Posts tagged science

Posts tagged science
Scientists have described a way to convert human skin cells directly into a specific type of brain cell affected by Huntington’s disease, an ultimately fatal neurodegenerative disorder. Unlike other techniques that turn one cell type into another, this new process does not pass through a stem cell phase, avoiding the production of multiple cell types, the study’s authors report.

(Image caption: Human skin cells (top) can be converted into medium spiny neurons (bottom) with exposure to the right combination of microRNAs and transcription factors, according to work by Andrew Yoo and his research team)
The researchers, at Washington University School of Medicine in St. Louis, demonstrated that these converted cells survived at least six months after injection into the brains of mice and behaved similarly to native cells in the brain.
“Not only did these transplanted cells survive in the mouse brain, they showed functional properties similar to those of native cells,” said senior author Andrew S. Yoo, PhD, assistant professor of developmental biology. “These cells are known to extend projections into certain brain regions. And we found the human transplanted cells also connected to these distant targets in the mouse brain. That’s a landmark point about this paper.”
The work appears Oct. 22 in the journal Neuron.
The investigators produced a specific type of brain cell called medium spiny neurons, which are important for controlling movement. They are the primary cells affected in Huntington’s disease, an inherited genetic disorder that causes involuntary muscle movements and cognitive decline usually beginning in middle-adulthood. Patients with the condition live about 20 years following the onset of symptoms, which steadily worsen over time.
The research involved adult human skin cells, rather than more commonly studied mouse cells or even human cells at an earlier stage of development. In regard to potential future therapies, the ability to convert adult human cells presents the possibility of using a patient’s own skin cells, which are easily accessible and won’t be rejected by the immune system.
To reprogram these cells, Yoo and his colleagues put the skin cells in an environment that closely mimics the environment of brain cells. They knew from past work that exposure to two small molecules of RNA, a close chemical cousin of DNA, could turn skin cells into a mix of different types of neurons.
In a skin cell, the DNA instructions for how to be a brain cell, or any other type of cell, is neatly packed away, unused. In past research published in Nature, Yoo and his colleagues showed that exposure to two microRNAs called miR-9 and miR-124 altered the machinery that governs packaging of DNA. Though the investigators still are unraveling the details of this complex process, these microRNAs appear to be opening up the tightly packaged sections of DNA important for brain cells, allowing expression of genes governing development and function of neurons.
Knowing exposure to these microRNAs alone could change skin cells into a mix of neurons, the researchers then started to fine tune the chemical signals, exposing the cells to additional molecules called transcription factors that they knew were present in the part of the brain where medium spiny neurons are common.
“We think that the microRNAs are really doing the heavy lifting,” said co-first author Matheus B. Victor, a graduate student in neuroscience. “They are priming the skin cells to become neurons. The transcription factors we add then guide the skin cells to become a specific subtype, in this case medium spiny neurons. We think we could produce different types of neurons by switching out different transcription factors.”
Yoo also explained that the microRNAs, but not the transcription factors, are important components for the general reprogramming of human skin cells directly to neurons. His team, including co-first author Michelle C. Richner, senior research technician, showed that when the skin cells were exposed to the transcription factors alone, without the microRNAs, the conversion into neurons wasn’t successful.
The researchers performed extensive tests to demonstrate that these newly converted brain cells did indeed look and behave like native medium spiny neurons. The converted cells expressed genes specific to native human medium spiny neurons and did not express genes for other types of neurons. When transplanted into the mouse brain, the converted cells showed morphological and functional properties similar to native neurons.
To study the cellular properties associated with the disease, the investigators now are taking skin cells from patients with Huntington’s disease and reprogramming them into medium spiny neurons using the approach described in the new paper. They also plan to inject healthy reprogrammed human cells into mice with a model of Huntington’s disease to see if this has any effect on the symptoms.
(Source: news.wustl.edu)

Brain simulation raises questions
What does it mean to simulate the human brain? Why is it important to do so? And is it even possible to simulate the brain separately from the body it exists in? These questions are discussed in a new paper published in the scientific journal Neuron today.
Simulating the brain means modeling it on a computer. But in real life, brains don’t exist in isolation. The brain is a complex and adaptive system that is seated within our bodies and entangled with all the other adaptive systems inside us that together make up a whole person. And the fact that the brain is a brain inside our bodies is something we can’t ignore when we attempt to simulate it realistically. Today, two Human Brain Project (HBP) researchers, Kathinka Evers, philosopher at the Centre for Research Ethics and Bioethics at Uppsala University and Yadin Dudal, neuroscientist at the Weizmann Institute of Science, publish a paper in Neuron that discusses the questions raised by brain simulations within and beyond the EU flagship project HBP.
For many scientists, understanding means being able to create a mental model that allows them to predict how a system would behave under different conditions. For the brain sciences, this type of understanding is currently only possible for a limited number of basic functions. In the article, Kathinka Evers and Yadin Dudal discuss the goal of simulation. In broad terms it has to do with understanding. But what does understanding mean in neuroscience?
As it dwells inside our bodies, the brain is always a result of what the individual has experienced up to that point. That is why, when we simulate the brain, we have to take this ‘experienced brain’ into account and try and reflect that.
According to Kathinka Evers, leader of the Ethics and Society part of the Human Brain Project, neglecting this experience would severely limit the outcome of any brain simulation. But if we are to include experience we have to simulate real-life situations.
“That is a daunting task: a large part of that experience is the brain’s interaction with the rest of the human body existing and interacting in a still larger social context”, says Kathinka Evers.
What outcome would be realistic to hope for in the Human Brain Project’s simulation? In neuroscience, computer simulations of specific systems are already in use. These simulations are a complement to other tools scientists use.
But there are some warnings to issue here. According to Kathinka Evers and Yadin Dudal, our knowledge to date is still very limited. There are many neuroscientists who think that it is too early for large scale brain simulations. Collecting the data we need for this is not an easy task. Another problem is whether we truly can understand what we are about to build. There are also technical limitations: there simply isn’t enough computing power available today.
But if we do manage to simulate the brain, would that mean we have created artificial consciousness? And can a computer be conscious at all? According to Kathinka Evers and Yadin Dudal, that depends on what consciousness is: If it is the result of certain types of organization or functions of biological matter, like the cells in the human body, then a computer can never gain consciousness. But if it is a matter of organization alone, without the need for biological matter, then the answer could be yes. But it is still a very hypothetical stance.

Mathematical model shows how the brain remains stable during learning
Complex biochemical signals that coordinate fast and slow changes in neuronal networks keep the brain in balance during learning, according to an international team of scientists from the RIKEN Brain Science Institute in Japan, UC San Francisco (UCSF), and Columbia University in New York.
The work, reported on October 22 in the journal Neuron, culminates a six-year quest by a collaborative team from the three institutions to solve a decades-old question and opens the door to a more general understanding of how the brain learns and consolidates new experiences on dramatically different timescales.
Neuronal networks form a learning machine that allows the brain to extract and store new information from its surroundings via the senses. Researchers have long puzzled over how the brain achieves sensitivity and stability to unexpected new experiences during learning—two seemingly contradictory requirements.
A new model devised by this team of mathematicians and brain scientists shows how the brain’s network can learn new information while maintaining stability.
To address the problem, the team turned to a classic experimental system. After birth, the visual area of the brain’s cortex undergoes rapid modification to match the properties of neurons when seeing the world through the left and right eyes, a phenomenon termed “ocular dominance plasticity,” or ODP. The discovery of this dramatic plasticity was recognized by the 1981 Nobel Prize in Physiology or Medicine awarded to David H. Hubel and Torsten N. Wiesel.
ODP learning contains a paradox that puzzled researchers—it relies on fast-acting changes in activity called “Hebbian plasticity” in which neural connections strengthen or weaken almost instantly depending on their frequency of use. However, acting alone, this process could lead to unstable activity levels.
In 2008, the UCSF team of Megumi Kaneko and Michael P. Stryker found that a second process, termed “homeostatic plasticity,” also controls ODP by tuning the activity of the whole neural network up in a slower manner, resembling the system for controlling the overall brightness of a TV screen without changing its images.
By modeling Hebbian and homeostatic plasticity together, mathematicians Taro Toyoizumi and Ken Miller of Columbia saw a possible resolution to the paradox of brain stability during learning. Dr. Toyoizumi, who is now at the RIKEN Brain Science Institute in Japan, explains, “We were running simulations of ODP using a conventional model. When we failed to reconcile Kaneko and Stryker’s data to the model, we had to develop a new theoretical solution.”
"It seemed important to explore the interactions between these two different types of plasticity to understand the computations performed by neurons in the visual area," Dr. Stryker adds. Testing the new mathematical model in an animal during experimental ODP was necessary, so the teams decided to collaborate.
The theory and experimental findings showed that fast Hebbian and slow homeostatic plasticity work together during learning, but only after each has independently assured stability on its own timescale. “The essential idea is that the fast and slow processes control separate biochemical factors,” said Dr. Miller.
"Our model solves the ODP paradox and may explain in general terms how learning occurs in other areas of the brain," said Dr. Toyoizumi. "Building on our general mathematical model for learning could reveal insights into new principles of brain capacities and diseases."
A nano-sized discovery by Northwestern Medicine® scientists helps explain how bipolar disorder affects the brain and could one day lead to new drug therapies to treat the mental illness.

Scientists used a new super-resolution imaging method — the same method recognized with the 2014 Nobel Prize in chemistry — to peer deep into brain tissue from mice with bipolar-like behaviors. In the synapses (where communication between brain cells occurs), they discovered tiny “nanodomain” structures with concentrated levels of ANK3 — the gene most strongly associated with bipolar disorder risk. ANK3 is coding for the protein ankyrin-G.
“We knew that ankyrin-G played an important role in bipolar disease, but we didn’t know how,” said Northwestern Medicine scientist Peter Penzes, corresponding author of the paper. “Through this imaging method we found the gene formed in nanodomain structures in the synapses, and we determined that these structures control or regulate the behavior of synapses.”
Penzes is a professor in physiology and psychiatry and behavioral sciences at Northwestern University Feinberg School of Medicine. The results were published Oct. 22 in the journal Neuron.
High-profile cases, including actress Catherine Zeta-Jones and politician Jesse Jackson, Jr., have brought attention to bipolar disorder. The illness causes unusual shifts in mood, energy, activity levels and the ability to carry out day-to-day tasks. About 3 percent of Americans experience bipolar disorder symptoms, and there is no cure.
Recent large-scale human genetic studies have shown that genes can contribute to disease risk along with stress and other environmental factors. However, how these risk genes affect the brain is not known.
This is the first time any psychiatric risk gene has been analyzed at such a detailed level of resolution. As explained in the paper, Penzes used the Nikon Structured Illumination Super-resolution Microscope to study a mouse model of bipolar disorder. The microscope realizes resolution of up to 115 nanometers. To put that size in perspective, a nanometer is one-tenth of a micron, and there are 25,400 microns in one inch. Very few of these microscopes exist worldwide.
“There is important information about genes and diseases that can only been seen at this level of resolution,” Penzes said. “We provide a neurobiological explanation of the function of the leading risk gene, and this might provide insight into the abnormalities in bipolar disorder.”
The biological framework presented in this paper could be used in human studies of bipolar disorder in the future, with the goal of developing therapeutic approaches to target these genes.
(Source: northwestern.edu)

Researchers Identify Key Factor in Transition from Moderate to Problem Drinking
A team of UC San Francisco researchers has found that a tiny segment of genetic material known as a microRNA plays a central role in the transition from moderate drinking to binge drinking and other alcohol use disorders.
Previous research in the UCSF laboratory of Dorit Ron, PhD, Endowed Chair of Cell Biology of Addiction in Neurology, has demonstrated that the level of a protein known as brain-derived neurotrophic factor, or BDNF, is increased in the brain when alcohol consumed in moderation. In turn, experiments in Ron’s lab have shown, BDNF prevents the development of alcohol use disorders.
In the new study, Ron and first author Emmanuel Darcq, PhD, a former postdoctoral fellow now at McGill University in Canada, found that when mice consumed excessive amounts of alcohol for a prolonged period, there was a marked decrease in the amount of BDNF in the medial prefrontal cortex (mPFC), a brain region important for decision making. As reported in the October 21, 2014 online edition of Molecular Psychiatry, this decline was associated with a corresponding increase in the level of a microRNA called miR-30a-5p.
MicroRNAs lower the levels of proteins such as BDNF by binding to messenger RNA, the molecular middleman that carries instructions from genes to the protein-making machinery of the cell, and tagging it for destruction.
Ron and colleagues then showed that if they increased the levels of miR-30a-5p in the mPFC, BDNF was reduced, and the mice consumed large amounts of alcohol. When mice were treated with an inhibitor of miR-30a-5p, however, the level of BDNF in the mPFC was restored to normal and alcohol consumption was restored to normal, moderate levels.
“Our results suggest BDNF protects against the transition from moderate to uncontrolled drinking and alcohol use disorders,” said Ron, senior author of the study and a professor in UCSF’s Department of Neurology. “When there is a breakdown in this protective pathway, however, uncontrolled excessive drinking develops, and microRNAs are a possible mechanism in this breakdown. This mechanism may be one possible explanation as to why 10 percent of the population develop alcohol use disorders and this study may be helpful for the development of future medications to treat this devastating disease.”
One reason many potential therapies for alcohol abuse have been unsuccessful is because they inhibit the brain’s reward pathways, causing an overall decline in the experience of pleasure. But in the new study, these pathways continued to function in mice in which the actions of miR-30a-5p had been tamped down—the mice retained the preference for a sweetened solution over plain water that is seen in normal mice.
This result has significant implications for future treatments, Ron said. “In searching for potential therapies for alcohol abuse, it is important that we look for future medications that target drinking without affecting the reward system in general. One problem with current alcohol abuse medications is that patients tend to stop taking them because they interfere with the sense of pleasure.”
A rich vocabulary can protect against cognitive impairment
Some people suffer incipient dementia as they get older. To make up for this loss, the brain’s cognitive reserve is put to the test. Researchers from the University of Santiago de Compostela have studied what factors can help to improve this ability and they conclude that having a higher level of vocabulary is one such factor.
‘Cognitive reserve’ is the name given to the brain’s capacity to compensate for the loss of its functions. This reserve cannot be measured directly; rather, it is calculated through indicators believed to increase this capacity.
A research project at the University of Santiago de Compostela (USC) has studied how having a wide vocabulary influences cognitive reserve in the elderly.
As Cristina Lojo Seoane, from the USC, co-author of the study published in the journal ‘Anales de Psicología’(Annals of Psychology), explains to SINC: “We focused on level of vocabulary as it is considered an indicator of crystallised intelligence (the use of previously acquired intellectual skills). We aimed to deepen our understanding of its relation to cognitive reserve.”
The research team chose a sample of 326 subjects over the age of 50 – 222 healthy individuals and 104 with mild cognitive impairment. They then measured their levels of vocabulary, along with other measures such as their years of schooling, the complexity of their jobs and their reading habits.
They also analysed the scores they obtained in various tests, such as the vocabulary subtest of the ‘Wechsler Adult Intelligence Scale’(WAIS) and the Peabody Picture Vocabulary Test.
“With a regression analysis we calculated the probability of impairment to the vocabulary levels of the participants,” Lojo Seoane continues.
The results revealed a greater prevalence of mild cognitive impairment in participants who achieved a lower vocabulary level score.
“This led us to the conclusion that a higher level of vocabulary, as a measure of cognitive reserve, can protect against cognitive impairment,” the researcher concludes.
Memory decline — a frequent complaint of menopausal women — potentially could be lessened by hypnotic relaxation therapy, say Baylor University researchers, who already have done studies showing that such therapy eases hot flashes, improves sleep and reduces stress in menopausal women.
Their review — “Memory Decline in Peri- and Post-menopausal Women: The Potential of Mind-Body Medicine to Improve Cognitive Performance” — is published in the journal Integrative Medicine Insights. October has been designated World Menopause Month by the International Menopause Society.
Initial research by Baylor, funded by the National Institutes of Health, focused on hot flashes, finding that hypnotic relaxation therapy lessened them, but “along the way, we discovered there are a lot of secondary benefits, including significantly improved sleep and mood,” said Jim R. Sliwinski, a doctoral student in the department of psychology and neuroscience in Baylor’s College of Arts & Sciences.
Co-researcher Gary Elkins, Ph.D., theorizes that sleep, mood and hot flashes associated with decreased estrogen also have a bearing on memory. Their publication, which reviews previous research by other scholars, proposes a framework for how mind-body interventions may improve memory, which could prove fruitful in doing future research.
“Memory decline may not be solely about decreased estrogen,” said Elkins, director of Baylor’s Mind-Body Medicine Research Laboratory and a professor of psychology and neuroscience.
Peri- and post-menopausal women may find mind-body therapies attractive for many reasons, among them that they do not have the side effects of medications or hormone therapy, said Elkins, author of “Relief from Hot Flashes: The Natural, Drug-Free Program to Reduce Hot Flashes, Improve Sleep and Ease Stress.”
While hormone therapy can increase estrogen, it also is associated with an increased risk of breast cancer and cardiovascular disease for some women, he said.
Researchers have noted that while memory decline can occur with aging in both men and women, women are more likely to report a greater number of memory problems, associating it with estrogen decline. Women also report more concerns about memory than pre-menopausal women do, according to several large-scale survey studies.
A factor that may impact memory is that women are dealing with increased responsibilities, stress or depression over such issues as caring for aging parents. In addition, their concern about memory problems may cause them to be more aware of memory lapses, Sliwinski said.
Even women who can safely be treated with estrogen do not necessarily have improved memory. “It sometimes even is associated with cognition problems,” he said.
Although there are questions about sleep’s specific role in forming and storing memories, researchers generally agree that consolidated sleep throughout a whole night is optimal for learning and memory.
Memory tests and scores over time with study participants — both pre-and post-menopausal — could help shed light on how menopause affects recollection, the Baylor researchers said.
(Image: Shutterstock)
(Image caption: Pictured is a mouse hippocampal neuron studded with thousands of synaptic connections (yellow). The number and location of synapses — not too many or too few — is critical to healthy brain function. The researchers found that MHCI proteins, known for their role in the immune system, also are one of the only known factors that ensure synapse density is not too high. The protein does so by inhibiting insulin receptors, which promote synapse formation. Credit: Lisa Boulanger)
Immune proteins moonlight to regulate brain-cell connections
When it comes to the brain, “more is better” seems like an obvious assumption. But in the case of synapses, which are the connections between brain cells, too many or too few can both disrupt brain function.
Researchers from Princeton University and the University of California-San Diego (UCSD) recently found that an immune-system protein called MHCI, or major histocompatibility complex class I, moonlights in the nervous system to help regulate the number of synapses, which transmit chemical and electrical signals between neurons. The researchers report in the Journal of Neuroscience that in the brain MHCI could play an unexpected role in conditions such as Alzheimer’s disease, type II diabetes and autism.
MHCI proteins are known for their role in the immune system where they present protein fragments from pathogens and cancerous cells to T cells, which are white blood cells with a central role in the body’s response to infection. This presentation allows T cells to recognize and kill infected and cancerous cells.
In the brain, however, the researchers found that MHCI immune molecules are one of the only known factors that limit the density of synapses, ensuring that synapses form in the appropriate numbers necessary to support healthy brain function. MHCI limits synapse density by inhibiting insulin receptors, which regulate the body’s sugar metabolism and, in the brain, promote synapse formation.
Senior author Lisa Boulanger, an assistant professor in the Department of Molecular Biology and the Princeton Neuroscience Institute (PNI), said that MHCI’s role in ensuring appropriate insulin signaling and synapse density raises the possibility that changes in the protein’s activity could contribute to conditions such Alzheimer’s disease, type II diabetes and autism. These conditions have all been associated with a complex combination of disrupted insulin-signaling pathways, changes in synapse density, and inflammation, which activates immune-system molecules such as MHCI.
Patients with type II diabetes develop “insulin resistance” in which insulin receptors become incapable of responding to insulin, the reason for which is unknown, Boulanger said. Similarly, patients with Alzheimer’s disease develop insulin resistance in the brain that is so pronounced some have dubbed the disease “type III diabetes,” Boulanger said.
"Our results suggest that changes in MHCI immune proteins could contribute to disorders of insulin resistance," Boulanger said. "For example, chronic inflammation is associated with type II diabetes, but the reason for this link has remained a mystery. Our results suggest that inflammation-induced changes in MHCI could have consequences for insulin signaling in neurons and maybe elsewhere."
MHCI levels also are “dramatically altered” in the brains of people with Alzheimer’s disease, Boulanger said. Normal memory depends on appropriate levels of MHCI. Boulanger was senior author on a 2013 paper in the journal Learning and Memory that found that mice bred to produce less functional MHCI proteins exhibited striking changes in the function of the hippocampus, a part of the brain where some memories are formed, and had severe memory impairments.
"MHCI levels are altered in the Alzheimer’s brain, and altering MHCI levels in mice disrupts memory, reduces synapse number and causes neuronal insulin resistance, all of which are core features of Alzheimer’s disease," Boulanger said.
Links between MHCI and autism also are emerging, Boulanger said. People with autism have more synapses than usual in specific brain regions. In addition, several autism-associated genes regulate synapse number, often via a signaling protein known as mTOR (mammalian target of rapamycin). In their study, Boulanger and her co-authors found that mice with reduced levels of MHCI had increased insulin-receptor signaling via the mTOR pathway, and, consequently, more synapses. When elevated mTOR signaling was reduced in MHCI-deficient mice, normal synapse density was restored.
Thus, Boulanger said, MHCI and autism-associated genes appear to converge on the mTOR-synapse regulation pathway. This is intriguing given that inflammation during pregnancy, which alters MHCI levels in the fetal brain, may slightly increase the risk of autism in genetically predisposed individuals, she said.
"Up-regulating MHCI is essential for the maternal immune response, but changing MHCI activity in the fetal brain when synaptic connections are being formed could potentially affect synapse density," Boulanger said.
Ben Barres, a professor of neurobiology, developmental biology and neurology at the Stanford University School of Medicine, said that while it is known that both insulin-receptor signaling increases synapse density, and MHCI signaling decreases it, the researchers are the first to show that MHCI actually affects insulin receptors to control synapse density.
"The idea that there could be a direct interaction between these two signaling systems comes as a great surprise," said Barres, who was not involved in the research. "This discovery not only will lead to new insight into how brain circuitry develops but to new insight into declining brain function that occurs with aging."
Particularly, the research suggests a possible functional connection between type II diabetes and Alzheimer’s disease, Barres said.
"Type II diabetes has recently emerged as a risk factor for Alzheimer’s disease but it has not been clear what the connection is to the synapse loss experienced with Alzheimer’s disease," he said. "Given that type II diabetes is accompanied by decreased insulin responsiveness, it may be that the MHCI signaling becomes able to overcome normal insulin signaling and contribute to synapse decline in this disease."
Research during the past 15 years has shown that MHCI lives a prolific double-life in the brain, Boulanger said. The brain is “immune privileged,” meaning the immune system doesn’t respond as rapidly or effectively to perceived threats in the brain. Dozens of studies have shown, however, that MHCI is not only present throughout the healthy brain, but is essential for normal brain development and function, Boulanger said. A 2013 paper from her lab published in the journal Molecular and Cellular Neuroscience showed that MHCI is even present in the fetal-mouse brain, at a stage when the immune system is not yet mature.
"Many people thought that immune molecules like MHCI must be missing from the brain," Boulanger said. "It turns out that MHCI immune proteins do operate in the brain — they just do something completely different. The dual roles of these proteins in the immune system and nervous system may allow them to mediate both harmful and beneficial interactions between the two systems."
New Research on Walnuts and the Fight Against Alzheimer’s Disease
A new animal study published in the Journal of Alzheimer’s Disease indicates that a diet including walnuts may have a beneficial effect in reducing the risk, delaying the onset, slowing the progression of, or preventing Alzheimer’s disease.
Research led by Abha Chauhan, PhD, head of the Developmental Neuroscience Laboratory at the New York State Institute for Basic Research in Developmental Disabilities (IBR), found significant improvement in learning skills, memory, reducing anxiety, and motor development in mice fed a walnut-enriched diet.
The researchers suggest that the high antioxidant content of walnuts (3.7 mmol/ounce) may have been a contributing factor in protecting the mouse brain from the degeneration typically seen in Alzheimer’s disease. Oxidative stress and inflammation are prominent features in this disease, which affects more than five million Americans.
“These findings are very promising and help lay the groundwork for future human studies on walnuts and Alzheimer’s disease – a disease for which there is no known cure,” said lead researcher Dr. Abha Chauhan, PhD. “Our study adds to the growing body of research that demonstrates the protective effects of walnuts on cognitive functioning.”
The research group examined the effects of dietary supplementation on mice with 6 percent or 9 percent walnuts, which are equivalent to 1 ounce and 1.5 ounces per day, respectively, of walnuts in humans. This research stemmed from a previous cell culture study led by Dr. Chauhan that highlighted the protective effects of walnut extract against the oxidative damage caused by amyloid beta protein. This protein is the major component of amyloid plaques that form in the brains of those with Alzheimer’s disease.
Someone in the United States develops Alzheimer’s disease every 67 seconds, and the number of Americans with Alzheimer’s disease and other dementias are expected to rapidly escalate in coming years as the baby boom generation ages. By 2050, the number of people age 65 and older with Alzheimer’s disease may nearly triple, from five million to as many as 16 million, emphasizing the importance of determining ways to prevent, slow or stop the disease. Estimated total payments in 2014 for all individuals with Alzheimer’s disease and other dementias are $214 billion.
Walnuts have other nutritional benefits as they contain numerous vitamins and minerals and are the only nut that contains a significant source of alpha-linolenic acid (ALA) (2.5 grams per ounce), an omega-3 fatty acid with heart and brain-health benefits. The researchers also suggest that ALA may have played a role in improving the behavioral symptoms seen in the study.
Tarantula Toxin is Used to Report on Electrical Activity in Live Cells
Crucial experiments to develop a novel probe of cellular electrical activity were conducted in the Neurobiology course at the Marine Biological Laboratory (MBL) in 2013. Today, that optical probe, which combines a tarantula toxin with a fluorescent compound, is introduced in a paper in the Proceedings of the National Academy of Sciences.
The lead authors of the paper are Drew C. Tilley of University of California-Davis and the late Kenneth Eum, who was a teaching assistant in the Neurobiology course and a Ph.D. candidate at UC-Davis.
The probe takes advantage of the potent ability of tarantula toxin to bind to electrically active cells, such as neurons, while the cells are in a resting state. The team discovered that a trace amount of toxin combined with a fluorescent compound would bind to a specific subset of voltage-activated proteins (Kv2-type potassium ion channels) in live cells. The probe lights up cell surfaces with this ion channel, and the fluorescent signal dims when the channel is activated by electrical signals.
This is the first time that researchers have been able to visually observe these ion channels “turning on” without first genetically modifying them. All that is required is a means to detect probe location, suggesting that related probes could potentially one day be used to map neural activity in the human brain.
“This is a demonstration, a prototype probe. But the promise is that we could use it to measure the activity state of the electrical system in an organism that has not been genetically compromised,” says senior author Jon Sack, an assistant professor in the departments of Physiology and Membrane Biology at UC-Davis. Sack is a faculty member in the MBL Neurobiology course.
Since the probe binds selectively to one of the many different kinds of ion channels, it can help scientists disentangle the function of those specific channels in neuronal signaling. This can, in turn, lead to the identification of drug targets for neurological diseases and disorders.
“We have an incredible diversity of ion channels, and even of voltage-activated ion channels. The real trouble has been determining which ones perform which roles. Which ones turn on and when in normal nervous system functioning? Which are involved in abnormal states or syndromes?” Sack says. “The dream is to be able to see what the different types of ion channels are doing and when, to understand what they contribute to physiology and pathophysiology.”
These probes respond to movement of ion channel voltage sensors, and it is particularly fulfilling to have conducted some of this work at the MBL, Sack says. The first measurements of voltage sensor movement were conducted at the MBL in the early 1970s by Clay M. Armstrong and Francisco Bezanilla (Nature 242: 459-461, 1973). Armstrong and Bezanilla used electrophysiological methods to measure the movement of voltage sensors. The spider toxin probe creates an optical signal when voltage sensors move, and no electrophysiology or genetic mutation of the channels is required.