Posts tagged science
Posts tagged science
Cinnamon: Can the red-brown spice with the unmistakable fragrance and variety of uses offer an important benefit? The common baking spice might hold the key to delaying the onset of –– or warding off –– the effects of Alzheimer’s disease.
That is, according to Roshni George and Donald Graves, scientists at UC Santa Barbara. The results of their study, “Interaction of Cinnamaldehyde and Epicatechin with Tau: Implications of Beneficial Effects in Modulating Alzheimer’s Disease Pathogenesis,” appears in the online early edition of the Journal of Alzheimer’s Disease, and in the upcoming Volume 36, issue 1 print edition.
Alzheimer’s disease is the most common form of dementia, a neurodegenerative disease that progressively worsens over time as it kills brain cells. No cure has yet been found, nor has the major cause of Alzheimer’s been identified.
However, two compounds found in cinnamon –– cinnamaldehyde and epicatechin –– are showing some promise in the effort to fight the disease. According to George and Graves, the compounds have been shown to prevent the development of the filamentous “tangles” found in the brain cells that characterize Alzheimer’s.
Responsible for the assembly of microtubules in a cell, a protein called tau plays a large role in the structure of the neurons, as well as their function.
“The problem with tau in Alzheimer’s is that it starts aggregating,” said George, a graduate student researcher. When the protein does not bind properly to the microtubules that form the cell’s structure, it has a tendency to clump together, she explained, forming insoluble fibers in the neuron. The older we get the more susceptible we are to these twists and tangles; Alzheimer’s patients develop them more often and in larger amounts.
The use of cinnamaldehyde, the compound responsible for the bright, sweet smell of cinnamon, has proven effective in preventing the tau knots. By protecting tau from oxidative stress, the compound, an oil, could inhibit the protein’s aggregation. To do this, cinnamaldehyde binds to two residues of an amino acid called cysteine on the tau protein. The cysteine residues are vulnerable to modifications, a factor that contributes to the development of Alzheimer’s.
“Take, for example, sunburn, a form of oxidative damage,” said Graves, adjunct professor in UCSB’s Department of Molecular, Cellular, and Developmental Biology. “If you wore a hat, you could protect your face and head from the oxidation. In a sense this cinnamaldehyde is like a cap.” While it can protect the tau protein by binding to its vulnerable cysteine residues, it can also come off, Graves added, which can ensure the proper functioning of the protein.
Oxidative stress is a major factor to consider in the health of cells in general. Through normal cellular processes, free radical-generating substances like peroxides are formed, but antioxidants in the cell work to neutralize them and prevent oxidation. Under some conditions however, the scales are tipped, with increased production of peroxides and free radicals, and decreased amounts of antioxidants, leading to oxidative stress.
Epicatechin, which is also present in other foods, such as blueberries, chocolate, and red wine, has proven to be a powerful antioxidant. Not only does it quench the burn of oxidation, it is actually activated by oxidation so the compound can interact with the cysteines on the tau protein in a way similar to the protective action of cinnamaldehyde.
“Cell membranes that are oxidized also produce reactive derivatives, such as Acrolein, that can damage the cysteines,” said George. “Epicatechin also sequesters those byproducts.”
Studies indicate that there is a high correlation between Type 2 diabetes and the incidence of Alzheimer’s disease. The elevated glucose levels typical of diabetes lead to the overproduction of reactive oxygen species, resulting in oxidative stress, which is a common factor in both diabetes and Alzheimer’s disease. Other research has shown cinnamon’s beneficial effects in managing blood glucose and other problems associated with diabetes.
“Since tau is vulnerable to oxidative stress, this study then asks whether Alzheimer’s disease could benefit from cinnamon, especially looking at the potential of small compounds,” said George.
Although this research shows promise, Graves said, they are “still a long way from knowing whether this will work in human beings.” The researchers caution against ingesting more than the typical amounts of cinnamon already used in cooking.
If cinnamon and its compounds do live up to their promise, it could be a significant step in the ongoing battle against Alzheimer’s. A major risk factor for the disease –– age –––– is uncontrollable. In the United States, Alzheimer’s presents a particular problem as the population lives longer and the Baby Boom generation turns gray, leading to a steep rise in the prevalance of the disease. It is a phenomenon that threatens to overwhelm the U.S. health care system. According to the Alzheimer’s Association, in 2013, Alzheimer’s disease will cost the nation $203 billion.
“Wouldn’t it be interesting if a small molecule from a spice could help?” commented Graves, “perhaps prevent it, or slow down the progression.”
As the human body fine-tunes its neurological wiring, nerve cells often must fix a faulty connection by amputating an axon — the “business end” of the neuron that sends electrical impulses to tissues or other neurons. It is a dance with death, however, because the molecular poison the neuron deploys to sever an axon could, if uncontained, kill the entire cell.
Researchers from the University of North Carolina School of Medicine have uncovered some surprising insights about the process of axon amputation, or “pruning,” in a study published May 21 in the journal Nature Communications. Axon pruning has mystified scientists curious to know how a neuron can unleash a self-destruct mechanism within its axon, but keep it from spreading to the rest of the cell. The researchers’ findings could offer clues about the processes underlying some neurological disorders.
“Aberrant axon pruning is thought to underlie some of the causes for neurodevelopmental disorders, such as schizophrenia and autism,” said Mohanish Deshmukh, PhD, professor of cell biology and physiology at UNC and the study’s senior author. “This study sheds light on some of the mechanisms by which neurons are able to regulate axon pruning.”
Axon pruning is part of normal development and plays a key role in learning and memory. Another important process, apoptosis — the purposeful death of an entire cell — is also crucial because it allows the body to cull broken or incorrectly placed neurons. But both processes have been linked with disease when improperly regulated.
The research team placed mouse neurons in special devices called microfluidic chambers that allowed the researchers to independently manipulate the environments surrounding the axon and cell body to induce axon pruning or apoptosis.
They found that although the nerve cell uses the same poison — a group of molecules known as Caspases — whether it intends to kill the whole cell or just the axon, it deploys the Caspases in a different way depending on the context.
“People had assumed that the mechanism was the same regardless of whether the context was axon pruning or apoptosis, but we found that it’s actually quite distinct,” said Deshmukh. “The neuron essentially uses the same components for both cases, but tweaks them in a very elegant way so the neuron knows whether it needs to undergo apoptosis or axon pruning.”
In apoptosis, the neuron deploys the deadly Caspases using an activator known as Apaf-1. In the case of axon pruning, Apaf-1 was simply not involved, despite the presence of Caspases. “This is really going to take the field by surprise,” said Deshmukh. “There’s very little precedent of Caspases being activated without Apaf-1. We just didn’t know they could be activated through a different mechanism.”
In addition, the team discovered that neurons employ other molecules as safety brakes to keep the “kill” signal contained to the axon alone. “Having this brake keeps that signal from spreading to the rest of the body,” said Deshmukh. “Remarkably, just removing one brake makes the neurons more vulnerable.”
Deshmukh said the findings offer a glimpse into how nerve cells reconfigure themselves during development and beyond. Enhancing our understanding of these basic processes could help illuminate what has gone wrong in the case of some neurological disorders.
A study from the June issue of Anesthesiology found feedback from the front region of the brain is a crucial building block for consciousness and that its disruption is associated with unconsciousness when the anesthetics ketamine, propofol or sevoflurane are administered.
Brain centers and mechanisms of consciousness have not been well understood, resulting in a need for better monitors of consciousness during anesthesia. In addition, how anesthetics with different structures and pharmacological properties can generate unconsciousness has been a persistent question in anesthesiology since the beginning of the field in the mid-19th century.
A team of researchers from the University of Michigan, Ann Arbor, Mich., and Asan Medical Center, Seoul, South Korea, conducted a brain wave (electroencephalographic, or EEG) study of the front and back regions of the brain in 30 surgical patients who received intravenous ketamine. They compared the results of this study to the EEG data collected from 18 surgical patients who received either intravenous propofol or inhaled sevoflurane in a previous study. These three anesthetics, known to act on different parts of the brain and produce different EEG patterns, had the same effect of disrupting communication in the brain.
“Understanding a commonality among the actions of these diverse drugs could lead to a more comprehensive theory of how general anesthetics induce unconsciousness,” said study author George Mashour, M.D., Ph.D., assistant professor and associate chair for faculty affairs, Department of Anesthesiology, University of Michigan. “Our research shows that studying general anesthesia from the perspective of consciousness may be a fruitful approach and create new avenues for further investigation of anesthetic mechanisms and monitoring.”
An accompanying editorial by Jamie W. Sleigh, M.D., professor of anaesthesiology and intensive care, Department of Anaesthesia, University of Auckland, Hamilton, New Zealand, supported the study’s ability to better understand the neurobiology of consciousness.
“If the study’s findings are confirmed by subsequent work, the paper will achieve landmark status,” said Dr. Sleigh. “The study not only sheds light on the phenomenon of general anesthesia, but also how it is necessary for certain regions of the brain to communicate accurately with one another for consciousness to emerge.”
In addition, Dr. Sleigh recognized the study’s potential to lead to the development of better depth-of-anesthesia monitors that work for all general anesthetics.
Moving objects attract greater attention – a fact exploited by video screens in public spaces and animated advertising banners on the Internet. For most animal species, moving objects also play a major role in the processing of sensory impressions in the brain, as they often signal the presence of a welcome prey or an imminent threat. This is also true of the zebrafish larva, which has to react to the movements of its prey. Scientists at the Max Planck Institute for Medical Research in Heidelberg have investigated how the brain uses the information from the visual system for the execution of quicker movements. The animals’ visual system records the movements of the prey so that the brain can redirect the animals’ movements through targeted swim bouts in a matter of milliseconds. Two hitherto unknown types of neurons in the mid-brain are involved in the processing of movement stimuli.
In principle, the visual system of zebrafish larvae resembles that of other vertebrates. Moreover, its genome has been decoded, it is a small organism, and it has transparent skin, which is easily penetrated by light in the fluorescent microscope. Therefore, these animals are very suitable for studying visual motion perception. They also display very clear prey capture behaviour. With the help of their finely-tuned visual system, they pursue and catch small ciliates. To do this, they execute a series of swimming manoeuvres in a matter of one or two seconds, during which they repeatedly verify the direction and distance of the prey so that they can adapt their subsequent movement steps. The larva’s brain must, therefore, filter and evaluate visual information extremely rapidly so that it can select appropriate motor patterns.
Using high-speed video recordings, researchers working with Johann Bollmann at the Max Planck Institute for Medical Research began by studying the natural course of prey capture by the larvae under a variety of starting conditions. It emerged that the larvae repeatedly execute a basic motion pattern and can apply an orientation component that re-directs the hunter towards the prey with each swim bout. To do this, the larvae must process visual information in just a few hundreds of milliseconds.
Using an innovative experimental design, the scientists then modelled, in a second step, the natural swimming environment as a “virtual reality”, in which the larvae execute typical prey capture sequences without actually moving. The virtual prey consisted of computer-controlled images, which were projected onto a small screen. In this way, the role of motion parameters, for example the size and speed of the “prey”, could be studied quantitatively in relation to the processing of visual stimuli by the animals.
In the “virtual reality”, the scientists can test how the fish larvae respond to unexpected shifts in the prey after a swim bout. “When we direct our gaze at a target through movements of our eyes and head, we expect the object to appear in a central position in our field of view. In the larvae, very slight deviations from the target position or delays in the re-appearance of the virtual prey increased the reaction times. When it receives unexpected visual feedback, the larva’s brain presumably needs extra processing time to calculate the next swim bout,” explains Johann Bollmann from the Max Planck Institute in Heidelberg.
In addition, with the help of fluorescent microscopes, the researchers can examine the activity of groups of neurons in the larval brain which are likely to control the targeted prey capture movements. In a previous study, they discovered cell types that react specifically to opposing directions of movement. These previously unknown neurons in the dorsal region of the midbrain (tectum) differ in their directional sensitivity and in the structure of their finely branched projections. “It appears that different directions of motion are processed in different layers of the tectum, since the dendritic ramifications of these cell types are spatially separated from each other,” says Bollmann.
Until now, little was scientifically known about the human potential to cultivate compassion — the emotional state of caring for people who are suffering in a way that motivates altruistic behavior.
A new study by researchers at the Center for Investigating Healthy Minds at the Waisman Center of the University of Wisconsin-Madison shows that adults can be trained to be more compassionate. The report, recently published online in the journal Psychological Science, is the first to investigate whether training adults in compassion can result in greater altruistic behavior and related changes in neural systems underlying compassion.
“Our fundamental question was, ‘Can compassion be trained and learned in adults? Can we become more caring if we practice that mindset?’” says Helen Weng, a graduate student in clinical psychology and lead author of the paper. “Our evidence points to yes.”
In the study, the investigators trained young adults to engage in compassion meditation, an ancient Buddhist technique to increase caring feelings for people who are suffering. In the meditation, participants envisioned a time when someone has suffered and then practiced wishing that his or her suffering was relieved. They repeated phrases to help them focus on compassion such as, “May you be free from suffering. May you have joy and ease.”
Participants practiced with different categories of people, first starting with a loved one, someone whom they easily felt compassion for like a friend or family member. Then, they practiced compassion for themselves and, then, a stranger. Finally, they practiced compassion for someone they actively had conflict with called the “difficult person,” such as a troublesome coworker or roommate.
“It’s kind of like weight training,” Weng says. “Using this systematic approach, we found that people can actually build up their compassion ‘muscle’ and respond to others’ suffering with care and a desire to help.”
Compassion training was compared to a control group that learned cognitive reappraisal, a technique where people learn to reframe their thoughts to feel less negative. Both groups listened to guided audio instructions over the Internet for 30 minutes per day for two weeks. “We wanted to investigate whether people could begin to change their emotional habits in a relatively short period of time,” says Weng.
The real test of whether compassion could be trained was to see if people would be willing to be more altruistic — even helping people they had never met. The research tested this by asking the participants to play a game in which they were given the opportunity to spend their own money to respond to someone in need (called the “Redistribution Game”). They played the game over the Internet with two anonymous players, the “Dictator” and the “Victim.” They watched as the Dictator shared an unfair amount of money (only $1 out of $10) with the Victim. They then decided how much of their own money to spend (out of $5) in order to equalize the unfair split and redistribute funds from the Dictator to the Victim.
“We found that people trained in compassion were more likely to spend their own money altruistically to help someone who was treated unfairly than those who were trained in cognitive reappraisal,” Weng says.
“We wanted to see what changed inside the brains of people who gave more to someone in need. How are they responding to suffering differently now?” asks Weng. The study measured changes in brain responses using functional magnetic resonance imaging (fMRI) before and after training. In the MRI scanner, participants viewed images depicting human suffering, such as a crying child or a burn victim, and generated feelings of compassion towards the people using their practiced skills. The control group was exposed to the same images, and asked to recast them in a more positive light as in reappraisal.
The researchers measured how much brain activity had changed from the beginning to the end of the training, and found that the people who were the most altruistic after compassion training were the ones who showed the most brain changes when viewing human suffering. They found that activity was increased in the inferior parietal cortex, a region involved in empathy and understanding others. Compassion training also increased activity in the dorsolateral prefrontal cortex and the extent to which it communicated with the nucleus accumbens, brain regions involved in emotion regulation and positive emotions.
“People seem to become more sensitive to other people’s suffering, but this is challenging emotionally. They learn to regulate their emotions so that they approach people’s suffering with caring and wanting to help rather than turning away,” explains Weng.
Compassion, like physical and academic skills, appears to be something that is not fixed, but rather can be enhanced with training and practice. “The fact that alterations in brain function were observed after just a total of seven hours of training is remarkable,” explains UW-Madison psychology and psychiatry professor Richard J. Davidson, founder and chair of the Center for Investigating Healthy Minds and senior author of the article.
“There are many possible applications of this type of training,” Davidson says. “Compassion and kindness training in schools can help children learn to be attuned to their own emotions as well as those of others, which may decrease bullying. Compassion training also may benefit people who have social challenges such as social anxiety or antisocial behavior.”
Weng is also excited about how compassion training can help the general population. “We studied the effects of this training with healthy participants, which demonstrated that this can help the average person. I would love for more people to access the training and try it for a week or two — what changes do they see in their own lives?”
Both compassion and reappraisal trainings are available on the Center for Investigating Healthy Minds’ website. “I think we are only scratching the surface of how compassion can transform people’s lives,” says Weng.
A new study provides neurobiological evidence for dysfunction in the neural circuitry underlying emotion regulation in people with insomnia, which may have implications for the risk relationship between insomnia and depression.
“Insomnia has been consistently identified as a risk factor for depression,” said lead author Peter Franzen, PhD, an assistant professor of psychiatry at the University of Pittsburgh School of Medicine. “Alterations in the brain circuitry underlying emotion regulation may be involved in the pathway for depression, and these results suggest a mechanistic role for sleep disturbance in the development of psychiatric disorders.”
The study involved 14 individuals with chronic primary insomnia without other primary psychiatric disorders, as well as 30 good sleepers who served as a control group. Participants underwent an fMRI scan during an emotion regulation task in which they were shown negative or neutral pictures. They were asked to passively view the images or to decrease their emotional responses using cognitive reappraisal, a voluntary emotion regulation strategy in which you interpret the meaning depicted in the picture in order to feel less negative.
Results show that in the primary insomnia group, amygdala activity was significantly higher during reappraisal than during passive viewing. Located in the temporal lobe of the brain, the amygdala plays an important role in emotional processing and regulation.
In analysis between groups, amygdala activity during reappraisal trials was significantly greater in the primary insomnia group compared with good sleepers. The two groups did not significantly differ when passively viewing negative pictures.
“Previous studies have demonstrated that successful emotion regulation using reappraisal decreases amygdala response in healthy individuals, yet we were surprised that activity was even higher during reappraisal of, versus passive viewing of, pictures with negative emotional content in this sample of individuals with primary insomnia,” said Franzen.
The research abstract was published recently in an online supplement of the journal SLEEP, and Franzen will present the findings Wednesday, June 5, in Baltimore, Md., at SLEEP 2013, the 27th annual meeting of the Associated Professional Sleep Societies LLC.
The American Academy of Sleep Medicine reports that about 10 to 15 percent of adults have an insomnia disorder with distress or daytime impairment. According to the National Institute of Mental Health, 6.7 percent of the U.S. adult population suffers from major depressive disorder. Both insomnia and depression are more common in women than in men.
Brain freeze is practically a rite of summer.
It happens when you eat ice cream or gulp something ice cold too quickly. The scientific term is sphenopalatine ganglioneuralgia, but that’s a mouthful. Brain freeze is your body’s way of putting on the brakes, telling you to slow down and take it easy. Wake Forest Baptist Medical Center neuroscientist Dwayne Godwin, Ph.D., explains how it works.
“Brain freeze is really a type of headache that is rapid in onset, but rapidly resolved as well,” he said. “Our mouths are highly vascularized, including the tongue - that’s why we take our temperatures there. But drinking a cold beverage fast doesn’t give the mouth time to absorb the cold very well.”
Here’s how it happens: When you slurp a really cold drink or eat ice cream too fast you are rapidly changing the temperature in the back of the throat at the juncture of the internal carotoid artery, which feeds blood to the brain, and the anterior cerebral artery, which is where brain tissue starts.
“One thing the brain doesn’t like is for things to change, and brain freeze is a mechanism to prevent you from doing that,” Godwin said.
The brain can’t actually feel pain despite its billions of neurons, Godwin said, but the pain associated with brain freeze is sensed by receptors in the outer covering of the brain called the meninges, where the two arteries meet. When the cold hits, it causes a dilation and contraction of these arteries and that’s the sensation that the brain is interpreting as pain.
Analyzing brain freeze may seem like silly science to some, but “it’s helpful in understanding other types of headaches,” Godwin said.
“We can’t easily give people migraines or a cluster headache, but we can easily induce brain freeze without any long-term problems,” he said. “We can learn something about headache mechanisms and extend that to our understanding to develop better treatments for patients.”
Is there a cure for brain freeze? Yes - stop drinking the icy cold beverage. You can also jam your tongue up to the roof of your mouth because it’s warm or drink something tepid to normalize the temperature in your mouth.
(Image: Erik S. Peterson/ Wikimedia Commons)
Older people with a history of migraines and depression may have smaller brain tissue volumes than people with only one or neither of the conditions, according to a new study in the May 22, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Studies show that people with migraine have double the risk of depression compared to people without migraine,” said study author Larus S. Gudmundsson, PhD, with the National Institute on Aging and the Uniformed Services University of the Health Sciences, in Bethesda, Md. Gudmundsson is also a member of the American Academy of Neurology. “We wanted to find out whether having both conditions together possibly affected brain size.”
For the study, 4,296 people with an average age of 51 were tested for migraine headache from 1967 to 1991; they were later assessed from 2002 to 2006 at an average age of 76 for a history of major depressive disorder (depression). Participants also underwent MRI, from which brain tissue volumes were estimated. A total of 37 participants had a history of both migraine and depression, while 2,753 had neither condition.
The study found that people with both migraine and depression had total brain tissue volumes an average of 19.2 milliliters smaller than those without either condition. There was no difference in the total brain volume when comparing people with only one of the conditions to people with neither condition.
“It is important to note that participants in this study were imaged using MRI once, so we cannot say that migraine and depression resulted in brain atrophy. In future studies, we need to examine at what age participants develop both migraine and depression and measure their brain volume changes over time in order to determine what comes first,” said Gudmundsson.
Gudmundsson noted that some of the factors leading to a joint effect of migraine and depression on brain volume may include pain, brain inflammation, genetics and differences in a combination of social and economic factors. “Our study suggests that people with both migraine and depression may represent a unique group from those with only one of these conditions and may also require different strategies for long-term treatment.”
It is known that signs of neurological disorders such as Alzheimer’s and Huntington’s disease can appear years before the disease becomes manifest; these signs take the form of subtle changes in the brain and behavior of individuals affected. For the first time, an international group of researchers led by the DZNE and the Bonn University Hospital has proven the existence of such signatures for motor disorders belonging to the group of “spinocerebellar ataxias”. The scientists report these findings in the current online edition of “The Lancet Neurology”. This pan-European study could open up new possibilities of early diagnosis and smooth the way for treatments which tackle diseases before the patient’s nervous system is irreparably damaged.
“Spinocerebellar ataxias” comprise a group of genetic diseases of the cerebellum and other parts of the brain. Persons affected only have limited control of their muscles. They also suffer from balance disorders and impaired speech. The symptoms originate from mutations in the patient’s genetic make-up. These cause nerve cells to become damaged and to die off. Such genetic defects are comparatively rare: it is estimated that about 3,000 people in Germany are affected.
It is known that there are various subtypes of these neurodegenerative diseases. The age at which the symptoms manifest consequently fluctuates between about 30 and 50. “Our aim was to find out whether specific signs can be recognized before a disease becomes obvious,” says project leader Prof. Thomas Klockgether, Director for Clinical Research at the DZNE and Director of the Clinic for Neurology at Bonn University Hospital.
The study, which involved 14 research centers in all, focused on the four most common forms of spinocerebellar ataxia. These account for more than half of all cases. More than 250 siblings and children of patients throughout Europe declared their willingness to participate in appropriate tests. These individuals had no obvious symptoms of ataxia. However, about half of them had inherited the genetic defects which invariably cause the disease to manifest in the long term.
With the aid of a mathematical model that considered the genetic mutations and their effects, the scientists were able to estimate the time remaining until the disease could be expected to manifest. In the test group, this “time to onset” varied from 2 to 24 years. These and all other test results remained anonymous: the data was not known to the test subjects, neither could the researchers assign it to specific participants. The same applied to individuals whose DNA turned out to be inconspicuous. “People in families with cases of ataxia usually have not taken a genetic test and they don’t want to know any results. This kind of information has to be treated very carefully for ethical reasons,” emphasizes Klockgether.
The study participants made themselves available for various examinations including standardized tests of muscular coordination. These included measuring the time needed by the subjects to walk a specific distance. Another series of experiments involved inserting small pins into the holes of a board and taking them back out as quickly as possible. Yet another test measured how often the participants could repeat a certain sequence of syllables in ten seconds. “The tests were designed in such a way that they would provide significant information but still be easy to perform,” says Klockgether. “Tests like these can be performed anywhere without need for special technology.”
Technically complex methods were also used: all study participants were tested for the genetic defects relevant to ataxia. At some of the research centers involved in the study, it was also possible to examine the subjects with the aid of magnetic resonance imaging (MRI). This enabled researchers to measure the total brain volume as well as the dimensions of individual parts of the brain in about a third of the subjects.
In two of the four types of ataxia investigated, the scientists found signs of impending disease. “We found a loss in brain volume, particularly shrinkage in the area of the cerebellum and brain stem. These subjects also had subtle difficulties with coordination,” Klockgether summarizes the results. “This means that manifestations of this kind can be measured years before the disease is likely to become obvious.”
The findings for the other two types of ataxia were less conclusive. “I assume that there are indications also for these types of the disease. However, this subgroup of participants was relatively small. It is therefore difficult to make statistically reliable statements about these subjects,” says the Bonn-based researcher.
In his view, the study results testify to the modern-day view of neurodegenerative processes: “Neurodegeneration doesn’t begin when the symptoms surface. Rather, it is a stealthy disease which starts developing years or even decades beforehand.”
Klockgether believes that this gradual development offers certain opportunities: “If we intervened in this process by appropriate treatments and at a sufficiently early stage, it might be possible to slow down or even stop the disease process.”
More investigations planned
The current results will be the basis for long-term investigations. A new series of tests with the same group of individuals has already started; further tests are scheduled every two years. The scientists intend to monitor the study participants for as long as possible.
Scientists from the University of Southampton have developed a device which records the brain activity of worms to help test the effects of drugs.
NeuroChip is a microfluidic electrophysiological device, which can trap the microscopic worm Caenorhadbitis elegans and record the activity of discrete neural circuits in its ‘brain’ - a worm equivalent of the EEG.
C. elegans have been enormously important in providing insight into fundamental signalling processes in the nervous system and this device opens the way for a new analysis. Prior to this development, electrophysiological recordings that resolve the activity of excitatory and inhibitory nerve cells in the nervous system of the worm required a high level of technical expertise - single microscopic (1mm long) worms have to be trapped on the end of a glass tube, a microelectrode, in order to make the recording. The worms are very mobile as well as being small and this can be a challenging procedure.
The microfluidic invention consists of a reservoir through which worms can be fed, one after the other, into a narrow fluid-filled channel. The channel tapers at one end and this captures the worm by the front end. The worm is then in the correct orientation for recording the activity of the nervous system in the anterior of its body. The device incorporates metal electrodes, which are connected to an amplifier to make the recording. The design of the trapping channel has been optimised by PhD student Chunxiao Hu, so that the quality of the worm ‘EEG’ recording is sufficient to resolve the activity of components of the neural circuit in the worm’s nervous system.
This device has been used to detect the effects of drugs and is highly suitable for high throughput screens (which allow researchers to quickly conduct millions of chemical, genetic or pharmacological tests) in neurotoxicology and for generic screening for neuroactive drugs. It has more power to resolve discrete effects on excitatory, inhibitory or modulatory transmission than previously possible with behavioural screens.
Lindy Holden-Dye, Professor of Neuroscience at the University of Southampton and lead author of the paper, says: “We are particularly interested in using this as a sensitive new tool for screening compounds for neurotoxicity. It will allow us to precisely quantify sub-lethal effects on neural network activity. It can also provide an information rich platform by reporting the effects of compounds on a diverse array of neurotransmitter pathways, which are implicated in mammalian toxicology. “
The research, which is published in the latest issue of the journal PLOS One, is a joint project between the University’s Centre for Biological Sciences and the Hybrid Biodevices Group.