Neuroscience

Articles and news from the latest research reports.

114 notes

(Image caption: These are mature nerve cells generated from human cells using enhanced transcription factors. Credit: Fahad Ali)
Functional nerve cells from skin cells
A new method of generating mature nerve cells from skin cells could greatly enhance understanding of neurodegenerative diseases, and could accelerate the development of new drugs and stem cell-based regenerative medicine.
The nerve cells generated by this new method show the same functional characteristics as the mature cells found in the body, making them much better models for the study of age-related diseases such as Parkinson’s and Alzheimer’s, and for the testing of new drugs.
Eventually, the technique could also be used to generate mature nerve cells for transplantation into patients with a range of neurodegenerative diseases.
By studying how nerves form in developing tadpoles, researchers from the University of Cambridge were able to identify ways to speed up the cellular processes by which human nerve cells mature. The findings are reported in the May 27th edition of the journal Development.
Stem cells are our master cells, which can develop into almost any cell type within the body. Within a stem cell, there are mechanisms that tell it when to divide, and when to stop dividing and transform into another cell type, a process known as cell differentiation. Several years ago, researchers determined that a group of proteins known as transcription factors, which are found in many tissues throughout the body, regulate both mechanisms.
More recently, it was found that by adding these proteins to skin cells, they can be reprogrammed to form other cell types, including nerve cells. These cells are known as induced neurons, or iN cells. However, this method generates a low number of cells, and those that are produced are not fully functional, which is a requirement in order to be useful models of disease: for example, cortical neurons for stroke, or motor neurons for motor neuron disease.
In addition, for age-related diseases such as Parkinson’s and Alzheimer’s, both of which affect millions worldwide, mature nerve cells which show the same characteristics as those found in the body are crucial in order to enhance understanding of the disease and ultimately determine the best way to treat it.
"When you reprogramme cells, you’re essentially converting them from one form to another but often the cells you end up with look like they come from embryos rather than looking and acting like more mature adult cells," said Dr Anna Philpott of the Department of Oncology, who led the research. "In order to increase our understanding of diseases like Alzheimer’s, we need to be able to work with cells that look and behave like those you would see in older individuals who have developed the disease, so producing more ‘adult’ cells after reprogramming is really important."
By manipulating the signals which transcription factors send to the cells, Dr Philpott and her collaborators were able to promote cell differentiation and maturation, even in the presence of conflicting signals that were directing the cell to continue dividing.
When cells are dividing, transcription factors are modified by the addition of phosphate molecules, a process known as phosphorylation, but this can limit how well cells can convert to mature nerves. However, by engineering proteins which cannot be modified by phosphate and adding them to human cells, the researchers found they could produce nerve cells that were significantly more mature, and therefore more useful as models for disease such as Alzheimer’s.
Additionally, very similar protein control mechanisms are at work to mature important cells in other tissues such as pancreatic islets, the cell type that fails to function effectively in type 2 diabetes. As well as making more mature nerves, Dr Philpott’s lab is now using similar methods to improve the function of insulin-producing pancreas cells for future therapeutic applications.
"We’ve found that not only do you have to think about how you start the process of cell differentiation in stem cells, but you also have to think about what you need to do to make differentiation complete - we can learn a lot from how cells in developing embryos manage this," said Dr Philpott.

(Image caption: These are mature nerve cells generated from human cells using enhanced transcription factors. Credit: Fahad Ali)

Functional nerve cells from skin cells

A new method of generating mature nerve cells from skin cells could greatly enhance understanding of neurodegenerative diseases, and could accelerate the development of new drugs and stem cell-based regenerative medicine.

The nerve cells generated by this new method show the same functional characteristics as the mature cells found in the body, making them much better models for the study of age-related diseases such as Parkinson’s and Alzheimer’s, and for the testing of new drugs.

Eventually, the technique could also be used to generate mature nerve cells for transplantation into patients with a range of neurodegenerative diseases.

By studying how nerves form in developing tadpoles, researchers from the University of Cambridge were able to identify ways to speed up the cellular processes by which human nerve cells mature. The findings are reported in the May 27th edition of the journal Development.

Stem cells are our master cells, which can develop into almost any cell type within the body. Within a stem cell, there are mechanisms that tell it when to divide, and when to stop dividing and transform into another cell type, a process known as cell differentiation. Several years ago, researchers determined that a group of proteins known as transcription factors, which are found in many tissues throughout the body, regulate both mechanisms.

More recently, it was found that by adding these proteins to skin cells, they can be reprogrammed to form other cell types, including nerve cells. These cells are known as induced neurons, or iN cells. However, this method generates a low number of cells, and those that are produced are not fully functional, which is a requirement in order to be useful models of disease: for example, cortical neurons for stroke, or motor neurons for motor neuron disease.

In addition, for age-related diseases such as Parkinson’s and Alzheimer’s, both of which affect millions worldwide, mature nerve cells which show the same characteristics as those found in the body are crucial in order to enhance understanding of the disease and ultimately determine the best way to treat it.

"When you reprogramme cells, you’re essentially converting them from one form to another but often the cells you end up with look like they come from embryos rather than looking and acting like more mature adult cells," said Dr Anna Philpott of the Department of Oncology, who led the research. "In order to increase our understanding of diseases like Alzheimer’s, we need to be able to work with cells that look and behave like those you would see in older individuals who have developed the disease, so producing more ‘adult’ cells after reprogramming is really important."

By manipulating the signals which transcription factors send to the cells, Dr Philpott and her collaborators were able to promote cell differentiation and maturation, even in the presence of conflicting signals that were directing the cell to continue dividing.

When cells are dividing, transcription factors are modified by the addition of phosphate molecules, a process known as phosphorylation, but this can limit how well cells can convert to mature nerves. However, by engineering proteins which cannot be modified by phosphate and adding them to human cells, the researchers found they could produce nerve cells that were significantly more mature, and therefore more useful as models for disease such as Alzheimer’s.

Additionally, very similar protein control mechanisms are at work to mature important cells in other tissues such as pancreatic islets, the cell type that fails to function effectively in type 2 diabetes. As well as making more mature nerves, Dr Philpott’s lab is now using similar methods to improve the function of insulin-producing pancreas cells for future therapeutic applications.

"We’ve found that not only do you have to think about how you start the process of cell differentiation in stem cells, but you also have to think about what you need to do to make differentiation complete - we can learn a lot from how cells in developing embryos manage this," said Dr Philpott.

Filed under neurodegenerative diseases nerve cells skin cells stem cells cell differentiation neuroscience science

240 notes

Breakthrough: Nasal spray may soon replace the pill
When the doctor gives us medicine, it is often in the shape of a pill. But when it comes to brain diseases, pills are actually an extremely ineffecient way to deliver drugs to the brain, and according to researchers from University of Southern Denmark we need to find new and more efficient ways of transporting drugs to the brain. Spraying the patient’s nose could be one such way.
Every time we have an infection or a headache and take a pill, we get a lot more drugs than our body actually needs. The reason is that only a fraction of the drugs in a pill reaches the right places in the body; the rest never reaches its destination and may cause unwelcome side effects before they are flushed out of the body again. This kind of major overdosing is especially true when doctors treat brain diseases, because the brain does not easily accept entering drugs.
"People with brain diseases are often given huge amounts of unnecessary drugs. During a long life, or if you have a chronic disease, this may become problematic for your health", says Massimiliano Di Cagno, assistant professor at the Department of Physics, Chemistry and Pharmacy, University of Southern Denmark.
He is concerned with finding more efficient ways of delivering drugs to the brain. He and his colleagues at University of Southern Denmark and Aalborg University have turned their attention to the nose - specifically the nasal wall and the slimy mucosa that covers it.
As we know from e.g. cocaine addicts, substances can be assimilated extremely quickly and directly through the nose. But many medical substances, however, need help to be transported through the nasal wall and further on to the relevant places in the brain.
Researchers have long struggled with this challenge and have come up with different kinds of transport vehicles that are very good at transporting the active ingredients through the nasal wall into the brain. The problem with these vehicles, though, is that they cannot release their cargo of drugs once they have reached the inside of the brain. The drugs stay locked inside the strong vehicles.
“If the drugs cannot get out of their vehicles, they are no help to the patient. So we needed to develop a vehicle that does not lock the drug in”, explains Massimiliano Di Cagno.
The vehicles for drug delivery through the nose are typically made of so called polymers. A polymer is a large molecule composed of a large number of repeats of one or more types of atoms or groups of atoms bound to each other. Polymers can be natural or synthetic, simple or complex.
Direct track to the brain
Massimiliano Di Cagno and his colleagues tested a natural sugar polymer and they now report that this particular polymer is not only capable of carrying the drugs through the nasal wall but also – and most importantly – releasing the drug where it is needed.
"This is an important breakthrough, which will bring us closer to delivering brain drugs by nasal spray", says Massimiliano Di Cagno.
With this discovery two out of three major challenges in nasal delivery of brain drugs have been met:
“We have solved the problem of getting the drug through the nose, and we have solved the problem of getting the drug released once it has entered the brain. Now there is a third major challenge left: To secure a steady supply of drugs over a long period. This is especially important if you are a chronic patient and need drug delivery every hour or so”, says Massimiliano Di Cagno.
When a patient sprays a solution with active drugs into his nose cavity, the solution will hit the nasal wall and wander from here through the nasal wall to the relevant places in the brain.
“But gravity also rules inside the nose cavity and therefore the spray solution will start to run down as soon as it has been sprayed up the nose. We need it to cling to the nasal wall for a long time, so we need to invent some kind of glue that will help the solution stick to the nasal wall and not run down and out of the nose within minutes”, says Massimiliano Di Cagno.

Breakthrough: Nasal spray may soon replace the pill

When the doctor gives us medicine, it is often in the shape of a pill. But when it comes to brain diseases, pills are actually an extremely ineffecient way to deliver drugs to the brain, and according to researchers from University of Southern Denmark we need to find new and more efficient ways of transporting drugs to the brain. Spraying the patient’s nose could be one such way.

Every time we have an infection or a headache and take a pill, we get a lot more drugs than our body actually needs. The reason is that only a fraction of the drugs in a pill reaches the right places in the body; the rest never reaches its destination and may cause unwelcome side effects before they are flushed out of the body again. This kind of major overdosing is especially true when doctors treat brain diseases, because the brain does not easily accept entering drugs.

"People with brain diseases are often given huge amounts of unnecessary drugs. During a long life, or if you have a chronic disease, this may become problematic for your health", says Massimiliano Di Cagno, assistant professor at the Department of Physics, Chemistry and Pharmacy, University of Southern Denmark.

He is concerned with finding more efficient ways of delivering drugs to the brain. He and his colleagues at University of Southern Denmark and Aalborg University have turned their attention to the nose - specifically the nasal wall and the slimy mucosa that covers it.

As we know from e.g. cocaine addicts, substances can be assimilated extremely quickly and directly through the nose. But many medical substances, however, need help to be transported through the nasal wall and further on to the relevant places in the brain.

Researchers have long struggled with this challenge and have come up with different kinds of transport vehicles that are very good at transporting the active ingredients through the nasal wall into the brain. The problem with these vehicles, though, is that they cannot release their cargo of drugs once they have reached the inside of the brain. The drugs stay locked inside the strong vehicles.

“If the drugs cannot get out of their vehicles, they are no help to the patient. So we needed to develop a vehicle that does not lock the drug in”, explains Massimiliano Di Cagno.

The vehicles for drug delivery through the nose are typically made of so called polymers. A polymer is a large molecule composed of a large number of repeats of one or more types of atoms or groups of atoms bound to each other. Polymers can be natural or synthetic, simple or complex.

Direct track to the brain

Massimiliano Di Cagno and his colleagues tested a natural sugar polymer and they now report that this particular polymer is not only capable of carrying the drugs through the nasal wall but also – and most importantly – releasing the drug where it is needed.

"This is an important breakthrough, which will bring us closer to delivering brain drugs by nasal spray", says Massimiliano Di Cagno.

With this discovery two out of three major challenges in nasal delivery of brain drugs have been met:

“We have solved the problem of getting the drug through the nose, and we have solved the problem of getting the drug released once it has entered the brain. Now there is a third major challenge left: To secure a steady supply of drugs over a long period. This is especially important if you are a chronic patient and need drug delivery every hour or so”, says Massimiliano Di Cagno.

When a patient sprays a solution with active drugs into his nose cavity, the solution will hit the nasal wall and wander from here through the nasal wall to the relevant places in the brain.

“But gravity also rules inside the nose cavity and therefore the spray solution will start to run down as soon as it has been sprayed up the nose. We need it to cling to the nasal wall for a long time, so we need to invent some kind of glue that will help the solution stick to the nasal wall and not run down and out of the nose within minutes”, says Massimiliano Di Cagno.

Filed under brain diseases drug delivery cyclodextrins medicine science

234 notes

Blocking pain receptors extends lifespan, boosts metabolism in mice

Blocking a pain receptor in mice not only extends their lifespan, it also gives them a more youthful metabolism, including an improved insulin response that allows them to deal better with high blood sugar.

image

"We think that blocking this pain receptor and pathway could be very, very useful not only for relieving pain, but for improving lifespan and metabolic health, and in particular for treating diabetes and obesity in humans," said Andrew Dillin, a professor of molecular and cell biology at the University of California, Berkeley, and senior author of a new paper describing these results. "As humans age they report a higher incidence of pain, suggesting that pain might drive the aging process."

The “hot” compound in chili peppers, capsaicin, is already known to activate this pain receptor, called TRPV1 (transient receptor potential cation channel subfamily V member 1). In fact, TRPV1 is often called the capsaicin receptor. Constant activation of the receptor on a nerve cell results in death of the neuron, mimicking loss of TRPV1, which could explain why diets rich in capsaicin have been linked to a lower incidence of diabetes and metabolic problems in humans.

More relevant therapeutically, however, is an anti-migraine drug already on the market that inhibits a protein called CGRP that is triggered by TRPV1, producing an effect similar to that caused by blocking TRPV1. Dillin showed that giving this drug to older mice restored their metabolic health to that of younger mice.

"Our findings suggest that pharmacological manipulation of TRPV1 and CGRP may improve metabolic health and longevity," said Dillin, who is a Howard Hughes Medical Institute investigator and the Thomas and Stacey Siebel Distinguished Chair in Stem Cell Research. "Alternatively, chronic ingestion of compounds that affect TRPV1 might help prevent metabolic decline with age and lead to increased longevity in humans."

Dillin and his colleagues at UC Berkeley and The Salk Institute for Biological Studies in La Jolla, Calif., will publish their results in the May 22 issue of the journal Cell.

Pain and obesity

TRPV1 is a receptor found in the skin, nerves and joints that reacts to extremely high temperatures and other painful stimuli. The receptor is also found in nerve fibers that contact the pancreas, where it stimulates the release of substances that cause inflammation or, like CGRP (calcitonin gene-related peptide), prevent insulin release. Insulin promotes the uptake of sugar from the blood and storage in the body’s tissue, including fat.

Past research has shown that mice lacking TRPV1 are protected against diet-induced obesity, suggesting that this receptor plays a role in metabolism. Disrupting sensory perception also increases longevity in worms and flies. But until now, it was not known whether sensory perception also affects aging in mammals.

Dillin and his team have now found that mice genetically manipulated to lack TRPV1 receptors lived, on average, nearly four months – or about 14 percent – longer than normal mice. The TRPV1-deficient mice also showed signs of a youthful metabolism late in life, due to low levels of CGRP — a molecule that blocks insulin release resulting in increased blood glucose levels and thus could contribute to the development of type 2 diabetes. Throughout aging, these mice showed improved ability to quickly clear sugar from the blood as well as signs that they could burn more calories without increasing exercise levels.

Moreover, old mice treated with the anti-migraine drug, which inhibits the activity of CGRP receptors, showed a more youthful metabolic profile than untreated old mice.

UC Berkeley and The Salk Institute filed a patent May 16 on the technology described in the Cell paper. Dillin plans to continue his studies of the effects of TRPV1 and CGRP blockers on mice and, if possible, humans.

(Source: eurekalert.org)

Filed under TRPV1 pain pain receptors longevity lifespan obesity neuroscience science

169 notes

Genes discovered linking circadian clock with eating schedule
For most people, the urge to eat a meal or snack comes at a few, predictable times during the waking part of the day. But for those with a rare syndrome, hunger comes at unwanted hours, interrupts sleep and causes overeating.
Now, Salk scientists have discovered a pair of genes that normally keeps eating schedules in sync with daily sleep rhythms, and, when mutated, may play a role in so-called night eating syndrome. In mice with mutations in one of the genes, eating patterns are shifted, leading to unusual mealtimes and weight gain. The results were published in Cell Reports today.
"We really never expected that we would be able to decouple the sleep-wake cycle and the eating cycle, especially with a simple mutation," says senior study author Satchidananda Panda, an associate professor in Salk’s Regulatory Biology Laboratory. "It opens up a whole lot of future questions about how these cycles are regulated."
More than a decade ago, researchers discovered that individuals with an inherited sleep disorder often carry a particular mutation in a protein called PER2. The mutation is in an area of the protein that can be phosphorylated—the ability to bond with a phosphate chemical that changes the protein’s function. Humans have three PER, or period, genes, all thought to play a role in the daily circadian clock and all containing the same phosphorylation spot.
The Salk scientists joined forces with a Chinese team led by Ying Xu of Nanjing University to test whether mutations in the equivalent area of PER1 would have the same effect as those in PER2 that caused the sleep disorder. So they bred mice to lack the mouse period genes, and added in a human PER1 or PER2 with a mutation in the phosphorylation site. As expected, mice with a mutated PER2 had sleep defects, dozing off earlier than usual. The same wasn’t true for PER1 mutations though.
"In the mice without PER1, there was no obvious defect in their sleep-wake cycles," says Panda. "Instead, when we looked at their metabolism, we suddenly saw drastic changes."
Mice with the PER1 phosphorylation defects ate earlier than other mice—causing them to wake up and snack before their sleep cycle was over—and ate more food throughout their normal waking period. When the researchers looked at the molecular details of the PER1 protein, they found that the mutated PER1 led to lower protein levels during the sleeping period, higher levels during the waking period, and a faster degradation of protein whenever it was produced by cells.
Panda and his colleagues hypothesize that normally, PER1 and PER2 are kept synchronized since they have identical phosphorylation sites—they are turned on and off at the same times, keeping sleep and eating cycles aligned. But a mutation in one of the genes could break this link, and cause off-cycle eating or sleeping.
"For a long time, people discounted night eating syndrome as not real," says Panda. "These results in mice suggest that it could actually be a genetic basis for the syndrome." The researchers haven’t yet tested, however, whether any humans with night eating syndrome have mutations in PER1.
When Panda and Xu’s team restricted access to food, providing it only at the mice’s normal meal times, they found that even with a genetic mutation in PER1, mice could maintain a normal weight. Over a 10-week follow-up, these mice—with a PER1 mutation but timed access to food—showed no differences to control animals. This tells the researchers that the weight gain caused by PER1 is entirely caused by meal mistiming, not other metabolic defects.
Next, they hope to study exactly how PER1 controls appetite and eating behavior—whether its molecular actions work through the liver, fat cells, brain or other organs.

Genes discovered linking circadian clock with eating schedule

For most people, the urge to eat a meal or snack comes at a few, predictable times during the waking part of the day. But for those with a rare syndrome, hunger comes at unwanted hours, interrupts sleep and causes overeating.

Now, Salk scientists have discovered a pair of genes that normally keeps eating schedules in sync with daily sleep rhythms, and, when mutated, may play a role in so-called night eating syndrome. In mice with mutations in one of the genes, eating patterns are shifted, leading to unusual mealtimes and weight gain. The results were published in Cell Reports today.

"We really never expected that we would be able to decouple the sleep-wake cycle and the eating cycle, especially with a simple mutation," says senior study author Satchidananda Panda, an associate professor in Salk’s Regulatory Biology Laboratory. "It opens up a whole lot of future questions about how these cycles are regulated."

More than a decade ago, researchers discovered that individuals with an inherited sleep disorder often carry a particular mutation in a protein called PER2. The mutation is in an area of the protein that can be phosphorylated—the ability to bond with a phosphate chemical that changes the protein’s function. Humans have three PER, or period, genes, all thought to play a role in the daily circadian clock and all containing the same phosphorylation spot.

The Salk scientists joined forces with a Chinese team led by Ying Xu of Nanjing University to test whether mutations in the equivalent area of PER1 would have the same effect as those in PER2 that caused the sleep disorder. So they bred mice to lack the mouse period genes, and added in a human PER1 or PER2 with a mutation in the phosphorylation site. As expected, mice with a mutated PER2 had sleep defects, dozing off earlier than usual. The same wasn’t true for PER1 mutations though.

"In the mice without PER1, there was no obvious defect in their sleep-wake cycles," says Panda. "Instead, when we looked at their metabolism, we suddenly saw drastic changes."

Mice with the PER1 phosphorylation defects ate earlier than other mice—causing them to wake up and snack before their sleep cycle was over—and ate more food throughout their normal waking period. When the researchers looked at the molecular details of the PER1 protein, they found that the mutated PER1 led to lower protein levels during the sleeping period, higher levels during the waking period, and a faster degradation of protein whenever it was produced by cells.

Panda and his colleagues hypothesize that normally, PER1 and PER2 are kept synchronized since they have identical phosphorylation sites—they are turned on and off at the same times, keeping sleep and eating cycles aligned. But a mutation in one of the genes could break this link, and cause off-cycle eating or sleeping.

"For a long time, people discounted night eating syndrome as not real," says Panda. "These results in mice suggest that it could actually be a genetic basis for the syndrome." The researchers haven’t yet tested, however, whether any humans with night eating syndrome have mutations in PER1.

When Panda and Xu’s team restricted access to food, providing it only at the mice’s normal meal times, they found that even with a genetic mutation in PER1, mice could maintain a normal weight. Over a 10-week follow-up, these mice—with a PER1 mutation but timed access to food—showed no differences to control animals. This tells the researchers that the weight gain caused by PER1 is entirely caused by meal mistiming, not other metabolic defects.

Next, they hope to study exactly how PER1 controls appetite and eating behavior—whether its molecular actions work through the liver, fat cells, brain or other organs.

Filed under night eating syndrome circadian rhythms overeating gene mutation PER sleep neuroscience science

122 notes

Releasing the brakes for learning

Learning can only occur if certain neuronal “brakes” are released. As the group led by Andreas Lüthi at the Friedrich Miescher Institute for Biomedical Research has now discovered, learning processes in the brain are dynamically regulated by various types of interneurons. The new connections essential for learning can only be established if inhibitory inputs from interneurons are reduced at the right moment. These findings have now been published in Nature.

image

Image caption: Example of a dendrite of a principal neuron (white) and synaptic contacts (yellow arrowheads) from SOM1 interneurons.

For some years, most neurobiologists studying learning processes have assumed that the new connections required for learning can only be established and ultimately reinforced if certain neuronal “brakes” are released – a process known as disinhibition. It has also been supposed for some time that various types of interneurons could be involved in disinhibition. Interneurons are nerve cells that surround and – via their connections – inhibit the activity of principal neurons. It has not been clear, however, whether these cell types actually play a role in disinhibition and how they control learning.

Andreas Lüthi and his group at the Friedrich Miescher Institute for Biomedical Research have now demonstrated for the first time how a learning process is dynamically regulated by specific types of interneurons.

In Lüthi’s experiments, mice were trained to associate a sound with an unpleasant stimulus, so that the animals subsequently knew what would happen when they heard the auditory cue. The researchers showed that, during the learning process, the sound stimulus released a brake in some of the principal neurons. More precisely, it induced the activation of parvalbumin-positive (PV+) interneurons, leading indirectly – via somatostatin-positive (SOM+) interneurons – to disinhibition of the principal neurons. The latter thus became receptive to further sensory inputs. If this was immediately followed by the unpleasant stimulus, then another brake was released. Once again, PV+ interneurons were involved, but this time the principal neurons were directly disinhibited. Steffen Wolff, a postdoc in Lüthi’s group and first author of the publication, explains: “The principal neurons temporarily reached a level of activation enabling neuronal connections to be reinforced in such a way that the animal could learn the association between the sound and the unpleasant stimulus.”

Lüthi comments: “This is the first time we’ve been able to identify so clearly the function of defined interneurons in a learning process, and to show how successive disinhibition can enable this process. We assume that interneurons disinhibit the principal neurons in a highly dynamic manner. They integrate, as it were, the state of numerous different neural networks, activated for example by sensory input, earlier experiences or emotional states, and thus permit or prevent learning. I think these findings are also of interest in the context of conditions where learning processes are impaired or dysfunctional, as in the case of anxiety disorders.”

(Source: fmi.ch)

Filed under learning interneurons disinhibition neural circuits amygdala neuroscience science

156 notes

One Molecule To Block Both Pain And Itch

Duke University researchers have found an antibody that simultaneously blocks the sensations of pain and itching in studies with mice.

image

The new antibody works by targeting the voltage-sensitive sodium channels in the cell membrane of neurons. The results appear online on May 22 in Cell.

Voltage-sensitive sodium channels control the flow of sodium ions through the neuron’s membrane. These channels open and close by responding to the electric current or action potential of the cells. One particular type of sodium channel, called the Nav1.7 subtype, is responsible for sensing pain.

Mutations in the human gene encoding the Nav1.7 sodium channel can lead to either the inability to sense pain or pain hypersensitivity. Interestingly, these mutations do not affect other sensations such as touch or temperature. Hence, the Nav1.7 sodium channel might be a very specific target for treating pain disorders without perturbing the patients’ ability to feel other sensations.

"Originally, I was interested in isolating these sodium channels from cells to study their structure," said Seok-Yong Lee, assistant professor of biochemistry in the Duke University Medical School and principal investigator of the study. He designed antibodies that would capture the sodium channels so that he could study them. "But then I thought, what if I could make an antibody that interferes with the channel function?"

The team first tested the antibody in cultured cells engineered to express the Nav1.7 sodium channel. They found that the antibody can bind to the channel and stabilize its closed state.

"The channel is off when it is closed," Lee explained. "Since the antibody stabilizes the closed state, the channel becomes less sensitive to pain." If this held true in live animals, then the animals would also be less sensitive to pain.

To test this idea, Lee sought the help of Ru-Rong Ji, professor of anesthesiology and neurobiology, who is an expert in the study of pain and itch sensation. Using laboratory mouse models of inflammatory and neuropathic pain, they showed that the antibody can target the Nav1.7 channel and reduce the pain sensation in these mice. More importantly, mice receiving the treatment did not show signs of physical dependence or enhanced tolerance toward the antibody.

"Pain and itch are distinct sensations, and pain is often known to suppress itch", said Ji.
The team found that the antibody can also relieve acute and chronic itch in mouse models, making them the first to discover the role of Nav1.7 in transmitting the itch sensation.

"Now we have a compound that can potentially treat both pain and itch at the same time," said Lee. Both of these symptoms are common in allergic contact dermatitis, which affects more than 10 million patients a year in the United States alone.

The team is pursuing a patent for the antibody.

"We hope our discovery will garner interest from pharmaceutical companies that can help us expand our studies into clinical trials," Lee said. Their goal is to develop a safer treatment for pain and itch as an alternative to opioids, which often cause addiction and other detrimental side effects.

(Source: today.duke.edu)

Filed under sodium ions neurons Nav1.7 pain itch antibody neuroscience science

105 notes

A New Target for Alcoholism Treatment: Kappa Opioid Receptors
The list of brain receptor targets for opiates reads like a fraternity: Mu Delta Kappa. The mu opioid receptor is the primary target for morphine and endogenous opioids like endorphin, whereas the delta opioid receptor shows the highest affinity for endogenous enkephalins. The kappa opioid receptor (KOR) is very interesting, but the least understood of the opiate receptor family.
Until now, the mu opioid receptor received the most attention in alcoholism research. Naltrexone, a drug approved by the U.S. Food and Drug Administration for the treatment of alcoholism, acts by blocking opiate action at brain receptors and is most potent at the mu opioid receptor. In addition, research has suggested that a variant of the gene that codes for the mu opioid receptor (OPRM1) may be associated with the risk for alcoholism and the response to naltrexone treatment.
However, naltrexone also acts at the kappa opioid receptor and it has not been clear whether this effect of naltrexone is relevant to alcoholism treatment.
A growing body of research in animals implicates the KOR in alcoholism. Stimulation of the KOR, which occurs with alcohol intake, is thought to produce unpleasant and aversive effects. This receptor is hypothesized to play a role in alcohol dependence, at least in part, by promoting negative reinforcement processes. In other words, the theory postulates that during development of alcohol dependence, the KOR system becomes overstimulated, producing dysphoria and anhedonia, which then leads to further alcohol seeking and escalation of alcohol intake that serves to self-medicate those negative symptoms.
A new study in Biological Psychiatry, led by Dr. Brendan Walker at Washington State University, used a rat model of alcohol dependence to directly investigate the KOR system following chronic alcohol exposure and withdrawal.
They found that the KOR system is dysregulated in the amygdala of alcohol-dependent rats, a vital brain region with many functions, including regulation of emotional behavior and decision-making. Chronic alcohol consumption is known to cause neuroadaptations in the amygdala. In this study specifically, they found increased dynorphin A and increased KOR signaling in the amygdala of alcohol-dependent rats.
When the rats were in acute alcohol withdrawal, the researchers administered different drugs, each of which target the KOR system in precise ways, directly into the amygdala. Using this site-specific antagonism, they observed that alcohol dependence-related KOR dysregulation directly contributes to the excessive alcohol consumption that occurs during withdrawal.
“These data provide important new support for the hypothesis that kappa opioid receptor blockers might play a role in the treatment of alcoholism,” said Dr. John Krystal, Editor of Biological Psychiatry. “This study suggests that one role might be to prevent a relapse to alcohol use among patients recently withdrawn from alcohol.”
“This dataset demonstrates the extensive nature of the neuroadaptations the brain undergoes when chronically exposed to alcohol. The implications of these results are far reaching and should help guide pharmacotherapeutic development efforts for the treatment of alcohol use disorders,” said Walker. “Pharmacological compounds that alleviate the negative emotional / mood states that accompany alcohol withdrawal, by attenuating the excessive signaling in the dynorphin / kappa-opioid receptor system, should result in enhanced treatment compliance and facilitate the transition away from alcohol dependence.”
Additional extensive research will be necessary to identify and test the effectiveness of specific drugs that act on the KOR system, but these findings provide researchers with a potentially successful path forward to developing new drugs for the treatment of alcoholism.

A New Target for Alcoholism Treatment: Kappa Opioid Receptors

The list of brain receptor targets for opiates reads like a fraternity: Mu Delta Kappa. The mu opioid receptor is the primary target for morphine and endogenous opioids like endorphin, whereas the delta opioid receptor shows the highest affinity for endogenous enkephalins. The kappa opioid receptor (KOR) is very interesting, but the least understood of the opiate receptor family.

Until now, the mu opioid receptor received the most attention in alcoholism research. Naltrexone, a drug approved by the U.S. Food and Drug Administration for the treatment of alcoholism, acts by blocking opiate action at brain receptors and is most potent at the mu opioid receptor. In addition, research has suggested that a variant of the gene that codes for the mu opioid receptor (OPRM1) may be associated with the risk for alcoholism and the response to naltrexone treatment.

However, naltrexone also acts at the kappa opioid receptor and it has not been clear whether this effect of naltrexone is relevant to alcoholism treatment.

A growing body of research in animals implicates the KOR in alcoholism. Stimulation of the KOR, which occurs with alcohol intake, is thought to produce unpleasant and aversive effects. This receptor is hypothesized to play a role in alcohol dependence, at least in part, by promoting negative reinforcement processes. In other words, the theory postulates that during development of alcohol dependence, the KOR system becomes overstimulated, producing dysphoria and anhedonia, which then leads to further alcohol seeking and escalation of alcohol intake that serves to self-medicate those negative symptoms.

A new study in Biological Psychiatry, led by Dr. Brendan Walker at Washington State University, used a rat model of alcohol dependence to directly investigate the KOR system following chronic alcohol exposure and withdrawal.

They found that the KOR system is dysregulated in the amygdala of alcohol-dependent rats, a vital brain region with many functions, including regulation of emotional behavior and decision-making. Chronic alcohol consumption is known to cause neuroadaptations in the amygdala. In this study specifically, they found increased dynorphin A and increased KOR signaling in the amygdala of alcohol-dependent rats.

When the rats were in acute alcohol withdrawal, the researchers administered different drugs, each of which target the KOR system in precise ways, directly into the amygdala. Using this site-specific antagonism, they observed that alcohol dependence-related KOR dysregulation directly contributes to the excessive alcohol consumption that occurs during withdrawal.

“These data provide important new support for the hypothesis that kappa opioid receptor blockers might play a role in the treatment of alcoholism,” said Dr. John Krystal, Editor of Biological Psychiatry. “This study suggests that one role might be to prevent a relapse to alcohol use among patients recently withdrawn from alcohol.”

“This dataset demonstrates the extensive nature of the neuroadaptations the brain undergoes when chronically exposed to alcohol. The implications of these results are far reaching and should help guide pharmacotherapeutic development efforts for the treatment of alcohol use disorders,” said Walker. “Pharmacological compounds that alleviate the negative emotional / mood states that accompany alcohol withdrawal, by attenuating the excessive signaling in the dynorphin / kappa-opioid receptor system, should result in enhanced treatment compliance and facilitate the transition away from alcohol dependence.”

Additional extensive research will be necessary to identify and test the effectiveness of specific drugs that act on the KOR system, but these findings provide researchers with a potentially successful path forward to developing new drugs for the treatment of alcoholism.

Filed under alcohol alcohol dependence opioid receptors amygdala neuroscience science

139 notes

Visual hallucinations more common than previously thought

Vivid hallucinations experienced by people with sight loss last far longer and have more serious consequences than previously thought, according to new research from King’s College London and the Macular Society. 

image

The study is the largest survey of the phenomenon, known as Charles Bonnet Syndrome, and documented the experiences of 492 visually impaired people who had experienced visual hallucinations. The findings, published in the British Journal of Ophthalmology, show there is a serious discrepancy between medical opinion and the realities of the condition.

Charles Bonnet Syndrome is widely considered by the medical profession to be benign and short-lived. However, the new research shows that 80% of respondents had hallucinations for five years or more and 32% found them predominantly unpleasant, distressing and negative. 

The study described this group of people as having “negative outcome Charles Bonnet Syndrome”. The group was more likely to have frequent, fear inducing, longer duration hallucinations, which affected daily activities. They were more likely to attribute hallucinations to serious mental illness and were less likely to have been warned about the possibility of hallucinations before they started. 

Of respondents, 38% regarded their hallucinations as startling, terrifying or frightening when they first occurred and 46% said hallucinations had an effect on their ability to complete daily tasks. 36% of people who discussed the issue with a medical professional said the professional was “unsure or did not know” about the diagnosis.

Dr Dominic ffytche, who led the research at the Institute of Psychiatry at King’s, says:  “Charles Bonnet Syndrome has been traditionally thought of as benign. Indeed, it has been questioned whether it should even be considered a medical condition given it does not cause problems and goes away by itself. The results of our survey paint a very different picture.

“With no specific treatments for Charles Bonnet Syndrome, the survey highlights the importance of raising awareness to reduce the distress it causes, particularly before symptoms start. All people with Charles Bonnet Syndrome are relieved or reassured to find out about the cause of their hallucinations and our evidence shows the knowledge may help reduce negative outcome.”

People with macular disease are particularly prone to Charles Bonnet hallucinations. They are thought to be a reaction of the brain to the loss of visual stimulation. More than half of people with severe sight loss experience them but many do not tell others for fear they will be thought to have a serious mental illness. 

Age-related macular (AMD) degeneration affects the central vision and is the most common cause of sight loss in the UK. Nearly 600,000 people have late-stage AMD today and more people will become affected as our population ages. Around half will have hallucinations at some stage. 

Tony Rucinski, Chief Executive, the Macular Society, said: “It is essential that people affected by sight loss are given information about Charles Bonnet Syndrome at diagnosis or as soon after as possible. 

“Losing your sight is bad enough without the fear that you have something like dementia as well. We need medical professionals to recognise the seriousness of Charles Bonnet Syndrome and ensure that people don’t suffer unnecessarily. More research is also needed to investigate Charles Bonnet Syndrome and possible ways of reducing its impact.”

Dr ffytche is also leading a large NIHR funded research programme on visual hallucinations to develop a much-needed evidence base to inform NHS practice in managing and treating the symptoms. 

Filed under hallucinations Charles Bonnet Syndrome vision visual impairment neuroscience science

654 notes

How the gut feeling shapes fear
An unlit, deserted car park at night, footsteps in the gloom. The heart beats faster and the stomach ties itself in knots. We often feel threatening situations in our stomachs. While the brain has long been viewed as the centre of all emotions, researchers are increasingly trying to get to the bottom of this proverbial gut instinct.
It is not only the brain that controls processes in our abdominal cavity; our stomach also sends signals back to the brain. At the heart of this dialogue between the brain and abdomen is the vagus nerve, which transmits signals in both directions – from the brain to our internal organs (via the so called efferent nerves) and from the stomach back to our brain (via the afferent nerves). By cutting the afferent nerve fibres in rats, a team of scientists led by Urs Meyer, a researcher in the group of ETH Zurich professor Wolfgang Langhans, turned this two-way communication into a one-way street, enabling the researchers to get to the bottom of the role played by gut instinct. In the test animals, the brain was still able to control processes in the abdomen, but no longer received any signals from the other direction.
Less fear without gut instinct
In the behavioural studies, the researchers determined that the rats were less wary of open spaces and bright lights compared with controlled rats with an intact vagus nerve. “The innate response to fear appears to be influenced significantly by signals sent from the stomach to the brain,” says Meyer.
Nevertheless, the loss of their gut instinct did not make the rats completely fearless: the situation for learned fear behaviour looked different. In a conditioning experiment, the rats learned to link a neutral acoustic stimulus – a sound – to an unpleasant experience. Here, the signal path between the stomach and brain appeared to play no role, with the test animals learning the association as well as the control animals. If, however, the researchers switched from a negative to a neutral stimulus, the rats without gut instinct required significantly longer to associate the sound with the new, neutral situation. This also fits with the results of a recently published study conducted by other researchers, which found that stimulation of the vagus nerve facilitates relearning, says Meyer.
These findings are also of interest to the field of psychiatry, as post-traumatic stress disorder (PTSD), for example, is linked to the association of neutral stimuli with fear triggered by extreme experiences. Stimulation of the vagus nerve could help people with PTSD to once more associate the triggering stimuli with neutral experiences. Vagus nerve stimulation is already used today to treat epilepsy and, in some cases, depression.
Stomach influences signalling in the brain
“A lower level of innate fear, but a longer retention of learned fear – this may sound contradictory,” says Meyer. However, innate and conditioned fear are two different behavioural domains in which different signalling systems in the brain are involved. On closer investigation of the rats’ brains, the researchers found that the loss of signals from the abdomen changes the production of certain signalling substances, so called neurotransmitters, in the brain.
“We were able to show for the first time that the selective interruption of the signal path from the stomach to the brain changed complex behavioural patterns. This has traditionally been attributed to the brain alone,” says Meyer. The study shows clearly that the stomach also has a say in how we respond to fear; however, what it says, i.e. precisely what it signals, is not yet clear. The researchers hope, however, that they will be able to further clarify the role of the vagus nerve and the dialogue between brain and body in future studies.

How the gut feeling shapes fear

An unlit, deserted car park at night, footsteps in the gloom. The heart beats faster and the stomach ties itself in knots. We often feel threatening situations in our stomachs. While the brain has long been viewed as the centre of all emotions, researchers are increasingly trying to get to the bottom of this proverbial gut instinct.

It is not only the brain that controls processes in our abdominal cavity; our stomach also sends signals back to the brain. At the heart of this dialogue between the brain and abdomen is the vagus nerve, which transmits signals in both directions – from the brain to our internal organs (via the so called efferent nerves) and from the stomach back to our brain (via the afferent nerves). By cutting the afferent nerve fibres in rats, a team of scientists led by Urs Meyer, a researcher in the group of ETH Zurich professor Wolfgang Langhans, turned this two-way communication into a one-way street, enabling the researchers to get to the bottom of the role played by gut instinct. In the test animals, the brain was still able to control processes in the abdomen, but no longer received any signals from the other direction.

Less fear without gut instinct

In the behavioural studies, the researchers determined that the rats were less wary of open spaces and bright lights compared with controlled rats with an intact vagus nerve. “The innate response to fear appears to be influenced significantly by signals sent from the stomach to the brain,” says Meyer.

Nevertheless, the loss of their gut instinct did not make the rats completely fearless: the situation for learned fear behaviour looked different. In a conditioning experiment, the rats learned to link a neutral acoustic stimulus – a sound – to an unpleasant experience. Here, the signal path between the stomach and brain appeared to play no role, with the test animals learning the association as well as the control animals. If, however, the researchers switched from a negative to a neutral stimulus, the rats without gut instinct required significantly longer to associate the sound with the new, neutral situation. This also fits with the results of a recently published study conducted by other researchers, which found that stimulation of the vagus nerve facilitates relearning, says Meyer.

These findings are also of interest to the field of psychiatry, as post-traumatic stress disorder (PTSD), for example, is linked to the association of neutral stimuli with fear triggered by extreme experiences. Stimulation of the vagus nerve could help people with PTSD to once more associate the triggering stimuli with neutral experiences. Vagus nerve stimulation is already used today to treat epilepsy and, in some cases, depression.

Stomach influences signalling in the brain

“A lower level of innate fear, but a longer retention of learned fear – this may sound contradictory,” says Meyer. However, innate and conditioned fear are two different behavioural domains in which different signalling systems in the brain are involved. On closer investigation of the rats’ brains, the researchers found that the loss of signals from the abdomen changes the production of certain signalling substances, so called neurotransmitters, in the brain.

“We were able to show for the first time that the selective interruption of the signal path from the stomach to the brain changed complex behavioural patterns. This has traditionally been attributed to the brain alone,” says Meyer. The study shows clearly that the stomach also has a say in how we respond to fear; however, what it says, i.e. precisely what it signals, is not yet clear. The researchers hope, however, that they will be able to further clarify the role of the vagus nerve and the dialogue between brain and body in future studies.

Filed under fear anxiety gut feeling emotions vagus nerve neuroscience science

101 notes

Screening for Autism: There’s an App for That

Most schools across the United States provide simple vision tests to their students—not to prescribe glasses, but to identify potential problems and recommend a trip to the optometrist. Researchers are now on the cusp of providing the same kind of service for autism.

image

Researchers at Duke University have developed software that tracks and records infants’ activity during videotaped autism screening tests. Their results show that the program is as good at spotting behavioral markers of autism as experts giving the test themselves, and better than non-expert medical clinicians and students in training.

The results appear online in the journal Autism Research and Treatment.

“We’re not trying to replace the experts,” said Jordan Hashemi, a graduate student in computer and electrical engineering at Duke. “We’re trying to transfer the knowledge of the relatively few autism experts available into classrooms and homes across the country. We want to give people tools they don’t currently have, because research has shown that early intervention can greatly impact the severity of the symptoms common in autism spectrum disorders.”

The study focused on three behavioral tests that can help identify autism in very young children.

In one test, an infant’s attention is drawn to a toy being shaken on the left side and then redirected to a toy being shaken on the right side. Clinicians count how long it takes for the child’s attention to shift in response to the changing stimulus. The second test passes a toy across the infant’s field of view and looks for any delay in the child tracking its motion. In the last test, a clinician rolls a ball to a child and looks for eye contact afterward—a sign of the child’s engagement with their play partner.

In all of the tests, the person administering them isn’t just controlling the stimulus, he or she is also counting how long it takes for the child to react—an imprecise science at best. The new program allows testers to forget about taking measurements while also providing more accuracy, recording reaction times down to tenths of a second.

“The great benefit of the video and software is for general practitioners who do not have the trained eye to look for subtle early warning signs of autism,” said Amy Esler, an assistant professor of pediatrics and autism researcher at the University of Minnesota, who participated in some of the trials highlighted in the paper.

“The software has the potential to automatically analyze a child’s eye gaze, walking patterns or motor behaviors for signs that are distinct from typical development,” Esler said. “These signs would signal to doctors that they need to refer a family to a specialist for a more detailed evaluation.”

According to Hashemi and his adviser, Guillermo Sapiro, professor of electrical and computer engineering and biomedical engineering at Duke, because the program is non-invasive, it could be useful immediately in homes and clinics. Neither, however, expects it to become widely used—not because clinicians, teachers and parents aren’t willing, but because the researchers are working on an even more practical solution.

Later this year, the Duke team (which includes students and faculty from engineering and psychiatry) plans to test a new tablet application that could do away with the need for a person to administer any tests at all. The program would watch for physical and facial responses to visual cues played on the screen, analyze the data and automatically report any potential red flags. Any parent, teacher or clinician would simply need to download the app and sit their child down in front of it for a few minutes.

The efforts are part of the Information Initiative at Duke, which connects researchers from disparate fields to experts in computer programming to help analyze large data sets.

“We’re currently working with autism experts at Duke Medicine to determine what sorts of easy tests could be used on just a computer or tablet screen to spot any potential concerns,” said Sapiro. “The goal is to mimic the same sorts of social interactions that the tests with the toys and balls measure, but without the toys and balls. The research has shown that the earlier autism can be spotted, the more beneficial intervention can be. And we want to provide everyone in the world with the ability to spot those signs as early as possible.”

(Source: pratt.duke.edu)

Filed under autism infants social interaction eye movements attention ASD neuroscience science

free counters