Neuroscience

Articles and news from the latest research reports.

Posts tagged neuroscience

84 notes

Balancing mitochondrial dynamics in Alzheimer’s disease
Many diseases are multifactorial and can not be understood by simple molecular associations alone. Alzheimer’s disease (AD) is associated with toxic transformations in two classes of protein,amyloid beta and tau, but they do not explain the full underlying pathology. On the cellular scale, much of the real-time morphological changes in neurons can be attributed to their underlying mitochondrial dynamics—namely fission, fusion, and the motions between these events. Last year, researchers from Harvard Medical School made the intriguing discovery that alterations in tau could lead to a doubling in the length of mitochondria. This week, they published a review article in Trends in Neuroscience, in which they seek to explain the primary features of AD in terms of mitochondrial dynamics.
Together with a collaborator from the Queensland Brain Institute, the Harvard researchers arrive at the conclusion that, like many other neurological diseases, AD is fundamentally an energy problem. While some proteins, like APOE-ɛ4 can predispose one to AD, point defects in individual proteins can not account for AD in the same way that a single alteration in hemoglobin leads to sickle cell disease. Attempts to assign casual relations to the complex interactions of tau or amyloid, with hundreds of other proteins inside neurons have frequently served to cloud, rather than simplify the AD story.
In years gone by, it was possible to publish a paper about how phosphorylation at certain sites on proteins, like tau, could lead to any number of downstream events. Tau is one of many proteins that control the assembly and stability of microtubules, critical structures that are among those compromised in AD. The problem now, is that we know tau comes in so many flavors—it is a big family of different isoforms with different properties depending on how they are processed. As far as simple phosphorylation, tau has been found to have 79 potential sites, with at least 30 of them normally phosphorylated.
A welcome simplification to this situation of compounding molecular complexity, is that many pathways converge onto convenient pre-existing packets of time, space, and predictable molecular structure—the mitochondria. As opposed to massive cell-wide molecular accounting, describing a few sub-cellular morphological features may be a more tractable approach not only to capture disease etiology, but perhaps to treat it.
To this end, the researchers apply existing knowledge regarding some of the molecular players in AD, to a few of the well-established control points in mitochondrial dynamics. State transitions between fission and fusion are, at the moment at least, characterized by only a small handful of proteins. This simple formula might be prescribed as the following: molecular pathway locally effects the organelle dynamics, then, the dynamic behavior of organelle accounts for the disease. The imposition of this middleman can potentially simplify much of the vast body of fact and conjecture associated with the disease.
The elongation of mitochondria by tau can be caused by increasing fusion, decreasing fission, or both. One function of tau is to stabilize F-actin networks which prevents a key fission protein from ever reaching the mitochondria. Elongated mitochondria do not necessarily cause AD. In fact, amyloid beta, which is concentrated inside mitochondria, has been shown to cause increased fission and decreased fusion. When the balance between fission and fusion is pushed too far in either direction, the result is bad news for neurons. If there are defects in the transport of mitochondria, as seems to be the case in many neurological diseases, their redistribution is unable to compensate for this loss of balance.
Specific disease-associated isoforms and phosphorylation states of tau can lead to AD through the loss of mitochondria in axons. In studies of AD tissue, mitochondria have been found to be preferentially redistributed to the soma. These selective localizations can take place quickly, and are therefore difficult to quantify except by live videomicroscopy. In synapses, the mitochondria have been observed to be longer lived, and to play a more critical role in calcium regulation then those elsewhere. Disruption in the normal handling of calcium has been attributed to many aspects of AD, particularly synaptic pathology.
The canonical dogma that action potentials lead to vesicle fusion and transmitter release exclusively through the entry of extracellular calcium has recently been enhanced with the understanding that mitochondria contribute significantly to the synaptic calcium cycle. While mitochondria clearly do not depolarize as rapidly as whole spiking cells,(generally when mitochondria are depolarized there is some problem) their calcium transporters operate quickly to mop up and redistribute calcium. To say that mitochondria might single-handedly initiate vesicle fusion, or for that matter minipotentials or full-blown spikes, would await future experimental corroboration.
Countless scores of papers over the years have attempted to make sense of the myriad synaptic pathways underlying memory and LTP. They might be better understood when mitochondria are viewed as the primary authors of synaptic vesicle release probability, and by implication, “spontaneous” release (vesicle fusion in the absence of a spike). As in disease states, specific pathways, structures and organelles have significant roles to play in many aspects of brain function—but causally relating the motions and dynamics of mitochondria to these phenomena now gives the broadest interpretive power.

Balancing mitochondrial dynamics in Alzheimer’s disease

Many diseases are multifactorial and can not be understood by simple molecular associations alone. Alzheimer’s disease (AD) is associated with toxic transformations in two classes of protein,amyloid beta and tau, but they do not explain the full underlying pathology. On the cellular scale, much of the real-time morphological changes in neurons can be attributed to their underlying mitochondrial dynamics—namely fission, fusion, and the motions between these events. Last year, researchers from Harvard Medical School made the intriguing discovery that alterations in tau could lead to a doubling in the length of mitochondria. This week, they published a review article in Trends in Neuroscience, in which they seek to explain the primary features of AD in terms of mitochondrial dynamics.

Together with a collaborator from the Queensland Brain Institute, the Harvard researchers arrive at the conclusion that, like many other neurological diseases, AD is fundamentally an energy problem. While some proteins, like APOE-ɛ4 can predispose one to AD, point defects in individual proteins can not account for AD in the same way that a single alteration in hemoglobin leads to sickle cell disease. Attempts to assign casual relations to the complex interactions of tau or amyloid, with hundreds of other proteins inside neurons have frequently served to cloud, rather than simplify the AD story.

In years gone by, it was possible to publish a paper about how phosphorylation at certain sites on proteins, like tau, could lead to any number of downstream events. Tau is one of many proteins that control the assembly and stability of microtubules, critical structures that are among those compromised in AD. The problem now, is that we know tau comes in so many flavors—it is a big family of different isoforms with different properties depending on how they are processed. As far as simple phosphorylation, tau has been found to have 79 potential sites, with at least 30 of them normally phosphorylated.

A welcome simplification to this situation of compounding molecular complexity, is that many pathways converge onto convenient pre-existing packets of time, space, and predictable molecular structure—the mitochondria. As opposed to massive cell-wide molecular accounting, describing a few sub-cellular morphological features may be a more tractable approach not only to capture disease etiology, but perhaps to treat it.

To this end, the researchers apply existing knowledge regarding some of the molecular players in AD, to a few of the well-established control points in mitochondrial dynamics. State transitions between fission and fusion are, at the moment at least, characterized by only a small handful of proteins. This simple formula might be prescribed as the following: molecular pathway locally effects the organelle dynamics, then, the dynamic behavior of organelle accounts for the disease. The imposition of this middleman can potentially simplify much of the vast body of fact and conjecture associated with the disease.

The elongation of mitochondria by tau can be caused by increasing fusion, decreasing fission, or both. One function of tau is to stabilize F-actin networks which prevents a key fission protein from ever reaching the mitochondria. Elongated mitochondria do not necessarily cause AD. In fact, amyloid beta, which is concentrated inside mitochondria, has been shown to cause increased fission and decreased fusion. When the balance between fission and fusion is pushed too far in either direction, the result is bad news for neurons. If there are defects in the transport of mitochondria, as seems to be the case in many neurological diseases, their redistribution is unable to compensate for this loss of balance.

Specific disease-associated isoforms and phosphorylation states of tau can lead to AD through the loss of mitochondria in axons. In studies of AD tissue, mitochondria have been found to be preferentially redistributed to the soma. These selective localizations can take place quickly, and are therefore difficult to quantify except by live videomicroscopy. In synapses, the mitochondria have been observed to be longer lived, and to play a more critical role in calcium regulation then those elsewhere. Disruption in the normal handling of calcium has been attributed to many aspects of AD, particularly synaptic pathology.

The canonical dogma that action potentials lead to vesicle fusion and transmitter release exclusively through the entry of extracellular calcium has recently been enhanced with the understanding that mitochondria contribute significantly to the synaptic calcium cycle. While mitochondria clearly do not depolarize as rapidly as whole spiking cells,(generally when mitochondria are depolarized there is some problem) their calcium transporters operate quickly to mop up and redistribute calcium. To say that mitochondria might single-handedly initiate vesicle fusion, or for that matter minipotentials or full-blown spikes, would await future experimental corroboration.

Countless scores of papers over the years have attempted to make sense of the myriad synaptic pathways underlying memory and LTP. They might be better understood when mitochondria are viewed as the primary authors of synaptic vesicle release probability, and by implication, “spontaneous” release (vesicle fusion in the absence of a spike). As in disease states, specific pathways, structures and organelles have significant roles to play in many aspects of brain function—but causally relating the motions and dynamics of mitochondria to these phenomena now gives the broadest interpretive power.

Filed under alzheimer's disease mitochondria proteins phosphorylation beta amyloid neuroscience science

109 notes

Researchers Discover How Brain Circuits Can Become Miswired During Development
Researchers at Weill Cornell Medical College have uncovered a mechanism that guides the exquisite wiring of neural circuits in a developing brain — gaining unprecedented insight into the faulty circuits that may lead to brain disorders ranging from autism to mental retardation.
In the journal Cell, the researchers describe, for the first time, that faulty wiring occurs when RNA molecules embedded in a growing axon are not degraded after they give instructions that help steer the nerve cell. So, for example, the signal that tells the axon to turn — which should disappear after the turn is made — remains active, interfering with new signals meant to guide the axon in other directions.
The scientists say that there may be a way to use this new knowledge to fix the circuits.
"Understanding the basis of brain miswiring can help scientists come up with new therapies and strategies to correct the problem," says the study’s senior author, Dr. Samie Jaffrey, a professor in the Department of Pharmacology.
"The brain is quite ‘plastic’ and changeable in the very young, and if we know why circuits are miswired, it may be possible to correct those pathways, allowing the brain to build new, functional wiring," he says.
Disorders associated with faulty neuronal circuits include epilepsy, autism, schizophrenia, mental retardation and spasticity and movement disorders, among others.
In their study, the scientists describe a process of brain wiring that is much more dynamic than was previously known — and thus more prone to error.
Proteins Sense the Environment to Steer the Axon
During brain development, neurons have to connect to each other, which they do by extending their long axons to touch one another. Ultimately, these neurons form a circuit between the brain and the target tissue through which chemical and electrical signals are relayed. In this study, researchers investigated neurons that travel up the spinal cord into the brain. “It is very critical that axons are precisely positioned in the spinal cord,” Dr. Jaffrey says. “If they are improperly positioned, they will form the wrong connections, which can lead to signals being sent to the wrong target cells in the brain.”
The way that an axon guides and finds its proper target is through so-called growth cones located at the tips of axons. “These growth cones have the ability to sense the environment, determine where the targets are and navigate toward them. The question has always been — how do they know how to do this? Where do the instructions come from that tell them how to find their proper target?” Dr. Jaffrey says. The team found that RNA molecules embedded in the growth cone are responsible for instructing the axon to move left or right, up or down. These RNAs are translated in growth cones to produce antenna-like proteins that steer the axon like a self-guided missile.
"As a circuit is being built, RNAs in the neuron’s growth cones are mostly silent. We found that specific RNAs are only read at precise stages in order to produce the right protein needed to steer the axon at the right time. After the protein is produced, we saw that the RNA instruction is degraded and disappears," he says.
"If these RNAs do not disappear when they should, the axon does not position itself properly — it may go right instead of left — and the wiring will be incorrect and the circuit may be faulty," Dr. Jaffrey says.
RNAs have Tremendous Power over Brain Development
The research finding answers a long-standing puzzle in the quest to understand brain wiring, says Dr. Dilek Colak, a postdoctoral associate in Dr. Jaffrey’s laboratory.
"There have been a series of discoveries over the last five years showing that proteins that control RNA degradation are very important for brain development and, when they are mutated, you can have spasticity or other movement disorders," Dr. Colak says. "That has raised a major question — why would RNA degradation pathways be so critical for properly creating brain circuits?
"What we show here is that not only does RNA need to be present in growth cones to give instructions, it then also needs to be removed from the growth cones to take away those instructions at the right time," she says. "Both those processes are critical and it may explain why there are so many different brain disorders associated with ineffective RNA regulation."
"The idea that control of brain wiring is located in these RNA molecules that are constantly being dynamically turned over is something that we didn’t anticipate," Dr. Jaffrey adds. "This tells us that regulating these RNA degradation pathways could have a tremendous impact on brain development. Now we know where to look to tease apart this process when it goes awry, and to think about how we can repair it."
(Image: Chad Baker)

Researchers Discover How Brain Circuits Can Become Miswired During Development

Researchers at Weill Cornell Medical College have uncovered a mechanism that guides the exquisite wiring of neural circuits in a developing brain — gaining unprecedented insight into the faulty circuits that may lead to brain disorders ranging from autism to mental retardation.

In the journal Cell, the researchers describe, for the first time, that faulty wiring occurs when RNA molecules embedded in a growing axon are not degraded after they give instructions that help steer the nerve cell. So, for example, the signal that tells the axon to turn — which should disappear after the turn is made — remains active, interfering with new signals meant to guide the axon in other directions.

The scientists say that there may be a way to use this new knowledge to fix the circuits.

"Understanding the basis of brain miswiring can help scientists come up with new therapies and strategies to correct the problem," says the study’s senior author, Dr. Samie Jaffrey, a professor in the Department of Pharmacology.

"The brain is quite ‘plastic’ and changeable in the very young, and if we know why circuits are miswired, it may be possible to correct those pathways, allowing the brain to build new, functional wiring," he says.

Disorders associated with faulty neuronal circuits include epilepsy, autism, schizophrenia, mental retardation and spasticity and movement disorders, among others.

In their study, the scientists describe a process of brain wiring that is much more dynamic than was previously known — and thus more prone to error.

Proteins Sense the Environment to Steer the Axon

During brain development, neurons have to connect to each other, which they do by extending their long axons to touch one another. Ultimately, these neurons form a circuit between the brain and the target tissue through which chemical and electrical signals are relayed. In this study, researchers investigated neurons that travel up the spinal cord into the brain. “It is very critical that axons are precisely positioned in the spinal cord,” Dr. Jaffrey says. “If they are improperly positioned, they will form the wrong connections, which can lead to signals being sent to the wrong target cells in the brain.”

The way that an axon guides and finds its proper target is through so-called growth cones located at the tips of axons. “These growth cones have the ability to sense the environment, determine where the targets are and navigate toward them. The question has always been — how do they know how to do this? Where do the instructions come from that tell them how to find their proper target?” Dr. Jaffrey says. The team found that RNA molecules embedded in the growth cone are responsible for instructing the axon to move left or right, up or down. These RNAs are translated in growth cones to produce antenna-like proteins that steer the axon like a self-guided missile.

"As a circuit is being built, RNAs in the neuron’s growth cones are mostly silent. We found that specific RNAs are only read at precise stages in order to produce the right protein needed to steer the axon at the right time. After the protein is produced, we saw that the RNA instruction is degraded and disappears," he says.

"If these RNAs do not disappear when they should, the axon does not position itself properly — it may go right instead of left — and the wiring will be incorrect and the circuit may be faulty," Dr. Jaffrey says.

RNAs have Tremendous Power over Brain Development

The research finding answers a long-standing puzzle in the quest to understand brain wiring, says Dr. Dilek Colak, a postdoctoral associate in Dr. Jaffrey’s laboratory.

"There have been a series of discoveries over the last five years showing that proteins that control RNA degradation are very important for brain development and, when they are mutated, you can have spasticity or other movement disorders," Dr. Colak says. "That has raised a major question — why would RNA degradation pathways be so critical for properly creating brain circuits?

"What we show here is that not only does RNA need to be present in growth cones to give instructions, it then also needs to be removed from the growth cones to take away those instructions at the right time," she says. "Both those processes are critical and it may explain why there are so many different brain disorders associated with ineffective RNA regulation."

"The idea that control of brain wiring is located in these RNA molecules that are constantly being dynamically turned over is something that we didn’t anticipate," Dr. Jaffrey adds. "This tells us that regulating these RNA degradation pathways could have a tremendous impact on brain development. Now we know where to look to tease apart this process when it goes awry, and to think about how we can repair it."

(Image: Chad Baker)

Filed under brain development plasticity neural circuits autism RNA molecules brain wiring neuroscience science

83 notes

Mapping the Brain
Freiburg Researchers Use Signals from Natural Movements to Identify Brain Regions
Whether we run to catch a bus or reach for a pen: Activities that involve the use of muscles are related to very specific areas in the brain. Traditionally, their exact location has only been determined through electrical stimulation or unnatural, experimental tasks. A team of scientists in Freiburg has now succeeded for the first time in mapping the brain’s surface using measurements of everyday movements. Attributing abilities to specific brain regions and identifying pathological areas is especially important in the treatment of epilepsy patients, as severe cases require removal of neural tissue. Until now, such “mapping” involved stimulating individual regions of the brain’s surface with electric currents and observing the reaction or sensation. Alternatively, patients were asked to perform the same movements again and again until the physicians isolated the corresponding patterns in brain activity. However, these methods required for the patient to cooperate and to provide detailed answers to the physicians’ questions. This is a prerequisite that small children or patients with impaired mental abilities can hardly meet, and hence there is a need for other strategies.
Scientists from the group of Dr. Tonio Ball at the Cluster of Excellence “BrainLinks-BrainTools” and the Bernstein Center Freiburg report in the current issue of NeuroImage that the brain’s natural activity during everyday movements can also be used to reliably identify the regions responsible for arm and leg movements.
The researchers examined data from epilepsy patients who had electrodes implanted under their skull prior to surgery. Using video recordings, the team captured the spontaneous movements of their patients, searching for concurrent signals of a certain frequency in the data gathered on the surface of the brain. They succeeded in creating a map of the brain’s surface for arm and leg movements that is as accurate as those created through established experimental methods.
A big hope for the team of researchers is also to gain new insights into the control of movements in the brain, as their method allows them to explore all manner of behaviors and is no longer limited to experimental conditions. Last but not least, the scientists explain that this new method of analyzing signals from the brain will contribute to the development of brain-machine interfaces that are suitable for daily use.

Mapping the Brain

Freiburg Researchers Use Signals from Natural Movements to Identify Brain Regions

Whether we run to catch a bus or reach for a pen: Activities that involve the use of muscles are related to very specific areas in the brain. Traditionally, their exact location has only been determined through electrical stimulation or unnatural, experimental tasks. A team of scientists in Freiburg has now succeeded for the first time in mapping the brain’s surface using measurements of everyday movements.
Attributing abilities to specific brain regions and identifying pathological areas is especially important in the treatment of epilepsy patients, as severe cases require removal of neural tissue. Until now, such “mapping” involved stimulating individual regions of the brain’s surface with electric currents and observing the reaction or sensation. Alternatively, patients were asked to perform the same movements again and again until the physicians isolated the corresponding patterns in brain activity. However, these methods required for the patient to cooperate and to provide detailed answers to the physicians’ questions. This is a prerequisite that small children or patients with impaired mental abilities can hardly meet, and hence there is a need for other strategies.

Scientists from the group of Dr. Tonio Ball at the Cluster of Excellence “BrainLinks-BrainTools” and the Bernstein Center Freiburg report in the current issue of NeuroImage that the brain’s natural activity during everyday movements can also be used to reliably identify the regions responsible for arm and leg movements.

The researchers examined data from epilepsy patients who had electrodes implanted under their skull prior to surgery. Using video recordings, the team captured the spontaneous movements of their patients, searching for concurrent signals of a certain frequency in the data gathered on the surface of the brain. They succeeded in creating a map of the brain’s surface for arm and leg movements that is as accurate as those created through established experimental methods.

A big hope for the team of researchers is also to gain new insights into the control of movements in the brain, as their method allows them to explore all manner of behaviors and is no longer limited to experimental conditions. Last but not least, the scientists explain that this new method of analyzing signals from the brain will contribute to the development of brain-machine interfaces that are suitable for daily use.

Filed under brain mapping brain regions motor cortex electrocortical stimulation mapping epilepsy neuroscience science

187 notes

Weird: Nuclear Bomb Tests Reveal Adults Grow New Brain Cells
Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.
Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable — or plastic — the adult brain can be.
Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.
"There was a lot in the literature showing there was neurogenesis in rodents and every animal studied," said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, "But there was very little evidence of whether this happens in humans."
Tantalizing clues
Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)
The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.
The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.
Learning to love the bomb
Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon.
This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.
When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.
Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.
They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.
This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.
"Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility," she said.
Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.
The new findings are detailed in the journal Cell.

Weird: Nuclear Bomb Tests Reveal Adults Grow New Brain Cells

Aboveground nuclear bomb testing in the 1950s and 1960s inadvertently gave modern scientists a way to prove the adult brain regularly creates new neurons, research reveals.

Researchers used to believe that the brain changed little once it finished maturing. That view is now considered out of date, as studies have revealed how changeable — or plastic — the adult brain can be.

Much of this plasticity is related to the brain’s organization; brain cells can alter their connections and communications with other brain cells. What has been less clear is whether, and to what extent, the human brain grows brand-new neurons in adulthood.

"There was a lot in the literature showing there was neurogenesis in rodents and every animal studied," said study researcher Kirsty Spalding, a biologist at the Karolinska Institute in Sweden, "But there was very little evidence of whether this happens in humans."

Tantalizing clues

Scientists had reason to believe it does. In adult mice, the hippocampus, a structure deep in the brain involved in memory and navigation, turns over cells all the time. Some of the biological markers linked to this turnover are seen in the human hippocampus. But the only direct evidence of new brain cells forming in the region came from a 1998 study in which researchers looked at the brains of five people who had been injected with a compounded called BrdU that cells take up into their DNA. (The compound was once used in experimental cancer studies, but is not used anymore for safety reasons.)

The BrdU study revealed that neurons in the hippocampuses of the participants contained the compound in their DNA, indicating these brain cells had formed after the injections. The oldest person in the study was 72, suggesting new neuron creation, known as neurogenesis, continues well into old age.

The 1998 study was the only direct evidence of such neurogenesis in the human hippocampus, however. Spalding and her colleagues wanted to change that. Ten years ago, they began a project to track the age of neurons in the human brain using an unusual tool: spare molecules left over from Cold War-era nuclear bomb tests.

Learning to love the bomb

Between 1945 and 1962, the United States conducted hundreds of aboveground nuclear bomb tests. These tests largely stopped with the Limited Test Ban Treaty of 1963, but their effects remained in the atmosphere. The neutrons sent flying by the bombs reacted with nitrogen in the atmosphere, creating a spike in carbon 14, an isotope (or variation) of carbon.

This carbon 14, in turn, did what carbon in the atmosphere does. It combined with oxygen to form carbon dioxide, and was then taken in by plants, which use carbon dioxide in photosynthesis. Humans ate some of these plants, along with some of the animals that also ate these plants, and the carbon 14 inside ended up in their bodies.

When a cell divides, it uses this carbon 14, integrating it into the DNA of the new cells that are forming. Carbon 14 decays over time at a known rate, so scientists can pinpoint from that decay exactly when the new cells were born.

Over the past decade, Spalding and her colleagues have used the technique in a variety of cells, including fat cells, refining it along the way until it became sensitive enough to measure tiny amounts of carbon 14 in small hippocampus samples. The researchers collected samples, with family permission, from autopsies in Sweden.

They found the tantalizing 1998 evidence was correct: Human hippocampuses do grow new neurons. In fact, about a third of the brain region is subject to cell turnover, with about 700 new neurons being formed each day in each hippocampus (humans have two, a mirror-image set on either side of the brain). Hippocampus neurons die each day, too, keeping the overall number more or less in balance, with some slow loss of cells with aging, Spalding said.

This turnover occurs at a ridge in the hippocampus known as the dentate gyrus, a spot known to contribute to the formation of new memories. Researchers aren’t sure what the function of this constant renewal is, but it could relate to allowing the brain to cope with novel situations, Spalding told LiveScience.

"Neurogenesis gives a particular kind of plasticity to the brain, a cognitive flexibility," she said.

Spalding and her colleagues had used the same techniques in other regions of the brain, including the cortex, the cerebellum and the olfactory bulb, and found no evidence of newborn neurons being integrated into those areas. The researchers now plan to study whether there are any links between neurogenesis and psychiatric conditions such as depression.

The new findings are detailed in the journal Cell.

Filed under adult brain neurogenesis cognitive function neurons nuclear bomb hippocampus memory neuroscience science

129 notes

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking 
People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.
“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”
The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.
The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.
For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.
“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.
Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.
“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Rapid, Irregular Heartbeat May Be Linked to Problems with Memory and Thinking

People who develop a type of irregular heartbeat common in old age called atrial fibrillation may also be more likely to develop problems with memory and thinking, according to new research published in the June 5, 2013, online issue of Neurology®, the medical journal of the American Academy of Neurology.

“Problems with memory and thinking are common for people as they get older. Our study shows that on average, problems with memory and thinking may start earlier or get worse more quickly in people who have atrial fibrillation,” said study author Evan L. Thacker, PhD, of the University of Alabama at Birmingham. “This means that heart health is an important factor related to brain health.”

The study involved people age 65 and older from four communities in the United States who were enrolled in the Cardiovascular Health Study. Participants did not have a history of atrial fibrillation or stroke at the start of the study. They were followed for an average of seven years, and received a 100-point memory and thinking test every year. People who had a stroke were not included in this analysis after the stroke. Of the 5,150 participants, 552, or about 11 percent, developed atrial fibrillation during the study.

The study found that people with atrial fibrillation were more likely to experience lower memory and thinking scores at earlier ages than people with no history of atrial fibrillation. For example, from age 80 to age 85 the average score on the 100-point test went down by about 6 points for people without atrial fibrillation, but it went down by about 10 points for people with atrial fibrillation.

For participants ages 75 and older, the average rate of decline was about three to four points faster per five years of aging with atrial fibrillation compared to those without the condition.

“This suggests that on average, people with atrial fibrillation may be more likely to develop cognitive impairment or dementia at earlier ages than people with no history of atrial fibrillation,” Thacker said.

Thacker noted that scores below 78 points on the 100-point test are suggestive of dementia. People without atrial fibrillation in the study were predicted on average to score below 78 points at age 87, while people with atrial fibrillation were predicted to score below 78 points at age 85, two years earlier.

“If there is indeed a link between atrial fibrillation and memory and thinking decline, the next steps are to learn why that decline happens and how we can prevent that decline,” said Thacker.

Filed under atrial fibrillation cognitive decline cognition irregular heartbeat medicine neuroscience science

47 notes

Targeting an aspect of Down syndrome
University of Michigan researchers have determined how a gene that is known to be defective in Down syndrome is regulated and how its dysregulation may lead to neurological defects, providing insights into potential therapeutic approaches to an aspect of the syndrome.
Normally, nerve cells called neurons undergo an intense period of extending and branching of neuronal protrusions around the time of birth. During this period, the neurons produce the proteins of the gene called Down syndrome cell-adhesion molecule, or Dscam, at high levels. After this phase, the growth and the levels of protein taper off.
However, in the brains of patients with Down syndrome, epilepsy and several other neurological disorders, the amount of Dscam remains high. The impact of the elevated Dscam amount on how neurons develop is unknown.
Bing Ye, a faculty member at U-M’s Life Sciences Institute, found that in the fruit fly Drosophila, the amount of Dscam proteins in a neuron determines the size to which a neuron extends its protrusions before it forms connections with other nerve cells. An overproduction of Dscam proteins leads to abnormally large neuronal protrusions.
Ye also identified two molecular pathways that converge to regulate the abundance of Dscam. One, dual leucine zipper kinase (DLK), which is involved in nerve regeneration, promotes the synthesis of Dscam proteins. Another, fragile X mental retardation protein (FMRP), which causes fragile X syndrome when defective, represses Dscam protein synthesis. Because humans share these genes with Drosophila, the DLK-FMRP-Dscam relationship presents a possible target for therapeutic intervention, Ye said.
Many genes are involved in neurological disorders like Down syndrome, and how molecular defects cause the disease is complex.
"But because of the important roles of Dscam in the development of neurons, its related defect is very likely to be an aspect of Down syndrome and it may be an aspect of the syndrome that can be treated," said Ye, an assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.
Ye’s next step is to test the effects of overexpression of Dscam in mice to see how it changes the development of the nervous system and the behavior of the animal.
Down syndrome occurs in about one in 830 newborns; an estimated 250,000 people in the U.S. have the condition, according to the National Library of Medicine’s Genetics Home Reference.

Targeting an aspect of Down syndrome

University of Michigan researchers have determined how a gene that is known to be defective in Down syndrome is regulated and how its dysregulation may lead to neurological defects, providing insights into potential therapeutic approaches to an aspect of the syndrome.

Normally, nerve cells called neurons undergo an intense period of extending and branching of neuronal protrusions around the time of birth. During this period, the neurons produce the proteins of the gene called Down syndrome cell-adhesion molecule, or Dscam, at high levels. After this phase, the growth and the levels of protein taper off.

However, in the brains of patients with Down syndrome, epilepsy and several other neurological disorders, the amount of Dscam remains high. The impact of the elevated Dscam amount on how neurons develop is unknown.

Bing Ye, a faculty member at U-M’s Life Sciences Institute, found that in the fruit fly Drosophila, the amount of Dscam proteins in a neuron determines the size to which a neuron extends its protrusions before it forms connections with other nerve cells. An overproduction of Dscam proteins leads to abnormally large neuronal protrusions.

Ye also identified two molecular pathways that converge to regulate the abundance of Dscam. One, dual leucine zipper kinase (DLK), which is involved in nerve regeneration, promotes the synthesis of Dscam proteins. Another, fragile X mental retardation protein (FMRP), which causes fragile X syndrome when defective, represses Dscam protein synthesis. Because humans share these genes with Drosophila, the DLK-FMRP-Dscam relationship presents a possible target for therapeutic intervention, Ye said.

Many genes are involved in neurological disorders like Down syndrome, and how molecular defects cause the disease is complex.

"But because of the important roles of Dscam in the development of neurons, its related defect is very likely to be an aspect of Down syndrome and it may be an aspect of the syndrome that can be treated," said Ye, an assistant professor in the Department of Cell and Developmental Biology at the U-M Medical School.

Ye’s next step is to test the effects of overexpression of Dscam in mice to see how it changes the development of the nervous system and the behavior of the animal.

Down syndrome occurs in about one in 830 newborns; an estimated 250,000 people in the U.S. have the condition, according to the National Library of Medicine’s Genetics Home Reference.

Filed under fruit flies nerve cells nerve regeneration down syndrome dscam proteins fragile X syndrome neuroscience science

58 notes

These scientists are ‘itching’ to help you stop scratching



Itch and scratch, itch and scratch.  It’s not the most serious physical problem in our lives, but it is common and it is very annoying. Now, researchers at the Hebrew University of Jerusalem and in Boston have come up with new findings that can stop the itching through silencing the neurons that transmit itch-generating stimuli.
The research was a collaborative effort by a group led by Dr. Alex Binshtok at the Hebrew University’s Department of Medical Neurobiology at the Institute of Medical Research Israel-Canada, and the Edmond & Lily Safra Center for Brain Sciences; along with Dr. Clifford Woolf’s group in the Boston Children’s Hospital and Harvard Medical School.
The study demonstrated the presence of functionally distinct sets of neurons that detect and transmit itch-generating stimuli. The researchers were further able to demonstrate that they could selectively target and silence those itch-generating neurons while active. These results provide a basis for the development of novel therapeutic approaches for selective treatment of previously unmet itching not induced by histamine (non-histaminergic itch), such as dry skin itch and allergic dermatitis.
(Histaminergic itch is brought on when histamine triggers an inflammatory immune response to foreign agents, such as occurs, for example, in hay fever.)
The findings of the Israeli-US researchers were published in the journal Nature Neuroscience. In addition to the senior researchers, student major contributors to the project were Sagi Gudes and Felix Blasl from the Hebrew University; and David Roberson and Jared Sprague from Harvard Medical School.
Itch is a complex, unpleasant, cutaneous sensation that in some respects resembles pain, yet is different in terms of its intrinsic sensory quality and the urge to scratch. Although some types of itch like urticaria (hives) could be effectively treated with anti-histaminergic agents, itch accompanying most chronic itch-inducing diseases, including atopic dermatitis (eczema), allergic itch and dry skin itch, is not predominantly induced by histamine. An understanding of the molecular and cellular mechanisms underlying the sensation of itch, therefore, is essential for the development of effective and selective treatment of itch, which in some cases could become a devastating condition, say the researchers.
The researchers’ findings suggest that primary itch-generating neurons that carry messages toward the central nervous system code functionally distinct histaminergic and non-histaminergic itch pathways that could be selectively blocked. This is the first time that this has been demonstrated, and means that it is possible to block itch signals in the neurons that mediate non-histamine itch. 
These findings have a great clinical importance since they could be translated into novel, selective and effective therapies for previously largely untreated dry skin itch and allergic dermatitis itch.

These scientists are ‘itching’ to help you stop scratching

Itch and scratch, itch and scratch.  It’s not the most serious physical problem in our lives, but it is common and it is very annoying. Now, researchers at the Hebrew University of Jerusalem and in Boston have come up with new findings that can stop the itching through silencing the neurons that transmit itch-generating stimuli.

The research was a collaborative effort by a group led by Dr. Alex Binshtok at the Hebrew University’s Department of Medical Neurobiology at the Institute of Medical Research Israel-Canada, and the Edmond & Lily Safra Center for Brain Sciences; along with Dr. Clifford Woolf’s group in the Boston Children’s Hospital and Harvard Medical School.

The study demonstrated the presence of functionally distinct sets of neurons that detect and transmit itch-generating stimuli. The researchers were further able to demonstrate that they could selectively target and silence those itch-generating neurons while active. These results provide a basis for the development of novel therapeutic approaches for selective treatment of previously unmet itching not induced by histamine (non-histaminergic itch), such as dry skin itch and allergic dermatitis.

(Histaminergic itch is brought on when histamine triggers an inflammatory immune response to foreign agents, such as occurs, for example, in hay fever.)

The findings of the Israeli-US researchers were published in the journal Nature Neuroscience. In addition to the senior researchers, student major contributors to the project were Sagi Gudes and Felix Blasl from the Hebrew University; and David Roberson and Jared Sprague from Harvard Medical School.

Itch is a complex, unpleasant, cutaneous sensation that in some respects resembles pain, yet is different in terms of its intrinsic sensory quality and the urge to scratch. Although some types of itch like urticaria (hives) could be effectively treated with anti-histaminergic agents, itch accompanying most chronic itch-inducing diseases, including atopic dermatitis (eczema), allergic itch and dry skin itch, is not predominantly induced by histamine. An understanding of the molecular and cellular mechanisms underlying the sensation of itch, therefore, is essential for the development of effective and selective treatment of itch, which in some cases could become a devastating condition, say the researchers.

The researchers’ findings suggest that primary itch-generating neurons that carry messages toward the central nervous system code functionally distinct histaminergic and non-histaminergic itch pathways that could be selectively blocked. This is the first time that this has been demonstrated, and means that it is possible to block itch signals in the neurons that mediate non-histamine itch. 

These findings have a great clinical importance since they could be translated into novel, selective and effective therapies for previously largely untreated dry skin itch and allergic dermatitis itch.

Filed under itch sensory neurons histamine neuroscience science

89 notes

Study Expands Concerns About Anesthesia’s Impact on the Brain
As pediatric specialists become increasingly aware that surgical anesthesia may have lasting effects on the developing brains of young children, new research suggests the threat may also apply to adult brains.
Researchers from Cincinnati Children’s Hospital Medical Center report June 5 the Annals of Neurology that testing in laboratory mice shows anesthesia’s neurotoxic effects depend on the age of brain neurons – not the age of the animal undergoing anesthesia, as once thought.
Although more research is needed to confirm the study’s relevance to humans, the study suggests possible health implications for millions of children and adults who undergo surgical anesthesia annually, according to Andreas Loepke, MD, PhD, a physician and researcher in the Department of Anesthesiology.
“We demonstrate that anesthesia-induced cell death in neurons is not limited to the immature brain, as previously believed,” said Loepke. “Instead, vulnerability seems to target neurons of a certain age and maturational stage. This finding brings us a step closer to understanding the phenomenon’s underlying mechanism”.
New neurons are generated abundantly in most regions of the very young brain, explaining why previous research has focused on that developmental stage. In a mature brain, neuron formation slows considerably, but extends into later life in dentate gyrus and olfactory bulb.
The dentate gyrus, which helps control learning and memory, is the region Loepke and his research colleagues paid particular attention to in their study. Also collaborating were researchers from the University of Cincinnati College of Medicine and the Children’s Hospital of Fudan University, Shanghai, China.
Researchers exposed newborn, juvenile and young adult mice to a widely used anesthetic called isoflurane in doses approximating those used in surgical practice. Newborn mice exhibited widespread neuronal loss in forebrain structures – confirming previous research – with no significant impact on the dentate gyrus. However, the effect in juvenile mice was reversed, with minimal neuronal impact in the forebrain regions and significant cell death in the dentate gyrus.
The team then performed extensive studies to discover that age and maturational stage of the affected neurons were the defining characteristics for vulnerability to anesthesia-induced neuronal cell death. The researchers observed similar results in young adult mice as well.
Research over the past 10 years has made it increasingly clear that commonly used anesthetics increase brain cell death in developing animals, raising concerns from the Food and Drug Administration, clinicians, neuroscientists and the public. As well, several follow-up studies in children and adults who have undergone surgical anesthesia show a link to learning and memory impairment.
Cautioning against immediate application of the current study’s findings to children and adults undergoing anesthesia, Loepke said his research team is trying to learn enough about anesthesia’s impact on brain chemistry to develop protective therapeutic strategies, in case they are needed. To this end, their next step is to identify specific molecular processes triggered by anesthesia that lead to brain cell death.
“Surgery is often vital to save lives or maintain quality of life and usually cannot be performed without general anesthesia,” Loepke said. “Physicians should carefully discuss with patients, parents and caretakers the risks and benefits of procedures requiring anesthetics, as well as the known risks of not treating certain conditions.”
Loepke is also collaborating with researchers from the Pediatric Neuroimaging Research Consortium at Cincinnati Children’s Hospital Medical Center to examine anesthesia’s impact on children’s brain using non-invasive magnetic resonance imaging (MRI) technology.

Study Expands Concerns About Anesthesia’s Impact on the Brain

As pediatric specialists become increasingly aware that surgical anesthesia may have lasting effects on the developing brains of young children, new research suggests the threat may also apply to adult brains.

Researchers from Cincinnati Children’s Hospital Medical Center report June 5 the Annals of Neurology that testing in laboratory mice shows anesthesia’s neurotoxic effects depend on the age of brain neurons – not the age of the animal undergoing anesthesia, as once thought.

Although more research is needed to confirm the study’s relevance to humans, the study suggests possible health implications for millions of children and adults who undergo surgical anesthesia annually, according to Andreas Loepke, MD, PhD, a physician and researcher in the Department of Anesthesiology.

“We demonstrate that anesthesia-induced cell death in neurons is not limited to the immature brain, as previously believed,” said Loepke. “Instead, vulnerability seems to target neurons of a certain age and maturational stage. This finding brings us a step closer to understanding the phenomenon’s underlying mechanism”.

New neurons are generated abundantly in most regions of the very young brain, explaining why previous research has focused on that developmental stage. In a mature brain, neuron formation slows considerably, but extends into later life in dentate gyrus and olfactory bulb.

The dentate gyrus, which helps control learning and memory, is the region Loepke and his research colleagues paid particular attention to in their study. Also collaborating were researchers from the University of Cincinnati College of Medicine and the Children’s Hospital of Fudan University, Shanghai, China.

Researchers exposed newborn, juvenile and young adult mice to a widely used anesthetic called isoflurane in doses approximating those used in surgical practice. Newborn mice exhibited widespread neuronal loss in forebrain structures – confirming previous research – with no significant impact on the dentate gyrus. However, the effect in juvenile mice was reversed, with minimal neuronal impact in the forebrain regions and significant cell death in the dentate gyrus.

The team then performed extensive studies to discover that age and maturational stage of the affected neurons were the defining characteristics for vulnerability to anesthesia-induced neuronal cell death. The researchers observed similar results in young adult mice as well.

Research over the past 10 years has made it increasingly clear that commonly used anesthetics increase brain cell death in developing animals, raising concerns from the Food and Drug Administration, clinicians, neuroscientists and the public. As well, several follow-up studies in children and adults who have undergone surgical anesthesia show a link to learning and memory impairment.

Cautioning against immediate application of the current study’s findings to children and adults undergoing anesthesia, Loepke said his research team is trying to learn enough about anesthesia’s impact on brain chemistry to develop protective therapeutic strategies, in case they are needed. To this end, their next step is to identify specific molecular processes triggered by anesthesia that lead to brain cell death.

“Surgery is often vital to save lives or maintain quality of life and usually cannot be performed without general anesthesia,” Loepke said. “Physicians should carefully discuss with patients, parents and caretakers the risks and benefits of procedures requiring anesthetics, as well as the known risks of not treating certain conditions.”

Loepke is also collaborating with researchers from the Pediatric Neuroimaging Research Consortium at Cincinnati Children’s Hospital Medical Center to examine anesthesia’s impact on children’s brain using non-invasive magnetic resonance imaging (MRI) technology.

Filed under anesthesia neurons cell death apoptosis dentate gyrus neurology neuroscience science

115 notes

Neurochemical Traffic Signals May Open New Avenues for the Treatment of Schizophrenia
Researchers at Boston University School of Medicine (BUSM) have uncovered important clues about a biochemical pathway in the brain that may one day expand treatment options for schizophrenia. The study, published online in the journal Molecular Pharmacology, was led by faculty within the department of pharmacology and experimental therapeutics at BUSM.
Patients with schizophrenia suffer from a life-long condition that can produce delusions, disordered thinking, and breaks with reality. A number of treatments are available for schizophrenia, but many patients do not respond to these therapies or experience side effects that limit their use.
This research focused on key components of the brain known as NMDA receptors. These receptors are located on nerve cells in the brain and serve as biochemical gates that allow calcium ions (electrical charges) to enter the cell when a neurotransmitter, such as glutamate, binds to the receptor. Proper activation of these receptors is critical for sensory perception, memory and learning, including the transfer of short-term memory into long-term storage. Patients with schizophrenia have poorly functioning or “hypoactive” NMDA receptors, suggesting the possibility of treatment with drugs that positively affect these receptors. Currently the only way to enhance NMDA receptor function is through the use of agents called agonists that directly bind to the receptor on the outer surface of the cell, opening the gates to calcium ions outside the cell.
In this study, the researchers discovered a novel “non-canonical” pathway in which NMDA receptors residing inside the cell are stimulated by a neuroactive steroid to migrate to the cell surface (a process known as trafficking), thus increasing the number of receptors available for glutamate activation. The researchers treated neural cells from the cerebral cortex with the novel steroid pregnenolone sulfate (PregS) and found that the number of working NMDA receptors on the cell surface increased by 60 to 100 percent within 10 minutes. The exact mechanism by which this occurs is not completely clear, but it appears that PregS increases calcium ions within the cell, which in turn produces a green light signal for more frequent trafficking of NMDA receptors to the cell surface.
Although still in the early stages, further research in this area may be instrumental in the development of treatments not only for schizophrenia, but also for other conditions associated with malfunctioning NMDA receptors, such as age-related decreases in memory and learning ability.

Neurochemical Traffic Signals May Open New Avenues for the Treatment of Schizophrenia

Researchers at Boston University School of Medicine (BUSM) have uncovered important clues about a biochemical pathway in the brain that may one day expand treatment options for schizophrenia. The study, published online in the journal Molecular Pharmacology, was led by faculty within the department of pharmacology and experimental therapeutics at BUSM.

Patients with schizophrenia suffer from a life-long condition that can produce delusions, disordered thinking, and breaks with reality. A number of treatments are available for schizophrenia, but many patients do not respond to these therapies or experience side effects that limit their use.

This research focused on key components of the brain known as NMDA receptors. These receptors are located on nerve cells in the brain and serve as biochemical gates that allow calcium ions (electrical charges) to enter the cell when a neurotransmitter, such as glutamate, binds to the receptor. Proper activation of these receptors is critical for sensory perception, memory and learning, including the transfer of short-term memory into long-term storage. Patients with schizophrenia have poorly functioning or “hypoactive” NMDA receptors, suggesting the possibility of treatment with drugs that positively affect these receptors. Currently the only way to enhance NMDA receptor function is through the use of agents called agonists that directly bind to the receptor on the outer surface of the cell, opening the gates to calcium ions outside the cell.

In this study, the researchers discovered a novel “non-canonical” pathway in which NMDA receptors residing inside the cell are stimulated by a neuroactive steroid to migrate to the cell surface (a process known as trafficking), thus increasing the number of receptors available for glutamate activation. The researchers treated neural cells from the cerebral cortex with the novel steroid pregnenolone sulfate (PregS) and found that the number of working NMDA receptors on the cell surface increased by 60 to 100 percent within 10 minutes. The exact mechanism by which this occurs is not completely clear, but it appears that PregS increases calcium ions within the cell, which in turn produces a green light signal for more frequent trafficking of NMDA receptors to the cell surface.

Although still in the early stages, further research in this area may be instrumental in the development of treatments not only for schizophrenia, but also for other conditions associated with malfunctioning NMDA receptors, such as age-related decreases in memory and learning ability.

Filed under schizophrenia NMDA receptors nerve cells calcium ions glutamate trafficking neuroscience science

156 notes

Pioneering Study Demonstrates Benefit of Imaging Technique in Identifying Mental Illness
MRI may be an effective way to diagnose mental illnesses such as bipolar disorder, according to experts from the Icahn School of Medicine at Mount Sinai. In a landmark study using advanced techniques, the researchers were able to correctly distinguish bipolar patients from healthy individuals based on their brain scans alone. The data are published in the journal Psychological Medicine.
Currently, most mental illnesses are diagnosed based on symptoms only, creating an urgent need for new approaches to diagnosis. In bipolar disorder, there may be a significant delay in diagnosis due to the complex clinical presentation of the illness. In this study, Sophia Frangou, MD, Professor of Psychiatry and Chief of the Psychosis Research Program at the Icahn School of Medicine at Mount Sinai teamed up with Andy Simmons, MD, of the Kings College London and Janaina Mourao-Miranda, MD, of University College London, to explore whether brain imaging could help correctly identify patients with bipolar disorder.
“Bipolar disorder affects patients’ ability to regulate their emotions successfully, which puts them at great disadvantage in their lives,” said Dr. Frangou. “The situation is made worse by unacceptably long delays, sometimes of up to 10 years, in making the correct diagnosis. Bipolar disorder may be easily misdiagnosed for other disorders, such as depression or schizophrenia. This is why bipolar disorder ranks among the top ten disorders causing significant disability worldwide.”
Dr. Frangou and her team used MRI to scan the brains of people with bipolar disorder and of healthy individuals. Using advanced computational models, they were successful in correctly separating people with bipolar disorder from healthy individuals with 73 percent accuracy using their brain imaging scans alone. They replicated their finding in a separate group of patients and healthy individuals and found a 72 percent accuracy rate.
Dr. Simmons added, “The level of accuracy we achieved is comparable to that of many other tests used in medicine. Additionally, brain scanning is very acceptable to patients as most people consider it a routine diagnostic test.”
“This approach does not undermine the importance of rigorous clinical assessment and the importance of building relationships with patients but provides biological justification for the type of diagnosis made,” said Dr. Frangou. “However, diagnostic imaging for psychiatry is still under investigation and not ready for widespread use. Nonetheless, our results together with those from other labs are a harbinger of a major shift in the way we approach diagnosis in psychiatry.”

Pioneering Study Demonstrates Benefit of Imaging Technique in Identifying Mental Illness

MRI may be an effective way to diagnose mental illnesses such as bipolar disorder, according to experts from the Icahn School of Medicine at Mount Sinai. In a landmark study using advanced techniques, the researchers were able to correctly distinguish bipolar patients from healthy individuals based on their brain scans alone. The data are published in the journal Psychological Medicine.

Currently, most mental illnesses are diagnosed based on symptoms only, creating an urgent need for new approaches to diagnosis. In bipolar disorder, there may be a significant delay in diagnosis due to the complex clinical presentation of the illness. In this study, Sophia Frangou, MD, Professor of Psychiatry and Chief of the Psychosis Research Program at the Icahn School of Medicine at Mount Sinai teamed up with Andy Simmons, MD, of the Kings College London and Janaina Mourao-Miranda, MD, of University College London, to explore whether brain imaging could help correctly identify patients with bipolar disorder.

“Bipolar disorder affects patients’ ability to regulate their emotions successfully, which puts them at great disadvantage in their lives,” said Dr. Frangou. “The situation is made worse by unacceptably long delays, sometimes of up to 10 years, in making the correct diagnosis. Bipolar disorder may be easily misdiagnosed for other disorders, such as depression or schizophrenia. This is why bipolar disorder ranks among the top ten disorders causing significant disability worldwide.”

Dr. Frangou and her team used MRI to scan the brains of people with bipolar disorder and of healthy individuals. Using advanced computational models, they were successful in correctly separating people with bipolar disorder from healthy individuals with 73 percent accuracy using their brain imaging scans alone. They replicated their finding in a separate group of patients and healthy individuals and found a 72 percent accuracy rate.

Dr. Simmons added, “The level of accuracy we achieved is comparable to that of many other tests used in medicine. Additionally, brain scanning is very acceptable to patients as most people consider it a routine diagnostic test.”

“This approach does not undermine the importance of rigorous clinical assessment and the importance of building relationships with patients but provides biological justification for the type of diagnosis made,” said Dr. Frangou. “However, diagnostic imaging for psychiatry is still under investigation and not ready for widespread use. Nonetheless, our results together with those from other labs are a harbinger of a major shift in the way we approach diagnosis in psychiatry.”

Filed under bipolar depression bipolar disorder neuroimaging MRI mental health psychology neuroscience science

free counters