Neuroscience

Articles and news from the latest research reports.

68 notes

Research opens up possibility of therapies to restore blood-brain barrier

Research led by Queen Mary, University of London, has opened up the possibility that drug therapies may one day be able to restore the integrity of the blood-brain barrier, potentially slowing or even reversing the progression of diseases like multiple sclerosis (MS). The study, funded by the Wellcome Trust, is published in Proceedings of the National Academy of Sciences.

image

The blood-brain barrier (BBB) is a layer of cells, including endothelial cells, which line the blood vessels in the brain and spinal cord. These cells act as a barrier, stopping certain molecules, including immune cells and viruses, passing from the blood stream into the central nervous system (brain and spinal cord).

In a number of neurodegenerative brain diseases, including MS, the BBB is compromised, allowing inappropriate cells to pass into the brain with devastating consequences.

In this study the researchers identified a specific protein – known as Annexin A1 (ANXA1) – as being integral in maintaining the BBB in the brain. The authors initially found that mice bred to lack this protein showed a decrease in integrity of the BBB compared to controls.

Taking this finding, they then investigated the potential role of ANXA1 in conditions which involve progressive breakdown of the BBB, including MS and Parkinson’s disease, by examining post-mortem human brain tissue samples. ANXA1 was present in the cells of samples from individuals who did not have a neurological disease and also in samples from patients who had died with Parkinson’s disease. However, it was not detectable in the endothelial cells in samples from patients who had died with MS.

Crucially, the researchers found that treating in vitro brain endothelial cells with human recombinant ANXA1 restored the key cellular features needed to reinstate the integrity of the BBB. The same was seen with the ANXA1 knockout mice, where administering the protein reversed the permeability of the BBB within 24 hours.

Dr Egle Solito, from Barts and The London School of Medicine and Dentistry, part of Queen Mary, who co-ordinated the study said: “Our findings suggest this protein plays a key role in maintaining a functioning BBB and, more importantly, has the potential to rescue defects in the BBB. We now need to carry on our research to see how much this molecule may be exploited for therapeutic uses in conditions such as MS, or as a biomarker to help in early diagnosis.”

(Source: qmul.ac.uk)

Filed under brain cells CNS blood–brain barrier neurological diseases science

121 notes

Electric stimulation of brain releases powerful, opiate-like painkiller
Researchers used electricity on certain regions in the brain of a patient with chronic, severe facial pain to release an opiate-like substance that’s considered one of the body’s most powerful painkillers.
The findings expand on previous work done at the University of Michigan, Harvard University and the City University of New York where researchers delivered electricity through sensors on the skulls of chronic migraine patients, and found a decrease in the intensity and pain of their headache attacks. However, the researchers then couldn’t completely explain how or why.
The current findings help explain what happens in the brain that decreases pain during the brief sessions of electricity, says Alexandre DaSilva, the senior researcher in the study from the University of Michigan School of Dentistry. Other study authors include DaSilva’s PhD student, Marcos DosSantos, and also Dr. Jon-Kar Zubieta from the Molecular and Behavioral Neuroscience Institute.
In their current study, DaSilva and colleagues intravenously administered a radiotracer that reached important brain areas in a patient with trigeminal neuropathic pain (TNP), a type of chronic, severe facial pain. They applied the electrodes and electrically stimulated the skull right above the motor cortex of the patient for 20 minutes during a PET scan (positron emission tomography). The stimulation is called transcranial direct current stimulation (tDCS).
The radiotracer was specifically designed to measure, indirectly, the local brain release of mu-opioid, a natural substance that alters pain perception. In order for opiate to function, it needs to bind to the mu-opioid receptor (the study assessed levels of this receptor).
"This is arguably the main resource in the brain to reduce pain," DaSilva said. "We’re stimulating the release of our (body’s) own resources to provide analgesia. Instead of giving more pharmaceutical opiates, we are directly targeting and activating the same areas in the brain on which they work. (Therefore), we can increase the power of this pain-killing effect and even decrease the use of opiates in general, and consequently avoid their side effects, including addiction."
Most pharmaceutical opiates, especially morphine, target the mu-opioid receptors in the brain, DaSilva says.
The dose of electricity is very small, he says. Consider that electroconvulsive therapy (ECT), which is used to treat depression and other psychiatric conditions, uses amperage in the brain ranging from 200 to 1600 milliamperes (mA). The tDCS protocol used in DaSilva’s study delivered 2 mA, considerably lower than ECT.
Just one session immediately improved the patient’s threshold for cold pain by 36 percent, but not the patient’s clinical, TNP/facial pain. This suggests that repetitive electrical stimulation over several sessions are required to have a lasting effect on clinical pain as shown in their previous migraine study, DaSilva says.
The manuscript appears in the journal Frontiers in Psychiatry. The group just completed another study with more subjects, and the initial results seem to confirm the findings above, but further analysis is necessary.
Next, researchers will investigate long-term effects of electric stimulation on the brain and find specific targets in the brain that may be more effective depending on the pain condition and patients’ status. For example, the frontal areas may be more helpful for chronic pain patients with depression symptoms.

Electric stimulation of brain releases powerful, opiate-like painkiller

Researchers used electricity on certain regions in the brain of a patient with chronic, severe facial pain to release an opiate-like substance that’s considered one of the body’s most powerful painkillers.

The findings expand on previous work done at the University of Michigan, Harvard University and the City University of New York where researchers delivered electricity through sensors on the skulls of chronic migraine patients, and found a decrease in the intensity and pain of their headache attacks. However, the researchers then couldn’t completely explain how or why.

The current findings help explain what happens in the brain that decreases pain during the brief sessions of electricity, says Alexandre DaSilva, the senior researcher in the study from the University of Michigan School of Dentistry. Other study authors include DaSilva’s PhD student, Marcos DosSantos, and also Dr. Jon-Kar Zubieta from the Molecular and Behavioral Neuroscience Institute.

In their current study, DaSilva and colleagues intravenously administered a radiotracer that reached important brain areas in a patient with trigeminal neuropathic pain (TNP), a type of chronic, severe facial pain. They applied the electrodes and electrically stimulated the skull right above the motor cortex of the patient for 20 minutes during a PET scan (positron emission tomography). The stimulation is called transcranial direct current stimulation (tDCS).

The radiotracer was specifically designed to measure, indirectly, the local brain release of mu-opioid, a natural substance that alters pain perception. In order for opiate to function, it needs to bind to the mu-opioid receptor (the study assessed levels of this receptor).

"This is arguably the main resource in the brain to reduce pain," DaSilva said. "We’re stimulating the release of our (body’s) own resources to provide analgesia. Instead of giving more pharmaceutical opiates, we are directly targeting and activating the same areas in the brain on which they work. (Therefore), we can increase the power of this pain-killing effect and even decrease the use of opiates in general, and consequently avoid their side effects, including addiction."

Most pharmaceutical opiates, especially morphine, target the mu-opioid receptors in the brain, DaSilva says.

The dose of electricity is very small, he says. Consider that electroconvulsive therapy (ECT), which is used to treat depression and other psychiatric conditions, uses amperage in the brain ranging from 200 to 1600 milliamperes (mA). The tDCS protocol used in DaSilva’s study delivered 2 mA, considerably lower than ECT.

Just one session immediately improved the patient’s threshold for cold pain by 36 percent, but not the patient’s clinical, TNP/facial pain. This suggests that repetitive electrical stimulation over several sessions are required to have a lasting effect on clinical pain as shown in their previous migraine study, DaSilva says.

The manuscript appears in the journal Frontiers in Psychiatry. The group just completed another study with more subjects, and the initial results seem to confirm the findings above, but further analysis is necessary.

Next, researchers will investigate long-term effects of electric stimulation on the brain and find specific targets in the brain that may be more effective depending on the pain condition and patients’ status. For example, the frontal areas may be more helpful for chronic pain patients with depression symptoms.

Filed under brain pain facial pain migraine electricity painkiller neuroscience science

118 notes

Newborn memories of the “oohs” and “ahs” heard in the womb
Newborns are much more attuned to the sounds of their native language than first thought. In fact, these linguistic whizzes can up pick on distinctive sounds of their mother tongue while in utero, a new study has concluded.
Research led by Christine Moon, a professor of psychology at Pacific Lutheran University, shows that infants, only hours old showed marked interest for the vowels of a language that was not their mother tongue.


"We have known for over 30 years that we begin learning prenatally about voices by listening to the sound of our mother talking," Moon said. "This is the first study that shows we learn about the particular speech sounds of our mother’s language before we are born."
Before the study, the general consensus was that infants learned about the small parts of speech, the vowels and the consonants, postnatally. Moon added. “This study moves the measurable result of experience with individual speech sounds from six months of age to before birth,” she said. The findings were published in Acta Paediatrica.

Newborn memories of the “oohs” and “ahs” heard in the womb

Newborns are much more attuned to the sounds of their native language than first thought. In fact, these linguistic whizzes can up pick on distinctive sounds of their mother tongue while in utero, a new study has concluded.

Research led by Christine Moon, a professor of psychology at Pacific Lutheran University, shows that infants, only hours old showed marked interest for the vowels of a language that was not their mother tongue.



"We have known for over 30 years that we begin learning prenatally about voices by listening to the sound of our mother talking," Moon said. "This is the first study that shows we learn about the particular speech sounds of our mother’s language before we are born."

Before the study, the general consensus was that infants learned about the small parts of speech, the vowels and the consonants, postnatally. Moon added. “This study moves the measurable result of experience with individual speech sounds from six months of age to before birth,” she said. The findings were published in Acta Paediatrica.

Filed under babies language native language learning womb psychology neuroscience science

78 notes

Risk Genes for Alzheimer’s and Mental Illness Linked to Brain Changes at Birth
Some brain changes that are found in adults with common gene variants linked to disorders such as Alzheimer’s disease, schizophrenia, and autism can also be seen in the brain scans of newborns.
“These results suggest that prenatal brain development may be a very important influence on psychiatric risk later in life,” said Rebecca C. Knickmeyer, PhD, lead author of the study and assistant professor of psychiatry in the University of North Carolina School of Medicine. The study was published by the journal Cerebral Cortex on Jan. 3, 2013.
The study included 272 infants who received MRI scans at UNC Hospitals shortly after birth. The DNA of each was tested for 10 common variations in 7 genes that have been linked to brain structure in adults. These genes have also been implicated in conditions such as schizophrenia, bipolar disorder, autism, Alzheimer’s disease, anxiety disorders and depression.
For some polymorphisms – such as a variation in the APOE gene which is associated with Alzheimer’s disease – the brain changes in infants looked very similar to brain changes found in adults with the same variants, Knickmeyer said. “This could stimulate an exciting new line of research focused on preventing onset of illness through very early intervention in at-risk individuals.”
But this was not true for every polymorphism included in the study, said John H. Gilmore, MD, senior author of the study and Thad & Alice Eure Distinguished Professor and Vice Chair for Research and Scientific Affairs in the UNC Department of Psychiatry.
For example, the study included two variants in the DISC1 gene. For one of these variants, known as rs821616, the infant brains looked very similar to the brains of adults with this variant. But there was no such similarity between infant brains and adult brains for the other variant, rs6675281.
“This suggests that the brain changes associated with this gene variant aren’t present at birth but develop later in life, perhaps during puberty,” Gilmore said.
“It’s fascinating that different variants in the same gene have such unique effects in terms of when they affect brain development,” said Knickmeyer.

Risk Genes for Alzheimer’s and Mental Illness Linked to Brain Changes at Birth

Some brain changes that are found in adults with common gene variants linked to disorders such as Alzheimer’s disease, schizophrenia, and autism can also be seen in the brain scans of newborns.

“These results suggest that prenatal brain development may be a very important influence on psychiatric risk later in life,” said Rebecca C. Knickmeyer, PhD, lead author of the study and assistant professor of psychiatry in the University of North Carolina School of Medicine. The study was published by the journal Cerebral Cortex on Jan. 3, 2013.

The study included 272 infants who received MRI scans at UNC Hospitals shortly after birth. The DNA of each was tested for 10 common variations in 7 genes that have been linked to brain structure in adults. These genes have also been implicated in conditions such as schizophrenia, bipolar disorder, autism, Alzheimer’s disease, anxiety disorders and depression.

For some polymorphisms – such as a variation in the APOE gene which is associated with Alzheimer’s disease – the brain changes in infants looked very similar to brain changes found in adults with the same variants, Knickmeyer said. “This could stimulate an exciting new line of research focused on preventing onset of illness through very early intervention in at-risk individuals.”

But this was not true for every polymorphism included in the study, said John H. Gilmore, MD, senior author of the study and Thad & Alice Eure Distinguished Professor and Vice Chair for Research and Scientific Affairs in the UNC Department of Psychiatry.

For example, the study included two variants in the DISC1 gene. For one of these variants, known as rs821616, the infant brains looked very similar to the brains of adults with this variant. But there was no such similarity between infant brains and adult brains for the other variant, rs6675281.

“This suggests that the brain changes associated with this gene variant aren’t present at birth but develop later in life, perhaps during puberty,” Gilmore said.

“It’s fascinating that different variants in the same gene have such unique effects in terms of when they affect brain development,” said Knickmeyer.

Filed under brain brain development anxiety disorders autism schizophrenia genes neuroscience science

116 notes

Itchy Wool Sweaters Explained
Johns Hopkins researchers have uncovered strong evidence that mice have a specific set of nerve cells that signal itch but not pain, a finding that may settle a decades-long debate about these sensations, and, if confirmed in humans, help in developing treatments for chronic itch, including itch caused by life-saving medications.
At the heart of their discovery is a type of sensory nerve cell whose endings receive information from the skin and relay it to other nerves in the spinal cord, which then coordinates a response to the stimulus. Published online Dec. 23 in Nature Neuroscience, a report on the research suggests that even when the itch-specific nerve cells receive stimuli that are normally pain-inducing, the message they send isn’t “That hurts!” but rather “That itches!”
Pain and itch are both important sensations that help organisms survive. And pain is arguably more important because it tells us to withdraw the pained body part in order to prevent tissue damage. But itch also warns us of the presence of irritants, as in an allergic reaction. However, “when either of these sensations continues for weeks or months, they are no longer helpful. We even see patients stop taking life-saving medications because they cause such horrible itchiness all over,” says Xinzhong Dong, Ph.D., a Howard Hughes early career scientist and associate professor of neuroscience at the Institute for Basic Biomedical Sciences at the Johns Hopkins University School of Medicine. “And sometimes when we try to suppress chronic pain, with morphine for example, we end up causing chronic itchiness. So the two sensations are somehow related, and this study has begun to untangle them,” he says.
Because nerve cells send their messages as electrical currents that flow through them just as they would through wires, scientists can plug tiny monitors into individual nerve cells to detect the moment of stimulation. The scientific controversy over pain and itch centers around a group of nerve cells known to respond electrically to painful stimuli such as molecules of capsaicin, the fiery ingredient in chili peppers. A small subset of these nerve cells also responds electrically to itchy stimuli because they have on their surfaces receptors for molecules like histamine. One of these itchy receptors, called MrgA3, binds the anti-malaria drug chloroquine, causing serious itchiness in many patients.
Sensory nerve scientists have not known whether the nerves with itchy receptors and pain receptors were actually sending both types of messages to the brain, or just itch messages. What the current study found is that, in nerves with the itchy receptor MrgA3, electrical signals sent in response to both painful and itchy stimuli are interpreted by the brain as itch.
To reach this conclusion, the researchers first used a genetic trick to label the MrgA3 cells in mice with a glowing protein that allowed them to see the cells under the microscope. Aided by the glow, they were able to plug in those tiny electricity monitors and watch nerve cell responses to different stimuli. The cells transmitted electrical signals when the mice were exposed to itch-inducing chloroquine and histamine, as well as pain-inducing capsaicin and heat. Based on this result, the researchers tentatively concluded that the cells could send both pain and itch signals.
In the next experiment, the researchers monitored the behavioral responses of mice to the different stimuli. As expected, when the tails of normal mice were placed in hot water, they quickly pulled them out; when normal mice were given a bit of chloroquine or histamine, they scratched vigorously with their hind legs.
Then, to examine the role of MrgA3 cells in pain and itch, the scientists selectively killed MrgA3 nerve cells in adult mice and retested their responses. Presumably, the researchers noted, because MrgA3 cells are only a small fraction of all pain-sensing nerve cells, the mice had normal withdrawal responses to painful stimuli like hot water. However, when exposed to itchy stimuli, their scratching responses were reduced to varying degrees depending on the stimulus, most significantly in response to chloroquine. The fact that some stimuli still caused scratching suggested to the scientists that MrgA3 cells are not the only ones in the body that respond to itch. “We were convinced that MrgA3 cells are responsible for much of the sensation of itch, but it wasn’t yet clear whether MrgA3 cells could also relay painful information,” says Dong.
In their final experiments, the scientists used genetic techniques to create mice in which the MrgA3 cells were the only cells in the body capable of responding to capsaicin, that peppery pain-inducing substance. When injected into the cheeks of mice, normal mice massage the area with their forepaws to relieve the hot sensation. When injected into the experimental mice, they vigorously scratched their cheeks with their hind legs, suggesting that this normally painful stimulus had been communicated to the brain—by MrgA3 cells—as itchiness.
"Now that we have disentangled these itchy sensations from painful ones, we should be able to design drugs that target itch-specific nerve cells to combat chronic itchiness," says Dong. "We hope that this will not only provide relief, but also increase people’s faithfulness to their drug plans, particularly for deadly diseases like malaria and cancer."

Itchy Wool Sweaters Explained

Johns Hopkins researchers have uncovered strong evidence that mice have a specific set of nerve cells that signal itch but not pain, a finding that may settle a decades-long debate about these sensations, and, if confirmed in humans, help in developing treatments for chronic itch, including itch caused by life-saving medications.

At the heart of their discovery is a type of sensory nerve cell whose endings receive information from the skin and relay it to other nerves in the spinal cord, which then coordinates a response to the stimulus. Published online Dec. 23 in Nature Neuroscience, a report on the research suggests that even when the itch-specific nerve cells receive stimuli that are normally pain-inducing, the message they send isn’t “That hurts!” but rather “That itches!”

Pain and itch are both important sensations that help organisms survive. And pain is arguably more important because it tells us to withdraw the pained body part in order to prevent tissue damage. But itch also warns us of the presence of irritants, as in an allergic reaction. However, “when either of these sensations continues for weeks or months, they are no longer helpful. We even see patients stop taking life-saving medications because they cause such horrible itchiness all over,” says Xinzhong Dong, Ph.D., a Howard Hughes early career scientist and associate professor of neuroscience at the Institute for Basic Biomedical Sciences at the Johns Hopkins University School of Medicine. “And sometimes when we try to suppress chronic pain, with morphine for example, we end up causing chronic itchiness. So the two sensations are somehow related, and this study has begun to untangle them,” he says.

Because nerve cells send their messages as electrical currents that flow through them just as they would through wires, scientists can plug tiny monitors into individual nerve cells to detect the moment of stimulation. The scientific controversy over pain and itch centers around a group of nerve cells known to respond electrically to painful stimuli such as molecules of capsaicin, the fiery ingredient in chili peppers. A small subset of these nerve cells also responds electrically to itchy stimuli because they have on their surfaces receptors for molecules like histamine. One of these itchy receptors, called MrgA3, binds the anti-malaria drug chloroquine, causing serious itchiness in many patients.

Sensory nerve scientists have not known whether the nerves with itchy receptors and pain receptors were actually sending both types of messages to the brain, or just itch messages. What the current study found is that, in nerves with the itchy receptor MrgA3, electrical signals sent in response to both painful and itchy stimuli are interpreted by the brain as itch.

To reach this conclusion, the researchers first used a genetic trick to label the MrgA3 cells in mice with a glowing protein that allowed them to see the cells under the microscope. Aided by the glow, they were able to plug in those tiny electricity monitors and watch nerve cell responses to different stimuli. The cells transmitted electrical signals when the mice were exposed to itch-inducing chloroquine and histamine, as well as pain-inducing capsaicin and heat. Based on this result, the researchers tentatively concluded that the cells could send both pain and itch signals.

In the next experiment, the researchers monitored the behavioral responses of mice to the different stimuli. As expected, when the tails of normal mice were placed in hot water, they quickly pulled them out; when normal mice were given a bit of chloroquine or histamine, they scratched vigorously with their hind legs.

Then, to examine the role of MrgA3 cells in pain and itch, the scientists selectively killed MrgA3 nerve cells in adult mice and retested their responses. Presumably, the researchers noted, because MrgA3 cells are only a small fraction of all pain-sensing nerve cells, the mice had normal withdrawal responses to painful stimuli like hot water. However, when exposed to itchy stimuli, their scratching responses were reduced to varying degrees depending on the stimulus, most significantly in response to chloroquine. The fact that some stimuli still caused scratching suggested to the scientists that MrgA3 cells are not the only ones in the body that respond to itch. “We were convinced that MrgA3 cells are responsible for much of the sensation of itch, but it wasn’t yet clear whether MrgA3 cells could also relay painful information,” says Dong.

In their final experiments, the scientists used genetic techniques to create mice in which the MrgA3 cells were the only cells in the body capable of responding to capsaicin, that peppery pain-inducing substance. When injected into the cheeks of mice, normal mice massage the area with their forepaws to relieve the hot sensation. When injected into the experimental mice, they vigorously scratched their cheeks with their hind legs, suggesting that this normally painful stimulus had been communicated to the brain—by MrgA3 cells—as itchiness.

"Now that we have disentangled these itchy sensations from painful ones, we should be able to design drugs that target itch-specific nerve cells to combat chronic itchiness," says Dong. "We hope that this will not only provide relief, but also increase people’s faithfulness to their drug plans, particularly for deadly diseases like malaria and cancer."

Filed under itchiness nerve cells tissue damage sensation neuroscience science

242 notes

Study Refutes Accepted Model of Memory Formation
A study by Johns Hopkins researchers has shown that a widely accepted model of long-term memory formation — that it hinges on a single enzyme in the brain — is flawed. The new study, published in the Jan. 2 issue of Nature, found that mice lacking the enzyme that purportedly builds memory were in fact still able to form long-term memories as well as normal mice could.
“The prevailing theory is that when you learn something, you strengthen connections between your brain cells called synapses,” explains Richard Huganir, Ph.D., a professor and director of the Johns Hopkins University School of Medicine’s Solomon H. Snyder Department of Neuroscience. “The question is, how exactly does this strengthening happen?”
A research group at SUNY Downstate, led by Todd Sacktor, Ph.D., has suggested that key to the process is an enzyme they discovered, known as PKM-zeta. In 2006, Sacktor’s group made waves when it created a molecule that seemed to block the action of PKM-zeta — and only PKM-zeta. When the molecule, dubbed ZIP, was given to mice, it erased existing long-term memories. The molecule caught the attention of reporters and bloggers, who mused on the social and ethical implications of memory erasure.
But for researchers, ZIP was exciting primarily as a means for studying PKM-zeta. “Since 2006, many papers have been published on PKM-zeta and ZIP, but no one knew what PKM-zeta was acting on,” says Lenora Volk, Ph.D., a member of Huganir’s team. “We thought that learning the enzyme’s target could tell us a lot about how memories are stored and maintained.”
For the current study, Volk and fellow team member Julia Bachman made mice that lacked working PKM-zeta, so-called genetic “knockouts.” The goal was to compare the synapses of the modified mice with those of normal mice, and find clues about how the enzyme works.
But, says Volk, “what we got was not at all what we expected. We thought the strengthening capacity of the synapses would be impaired, but it wasn’t.” The brains of the mice without PKM-zeta were indistinguishable from those of other mice, she says. Additionally, the synapses of the PKM-zeta-less mice responded to the memory-erasing ZIP molecule just as the synapses of normal mice do.
The team then considered whether, in the absence of PKM-zeta, the mouse brains had honed a substitute synapse-building pathway, much in the way that a blind person learns to glean more information from her other senses. So the researchers made mice whose PKM-zeta genes functioned normally until they were given a drug that would suddenly shut the gene down. This allowed them to study PKM-zeta-less adult mice that had had no opportunity to develop a way around the loss of the gene. Still, the synapses of the so-called conditional knockout mice responded to stimuli just as synapses in normal mice did.
What this means, the researchers say, is that PKM-zeta is not the key long-term memory molecule previous studies had suggested, although it may have some role in memory. “We don’t know what this ZIP peptide is really acting on,” says Volk. “Finding out what its target is will be quite important, because then we can begin to understand at the molecular level how synapses strengthen and how memories form in response to stimuli.”

Study Refutes Accepted Model of Memory Formation

A study by Johns Hopkins researchers has shown that a widely accepted model of long-term memory formation — that it hinges on a single enzyme in the brain — is flawed. The new study, published in the Jan. 2 issue of Nature, found that mice lacking the enzyme that purportedly builds memory were in fact still able to form long-term memories as well as normal mice could.

“The prevailing theory is that when you learn something, you strengthen connections between your brain cells called synapses,” explains Richard Huganir, Ph.D., a professor and director of the Johns Hopkins University School of Medicine’s Solomon H. Snyder Department of Neuroscience. “The question is, how exactly does this strengthening happen?”

A research group at SUNY Downstate, led by Todd Sacktor, Ph.D., has suggested that key to the process is an enzyme they discovered, known as PKM-zeta. In 2006, Sacktor’s group made waves when it created a molecule that seemed to block the action of PKM-zeta — and only PKM-zeta. When the molecule, dubbed ZIP, was given to mice, it erased existing long-term memories. The molecule caught the attention of reporters and bloggers, who mused on the social and ethical implications of memory erasure.

But for researchers, ZIP was exciting primarily as a means for studying PKM-zeta. “Since 2006, many papers have been published on PKM-zeta and ZIP, but no one knew what PKM-zeta was acting on,” says Lenora Volk, Ph.D., a member of Huganir’s team. “We thought that learning the enzyme’s target could tell us a lot about how memories are stored and maintained.”

For the current study, Volk and fellow team member Julia Bachman made mice that lacked working PKM-zeta, so-called genetic “knockouts.” The goal was to compare the synapses of the modified mice with those of normal mice, and find clues about how the enzyme works.

But, says Volk, “what we got was not at all what we expected. We thought the strengthening capacity of the synapses would be impaired, but it wasn’t.” The brains of the mice without PKM-zeta were indistinguishable from those of other mice, she says. Additionally, the synapses of the PKM-zeta-less mice responded to the memory-erasing ZIP molecule just as the synapses of normal mice do.

The team then considered whether, in the absence of PKM-zeta, the mouse brains had honed a substitute synapse-building pathway, much in the way that a blind person learns to glean more information from her other senses. So the researchers made mice whose PKM-zeta genes functioned normally until they were given a drug that would suddenly shut the gene down. This allowed them to study PKM-zeta-less adult mice that had had no opportunity to develop a way around the loss of the gene. Still, the synapses of the so-called conditional knockout mice responded to stimuli just as synapses in normal mice did.

What this means, the researchers say, is that PKM-zeta is not the key long-term memory molecule previous studies had suggested, although it may have some role in memory. “We don’t know what this ZIP peptide is really acting on,” says Volk. “Finding out what its target is will be quite important, because then we can begin to understand at the molecular level how synapses strengthen and how memories form in response to stimuli.”

Filed under brain cells memory formation memory LTM synapses neuroscience science

126 notes

Imaging Study Examines Effect of Fructose on Brain Regions That Regulate Appetite
In a study examining possible factors regarding the associations between fructose consumption and weight gain, brain magnetic resonance imaging of study participants indicated that ingestion of glucose but not fructose reduced cerebral blood flow and activity in brain regions that regulate appetite, and ingestion of glucose but not fructose produced increased ratings of satiety and fullness, according to a preliminary study published in the January 2 issue of JAMA.
“Increases in fructose consumption have paralleled the increasing prevalence of obesity, and high-fructose diets are thought to promote weight gain and insulin resistance. Fructose ingestion produces smaller increases in circulating satiety hormones compared with glucose ingestion, and central administration of fructose provokes feeding in rodents, whereas centrally administered glucose promotes satiety,” according to background information in the article. “Thus, fructose possibly increases food-seeking behavior and increases food intake.” How brain regions associated with fructose- and glucose-mediated changes in animal feeding behaviors translates to humans is not completely understood.
Kathleen A. Page, M.D., of Yale University School of Medicine, New Haven, Conn., and colleagues conducted a study to examine neurophysiological factors that might underlie associations between fructose consumption and weight gain. The study included 20 healthy adult volunteers who underwent two magnetic resonance imaging sessions in conjunction with fructose or glucose drink ingestion. The primary outcome measure for the study was the relative changes in hypothalamic (a region of the brain) regional cerebral blood flow (CBF) after glucose or fructose ingestion.
The researchers found that there was a significantly greater reduction in hypothalamic CBF after glucose vs. fructose ingestion. “Glucose but not fructose ingestion reduced the activation of the hypothalamus, insula, and striatum—brain regions that regulate appetite, motivation, and reward processing; glucose ingestion also increased functional connections between the hypothalamic-striatal network and increased satiety.”
“The disparate responses to fructose were associated with reduced systemic levels of the satiety-signaling hormone insulin and were not likely attributable to an inability of fructose to cross the blood-brain barrier into the hypothalamus or to a lack of hypothalamic expression of genes necessary for fructose metabolism.”
(Image: iStockphoto)

Imaging Study Examines Effect of Fructose on Brain Regions That Regulate Appetite

In a study examining possible factors regarding the associations between fructose consumption and weight gain, brain magnetic resonance imaging of study participants indicated that ingestion of glucose but not fructose reduced cerebral blood flow and activity in brain regions that regulate appetite, and ingestion of glucose but not fructose produced increased ratings of satiety and fullness, according to a preliminary study published in the January 2 issue of JAMA.

“Increases in fructose consumption have paralleled the increasing prevalence of obesity, and high-fructose diets are thought to promote weight gain and insulin resistance. Fructose ingestion produces smaller increases in circulating satiety hormones compared with glucose ingestion, and central administration of fructose provokes feeding in rodents, whereas centrally administered glucose promotes satiety,” according to background information in the article. “Thus, fructose possibly increases food-seeking behavior and increases food intake.” How brain regions associated with fructose- and glucose-mediated changes in animal feeding behaviors translates to humans is not completely understood.

Kathleen A. Page, M.D., of Yale University School of Medicine, New Haven, Conn., and colleagues conducted a study to examine neurophysiological factors that might underlie associations between fructose consumption and weight gain. The study included 20 healthy adult volunteers who underwent two magnetic resonance imaging sessions in conjunction with fructose or glucose drink ingestion. The primary outcome measure for the study was the relative changes in hypothalamic (a region of the brain) regional cerebral blood flow (CBF) after glucose or fructose ingestion.

The researchers found that there was a significantly greater reduction in hypothalamic CBF after glucose vs. fructose ingestion. “Glucose but not fructose ingestion reduced the activation of the hypothalamus, insula, and striatum—brain regions that regulate appetite, motivation, and reward processing; glucose ingestion also increased functional connections between the hypothalamic-striatal network and increased satiety.”

“The disparate responses to fructose were associated with reduced systemic levels of the satiety-signaling hormone insulin and were not likely attributable to an inability of fructose to cross the blood-brain barrier into the hypothalamus or to a lack of hypothalamic expression of genes necessary for fructose metabolism.”

(Image: iStockphoto)

Filed under MRI brain activity cerebral blood flow fructose obesity science

137 notes

Second impact syndrome: A devastating injury to the young brain
Physicians at Indiana University School of Medicine and the Northwest Radiology Network (Indianapolis, Indiana) report the case of a 17-year-old high school football player with second impact syndrome (SIS). A rare and devastating traumatic brain injury, SIS occurs when a person, most often a teenager, sustains a second head injury before recovery from an earlier head injury is complete. To the best of the authors’ knowledge, this is the first reported case in which imaging studies were performed after both injuries, adding new knowledge of the event. Findings in this case are reported and discussed in “Second impact syndrome in football: new imaging and insights into a rare and devastating condition. Case report,” by Elizabeth Weinstein, M.D., and colleagues, published today online, ahead of print, in the Journal of Neurosurgery: Pediatrics.

Second impact syndrome: A devastating injury to the young brain

Physicians at Indiana University School of Medicine and the Northwest Radiology Network (Indianapolis, Indiana) report the case of a 17-year-old high school football player with second impact syndrome (SIS). A rare and devastating traumatic brain injury, SIS occurs when a person, most often a teenager, sustains a second head injury before recovery from an earlier head injury is complete. To the best of the authors’ knowledge, this is the first reported case in which imaging studies were performed after both injuries, adding new knowledge of the event. Findings in this case are reported and discussed in “Second impact syndrome in football: new imaging and insights into a rare and devastating condition. Case report,” by Elizabeth Weinstein, M.D., and colleagues, published today online, ahead of print, in the Journal of Neurosurgery: Pediatrics.

Filed under brain TBI second impact syndrome head injuries case study neuroscience science

41 notes

Late-Life Depression Associated with Prevalent Mild Cognitive Impairment, Increased Risk of Dementia

Depression in a group of Medicare recipients ages 65 years and older appears to be associated with prevalent mild cognitive impairment and an increased risk of dementia, according to a report published Online First by Archives of Neurology, a JAMA Network publication.

Depressive symptoms occur in 3 percent to 63 percent of patients with mild cognitive impairment (MCI) and some studies have shown an increased dementia risk in individuals with a history of depression. The mechanisms behind the association between depression and cognitive decline have not been made clear and different mechanisms have been proposed, according to the study background.

Edo Richard, M.D., Ph.D., of the University of Amsterdam, the Netherlands, and colleagues evaluated the association of late-life depression with MCI and dementia in a group of 2,160 community-dwelling Medicare recipients.

“We found that depression was related to a higher risk of prevalent MCI and dementia, incident dementia, and progression from prevalent MCI to dementia, but not to incident MCI,” the authors note.

Baseline depression was associated with prevalent MCI (odds ratio [OR], 1.4) and dementia (OR, 2.2), while baseline depression was associated with an increased risk of incident dementia (hazard ratio [HR], 1.7) but not with incident MCI (HR, 0.9). Patients with MCI and coexisting depression at baseline also had a higher risk of progression to dementia (HR, 2.0), especially vascular dementia (HR, 4.3), but not Alzheimer disease (HR, 1.9), according to the study results.

“Our finding that depression was associated cross sectionally with both MCI and dementia and longitudinally only with dementia suggests that depression develops with the transition from normal cognition to dementia,” the authors conclude.

(Source: media.jamanetwork.com)

Filed under depression MCI cognitive impairment dementia neuroscience science

148 notes

Definitive proof for receptor’s role in synapse development
Jackson Laboratory researchers led by Associate Professor Zhong-wei Zhang, Ph.D., have provided direct evidence that a specific neurotransmitter receptor is vital to the process of pruning synapses in the brains of newborn mammals.
Faulty pruning at this early developmental stage is implicated in autism-spectrum disorders and schizophrenia. The definitive evidence for N-methyl-D-aspartate receptor (NMDAR) in pruning has eluded researchers until now, but in research published in the Proceedings of the National Academy of Sciences, Zhang’s lab had serendipitous help in the form of a mouse model containing brain cells lacking NMDAR side-by-side with cells containing the receptor.
Soon after birth, mammals’ brains undergo significant development and change. Initially, large numbers of synapses form between neurons. Then, in response to stimuli, the synaptic connections are refined—some synapses are strengthened and others eliminated, or pruned.
In most synapses, glutamate serves as the neurotransmitter, and NMDAR, a major type of post-synaptic glutamate receptor, was previously known to play an important role in neural circuit development. Previous research has implicated the importance of NMDARs in pruning, but it remained unclear whether they played a direct or indirect role.
Zhang and colleagues focused on the thalamus, a brain region where synapse pruning and strengthening can be monitored and quantified with relative ease. They got unexpected help when they realized the mouse model they were using had thalamus cells lacking NMDARs right next to cells with normal NMDAR levels.
The researchers showed that the refinement process was disrupted in the absence of NMDARs. At the same time, neighboring neurons with the receptors proceeded through normal synaptic strengthening and pruning, clearly establishing the necessity of NMDARs in postsynaptic neurons for synaptic refinement.
"Whenever I give a talk or meet colleagues," Zhang says, "the first question that comes up is whether the NMDA receptor is important. It’s good that this is now settled definitively."
There has been extensive research into synaptic strengthening, and most of these studies indicate that the presence of NMDARs may support the recruitment of larger numbers of another kind of glutamate receptor to strengthen the synaptic connections. How NMDARs regulate the pruning process remains largely unknown, however.

Definitive proof for receptor’s role in synapse development

Jackson Laboratory researchers led by Associate Professor Zhong-wei Zhang, Ph.D., have provided direct evidence that a specific neurotransmitter receptor is vital to the process of pruning synapses in the brains of newborn mammals.

Faulty pruning at this early developmental stage is implicated in autism-spectrum disorders and schizophrenia. The definitive evidence for N-methyl-D-aspartate receptor (NMDAR) in pruning has eluded researchers until now, but in research published in the Proceedings of the National Academy of Sciences, Zhang’s lab had serendipitous help in the form of a mouse model containing brain cells lacking NMDAR side-by-side with cells containing the receptor.

Soon after birth, mammals’ brains undergo significant development and change. Initially, large numbers of synapses form between neurons. Then, in response to stimuli, the synaptic connections are refined—some synapses are strengthened and others eliminated, or pruned.

In most synapses, glutamate serves as the neurotransmitter, and NMDAR, a major type of post-synaptic glutamate receptor, was previously known to play an important role in neural circuit development. Previous research has implicated the importance of NMDARs in pruning, but it remained unclear whether they played a direct or indirect role.

Zhang and colleagues focused on the thalamus, a brain region where synapse pruning and strengthening can be monitored and quantified with relative ease. They got unexpected help when they realized the mouse model they were using had thalamus cells lacking NMDARs right next to cells with normal NMDAR levels.

The researchers showed that the refinement process was disrupted in the absence of NMDARs. At the same time, neighboring neurons with the receptors proceeded through normal synaptic strengthening and pruning, clearly establishing the necessity of NMDARs in postsynaptic neurons for synaptic refinement.

"Whenever I give a talk or meet colleagues," Zhang says, "the first question that comes up is whether the NMDA receptor is important. It’s good that this is now settled definitively."

There has been extensive research into synaptic strengthening, and most of these studies indicate that the presence of NMDARs may support the recruitment of larger numbers of another kind of glutamate receptor to strengthen the synaptic connections. How NMDARs regulate the pruning process remains largely unknown, however.

Filed under synaptic connections receptors neurotransmitters brain cells synapses neuroscience science

free counters