Neuroscience

Articles and news from the latest research reports.

80 notes

The pain puzzle: Uncovering how morphine increases pain in some people
For individuals with agonizing pain, it is a cruel blow when the gold-standard medication actually causes more pain. Adults and children whose pain gets worse when treated with morphine may be closer to a solution, based on research published in the January 6 on-line edition of Nature Neuroscience.
"Our research identifies a molecular pathway by which morphine can increase pain, and suggests potential new ways to make morphine effective for more patients," says senior author Dr. Yves De Koninck, Professor at Université Laval in Quebec City. The team included researchers from The Hospital for Sick Children (SickKids) in Toronto, the Institut universitaire en santé mentale de Québec, the US and Italy.
New pathway in pain management
The research not only identifies a target pathway to suppress morphine-induced pain but teases apart the pain hypersensitivity caused by morphine from tolerance to morphine, two phenomena previously considered to be caused by the same mechanisms.
"When morphine doesn’t reduce pain adequately the tendency is to increase the dosage. If a higher dosage produces pain relief, this is the classic picture of morphine tolerance, which is very well known. But sometimes increasing the morphine can, paradoxically, makes the pain worse," explains co-author Dr. Michael Salter. Dr. Salter is Senior Scientist and Head of Neurosciences & Mental Health at SickKids, Professor of Physiology at University of Toronto, and Canada Research Chair in Neuroplasticity and Pain.
"Pain experts have thought tolerance and hypersensitivity (or hyperalgesia) are simply different reflections of the same response," says Dr. De Koninck, "but we discovered that cellular and signalling processes for morphine tolerance are very different from those of morphine-induced pain."
Dr. Salter adds, “We identified specialized cells – known as microglia – in the spinal cord as the culprit behind morphine-induced pain hypersensitivity. When morphine acts on certain receptors in microglia, it triggers the cascade of events that ultimately increase, rather than decrease, activity of the pain-transmitting nerve cells.”
The researchers also identified the molecule responsible for this side effect of morphine. “It’s a protein called KCC2, which regulates the transport of chloride ions and the proper control of sensory signals to the brain,” explains Dr. De Koninck. “Morphine inhibits the activity of this protein, causing abnormal pain perception. By restoring normal KCC2 activity we could potentially prevent pain hypersensitivity.” Dr. De Koninck and researchers at Université Laval are testing new molecules capable of preserving KCC2 functions and thus preventing hyperalgesia.
The KCC2 pathway appears to apply to short-term as well as to long-term morphine administration, says Dr. De Koninck. “Thus, we have the foundation for new strategies to improve the treatment of post-operative as well as chronic pain.”
Dr. Salter adds, “Our discovery could have a major impact on individuals with various types of intractable pain, such as that associated with cancer or nerve damage, who have stopped morphine or other opiate medications because of pain hypersensitivity.”
Cost of pain
Pain has been labelled the silent health crisis, afflicting tens of millions of people worldwide. Pain has a profound negative effect on the quality of human life. Pain affects nearly all aspects of human existence, with untreated or under-treated pain being the most common cause of disability. The Canadian Pain Society estimates that chronic pain affects at least one in five Canadians and costs Canada $55-60 billion per year, including health care expenses and lost productivity.
"People with incapacitating pain may be left with no alternatives when our most powerful medications intensify their suffering," says Dr. De Koninck, who is also Director of Cellular and Molecular Neuroscience at Institut universitaire en santé mentale de Québec.
Dr. Salter adds, “Pain interferes with many aspects of an individual’s life. Too often, patients with chronic pain feel abandoned and stigmatized. Among the many burdens on individuals and their families, chronic pain is linked to increased risk of suicide. The burden of chronic pain affects children and teens as well as adults.” These risks affect individuals with many types of pain, ranging from migraine and carpel-tunnel syndrome to cancer, AIDS, diabetes, traumatic injuries, Parkinson’s disease and dozens of other conditions.

The pain puzzle: Uncovering how morphine increases pain in some people

For individuals with agonizing pain, it is a cruel blow when the gold-standard medication actually causes more pain. Adults and children whose pain gets worse when treated with morphine may be closer to a solution, based on research published in the January 6 on-line edition of Nature Neuroscience.

"Our research identifies a molecular pathway by which morphine can increase pain, and suggests potential new ways to make morphine effective for more patients," says senior author Dr. Yves De Koninck, Professor at Université Laval in Quebec City. The team included researchers from The Hospital for Sick Children (SickKids) in Toronto, the Institut universitaire en santé mentale de Québec, the US and Italy.

New pathway in pain management

The research not only identifies a target pathway to suppress morphine-induced pain but teases apart the pain hypersensitivity caused by morphine from tolerance to morphine, two phenomena previously considered to be caused by the same mechanisms.

"When morphine doesn’t reduce pain adequately the tendency is to increase the dosage. If a higher dosage produces pain relief, this is the classic picture of morphine tolerance, which is very well known. But sometimes increasing the morphine can, paradoxically, makes the pain worse," explains co-author Dr. Michael Salter. Dr. Salter is Senior Scientist and Head of Neurosciences & Mental Health at SickKids, Professor of Physiology at University of Toronto, and Canada Research Chair in Neuroplasticity and Pain.

"Pain experts have thought tolerance and hypersensitivity (or hyperalgesia) are simply different reflections of the same response," says Dr. De Koninck, "but we discovered that cellular and signalling processes for morphine tolerance are very different from those of morphine-induced pain."

Dr. Salter adds, “We identified specialized cells – known as microglia – in the spinal cord as the culprit behind morphine-induced pain hypersensitivity. When morphine acts on certain receptors in microglia, it triggers the cascade of events that ultimately increase, rather than decrease, activity of the pain-transmitting nerve cells.”

The researchers also identified the molecule responsible for this side effect of morphine. “It’s a protein called KCC2, which regulates the transport of chloride ions and the proper control of sensory signals to the brain,” explains Dr. De Koninck. “Morphine inhibits the activity of this protein, causing abnormal pain perception. By restoring normal KCC2 activity we could potentially prevent pain hypersensitivity.” Dr. De Koninck and researchers at Université Laval are testing new molecules capable of preserving KCC2 functions and thus preventing hyperalgesia.

The KCC2 pathway appears to apply to short-term as well as to long-term morphine administration, says Dr. De Koninck. “Thus, we have the foundation for new strategies to improve the treatment of post-operative as well as chronic pain.”

Dr. Salter adds, “Our discovery could have a major impact on individuals with various types of intractable pain, such as that associated with cancer or nerve damage, who have stopped morphine or other opiate medications because of pain hypersensitivity.”

Cost of pain

Pain has been labelled the silent health crisis, afflicting tens of millions of people worldwide. Pain has a profound negative effect on the quality of human life. Pain affects nearly all aspects of human existence, with untreated or under-treated pain being the most common cause of disability. The Canadian Pain Society estimates that chronic pain affects at least one in five Canadians and costs Canada $55-60 billion per year, including health care expenses and lost productivity.

"People with incapacitating pain may be left with no alternatives when our most powerful medications intensify their suffering," says Dr. De Koninck, who is also Director of Cellular and Molecular Neuroscience at Institut universitaire en santé mentale de Québec.

Dr. Salter adds, “Pain interferes with many aspects of an individual’s life. Too often, patients with chronic pain feel abandoned and stigmatized. Among the many burdens on individuals and their families, chronic pain is linked to increased risk of suicide. The burden of chronic pain affects children and teens as well as adults.” These risks affect individuals with many types of pain, ranging from migraine and carpel-tunnel syndrome to cancer, AIDS, diabetes, traumatic injuries, Parkinson’s disease and dozens of other conditions.

Filed under pain pain management morphine molecular pathways neuroscience science

307 notes

Humanity becomes technology
Humanity’s merge with its technology, which began shortly after the taming of fire, is still happening today. Many predict that the fine-tuning of our DNA-based biology through stem cell and genetic research will spark a powerful nanotech revolution that promises to redesign and rebuild our bodies and the environment, pushing the limits of today’s understanding of life and the world we live in.
Nanotech will change our physical world much the same way that computers have transformed our information world. Physical things such as cars and houses could follow the same path of computers, when Moore’s Law correctly predicted value-to-cost would increase by 50% every 18 months.
Existing products that are now expensive, such as photovoltaic solar cells, will become so cheap in the decades ahead, that it may one day be possible to surface roads with solar-collecting materials that would also gather energy to power cars, ending much of the world’s dependency on fossil fuels.
In addition, imagine machines that create clothing, medicine, food and most essentials, with only your voice needed to command the action. Today, such devices are not available, but by early 2030s, experts predict, a home nanofactory will provide most of your family’s needs at little or no cost.
Now bring on the most amazing impending revolution – human-level robots – with intelligence derived from us, but with redesigned bodies that exceed human capabilities. These powerful android creatures expected by 2030, will enable us to tap into their super-computer minds to increase our own intelligence. Constructed with molecular nanotech processes, they will be affordable for every family.
Finally, by mid-century, many people will complete the technology merge by replacing more of their biology with nanomaterials, creating a powerful body that can automatically repair itself when damaged. No more concerns over sickness, accidents, or unwanted death.
Evolution created humanity; humanity created technology, humanity will soon become technology. This is simply our next evolutionary step. Where this trip will take us may be beyond present day knowledge, but whatever the future holds, many people alive today can expect to experience all of its wonders.
Of course, not everyone may hold such a glowing vision of how life may unfold, but for one who has seen so many amazing changes over the past eighty two years, I think it difficult to imagine a negative outcome as we trek through what promises to be an incredible future.

Humanity becomes technology

Humanity’s merge with its technology, which began shortly after the taming of fire, is still happening today. Many predict that the fine-tuning of our DNA-based biology through stem cell and genetic research will spark a powerful nanotech revolution that promises to redesign and rebuild our bodies and the environment, pushing the limits of today’s understanding of life and the world we live in.

Nanotech will change our physical world much the same way that computers have transformed our information world. Physical things such as cars and houses could follow the same path of computers, when Moore’s Law correctly predicted value-to-cost would increase by 50% every 18 months.

Existing products that are now expensive, such as photovoltaic solar cells, will become so cheap in the decades ahead, that it may one day be possible to surface roads with solar-collecting materials that would also gather energy to power cars, ending much of the world’s dependency on fossil fuels.

In addition, imagine machines that create clothing, medicine, food and most essentials, with only your voice needed to command the action. Today, such devices are not available, but by early 2030s, experts predict, a home nanofactory will provide most of your family’s needs at little or no cost.

Now bring on the most amazing impending revolution – human-level robots – with intelligence derived from us, but with redesigned bodies that exceed human capabilities. These powerful android creatures expected by 2030, will enable us to tap into their super-computer minds to increase our own intelligence. Constructed with molecular nanotech processes, they will be affordable for every family.

Finally, by mid-century, many people will complete the technology merge by replacing more of their biology with nanomaterials, creating a powerful body that can automatically repair itself when damaged. No more concerns over sickness, accidents, or unwanted death.

Evolution created humanity; humanity created technology, humanity will soon become technology. This is simply our next evolutionary step. Where this trip will take us may be beyond present day knowledge, but whatever the future holds, many people alive today can expect to experience all of its wonders.

Of course, not everyone may hold such a glowing vision of how life may unfold, but for one who has seen so many amazing changes over the past eighty two years, I think it difficult to imagine a negative outcome as we trek through what promises to be an incredible future.

Filed under technology nanotech robotics AI evolution science

248 notes

Totally blind mice get sight back
Totally blind mice have had their sight restored by injections of light-sensing cells into the eye, UK researchers report. The team in Oxford said their studies closely resemble the treatments that would be needed in people with degenerative eye disease. Similar results have already been achieved with night-blind mice.
Experts said the field was advancing rapidly, but there were still questions about the quality of vision restored. Patients with retinitis pigmentosa gradually lose light-sensing cells from the retina and can become blind. The research team, at the University of Oxford, used mice with a complete lack of light-sensing photoreceptor cells in their retinas. The mice were unable to tell the difference between light and dark.
Reconstruction
They injected “precursor” cells which will develop into the building blocks of a retina once inside the eye. Two weeks after the injections a retina had formed, according to the findings presented in the Proceedings of the National Academy of Sciences journal. Prof Robert MacLaren said: “We have recreated the whole structure, basically it’s the first proof that you can take a completely blind mouse, put the cells in and reconstruct the entire light-sensitive layer.”
Previous studies have achieved similar results with mice that had a partially degenerated retina. Prof MacLaren said this was like “restoring a whole computer screen rather than repairing individual pixels”. The mice were tested to see if they fled being in a bright area, if their pupils constricted in response to light and had their brain scanned to see if visual information was being processed by the mind.
Vision
Prof Pete Coffee, from the Institute of Ophthalmology at University College London, said the findings were important as they looked at the “most clinically relevant and severe case” of blindness. “This is probably what you would need to do to restore sight in a patient that has lost their vision,” he said.
However, he said this and similar studies needed to show how good the recovered vision was as brain scans and tests of light sensitivity were not enough. He said: “Can they tell the difference between a nasty animal and something to eat?”
Prof Robin Ali published research in the journal Nature showing that transplanting cells could restore vision in night-blind mice and then showed the same technique worked in a range of mice with degenerated retinas. He said: “These papers demonstrate that it is possible to transplant photoreceptor cells into a range of mice even with a severe level of degeneration. “I think it’s great that another group is showing the utility of photoreceptor transplantation.”
Researchers are already trialling human embryonic stem cells, at Moorfields Eye Hospital, in patients with Stargardt’s disease. Early results suggest the technique is safe but reliable results will take several years.
Retinal chips or bionic eyes are also being trailed in patients with retinitis pigmentosa.

Totally blind mice get sight back

Totally blind mice have had their sight restored by injections of light-sensing cells into the eye, UK researchers report. The team in Oxford said their studies closely resemble the treatments that would be needed in people with degenerative eye disease. Similar results have already been achieved with night-blind mice.

Experts said the field was advancing rapidly, but there were still questions about the quality of vision restored. Patients with retinitis pigmentosa gradually lose light-sensing cells from the retina and can become blind. The research team, at the University of Oxford, used mice with a complete lack of light-sensing photoreceptor cells in their retinas. The mice were unable to tell the difference between light and dark.

Reconstruction

They injected “precursor” cells which will develop into the building blocks of a retina once inside the eye. Two weeks after the injections a retina had formed, according to the findings presented in the Proceedings of the National Academy of Sciences journal. Prof Robert MacLaren said: “We have recreated the whole structure, basically it’s the first proof that you can take a completely blind mouse, put the cells in and reconstruct the entire light-sensitive layer.”

Previous studies have achieved similar results with mice that had a partially degenerated retina. Prof MacLaren said this was like “restoring a whole computer screen rather than repairing individual pixels”. The mice were tested to see if they fled being in a bright area, if their pupils constricted in response to light and had their brain scanned to see if visual information was being processed by the mind.

Vision

Prof Pete Coffee, from the Institute of Ophthalmology at University College London, said the findings were important as they looked at the “most clinically relevant and severe case” of blindness. “This is probably what you would need to do to restore sight in a patient that has lost their vision,” he said.

However, he said this and similar studies needed to show how good the recovered vision was as brain scans and tests of light sensitivity were not enough. He said: “Can they tell the difference between a nasty animal and something to eat?”

Prof Robin Ali published research in the journal Nature showing that transplanting cells could restore vision in night-blind mice and then showed the same technique worked in a range of mice with degenerated retinas. He said: “These papers demonstrate that it is possible to transplant photoreceptor cells into a range of mice even with a severe level of degeneration. “I think it’s great that another group is showing the utility of photoreceptor transplantation.”

Researchers are already trialling human embryonic stem cells, at Moorfields Eye Hospital, in patients with Stargardt’s disease. Early results suggest the technique is safe but reliable results will take several years.

Retinal chips or bionic eyes are also being trailed in patients with retinitis pigmentosa.

Filed under retina light-sensing cells retinitis pigmentosa eye disease photoreceptors retinal degeneration neuroscience science

130 notes

Decoding Dreams
“[I was] somewhere, in a place like a studio to make a TV program or something,” a groggy study participant recounted (in Japanese). “A male person ran with short steps from the left side to the right side. Then, he tumbled.” The participant had recently been awoken by Masako Tamaki, a postdoc in the lab of neuroscientist Yukiyasu Kamitani of the ATR Computational Neuroscience Laboratories in Kyoto, Japan. He was lying in a functional magnetic resonance imaging (fMRI) scanner, doing his best to recall what he had been dreaming about. “He stumbled over something, and stood up while laughing, and said something,” the participant continued. “He said something to persons on the left side.”
At first blush, the story doesn’t seem particularly informative. But the study subject saw a man, not a woman. And he was inside some sort of workplace. That fragmented information is enough for Kamitani and his team, who recorded dream appearances of 20 key objects, such as “male” or “room,” and used a machine-learning algorithm to correlate those concepts with the fMRI images to find patterns that could be used to predict what people were dreaming about without having to wake them. Such information could help inform the study of why people dream, an elusive question in neurobiology, Kamitani says. “Knowing what is represented during sleep would help to understand the function of dreaming.”
Analyzing more than 200 dream reports—some 30–45 hours of interviews with each of three participants—Kamitani and his colleagues built a “dream-trained decoder” based on fMRI imagery of the V1, V2, and V3 areas of the visual cortex. “We find some rule, or mapping, or pattern between what the person is seeing and what activity is happening in the brain,” Kamitani explains. And it worked, according to Kamitani, who presented the results at the Society for Neuroscience meeting in New Orleans in October 2012, predicting whether or not the 20 objects occurred in dreams with 75–80 percent accuracy.
But while Kamitani’s dream-decoding study is interesting, says neurobiologist David Kahn of Harvard Medical School, the algorithms used are quite primitive, only providing a handful of clues about the dream’s content. “We still have a long way to go before we can actually re-create the story that is the dream,” he says. “This is almost science fiction, because we’re way, way far from it … [but] this is an added tool.”
“Decoding is very primitive,” Kamitani agrees, “but I think there are a lot of potentials.” One way to get a more complete picture of the dream is to increase the complexity of the decoder, he notes. In this first study, for example, the researchers focused on nouns representing visual objects, but going forward, Kamitani says he hopes to include other concepts, like verbs. “By analyzing that aspect we may be able to add some action aspects in the dream.”
Furthermore, researchers might not have to fully interpret the dream themselves to benefit from the new decoder. Instead, the clues gleaned from the fMRI images could simply be used to jog participants’ memories. “We know that dreams—even the most vivid dreams we remember, [like] nightmares or lucid dreams—are really fragile memories,” says Antonio Zadra, an experimental psychologist at the University of Montreal. “Unless you wrote it down or told it to someone in the morning, usually even before lunch, that memory will start fading. And by night, you might just have the essence.”
Unfortunately, that failing memory was the only resource for researchers studying dreams. Now, with a little bit of supplemental information, they may be able to help participants recall dreams more precisely. “The subjective reports are never complete,” Kamitani says. “By giving the subject what we reconstructed, they may remember something more.”
At an even more basic level, the decoder could help scientists understand what’s happening in the brain during dreaming. “To create this whole virtual world out of nothing—with no visual input or auditory input—is quite fascinating and undoubtedly very complex,” Zadra says. “This research will certainly help us better understand what brain areas are doing what, to even allow for this to happen.”
In Kamitani’s study, for example, the researchers found that areas of higher-level visual processing, which respond to more abstract features, were more useful for interpreting dream content than lower-level processing areas. This makes sense, given that those lower areas of the visual cortex are more closely connected to the direct input from the retina. But, Kamitani notes, this could simply have to do with the way the study was designed. “We didn’t train the decoder with low-level visual features,” such as shape or contrast, he says. “We just used the semantic category information.”
Indeed, given the richness of the dreaming experience, such visual qualities may well be encoded during sleep. “Your brain creates a whole virtual world for you when you are dreaming, complete with characters, settings, interactions, dialogues,” says Zadra. “But you’re actually in your bed asleep; there is no visual input. So your brain is literally creating this virtual world from A to Z.”

Decoding Dreams

“[I was] somewhere, in a place like a studio to make a TV program or something,” a groggy study participant recounted (in Japanese). “A male person ran with short steps from the left side to the right side. Then, he tumbled.” The participant had recently been awoken by Masako Tamaki, a postdoc in the lab of neuroscientist Yukiyasu Kamitani of the ATR Computational Neuroscience Laboratories in Kyoto, Japan. He was lying in a functional magnetic resonance imaging (fMRI) scanner, doing his best to recall what he had been dreaming about. “He stumbled over something, and stood up while laughing, and said something,” the participant continued. “He said something to persons on the left side.”

At first blush, the story doesn’t seem particularly informative. But the study subject saw a man, not a woman. And he was inside some sort of workplace. That fragmented information is enough for Kamitani and his team, who recorded dream appearances of 20 key objects, such as “male” or “room,” and used a machine-learning algorithm to correlate those concepts with the fMRI images to find patterns that could be used to predict what people were dreaming about without having to wake them. Such information could help inform the study of why people dream, an elusive question in neurobiology, Kamitani says. “Knowing what is represented during sleep would help to understand the function of dreaming.”

Analyzing more than 200 dream reports—some 30–45 hours of interviews with each of three participants—Kamitani and his colleagues built a “dream-trained decoder” based on fMRI imagery of the V1, V2, and V3 areas of the visual cortex. “We find some rule, or mapping, or pattern between what the person is seeing and what activity is happening in the brain,” Kamitani explains. And it worked, according to Kamitani, who presented the results at the Society for Neuroscience meeting in New Orleans in October 2012, predicting whether or not the 20 objects occurred in dreams with 75–80 percent accuracy.

But while Kamitani’s dream-decoding study is interesting, says neurobiologist David Kahn of Harvard Medical School, the algorithms used are quite primitive, only providing a handful of clues about the dream’s content. “We still have a long way to go before we can actually re-create the story that is the dream,” he says. “This is almost science fiction, because we’re way, way far from it … [but] this is an added tool.”

“Decoding is very primitive,” Kamitani agrees, “but I think there are a lot of potentials.” One way to get a more complete picture of the dream is to increase the complexity of the decoder, he notes. In this first study, for example, the researchers focused on nouns representing visual objects, but going forward, Kamitani says he hopes to include other concepts, like verbs. “By analyzing that aspect we may be able to add some action aspects in the dream.”

Furthermore, researchers might not have to fully interpret the dream themselves to benefit from the new decoder. Instead, the clues gleaned from the fMRI images could simply be used to jog participants’ memories. “We know that dreams—even the most vivid dreams we remember, [like] nightmares or lucid dreams—are really fragile memories,” says Antonio Zadra, an experimental psychologist at the University of Montreal. “Unless you wrote it down or told it to someone in the morning, usually even before lunch, that memory will start fading. And by night, you might just have the essence.”

Unfortunately, that failing memory was the only resource for researchers studying dreams. Now, with a little bit of supplemental information, they may be able to help participants recall dreams more precisely. “The subjective reports are never complete,” Kamitani says. “By giving the subject what we reconstructed, they may remember something more.”

At an even more basic level, the decoder could help scientists understand what’s happening in the brain during dreaming. “To create this whole virtual world out of nothing—with no visual input or auditory input—is quite fascinating and undoubtedly very complex,” Zadra says. “This research will certainly help us better understand what brain areas are doing what, to even allow for this to happen.”

In Kamitani’s study, for example, the researchers found that areas of higher-level visual processing, which respond to more abstract features, were more useful for interpreting dream content than lower-level processing areas. This makes sense, given that those lower areas of the visual cortex are more closely connected to the direct input from the retina. But, Kamitani notes, this could simply have to do with the way the study was designed. “We didn’t train the decoder with low-level visual features,” such as shape or contrast, he says. “We just used the semantic category information.”

Indeed, given the richness of the dreaming experience, such visual qualities may well be encoded during sleep. “Your brain creates a whole virtual world for you when you are dreaming, complete with characters, settings, interactions, dialogues,” says Zadra. “But you’re actually in your bed asleep; there is no visual input. So your brain is literally creating this virtual world from A to Z.”

Filed under Neuroscience 2012 dream-trained decoder dreaming neuroscience sleep brain science

252 notes

Scientists explore the illusion of memory
A memory might seem like a permanent, precious essence carved deep into the circuits of the brain. But it is not. Instead, scientists are discovering that a memory changes every time you think about it.
"Every time you recall a memory, it becomes sensitive to disruption. Often that is used to incorporate new information into it." That’s the blunt assessment from one of the world’s leading experts on memory, Dr. Eric Kandel from Columbia University.
And that means our memories are not abstract snapshots stored forever in a bulging file in our mind, but rather, they’re a collection of brain cells — neurons that undergo chemical changes every time they’re engaged.
So when we think about something from the past, the memory is called up like a computer file, reviewed and revised in subtle ways, and then sent back to the brain’s archives, now modified slightly, updated, and changed.
As scientists increasingly understand the biological process of memory, they are also learning how to interrupt it, and that means they might one day be able to ease the pain of past trauma, or alter destructive habits and addictions, as though shaking an Etch A Sketch, erasing the scribbles on the mind, and starting fresh.
In his McGill University lab, researcher Karim Nader routinely erases the memory of his laboratory rats. But first he has to give them a memory and he does that by putting them in an isolation cubicle, playing a tone, and then delivering a small electrical shock to their feet.
Read more

Scientists explore the illusion of memory

A memory might seem like a permanent, precious essence carved deep into the circuits of the brain. But it is not. Instead, scientists are discovering that a memory changes every time you think about it.

"Every time you recall a memory, it becomes sensitive to disruption. Often that is used to incorporate new information into it." That’s the blunt assessment from one of the world’s leading experts on memory, Dr. Eric Kandel from Columbia University.

And that means our memories are not abstract snapshots stored forever in a bulging file in our mind, but rather, they’re a collection of brain cells — neurons that undergo chemical changes every time they’re engaged.

So when we think about something from the past, the memory is called up like a computer file, reviewed and revised in subtle ways, and then sent back to the brain’s archives, now modified slightly, updated, and changed.

As scientists increasingly understand the biological process of memory, they are also learning how to interrupt it, and that means they might one day be able to ease the pain of past trauma, or alter destructive habits and addictions, as though shaking an Etch A Sketch, erasing the scribbles on the mind, and starting fresh.

In his McGill University lab, researcher Karim Nader routinely erases the memory of his laboratory rats. But first he has to give them a memory and he does that by putting them in an isolation cubicle, playing a tone, and then delivering a small electrical shock to their feet.

Read more

Filed under brain memory memory disruption PTSD OCD neuroscience psychology science

98 notes

Why good resolutions about taking up a physical activity can be hard to keep
The collective appraisal conducted by Inserm in 2008 highlighted the many preventive health benefits of regular physical activity. Such activity is limited, however, by our lifestyle in today’s industrial society. While varying degrees of physical inactivity may be partly explained by social causes, they are also rooted in biology.
“The inability to experience pleasure during physical activity, which is often quoted as one explanation why people partially or completely drop out of physical exercise programmes, is a clear sign that the biology of the nervous system is involved”, explains Francis Chaouloff.
But how exactly? The neurobiological mechanisms underlying physical inactivity had yet to be identified.
Francis Chaouloff (Giovanni Marsicano’s team at the NeuroCentre Magendie; Inserm joint research unit, Université Bordeaux Ségalen) and his team have now begun to decipher these mechanisms. Their work clearly identifies the endogenous cannabinoid (or endocannabinoid) system as playing a decisive role, in particular one of its brain receptors. This is by no means the first time that data has pointed to interactions between the endocannabinoid system, which is the target of delta9-tetrahydrocannabinol (the active ingredient of cannabis), and physical exercise. It was discovered ten years ago that physical exercise activated the endocannabinoid system in trained sportsmen, but its exact role remained a mystery for many years. Three years ago, the same research team in Bordeaux observed that when given the opportunity to use a running wheel, mutant mice lacking the CB1 cannabinoid receptor, which is the principal receptor of the endocannabinoid system in the brain, ran for a shorter time and over shorter distances than healthy mice. The research published in Biological Psychiatry this month seeks to understand how, where and why the lack of CB1 receptor reduces voluntary exercise performance (by 20 to 30%) in mice allowed access to a running wheel three hours per day.
The researchers used various lines of mutant mice for the CB1 receptor, together with pharmacological tools. They began by demonstrating that the CB1 receptor controlling running performance is located at the GABAergic nerve endings. They went on to show that the receptor is located in the ventral tegmental area of the brain, which is an area involved in motivational processes relating to reward, whether the reward is natural (food, sex) or associated with the consumption of psychoactive substances.

Why good resolutions about taking up a physical activity can be hard to keep

The collective appraisal conducted by Inserm in 2008 highlighted the many preventive health benefits of regular physical activity. Such activity is limited, however, by our lifestyle in today’s industrial society. While varying degrees of physical inactivity may be partly explained by social causes, they are also rooted in biology.

“The inability to experience pleasure during physical activity, which is often quoted as one explanation why people partially or completely drop out of physical exercise programmes, is a clear sign that the biology of the nervous system is involved”, explains Francis Chaouloff.

But how exactly? The neurobiological mechanisms underlying physical inactivity had yet to be identified.

Francis Chaouloff (Giovanni Marsicano’s team at the NeuroCentre Magendie; Inserm joint research unit, Université Bordeaux Ségalen) and his team have now begun to decipher these mechanisms. Their work clearly identifies the endogenous cannabinoid (or endocannabinoid) system as playing a decisive role, in particular one of its brain receptors. This is by no means the first time that data has pointed to interactions between the endocannabinoid system, which is the target of delta9-tetrahydrocannabinol (the active ingredient of cannabis), and physical exercise. It was discovered ten years ago that physical exercise activated the endocannabinoid system in trained sportsmen, but its exact role remained a mystery for many years. Three years ago, the same research team in Bordeaux observed that when given the opportunity to use a running wheel, mutant mice lacking the CB1 cannabinoid receptor, which is the principal receptor of the endocannabinoid system in the brain, ran for a shorter time and over shorter distances than healthy mice. The research published in Biological Psychiatry this month seeks to understand how, where and why the lack of CB1 receptor reduces voluntary exercise performance (by 20 to 30%) in mice allowed access to a running wheel three hours per day.

The researchers used various lines of mutant mice for the CB1 receptor, together with pharmacological tools. They began by demonstrating that the CB1 receptor controlling running performance is located at the GABAergic nerve endings. They went on to show that the receptor is located in the ventral tegmental area of the brain, which is an area involved in motivational processes relating to reward, whether the reward is natural (food, sex) or associated with the consumption of psychoactive substances.

Filed under cannabinoids endocannabinoid system neurotransmitters physical activity physical exercise neuroscience science

111 notes

Editing the genome with high precision
Researchers at MIT, the Broad Institute and Rockefeller University have developed a new technique for precisely altering the genomes of living cells by adding or deleting genes. The researchers say the technology could offer an easy-to-use, less-expensive way to engineer organisms that produce biofuels; to design animal models to study human disease; and  to develop new therapies, among other potential applications.
To create their new genome-editing technique, the researchers modified a set of bacterial proteins that normally defend against viral invaders. Using this system, scientists can alter several genome sites simultaneously and can achieve much greater control over where new genes are inserted, says Feng Zhang, an assistant professor of brain and cognitive sciences at MIT and leader of the research team.
“Anything that requires engineering of an organism to put in new genes or to modify what’s in the genome will be able to benefit from this,” says Zhang, who is a core member of the Broad Institute and MIT’s McGovern Institute for Brain Research.
Zhang and his colleagues describe the new technique in the Jan. 3 online edition of Science. Lead authors of the paper are graduate students Le Cong and Ann Ran.
Early efforts
The first genetically altered mice were created in the 1980s by adding small pieces of DNA to mouse embryonic cells. This method is now widely used to create transgenic mice for the study of human disease, but, because it inserts DNA randomly in the genome, researchers can’t target the newly delivered genes to replace existing ones.
In recent years, scientists have sought more precise ways to edit the genome. One such method, known as homologous recombination, involves delivering a piece of DNA that includes the gene of interest flanked by sequences that match the genome region where the gene is to be inserted. However, this technique’s success rate is very low because the natural recombination process is rare in normal cells.
More recently, biologists discovered that they could improve the efficiency of this process by adding enzymes called nucleases, which can cut DNA. Zinc fingers are commonly used to deliver the nuclease to a specific location, but zinc finger arrays can’t target every possible sequence of DNA, limiting their usefulness. Furthermore, assembling the proteins is a labor-intensive and expensive process.
Complexes known as transcription activator-like effector nucleases (TALENs) can also cut the genome in specific locations, but these complexes can also be expensive and difficult to assemble.
Precise targeting
The new system is much more user-friendly, Zhang says. Making use of naturally occurring bacterial protein-RNA systems that recognize and snip viral DNA, the researchers can create DNA-editing complexes that include a nuclease called Cas9 bound to short RNA sequences. These sequences are designed to target specific locations in the genome; when they encounter a match, Cas9 cuts the DNA.
This approach can be used either to disrupt the function of a gene or to replace it with a new one. To replace the gene, the researchers must also add a DNA template for the new gene, which would be copied into the genome after the DNA is cut.
Each of the RNA segments can target a different sequence. “That’s the beauty of this — you can easily program a nuclease to target one or more positions in the genome,” Zhang says.
The method is also very precise — if there is a single base-pair difference between the RNA targeting sequence and the genome sequence, Cas9 is not activated. This is not the case for zinc fingers or TALEN. The new system also appears to be more efficient than TALEN, and much less expensive.
The new system “is a significant advancement in the field of genome editing and, in its first iteration, already appears comparable in efficiency to what zinc finger nucleases and TALENs have to offer,” says Aron Geurts, an associate professor of physiology at the Medical College of Wisconsin. “Deciphering the ever-increasing data emerging on genetic variation as it relates to human health and disease will require this type of scalable and precise genome editing in model systems.”
The research team has deposited the necessary genetic components with a nonprofit called Addgene, making the components widely available to other researchers who want to use the system. The researchers have also created a website with tips and tools for using this new technique.
Engineering new therapies
Among other possible applications, this system could be used to design new therapies for diseases such as Huntington’s disease, which appears to be caused by a single abnormal gene. Clinical trials that use zinc finger nucleases to disable genes are now under way, and the new technology could offer a more efficient alternative.
The system might also be useful for treating HIV by removing patients’ lymphocytes and mutating the CCR5 receptor, through which the virus enters cells. After being put back in the patient, such cells would resist infection.
This approach could also make it easier to study human disease by inducing specific mutations in human stem cells. “Using this genome editing system, you can very systematically put in individual mutations and differentiate the stem cells into neurons or cardiomyocytes and see how the mutations alter the biology of the cells,” Zhang says.
In the Science study, the researchers tested the system in cells grown in the lab, but they plan to apply the new technology to study brain function and diseases.

Editing the genome with high precision

Researchers at MIT, the Broad Institute and Rockefeller University have developed a new technique for precisely altering the genomes of living cells by adding or deleting genes. The researchers say the technology could offer an easy-to-use, less-expensive way to engineer organisms that produce biofuels; to design animal models to study human disease; and  to develop new therapies, among other potential applications.

To create their new genome-editing technique, the researchers modified a set of bacterial proteins that normally defend against viral invaders. Using this system, scientists can alter several genome sites simultaneously and can achieve much greater control over where new genes are inserted, says Feng Zhang, an assistant professor of brain and cognitive sciences at MIT and leader of the research team.

“Anything that requires engineering of an organism to put in new genes or to modify what’s in the genome will be able to benefit from this,” says Zhang, who is a core member of the Broad Institute and MIT’s McGovern Institute for Brain Research.

Zhang and his colleagues describe the new technique in the Jan. 3 online edition of Science. Lead authors of the paper are graduate students Le Cong and Ann Ran.

Early efforts

The first genetically altered mice were created in the 1980s by adding small pieces of DNA to mouse embryonic cells. This method is now widely used to create transgenic mice for the study of human disease, but, because it inserts DNA randomly in the genome, researchers can’t target the newly delivered genes to replace existing ones.

In recent years, scientists have sought more precise ways to edit the genome. One such method, known as homologous recombination, involves delivering a piece of DNA that includes the gene of interest flanked by sequences that match the genome region where the gene is to be inserted. However, this technique’s success rate is very low because the natural recombination process is rare in normal cells.

More recently, biologists discovered that they could improve the efficiency of this process by adding enzymes called nucleases, which can cut DNA. Zinc fingers are commonly used to deliver the nuclease to a specific location, but zinc finger arrays can’t target every possible sequence of DNA, limiting their usefulness. Furthermore, assembling the proteins is a labor-intensive and expensive process.

Complexes known as transcription activator-like effector nucleases (TALENs) can also cut the genome in specific locations, but these complexes can also be expensive and difficult to assemble.

Precise targeting

The new system is much more user-friendly, Zhang says. Making use of naturally occurring bacterial protein-RNA systems that recognize and snip viral DNA, the researchers can create DNA-editing complexes that include a nuclease called Cas9 bound to short RNA sequences. These sequences are designed to target specific locations in the genome; when they encounter a match, Cas9 cuts the DNA.

This approach can be used either to disrupt the function of a gene or to replace it with a new one. To replace the gene, the researchers must also add a DNA template for the new gene, which would be copied into the genome after the DNA is cut.

Each of the RNA segments can target a different sequence. “That’s the beauty of this — you can easily program a nuclease to target one or more positions in the genome,” Zhang says.

The method is also very precise — if there is a single base-pair difference between the RNA targeting sequence and the genome sequence, Cas9 is not activated. This is not the case for zinc fingers or TALEN. The new system also appears to be more efficient than TALEN, and much less expensive.

The new system “is a significant advancement in the field of genome editing and, in its first iteration, already appears comparable in efficiency to what zinc finger nucleases and TALENs have to offer,” says Aron Geurts, an associate professor of physiology at the Medical College of Wisconsin. “Deciphering the ever-increasing data emerging on genetic variation as it relates to human health and disease will require this type of scalable and precise genome editing in model systems.”

The research team has deposited the necessary genetic components with a nonprofit called Addgene, making the components widely available to other researchers who want to use the system. The researchers have also created a website with tips and tools for using this new technique.

Engineering new therapies

Among other possible applications, this system could be used to design new therapies for diseases such as Huntington’s disease, which appears to be caused by a single abnormal gene. Clinical trials that use zinc finger nucleases to disable genes are now under way, and the new technology could offer a more efficient alternative.

The system might also be useful for treating HIV by removing patients’ lymphocytes and mutating the CCR5 receptor, through which the virus enters cells. After being put back in the patient, such cells would resist infection.

This approach could also make it easier to study human disease by inducing specific mutations in human stem cells. “Using this genome editing system, you can very systematically put in individual mutations and differentiate the stem cells into neurons or cardiomyocytes and see how the mutations alter the biology of the cells,” Zhang says.

In the Science study, the researchers tested the system in cells grown in the lab, but they plan to apply the new technology to study brain function and diseases.

Filed under genome genomic sequencing DNA genome-editing technique engineering cells science

84 notes

Pesticides and Parkinson’s: UCLA researchers uncover further proof of a link
For several years, neurologists at UCLA have been building a case that a link exists between pesticides and Parkinson’s disease. To date, paraquat, maneb and ziram — common chemicals sprayed in California’s Central Valley and elsewhere — have been tied to increases in the disease, not only among farmworkers but in individuals who simply lived or worked near fields and likely inhaled drifting particles.
Now, UCLA researchers have discovered a link between Parkinson’s and another pesticide, benomyl, whose toxicological effects still linger some 10 years after the chemical was banned by the U.S. Environmental Protection Agency.
Even more significantly, the research suggests that the damaging series of events set in motion by benomyl may also occur in people with Parkinson’s disease who were never exposed to the pesticide, according to Jeff Bronstein, senior author of the study and a professor of neurology at UCLA, and his colleagues.
Benomyl exposure, they say, starts a cascade of cellular events that may lead to Parkinson’s. The pesticide prevents an enzyme called ALDH (aldehyde dehydrogenase) from keeping a lid on DOPAL, a toxin that naturally occurs in the brain. When left unchecked by ALDH, DOPAL accumulates, damages neurons and increases an individual’s risk of developing Parkinson’s.
The investigators believe their findings concerning benomyl may be generalized to all Parkinson’s patients. Developing new drugs to protect ALDH activity, they say, may eventually help slow the progression of the disease, whether or not an individual has been exposed to pesticides.
The research is published in the current online edition of Proceedings of the National Academy of Sciences.

Pesticides and Parkinson’s: UCLA researchers uncover further proof of a link

For several years, neurologists at UCLA have been building a case that a link exists between pesticides and Parkinson’s disease. To date, paraquat, maneb and ziram — common chemicals sprayed in California’s Central Valley and elsewhere — have been tied to increases in the disease, not only among farmworkers but in individuals who simply lived or worked near fields and likely inhaled drifting particles.

Now, UCLA researchers have discovered a link between Parkinson’s and another pesticide, benomyl, whose toxicological effects still linger some 10 years after the chemical was banned by the U.S. Environmental Protection Agency.

Even more significantly, the research suggests that the damaging series of events set in motion by benomyl may also occur in people with Parkinson’s disease who were never exposed to the pesticide, according to Jeff Bronstein, senior author of the study and a professor of neurology at UCLA, and his colleagues.

Benomyl exposure, they say, starts a cascade of cellular events that may lead to Parkinson’s. The pesticide prevents an enzyme called ALDH (aldehyde dehydrogenase) from keeping a lid on DOPAL, a toxin that naturally occurs in the brain. When left unchecked by ALDH, DOPAL accumulates, damages neurons and increases an individual’s risk of developing Parkinson’s.

The investigators believe their findings concerning benomyl may be generalized to all Parkinson’s patients. Developing new drugs to protect ALDH activity, they say, may eventually help slow the progression of the disease, whether or not an individual has been exposed to pesticides.

The research is published in the current online edition of Proceedings of the National Academy of Sciences.

Filed under parkinson's disease pesticides benomyl brain neuron environment science

54 notes

Induction of adult cortical neurogenesis by an antidepressant

The production of new neurons in the adult normal cortex in response to the antidepressant, fluoxetine, is reported in a study published online this week in Neuropsychopharmacology.

The research team, which is based at the Institute for Comprehensive Medical Science, Fujita Health University, Aichi, has previously demonstrated that neural progenitor cells exist at the surface of the adult cortex, and, moreover, that ischemia enhances the generation of new inhibitory neurons from these neural progenitor cells. These cells were accordingly named “Layer 1 Inhibitory Neuron Progenitor cells” (L1-INP). However, until now it was not known whether L1-INP-related neurogenesis could be induced in the normal adult cortex.

Tsuyoshi Miyakawa, Koji Ohira, and their colleagues employed fluoxetine, a selective serotonin reuptake inhibitor, and one of the most widely used antidepressants, to stimulate the production of new neurons from L1-INP cells. A large percentage of these newly generated neurons were inhibitory GABAergic interneurons, and their generation coincided with a reduction in apoptotic cell death following ischemia. This finding highlights the potential neuroprotective response induced by this antidepressant drug. It also lends further support to the postulation that induction of adult neurogenesis in cortex is a relevant prevention/treatment option for neurodegenerative diseases and psychiatric disorders.

(Source: eurekalert.org)

Filed under neurogenesis fluoxetine neuron antidepressants interneurons neuroscience science

57 notes

USF and VA researchers find long-term consequences for those suffering traumatic brain injury
Researchers from the University of South Florida and colleagues at the James A. Haley Veterans’ Hospital studying the long-term consequences of traumatic brain injury (TBI) using rat models, have found that, overtime, TBI results in progressive brain deterioration characterized by elevated inflammation and suppressed cell regeneration. However, therapeutic intervention, even in the chronic stage of TBI, may still help prevent cell death.
Their study is published in the current issue of the journal PLOS ONE.
“In the U.S., an estimated 1.7 million people suffer from traumatic brain injury,” said Dr. Cesar V. Borlongan, professor and vice chair of the department of Neurosurgery and Brain Repair at the University of South Florida (USF). “In addition, TBI is responsible for 52,000 early deaths, accounts for 30 percent of all injury-related deaths, and costs approximately $52 billion yearly to treat.”
While TBI is generally considered an acute injury, secondary cell death caused by neuroinflammation and an impaired repair mechanism accompany the injury over time, said the authors. Long-term neurological deficits from TBI related to inflammation may cause more severe secondary injuries and predispose long-term survivors to age-related neurodegenerative diseases, such as Alzheimer’s disease, Parkinson’s disease and post-traumatic dementia.
Since the U.S. military has been involved in conflicts in Iraq and Afghanistan, the incidence of traumatic brain injury suffered by troops has increased dramatically, primarily from improvised explosive devices (IEDs), according to Martin Steele, Lieutenant General, U.S. Marine Corps (retired), USF associate vice president for veterans research, and executive director of Military Partnerships. In response, the U.S. Veterans Administration has increasingly focused on TBI research and treatment.
“Progressive injury to hippocampal, cortical and thalamic regions contributes to long-term cognitive damage post-TBI,” said study co-author Dr. Paul R. Sanberg, USF senior vice president for research and innovation. “Both military and civilian patients have shown functional and cognitive deficits resulting from TBI.”
Because TBI involves both acute and chronic stages, the researchers noted that animal model research on the chronic stages of TBI could provide insight into identifying therapeutic targets for treatment in the post-acute stage.
“Using animal models of TBI, our study investigated the prolonged pathological outcomes of TBI in different parts of the brain, such as the dorsal striatum, thalamus, corpus callosum white matter, hippocampus and cerebral peduncle,” explained Borlongan, the study’s lead author. “We found that a massive neuroinflammation after TBI causes a second wave of cell death that impairs cell proliferation and impedes the brain’s regenerative capabilities.”
Upon examining the rat brains eight weeks post-trauma, the researchers found “a significant up-regulation of activated microglia cells, not only in the area of direct trauma, but also in adjacent as well as distant areas.” The location of inflammation correlated with the cell loss and impaired cell proliferation researchers observed.
Microglia cells act as the first and main form of immune defense in the central nervous system and make up 20 percent of the total glial cell population within the brain. They are distributed across large regions throughout the brain and spinal cord.
“Our study found that cell proliferation was significantly affected by a cascade of neuroinflammatory events in chronic TBI and we identified the susceptibility of newly formed cells within neurologic niches and suppression of neurological repair,” wrote the authors.
The researchers concluded that, while the progressive deterioration of the TBI-affected brain over time suppressed efforts of repair, intervention, even in the chronic stage of TBI injury, could help further deterioration.

USF and VA researchers find long-term consequences for those suffering traumatic brain injury

Researchers from the University of South Florida and colleagues at the James A. Haley Veterans’ Hospital studying the long-term consequences of traumatic brain injury (TBI) using rat models, have found that, overtime, TBI results in progressive brain deterioration characterized by elevated inflammation and suppressed cell regeneration. However, therapeutic intervention, even in the chronic stage of TBI, may still help prevent cell death.

Their study is published in the current issue of the journal PLOS ONE.

“In the U.S., an estimated 1.7 million people suffer from traumatic brain injury,” said Dr. Cesar V. Borlongan, professor and vice chair of the department of Neurosurgery and Brain Repair at the University of South Florida (USF). “In addition, TBI is responsible for 52,000 early deaths, accounts for 30 percent of all injury-related deaths, and costs approximately $52 billion yearly to treat.”

While TBI is generally considered an acute injury, secondary cell death caused by neuroinflammation and an impaired repair mechanism accompany the injury over time, said the authors. Long-term neurological deficits from TBI related to inflammation may cause more severe secondary injuries and predispose long-term survivors to age-related neurodegenerative diseases, such as Alzheimer’s disease, Parkinson’s disease and post-traumatic dementia.

Since the U.S. military has been involved in conflicts in Iraq and Afghanistan, the incidence of traumatic brain injury suffered by troops has increased dramatically, primarily from improvised explosive devices (IEDs), according to Martin Steele, Lieutenant General, U.S. Marine Corps (retired), USF associate vice president for veterans research, and executive director of Military Partnerships. In response, the U.S. Veterans Administration has increasingly focused on TBI research and treatment.

“Progressive injury to hippocampal, cortical and thalamic regions contributes to long-term cognitive damage post-TBI,” said study co-author Dr. Paul R. Sanberg, USF senior vice president for research and innovation. “Both military and civilian patients have shown functional and cognitive deficits resulting from TBI.”

Because TBI involves both acute and chronic stages, the researchers noted that animal model research on the chronic stages of TBI could provide insight into identifying therapeutic targets for treatment in the post-acute stage.

“Using animal models of TBI, our study investigated the prolonged pathological outcomes of TBI in different parts of the brain, such as the dorsal striatum, thalamus, corpus callosum white matter, hippocampus and cerebral peduncle,” explained Borlongan, the study’s lead author. “We found that a massive neuroinflammation after TBI causes a second wave of cell death that impairs cell proliferation and impedes the brain’s regenerative capabilities.”

Upon examining the rat brains eight weeks post-trauma, the researchers found “a significant up-regulation of activated microglia cells, not only in the area of direct trauma, but also in adjacent as well as distant areas.” The location of inflammation correlated with the cell loss and impaired cell proliferation researchers observed.

Microglia cells act as the first and main form of immune defense in the central nervous system and make up 20 percent of the total glial cell population within the brain. They are distributed across large regions throughout the brain and spinal cord.

“Our study found that cell proliferation was significantly affected by a cascade of neuroinflammatory events in chronic TBI and we identified the susceptibility of newly formed cells within neurologic niches and suppression of neurological repair,” wrote the authors.

The researchers concluded that, while the progressive deterioration of the TBI-affected brain over time suppressed efforts of repair, intervention, even in the chronic stage of TBI injury, could help further deterioration.

Filed under TBI brain injury cell regeneration neurodegenerative diseases neuroscience science

free counters