Neuroscience

Articles and news from the latest research reports.

Posts tagged science

60 notes

Protein that Causes Frontotemporal Dementia also Implicated in Alzheimer’s Disease

Researchers at the Gladstone Institutes have shown that low levels of the protein progranulin in the brain can increase the formation of amyloid-beta plaques (a hallmark of Alzheimer’s disease), cause neuroinflammation, and worsen memory deficits in a mouse model of this condition. Conversely, by using a gene therapy approach to elevate progranulin levels, scientists were able to prevent these abnormalities and block cell death in this model.

Progranulin deficiency is known to cause another neurodegenerative disorder, frontotemporal dementia (FTD), but its role in Alzheimer’s disease was previously unclear. Although the two conditions are similar, FTD is associated with greater injury to cells in the frontal cortex, causing behavioral and personality changes, whereas Alzheimer’s disease predominantly affects memory centers in the hippocampus and temporal cortex.

Earlier research showed that progranulin levels were elevated near plaques in the brains of patients with Alzheimer’s disease, but it was unknown whether this effect counteracted or exacerbated neurodegeneration. The new evidence, published today in Nature Medicine, shows that a reduction of the protein can severely aggravate symptoms, while increases in progranulin may be the brain’s attempt at fighting the inflammation associated with the disease.

According to first author S. Sakura Minami, PhD, a postdoctoral fellow at the Gladstone Institutes, “This is the first study providing evidence for a protective role of progranulin in Alzheimer’s disease. Prior research had shown a link between Alzheimer’s and progranulin, but the nature of the association was unclear. Our study demonstrates that progranulin deficiency may promote Alzheimer’s disease, with decreased levels rendering the brain vulnerable to amyloid-beta toxicity.”

In the study, the researchers manipulated several different mouse models of Alzheimer’s disease, genetically raising or lowering their progranulin levels. Reducing progranulin markedly increased amyloid-beta plaque deposits in the brain as well as memory impairments. Progranulin deficiency also triggered an over-active immune response in the brain, which can contribute to neurological disorders. In contrast, increasing progranulin levels via gene therapy effectively lowered amyloid beta levels, protecting against cell toxicity and reversing the cognitive deficits typically seen in these Alzheimer’s models.

These effects appear to be linked to progranulin’s involvement in phagocytosis, a type of cellular house-keeping whereby cells “eat” other dead cells, debris, and large molecules. Low levels of progranulin can impair this process, leading to increased amyloid beta deposition. Conversely, increasing progranulin levels enhanced phagocytosis, decreasing the plaque load and preventing neuron death.

“The profound protective effects of progranulin against both amyloid-beta deposits and cell toxicity have important therapeutic implications,” said senior author Li Gan, PhD, an associate investigator at Gladstone and associate professor of neurology at the University of California, San Francisco. “The next step will be to develop progranulin-enhancing approaches that can be used as potential novel treatments, not only for frontotemporal dementia, but also for Alzheimer’s disease.”

(Source: gladstoneinstitutes.org)

Filed under progranulin alzheimer's disease dementia beta amyloid phagocytosis neuroscience science

114 notes

Scientists Identify the Signature of Aging in the Brain

How the brain ages is still largely an open question – in part because this organ is mostly insulated from direct contact with other systems in the body, including the blood and immune systems. In research that was recently published in Science, Weizmann Institute researchers Prof. Michal Schwartz of the Neurobiology Department and Dr. Ido Amit of Immunology Department found evidence of a unique “signature” that may be the “missing link” between cognitive decline and aging. The scientists believe that this discovery may lead, in the future, to treatments that can slow or reverse cognitive decline in older people.

image

(Image caption: Immunofluorescence microscope image of the choroid plexus. Epithelial cells are in green and chemokine proteins (CXCL10) are in red)

Until a decade ago, scientific dogma held that the blood-brain barrier prevents the blood-borne immune cells from attacking and destroying brain tissue. Yet in a long series of studies, Schwartz’s group had shown that the immune system actually plays an important role both in healing the brain after injury and in maintaining the brain’s normal functioning. They have found that this brain-immune interaction occurs across a barrier that is actually a unique interface within the brain’s territory.

This interface, known as the choroid plexus, is found in each of the brain’s four ventricles, and it separates the blood from the cerebrospinal fluid. Schwartz: “The choroid plexus acts as a ‘remote control’ for the immune system to affect brain activity. Biochemical ‘danger’ signals released from the brain are sensed through this interface; in turn, blood-borne immune cells assist by communicating with the choroid plexus. This cross-talk is important for preserving cognitive abilities and promoting the generation of new brain cells.”

This finding led Schwartz and her group to suggest that cognitive decline over the years may be connected not only to one’s “chronological age” but also to one’s “immunological age,” that is, changes in immune function over time might contribute to changes in brain function – not necessarily in step with the count of one’s years.

To test this theory, Schwartz and research students Kuti Baruch and Aleksandra Deczkowska teamed up with Amit and his research group in the Immunology Department. The researchers used next-generation sequencing technology to map changes in gene expression in 11 different organs, including the choroid plexus, in both young and aged mice, to identify and compare pathways involved in the aging process.

That is how they identified a strikingly unique “signature of aging” that exists solely in the choroid plexus – not in the other organs. They discovered that one of the main elements of this signature was interferon beta – a protein that the body normally produces to fight viral infection. This protein appears to have a negative effect on the brain: When the researchers injected an antibody that blocks interferon beta activity into the cerebrospinal fluid of the older mice, their cognitive abilities were restored, as was their ability to form new brain cells. The scientists were also able to identify this unique signature in elderly human brains. The scientists hope that this finding may, in the future, help prevent or reverse cognitive decline in old age, by finding ways to rejuvenate the “immunological age” of the brain.

(Source: wis-wander.weizmann.ac.il)

Filed under aging cognitive decline brain function blood-brain barrier choroid plexus gene expression neuroscience science

107 notes

Research mimics brain cells to boost memory power

RMIT University researchers have brought ultra-fast, nano-scale data storage within striking reach, using technology that mimics the human brain.

image

The researchers have built a novel nano-structure that offers a new platform for the development of highly stable and reliable nanoscale memory devices. 

The pioneering work will feature on a forthcoming cover of prestigious materials science journal Advanced Functional Materials (11 November). 

Project leader Dr Sharath Sriram, co-leader of the RMIT Functional Materials and Microsystems Research Group, said the nanometer-thin stacked structure was created using thin film, a functional oxide material more than 10,000 times thinner than a human hair. 

“The thin film is specifically designed to have defects in its chemistry to demonstrate a ‘memristive’ effect – where the memory element’s behaviour is dependent on its past experiences,” Dr Sriram said.

“With flash memory rapidly approaching fundamental scaling limits, we need novel materials and architectures for creating the next generation of non-volatile memory. 

“The structure we developed could be used for a range of electronic applications – from ultrafast memory devices that can be shrunk down to a few nanometers, to computer logic architectures that replicate the versatility and response time of a biological neural network.

“While more investigation needs to be done, our work advances the search for next generation memory technology can replicate the complex functions of human neural system – bringing us one step closer to the bionic brain.”

The research relies on memristors, touted as a transformational replacement for current hard drive technologies such as Flash, SSD and DRAM. Memristors have potential to be fashioned into non-volatile solid-state memory and offer building blocks for computing that could be trained to mimic synaptic interfaces in the human brain.

(Source: alphagalileo.org)

Filed under memristor memory perovskite oxide brain cells technology neuroscience science

154 notes

From Rats to Humans: Project NEUWalk Closer to Clinical Trials
EPFL scientists have discovered how to control the limbs of a completely paralyzed rat in real time to help it walk again. Their results are published today in Science Translational Medicine.
Building on earlier work in rats, this new breakthrough is part of a more general therapy that could one day be implemented in rehabilitation programs for people with spinal cord injury, currently being developed in a European project called NEUWalk. Clinical trials could start as early as next summer using the new Gait Platform, built with the support of the Valais canton and the SUVA, and now assembled at the CHUV (Lausanne University Hospital).
How it works
The human body needs electricity to function. The electrical output of the human brain, for instance, is about 30 watts. When the circuitry of the nervous system is damaged, the transmission of electrical signals is impaired, often leading to devastating neurological disorders like paralysis.
Electrical stimulation of the nervous system is known to help relieve these neurological disorders at many levels. Deep brain stimulation is used to treat tremors related to Parkinson’s disease, for example. Electrical signals can be engineered to stimulate nerves to restore a sense of touch in the missing limb of amputees. And electrical stimulation of the spinal cord can restore movement control in spinal cord injury.
But can electrical signals be engineered to help a paraplegic walk naturally? The answer is yes, for rats at least.
“We have complete control of the rat’s hind legs,” says EPFL neuroscientist Grégoire Courtine. “The rat has no voluntary control of its limbs, but the severed spinal cord can be reactivated and stimulated to perform natural walking. We can control in real-time how the rat moves forward and how high it lifts its legs.”
The scientists studied rats whose spinal cords were completely severed in the middle-back, so signals from the brain were unable to reach the lower spinal cord. That’s where flexible electrodes were surgically implanted. Sending electric current through the electrodes stimulated the spinal cord.
They realized that there was a direct relationship between how high the rat lifted its limbs and the frequency of the electrical stimulation. Based on this and careful monitoring of the rat’s walking patterns – its gait – the researchers specially designed the electrical stimulation to adapt the rat’s stride in anticipation of upcoming obstacles, like barriers or stairs.
“Simple scientific discoveries about how the nervous system works can be exploited to develop more effective neuroprosthetic technologies,” says co-author and neuroengineer Silvestro Micera. “We believe that this technology could one day significantly improve the quality of life of people confronted with neurological disorders.”
Taking this idea a step further, Courtine and Micera together with colleagues from EPFL’s Center for Neuroprosthetics are also exploring the possibility of decoding signals directly from the brain about leg movement and using this information to stimulate the spinal cord.
Towards clinical trials using the Gait Platform at the CHUV
The electrical stimulation reported in this study will be tested in patients with incomplete spinal cord injury in a clinical study that may start as early as next summer, using a new Gait Platform that brings together innovative monitoring and rehabilitation technology.
Designed by Courtine’s team, the Gait Platform consists of custom-made equipment like a treadmill and an overground support system, as well as 14 infrared cameras that detect reflective markers on the patient’s body and two video cameras, all of which generate extensive amounts of information about leg and body movement. This information can be fully synchronized for complete monitoring and fine-tuning of the equipment in order to achieve intelligent assistance and adaptive electrical spinal cord stimulation of the patient.
The Gait Platform is housed in a 100 square meter room provided by the CHUV. The hospital already has a rehabilitation center dedicated to translational research, notably for orthopedic and neurological pathologies.
“The Gait Platform is not a rehabilitation center,” says Courtine. “It is a research laboratory where we will be able to study and develop new therapies using very specialized technology in close collaboration with medical experts here at the CHUV, like physiotherapists and doctors.”

From Rats to Humans: Project NEUWalk Closer to Clinical Trials

EPFL scientists have discovered how to control the limbs of a completely paralyzed rat in real time to help it walk again. Their results are published today in Science Translational Medicine.

Building on earlier work in rats, this new breakthrough is part of a more general therapy that could one day be implemented in rehabilitation programs for people with spinal cord injury, currently being developed in a European project called NEUWalk. Clinical trials could start as early as next summer using the new Gait Platform, built with the support of the Valais canton and the SUVA, and now assembled at the CHUV (Lausanne University Hospital).

How it works

The human body needs electricity to function. The electrical output of the human brain, for instance, is about 30 watts. When the circuitry of the nervous system is damaged, the transmission of electrical signals is impaired, often leading to devastating neurological disorders like paralysis.

Electrical stimulation of the nervous system is known to help relieve these neurological disorders at many levels. Deep brain stimulation is used to treat tremors related to Parkinson’s disease, for example. Electrical signals can be engineered to stimulate nerves to restore a sense of touch in the missing limb of amputees. And electrical stimulation of the spinal cord can restore movement control in spinal cord injury.

But can electrical signals be engineered to help a paraplegic walk naturally? The answer is yes, for rats at least.

“We have complete control of the rat’s hind legs,” says EPFL neuroscientist Grégoire Courtine. “The rat has no voluntary control of its limbs, but the severed spinal cord can be reactivated and stimulated to perform natural walking. We can control in real-time how the rat moves forward and how high it lifts its legs.”

The scientists studied rats whose spinal cords were completely severed in the middle-back, so signals from the brain were unable to reach the lower spinal cord. That’s where flexible electrodes were surgically implanted. Sending electric current through the electrodes stimulated the spinal cord.

They realized that there was a direct relationship between how high the rat lifted its limbs and the frequency of the electrical stimulation. Based on this and careful monitoring of the rat’s walking patterns – its gait – the researchers specially designed the electrical stimulation to adapt the rat’s stride in anticipation of upcoming obstacles, like barriers or stairs.

“Simple scientific discoveries about how the nervous system works can be exploited to develop more effective neuroprosthetic technologies,” says co-author and neuroengineer Silvestro Micera. “We believe that this technology could one day significantly improve the quality of life of people confronted with neurological disorders.”

Taking this idea a step further, Courtine and Micera together with colleagues from EPFL’s Center for Neuroprosthetics are also exploring the possibility of decoding signals directly from the brain about leg movement and using this information to stimulate the spinal cord.

Towards clinical trials using the Gait Platform at the CHUV

The electrical stimulation reported in this study will be tested in patients with incomplete spinal cord injury in a clinical study that may start as early as next summer, using a new Gait Platform that brings together innovative monitoring and rehabilitation technology.

Designed by Courtine’s team, the Gait Platform consists of custom-made equipment like a treadmill and an overground support system, as well as 14 infrared cameras that detect reflective markers on the patient’s body and two video cameras, all of which generate extensive amounts of information about leg and body movement. This information can be fully synchronized for complete monitoring and fine-tuning of the equipment in order to achieve intelligent assistance and adaptive electrical spinal cord stimulation of the patient.

The Gait Platform is housed in a 100 square meter room provided by the CHUV. The hospital already has a rehabilitation center dedicated to translational research, notably for orthopedic and neurological pathologies.

“The Gait Platform is not a rehabilitation center,” says Courtine. “It is a research laboratory where we will be able to study and develop new therapies using very specialized technology in close collaboration with medical experts here at the CHUV, like physiotherapists and doctors.”

Filed under spinal cord spinal cord injury NEUWalk paralysis electrical stimulation neuroscience science

52 notes

Study Identifies Unexpected Clue to Peripheral Neuropathies

New research shows that disrupting the molecular function of a tumor suppressor causes improper formation of a protective insulating sheath on peripheral nerves – leading to neuropathy and muscle wasting in mice similar to that in human diabetes and neurodegeneration.

Scientists from Cincinnati Children’s Hospital Medical Center report their findings online Sept. 26 in Nature Communications. The study suggests that normal molecular function of the tumor suppressor gene Lkb1 is essential to an important metabolic transition in cells as peripheral nerves (called axons) are coated with the protective myelin sheath by Schwann glia cells.

“This study is just the tip of the iceberg and a fundamental discovery because of the unexpected finding that a well-known tumor suppressor gene has a novel and important role in myelinating glial cells,” said Biplab Dasgupta PhD, principal investigator and a researcher at the Cincinnati Children’s Cancer and Blood Diseases Institute (CBDI).  “Additional study is needed, as the function of Lkb1 may have broader implications – not only in normal development, but also in metabolic reprogramming in human pathologies. This includes functional regeneration of axons after injury and demyelinating neuropathies.”

The process of myelin sheath formation (called myelination) requires extraordinarily high levels of lipid (fat) synthesis because most of myelin is composed of lipids, according to Dasgupta. Lipids are made from citric acid which is produced in the powerhouse of cells called mitochondria. Success of this sheathing process depends on the cells shifting from a glycolytic to mitochondrial oxidative metabolism that generates citric acid, the authors report.

Dasgupta’s research team used Lkb1 mutant mice in the current study. Because the mice did not express Lkb1 in myelin forming glial cells, this allowed scientists to analyze its role in glial cell metabolism and formation of the myelin sheath coating.

When the function of Lkb1 was disrupted in laboratory mice, it blocked the metabolic shift from glycolytic to mitochondrial metabolism, resulting in a thinner myelin sheath (hypomyelination) of the nerves. This caused muscle atrophy, hind limb dysfunction, peripheral neuropathy and even premature death of these mice, according to the authors.

Peripheral neuropathy involves damage to the peripheral nervous system – which transmits information from the brain and spinal cord (the central nervous system) to other parts of the body, according to the National Institute of Neurological Disorders and Stroke (NINDS). There are more than 100 types of peripheral neuropathy, and damage to the peripheral nervous system interferes with crucial messages from the brain to the rest of the body.

The scientists also reported that reducing Lkb1 in Schwann cells decreased the activity of critical metabolic enzyme citrate synthase that makes citric acid. Enhancing Lkb1 increased this activity.

They tested the effect of boosting citric acid levels in the Lbk1 mutant Schwann cells. This enhanced lipid production and partially reversed myelin sheath formation defects in Lbk1 mutant Schwann cells. Dasgupta said this further underscores the importance of Lbk1 and the production of citrate synthase.

Dasgupta and his colleagues are currently testing whether increasing the fat content in the Lbk1 mutant mice diet improves hypomyelination defects. The researchers emphasized the importance of additional research into the laboratory findings to extend their relevance more directly to human disease.

(Source: cincinnatichildrens.org)

Filed under Lkb1 myelination glial cells mitochondria neuropathy neuroscience science

83 notes

Brain chemical potential new hope in controlling Tourette Syndrome tics

A chemical in the brain plays a vital role in controlling the involuntary movements and vocal tics associated with Tourette Syndrome (TS), a new study has shown.

image

The research by psychologists at The University of Nottingham, published in the latest edition of the journal Current Biology, could offer a potential new target for the development of more effective treatments to suppress these unwanted symptoms.

The study, led by PhD student Amelia Draper under the supervision of Professor Stephen Jackson, found that higher levels of a neurochemical called GABA in a part of the brain known as the supplementary motor area (SMA) helps to dampen down hyperactivity in the cortical areas that produce movement.

By reducing this hyperactivity, only the strongest signals would get through and produce a movement.

Greater control

Amelia said: “This result is significant because new brain stimulation techniques can be used to increase or decrease GABA in targeted areas of the cortex. It may be possible that such techniques to adjust the levels of GABA in the SMA could help young people with TS gain greater control over their tics.”

Tourette Syndrome is a developmental disorder associated with these involuntary and repetitive vocal and movement tics. Although the exact cause of TS is unknown, research has shown that people with TS have alterations in their brain ‘circuitry’  that are involved in producing and controlling motor functions.

Both the primary motor cortex (M1) and the supplementary motor area (SMA) are thought to be hyperactive in the brains of those with TS, causing the tics which can be both embarrassing and disruptive, especially for children who often find it difficult to concentrate at school.

Tics can be partially controlled by many people with TS but this often takes enormous mental energy and can leave them exhausted towards the end of the day and can often make their tics more frequent and excessive when they ‘relax’. The majority of people diagnosed with TS in childhood manage to gain control over their tics gradually until they have only mild symptoms by early adulthood but this is often too late for some people who have had their education and social friendships disrupted.

Greater detail

The scientists used a technique called magnetic resonance spectroscopy (MRS) in a 7 Tesla Magnetic Resonance Imaging (MRI) scanner to measure the concentration of certain chemicals in the brain known as neurotransmitters which offer an indication of brain activity.

The chemicals were measured in the M1, the SMA and an area involved in visual processing (V1) which was used as a control (comparison) site. They tested a group of young people with TS and a matched group of typical young people with no known disorders.

They discovered that the people with TS had higher concentrations of GABA, which inhibits neuronal activity, in the SMA.

They used other neuroscience techniques to explore the result in greater detail, finding that having more GABA in the SMA meant that the people with Tourette Syndrome had less activity in the SMA when asked to perform a simple motor task, in this case tapping their finger, which they were able to measure using functional MRI.

Using another technique called transcranial magnetic stimulation (TMS) in which a magnetic field is passed over the brain to stimulate neuron activity, they found that those with the most GABA dampen down the brain activity in the M1 when preparing to make a movement. In contrast, the typically developing group increased their activity during movement preparation.

Paradoxical finding

Finally, they considered how GABA was related to brain structure, specifically the white matter fibre bundles that connect the two hemispheres of the brain, a structure called the corpus callosum. They discovered that those with the highest levels of GABA also had the most connecting fibres, leading them to conclude that the more connecting fibres there are then the more excitatory signals are being produced leading to the need for even more GABA to calm this excess hyperactivity.

The results could lead the way to more targeted approaches to controlling tics. New brain techniques such as transcranial direct-current stimulation (tdcs), a form of neurostimulation which uses constant, low level electrical current delivered directly to the brain via electrodes, has already been shown to be successful in increasing or decreasing GABA in targeted areas of the cortex.

Professor Stephen Jackson added: “This finding is paradoxical because prior to our finding, most scientists working on this topic would have thought that GABA levels in TS would be reduced and not increased as we show. This is because a distinction should be made between brain changes that are causes of the disorder (e.g., reduced GABA cells in some key brain areas) and secondary consequences of the disorder (e.g., increased release of GABA in key brain areas) that act to reduce the effects of the disorder.”

New tdcs devices, similar to commercially-available TENS machines, could potentially be produced to be used by young people with TS to ‘train’ their brains to help them gain control over their tics, offering the benefit that they could be relatively cheap and could be used in the home while performing other tasks such as watching television.

(Source: nottingham.ac.uk)

Filed under tourette syndrome supplementary motor area GABA motor cortex neuroimaging brain activity neuroscience science

93 notes

Device lets docs stay ‘tuned in’ to brain bloodflow
For Dr. John Murkin, the medical device business is all about “making a better mouse trap.”
The Schulich School of Medicine & Dentistry professor is part of a team of Western and Lawson Health Research Institute (LHRI) researchers studying a new technology that may change the way patients undergoing cardiac surgery are monitored and managed in the hospital.
The device, known as CerOx, non-invasively monitors cerebral blood flow and helps physicians and nurses assess brain perfusion in real time. Murkin, who has been involved in the machine’s development, said this information could be used to support critical treatment decisions made to protect the patient from potential complications.
“We use near-infrared light routinely in all hospitals to measure oxygen saturation in the brain. That’s been out for 30 years,” Murkin said. “This new device is not just measuring oxygen saturation; it’s also measuring blood flow to the brain, in real time, and non-invasively.
“If a patient has a brain injury, the more you know about the brain, the better you are at being tuned into their needs.”
In cardiac surgery, cerebral monitoring significantly reduces complications, including permanent stroke.
An anesthetist at London Health Sciences Centre and a researcher at LHRI, Murkin has studied cognitive and neurological outcomes in cardiac surgery for more than three decades. He said there has been an unmet clinical need for a noninvasive tool that provides accurate, real-time measurements of cerebral blood flow in these highly vulnerable patients.
Currently, 11 different studies have evaluated CerOx in different applications.
“We’ve seen the potential of the machine and we’re convinced it works,” he added. “If you don’t know what’s going on in the brain, you can’t help. But, when you start to monitor this, and you see changes in blood flow, in oxygen saturation and its because of the blood pressure or hemoglobin, or whatever it is, if you pick things up early enough, you can hopefully avoid any possible complications.
“If you can monitor in real time, you can act in real time.”
The device is expected to be used primarly by physicians in neuro-critical care areas.
“While the device can alert you to potential problems, the next part is what are you going to do about it? You still need to act,” he said. “We want to start looking at what are some of the therapeutic interventions we can use to improve outcomes.”
CerOx was developed by U.S.- and Israel-based Ornim Medical, of which Murkin is a member of their scientific advisory board.

Device lets docs stay ‘tuned in’ to brain bloodflow

For Dr. John Murkin, the medical device business is all about “making a better mouse trap.”

The Schulich School of Medicine & Dentistry professor is part of a team of Western and Lawson Health Research Institute (LHRI) researchers studying a new technology that may change the way patients undergoing cardiac surgery are monitored and managed in the hospital.

The device, known as CerOx, non-invasively monitors cerebral blood flow and helps physicians and nurses assess brain perfusion in real time. Murkin, who has been involved in the machine’s development, said this information could be used to support critical treatment decisions made to protect the patient from potential complications.

“We use near-infrared light routinely in all hospitals to measure oxygen saturation in the brain. That’s been out for 30 years,” Murkin said. “This new device is not just measuring oxygen saturation; it’s also measuring blood flow to the brain, in real time, and non-invasively.

“If a patient has a brain injury, the more you know about the brain, the better you are at being tuned into their needs.”

In cardiac surgery, cerebral monitoring significantly reduces complications, including permanent stroke.

An anesthetist at London Health Sciences Centre and a researcher at LHRI, Murkin has studied cognitive and neurological outcomes in cardiac surgery for more than three decades. He said there has been an unmet clinical need for a noninvasive tool that provides accurate, real-time measurements of cerebral blood flow in these highly vulnerable patients.

Currently, 11 different studies have evaluated CerOx in different applications.

“We’ve seen the potential of the machine and we’re convinced it works,” he added. “If you don’t know what’s going on in the brain, you can’t help. But, when you start to monitor this, and you see changes in blood flow, in oxygen saturation and its because of the blood pressure or hemoglobin, or whatever it is, if you pick things up early enough, you can hopefully avoid any possible complications.

“If you can monitor in real time, you can act in real time.”

The device is expected to be used primarly by physicians in neuro-critical care areas.

“While the device can alert you to potential problems, the next part is what are you going to do about it? You still need to act,” he said. “We want to start looking at what are some of the therapeutic interventions we can use to improve outcomes.”

CerOx was developed by U.S.- and Israel-based Ornim Medical, of which Murkin is a member of their scientific advisory board.

Filed under CerOx cerebral blood flow oxygen saturation medicine science

191 notes

Turmeric compound boosts regeneration of brain stem cells
A bioactive compound found in turmeric promotes stem cell proliferation and differentiation in the brain, reveals new research published today in the open access journal Stem Cell Research & Therapy. The findings suggest aromatic turmerone could be a future drug candidate for treating neurological disorders, such as stroke and Alzheimer’s disease.
The study looked at the effects of aromatic (ar-) turmerone on endogenous neutral stem cells (NSC), which are stem cells found within adult brains. NSC differentiate into neurons, and play an important role in self-repair and recovery of brain function in neurodegenerative diseases. Previous studies of ar-turmerone have shown that the compound can block activation of microglia cells. When activated, these cells cause neuroinflammation, which is associated with different neurological disorders. However, ar-turmerone’s impact on the brain’s capacity to self-repair was unknown.
Researchers from the Institute of Neuroscience and Medicine in Jülich, Germany, studied the effects of ar-turmerone on NSC proliferation and differentiation both in vitro and in vivo. Rat fetal NSC were cultured and grown in six different concentrations of ar-turmerone over a 72 hour period. At certain concentrations, ar-turmerone was shown to increase NSC proliferation by up to 80%, without having any impact on cell death. The cell differentiation process also accelerated in ar-turmerone-treated cells compared to untreated control cells.
To test the effects of ar-turmerone on NSC in vivo, the researchers injected adult rats with ar-turmerone. Using PET imaging and a tracer to detect proliferating cells, they found that the subventricular zone (SVZ) was wider, and the hippocampus expanded, in the brains of rats injected with ar-turmerone than in control animals. The SVZ and hippocampus are the two sites in adult mammalian brains where neurogenesis, the growth of neurons, is known to occur.
Lead author of the study, Adele Rueger, said: “While several substances have been described to promote stem cell proliferation in the brain, fewer drugs additionally promote the differentiation of stem cells into neurons, which constitutes a major goal in regenerative medicine. Our findings on aromatic turmerone take us one step closer to achieving this goal.”
Ar-turmerone is the lesser-studied of two major bioactive compounds found in turmeric. The other compound is curcumin, which is well known for its anti-inflammatory and neuroprotective properties.

Turmeric compound boosts regeneration of brain stem cells

A bioactive compound found in turmeric promotes stem cell proliferation and differentiation in the brain, reveals new research published today in the open access journal Stem Cell Research & Therapy. The findings suggest aromatic turmerone could be a future drug candidate for treating neurological disorders, such as stroke and Alzheimer’s disease.

The study looked at the effects of aromatic (ar-) turmerone on endogenous neutral stem cells (NSC), which are stem cells found within adult brains. NSC differentiate into neurons, and play an important role in self-repair and recovery of brain function in neurodegenerative diseases. Previous studies of ar-turmerone have shown that the compound can block activation of microglia cells. When activated, these cells cause neuroinflammation, which is associated with different neurological disorders. However, ar-turmerone’s impact on the brain’s capacity to self-repair was unknown.

Researchers from the Institute of Neuroscience and Medicine in Jülich, Germany, studied the effects of ar-turmerone on NSC proliferation and differentiation both in vitro and in vivo. Rat fetal NSC were cultured and grown in six different concentrations of ar-turmerone over a 72 hour period. At certain concentrations, ar-turmerone was shown to increase NSC proliferation by up to 80%, without having any impact on cell death. The cell differentiation process also accelerated in ar-turmerone-treated cells compared to untreated control cells.

To test the effects of ar-turmerone on NSC in vivo, the researchers injected adult rats with ar-turmerone. Using PET imaging and a tracer to detect proliferating cells, they found that the subventricular zone (SVZ) was wider, and the hippocampus expanded, in the brains of rats injected with ar-turmerone than in control animals. The SVZ and hippocampus are the two sites in adult mammalian brains where neurogenesis, the growth of neurons, is known to occur.

Lead author of the study, Adele Rueger, said: “While several substances have been described to promote stem cell proliferation in the brain, fewer drugs additionally promote the differentiation of stem cells into neurons, which constitutes a major goal in regenerative medicine. Our findings on aromatic turmerone take us one step closer to achieving this goal.”

Ar-turmerone is the lesser-studied of two major bioactive compounds found in turmeric. The other compound is curcumin, which is well known for its anti-inflammatory and neuroprotective properties.

Filed under microglia cells stem cells neurodegenerative diseases curcumin turmeric neuroscience science

157 notes

Protein pairing builds brain networks
Neural networks are formed by the interconnection of specific neurons in the brain. The molecular mechanisms involved in creating these connections, however, have so far eluded scientists. Research led by Jun Aruga from the RIKEN Brain Science Institute has now  identified an interaction between two proteins that is crucial for making connections between specific types of neurons, with implications for some neurological disorders.
Connections between neurons are made via synapses—small gaps across which chemicals called neurotransmitters pass, relaying signals from a presynaptic neuron to a postsynaptic neuron. Aruga and his colleagues focused on a protein called mGluR7, which is found only at synapses with a specific type of postsynaptic neuron in an area of the brain involved in forming memories.
“mGluR7 is located on the presynaptic side of connections made with hippocampal local inhibitory neurons,” explains Aruga. “Previous studies have proposed that this protein prevents neurotransmitter release from the presynaptic neuron when the neurotransmitter glutamate binds to it.”
The researchers discovered that the localization of mGluR7 to specific synapses is determined by the presence of another protein called Elfn1. This protein is found on the other side of the same synapses, directly opposite mGluR7. When the researchers artificially introduced Elfn1 into cultured cells, mGluR7 became associated with the same cells, and they showed that this was due to a physical interaction between the two proteins. Conversely, deleting Elfn1 in the brains of mice reduced the amount of mGluR7 at the synapses.
These changes interfered with the process of strengthening connections at synapses, which takes place during memory formation, and caused patterns of brain waves that indicated abnormally high levels of electrical activity. Genetically altered mice also exhibited other symptoms that resembled human conditions.
“Deleting Elfn1 increased the susceptibility of mice to seizures,” explains Aruga. “It also enhanced behaviors similar to attention deficit hyperactivity disorder (ADHD).”
Indeed, the researchers found that humans with epilepsy and ADHD also had a faulty version of the gene encoding Elfn1, suggesting that a deficit in the ability of Elfn1 to localize mGluR7 and form specific connections in neural networks is important in some neurological conditions.
“In combination, the human and mouse results implicate the Elfn1–mGluR7 complex in the pathophysiology of epilepsy and ADHD, at least in part,” explains Aruga, although he remains cautious at this early stage of research. “Because of sample size limitations, the human genetics result is not conclusive, but we are now awaiting the results of follow-up studies with additional subjects.”

Protein pairing builds brain networks

Neural networks are formed by the interconnection of specific neurons in the brain. The molecular mechanisms involved in creating these connections, however, have so far eluded scientists. Research led by Jun Aruga from the RIKEN Brain Science Institute has now identified an interaction between two proteins that is crucial for making connections between specific types of neurons, with implications for some neurological disorders.

Connections between neurons are made via synapses—small gaps across which chemicals called neurotransmitters pass, relaying signals from a presynaptic neuron to a postsynaptic neuron. Aruga and his colleagues focused on a protein called mGluR7, which is found only at synapses with a specific type of postsynaptic neuron in an area of the brain involved in forming memories.

“mGluR7 is located on the presynaptic side of connections made with hippocampal local inhibitory neurons,” explains Aruga. “Previous studies have proposed that this protein prevents neurotransmitter release from the presynaptic neuron when the neurotransmitter glutamate binds to it.”

The researchers discovered that the localization of mGluR7 to specific synapses is determined by the presence of another protein called Elfn1. This protein is found on the other side of the same synapses, directly opposite mGluR7. When the researchers artificially introduced Elfn1 into cultured cells, mGluR7 became associated with the same cells, and they showed that this was due to a physical interaction between the two proteins. Conversely, deleting Elfn1 in the brains of mice reduced the amount of mGluR7 at the synapses.

These changes interfered with the process of strengthening connections at synapses, which takes place during memory formation, and caused patterns of brain waves that indicated abnormally high levels of electrical activity. Genetically altered mice also exhibited other symptoms that resembled human conditions.

“Deleting Elfn1 increased the susceptibility of mice to seizures,” explains Aruga. “It also enhanced behaviors similar to attention deficit hyperactivity disorder (ADHD).”

Indeed, the researchers found that humans with epilepsy and ADHD also had a faulty version of the gene encoding Elfn1, suggesting that a deficit in the ability of Elfn1 to localize mGluR7 and form specific connections in neural networks is important in some neurological conditions.

“In combination, the human and mouse results implicate the Elfn1–mGluR7 complex in the pathophysiology of epilepsy and ADHD, at least in part,” explains Aruga, although he remains cautious at this early stage of research. “Because of sample size limitations, the human genetics result is not conclusive, but we are now awaiting the results of follow-up studies with additional subjects.”

Filed under mGluR7 Elfn1 interneurons synapses epilepsy ADHD neuroscience science

164 notes

How plankton gets jet lagged

A hormone that governs sleep and jet lag in humans may also drive the mass migration of plankton in the ocean, scientists at the European Molecular Biology Laboratory (EMBL) in Heidelberg, Germany, have found. The molecule in question, melatonin, is essential to maintain our daily rhythm, and the European scientists have now discovered that it governs the nightly migration of a plankton species from the surface to deeper waters. The findings, published online today in Cell, indicate that melatonin’s role in controlling daily rhythms probably evolved early in the history of animals, and hold hints to how our sleep patterns may have evolved.

In vertebrates, melatonin is known to play a key role in controlling daily activity patterns – patterns which get thrown out of synch when we fly across time zones, leading to jet lag. But virtually all animals have melatonin. What is its role in other species, and how did it evolve the task of promoting sleep? To find out, Detlev Arendt’s lab at EMBL turned to the marine ragworm Platynereis dumerilii. This worm’s larvae take part in what has been described as the planet’s biggest migration, in terms of biomass: the daily vertical movement of plankton in the ocean. By beating a set of microscopic ‘flippers’ – cilia – arranged in a belt around its midline, the worm larvae are able to migrate toward the sea’s surface every day. They reach the surface at dusk, and then throughout the night they settle back down to deeper waters, where they are sheltered from damaging UV rays at the height of day. 

“We found that a group of multitasking cells in the brains of these larvae that sense light also run an internal clock and make melatonin at night.” says Detlev Arendt, who led the research. “So we think that melatonin is the message these cells produce at night to regulate the activity of other neurons that ultimately drive day-night rhythmic behaviour.”

Maria Antonietta Tosches, a postdoc in Arendt’s lab, discovered a group of specialised motor neurons that respond to melatonin. Using modern molecular sensors, she was able to visualise the activity of these neurons in the larva’s brain, and found that it changes radically from day to night. The night-time production of melatonin drives changes in these neurons’ activity, which in turn cause the larva’s cilia to take long pauses from beating. Thanks to these extended pauses, the larva slowly sinks down. During the day, no melatonin is produced, the cilia pause less, and the larva swims upwards.

“When we exposed the larvae to melatonin during the day, they switched towards night-time behaviour,” says Tosches, “it’s as if they were jet lagged.”

The work strongly suggests that the light-sensing, melatonin-producing cells at the heart of this larva’s nightly migration have evolutionary relatives in the human brain. This implies that the cells that control our rhythms of sleep and wakefulness may have first evolved in the ocean, hundreds of millions of years ago, in response to pressure to move away from the sun.

“Step by step we can elucidate the evolutionary origin of key functions of our brain. The fascinating picture emerges that human biology finds its roots in some deeply conserved, fundamental aspects of ocean ecology that dominated life on Earth since ancient evolutionary times,” Arendt concludes.

Filed under melatonin jet lag circadian clock opsins plankton motor neurons neuroscience science

free counters