Neuroscience

Month

May 2013

May 7, 2013129 notes
#science #bionic eye #argus ii #retina #retinitis pigmentosa #blindness #neuroscience
LCSB discovers endogenous antibiotic in the brain

Scientists from the Luxembourg Centre for Systems Biomedicine (LCSB) of the University of Luxembourg have discovered that immune cells in the brain can produce a substance that prevents bacterial growth: namely itaconic acid.

Until now, biologists had assumed that only certain fungi produced itaconic acid. A team working with Dr. Karsten Hiller, head of the Metabolomics Group at LCSB and funded by the ATTRACT program of Luxembourg’s National Research Fund, and Dr. Alessandro Michelucci has now shown that even so-called microglial cells in mammals are also capable of producing this acid. “This is a ground breaking result,” says Prof. Dr. Rudi Balling, director of LCSB: “It is the first proof of an endogenous antibiotic in the brain.” The researchers have now published their results in the prestigious scientific journal PNAS.

Alessandro Michelucci is a cellular biologist, with focus on neurosciences. This is an ideal combination for LCSB with its focus on neurodegenerative diseases, and Parkinson’s disease especially – i.e. changes in the cells of the human nervous system. “Little is still known about the immune responses of the brain,” says Michelucci. “However, because we suspect there are connections between the immune system and Parkinson’s disease, we want to find out what happens in the brain when we trigger an immune response there.” For this purpose, Michelucci brought cell cultures of microglial cells, the immune cells in the brain, into contact with specific constituents of bacterial membranes. The microglial cells exhibited a response and produced a cocktail of metabolic products.

This cocktail was subsequently analysed by Karsten Hiller´s metabolomics group. Upon closer examination, the scientists discovered that production of one substance in particular - itaconic acid - was upregulated. “Itaconic acid plays a central role in the plastics production. Industrial bioreactors use fungi to mass-produce it,” says Hiller: ” The realisation that mammalian cells synthesise itaconic acid came as a major surprise.”

However, it was not known how mammalian cells can synthesise this compound. Through sequence comparisons of the fungi’s enzyme sequence to human protein sequences, Karsten Hiller then identified a human gene, which encodes a protein similar to the one in fungi: immunoresponsive gene 1, orIRG1for short – a most exciting discovery as the function of this gene was not known. Says Hiller: "When it comes toIRG1, there is a lot of uncharted territory. What we did know is that it seems to play some role in the big picture of the immune response, but what exactly that role was, we were not sure."

To change this situation, the team turned offIRG1in cell cultures and instead added the gene to cells that normally do not express it. The experiments confirmed that in mammals,IRG1codes for an itaconic acid-producing enzyme. But why? When immune cells like macrophages and microglial cells take up bacteria in order to inactivate them, the intruders are actually able to survive by using a special metabolic pathway called the glyoxylate shunt. According to Hiller, "macrophages produce itaconic acid in an effort to foil this bacterial survival strategy.The acid blocks the first enzyme in the glyoxylate pathway. Which is how macrophages partially inhibit growth in order to support the innate immune response and digest the bacteria they have taken up."

LCSB director Prof. Dr. Rudi Balling describes the possibilities that these insights offer: “Parkinson’s disease is highly complex and has many causes. We now intend to study the importance of infections of the nervous system in this respect – and whether itaconic acid can play a role in diagnosing and treating Parkinson’s disease.”

May 7, 201363 notes
#itaconic acid #microglial cells #immune cells #neurodegenerative diseases #neuroscience #science
May 7, 201365 notes
#alzheimer's disease #blood sugar #diabetes #brain metabolism #neuroscience #science
Effects of stress on brain cells offer clues to new anti-depressant drugs

Research from King’s College London reveals the detailed mechanism behind how stress hormones reduce the number of new brain cells - a process considered to be linked to depression. 

image

The researchers identified a key protein responsible for the long-term detrimental effect of stress on cells, and importantly, successfully used a drug compound to block this effect, offering a potential new avenue for drug discovery.

The study, published in Proceedings of the National Academy of Sciences (PNAS) was co-funded by the National Institute for Health Research Biomedical Research Centre (NIHR BRC) for Mental Health at the South London and Maudsley NHS Foundation Trust and King’s College London.

Depression affects approximately 1 in 5 people in the UK at some point in their lives. The World Health Organisation estimate that by 2030, depression will be the leading cause of the global burden of disease. Treatment for depression involves either medication or talking therapy, or usually a combination of both. Current antidepressant medication is successful in treating depression in about 50-65% of cases, highlighting the need for new, more effective treatments.

Depression and successful antidepressant treatment are associated with changes in a process called “neurogenesis”- the ability of the adult brain to continue to produce new brain cells. At a molecular level, stress is known to increase levels of cortisol (a stress hormone) which in turn acts on a receptor called the glucocorticoid receptor (GR). However, the exact mechanism explaining how the GR decreases neurogenesis in the brain has remained unclear.

Professor Carmine Pariante, from King’s College London’s Institute of Psychiatry and lead author of the paper, says: “With as much as half of all depressed patients failing to improve with currently available medications, developing new, more effective antidepressants is an important priority. In order to do this, we need to understand the abnormal mechanisms that we can target. Our study shows the importance of conducting research on cellular models, animal models and clinical samples, all under one roof in order to better facilitate the translation of laboratory findings to patient benefit.”

In this study, the multidisciplinary team of researchers studied cellular and animal models before confirming their findings in human blood samples. First, the researchers studied human hippocampal stem cells, which are the source of new cells in the human brain. They gave the cells cortisol to measure the effect on neurogenesis and found that a protein called SGK1 was important in mediating the effects of stress hormones on neurogenesis and on the activity of the GR.

By measuring the effect of cortisol over time, they found that increased levels of SGK1 prolong the detrimental effects of stress hormones on neurogenesis. Specifically, SGK1 enhances and maintains the long-term effect of stress hormones, by keeping the GR active even after cortisol had been washed out of the cells.

Next, the researchers used a pharmacological compound (GSK650394) known to inhibit SGK1, and found they were able to block the detrimental effects of stress hormones and ultimately increase the number of new brain cells.

Finally, the research team were able to confirm these findings by studying levels of SGK1 in animal models and human blood samples of 25 drug-free depressed patients.

Dr Christoph Anacker, from King’s College London’s Institute of Psychiatry and first author of the paper, says: “Because a reduction of neurogenesis is considered part of the process leading to depression, targeting the molecular pathways that regulate this process may be a promising therapeutic strategy. This novel mechanism may be particularly important for the effects of chronic stress on mood, and ultimately depressive symptoms. Pharmacological interventions aimed at reducing the levels of SGK1 in depressed patients may therefore be a potential strategy for future antidepressant treatments.”

May 7, 2013102 notes
#stress hormones #brain cells #depression #antidepressant medication #neuroscience #science
May 7, 201391 notes
#science #astrocytes #neurons #hippocampus #extracellular matrix #neuronal connections #neuroscience
Study examines cognitive impairment in families with exceptional longevity

A study by Stephanie Cosentino, Ph.D., of Columbia University, New York, and colleagues examines the relationship between families with exceptional longevity and cognitive impairment consistent with Alzheimer disease.

The cross-sectional study included a total of 1,870 individuals (1,510 family members and 360 spouse controls) recruited through the Long Life Family Study. The main outcome measure was the prevalence of cognitive impairment based on a diagnostic algorithm validated using the National Alzheimer’s Coordinating Center data set.

According to study results, the cognitive algorithm classified 546 individuals (38.5 percent) as having cognitive impairment consistent with Alzheimer disease. Long Life Family Study probands had a slightly but not statistically significant reduced risk of cognitive impairment compared with spouse controls (121 of 232 for probands versus 45 of 103 for spouse controls), whereas Long Life Family Study sons and daughters had a reduced risk of cognitive impairment (11 of 213 for sons and daughters versus 28 of 216 for spouse controls). Restriction to nieces and nephews in the offspring generation attenuated this association (37 of 328 for nieces and nephews versus 28 of 216 for spouse controls).

"Overall, our results appear to be consistent with a delayed onset of disease in long-lived families, such that individuals who are part of exceptionally long-lived families are protected but not later in life," the study concludes.

May 7, 201336 notes
#longevity #cognitive impairment #alzheimer's disease #Long Life Family Study #neuroscience #science
May 7, 2013113 notes
#science #parkinson's disease #parkin #aging #fruit flies #gene expression #neuroscience
May 7, 201348 notes
#nerve cells #neurons #schwann cells #cell implants #medicine #neuroscience #science
Mind-body Genomics

A new study from investigators at the Benson-Henry Institute for Mind/Body Medicine at Massachusetts General Hospital and Beth Israel Deaconess Medical Center finds that eliciting the relaxation response—a physiologic state of deep rest induced by practices such as meditation, yoga, deep breathing and prayer—produces immediate changes in the expression of genes involved in immune function, energy metabolism and insulin secretion.

image

“Many studies have shown that mind/body interventions like the relaxation response can reduce stress and enhance wellness in healthy individuals and counteract the adverse clinical effects of stress in conditions like hypertension, anxiety, diabetes and aging,” said Herbert Benson, HMS professor of medicine at Mass General and co-senior author of thereport.

Benson is director emeritus of the Benson-Henry Institute.

“Now for the first time we’ve identified the key physiological hubs through which these benefits might be induced,” he said.

Published in the open-access journal PLOS ONE, the study combined advanced expression profiling and systems biology analysis to both identify genes affected by relaxation response practice and to determine the potential biological relevance of those changes.

“Some of the biological pathways we identify as being regulated by relaxation response practice are already known to play specific roles in stress, inflammation and human disease. For others, the connections are still speculative, but this study is generating new hypotheses for further investigation,” said Towia Libermann, HMS associate professor of medicine at Beth Israel Deaconess and co-senior author of the study.

Benson first described the relaxation response—the physiologic opposite of the fight-or-flight response—almost 40 years ago, and his team has pioneered the application of mind/body techniques to a wide range of health problems. Studies in many peer-reviewed journals have documented how the relaxation response both alleviates symptoms of anxiety and many other disorders and also affects factors such as heart rate, blood pressure, oxygen consumption and brain activity. 

In 2008, Benson and Libermann led a study finding that long-term practice of the relaxation response changed the expression of genes involved with the body’s response to stress. The current study examined changes produced during a single session of relaxation response practice, as well as those taking place over longer periods of time.

The study enrolled a group of 26 healthy adults with no experience in relaxation response practice, who then completed an 8-week relaxation-response training course.

Before they started their training, they went through what was essentially a control group session: Blood samples were taken before and immediately after the participants listened to a 20-minute health education CD and again 15 minutes later. After completing the training course, a similar set of blood tests was taken before and after participants listened to a 20-minute CD used to elicit the relaxation response as part of daily practice. 

The sets of blood tests taken before the training program were designated “novice,” and those taken after training completion were called “short-term practitioners.” For further comparison, a similar set of blood samples was taken from a group of 25 individuals with 4 to 25 years’ experience regularly eliciting the relaxation response through many different techniques before and after they listened to the same relaxation response CD.

Blood samples from all participants were analyzed to determine the expression of more than 22,000 genes at the different time points.

The results revealed significant changes in the expression of several important groups of genes between the novice samples and those from both the short- and long-term sets. Even more pronounced changes were shown in the long-term practitioners. 

A systems biology analysis of known interactions among the proteins produced by the affected genes revealed that pathways involved with energy metabolism, particularly the function of mitochondria, were upregulated during the relaxation response. Pathways controlled by activation of a protein called NF-κB—known to have a prominent role in inflammation, stress, trauma and cancer—were suppressed after relaxation response elicitation. The expression of genes involved in insulin pathways was also significantly altered.

“The combination of genomics and systems biology in this study provided great insight into the key molecules and physiological gene interaction networks that might be involved in relaying beneficial effects of relaxation response in healthy subjects,” said Manoj Bhasin, HMS assistant professor of medicine, co-lead author of the study, and co-director of the Beth Israel Deaconess Genomics, Proteomics, Bioinformatics and Systems Biology Center.

Bhasin noted that these insights should provide a framework for determining, on a genomic basis, whether the relaxation response will help alleviate symptoms of diseases triggered by stress. The work could also lead to developing biomarkers that may suggest how individual patients will respond to interventions.

Benson stressed that the long-term practitioners in this study elicited the relaxation response through many different techniques—various forms of meditation, yoga or prayer—but those differences were not reflected in the gene expression patterns.

“People have been engaging in these practices for thousands of years, and our finding of this unity of function on a basic-science, genomic level gives greater credibility to what some have called ‘new age medicine,’ ” he said.

“While this and our previous studies focused on healthy participants, we currently are studying how the genomic changes induced by mind/body interventions affect pathways involved in hypertension, inflammatory bowel disease and irritable bowel syndrome. We have also started a study—a collaborative undertaking between Dana-Farber Cancer Institute, Mass General and Beth Israel Deaconess—in patients with precursor forms of multiple myeloma, a condition known to involve activation of NF-κB pathways,” said Libermann, who is the director of the Beth Israel Deaconess Medical Center Genomics, Proteomics, Bioinformatics and Systems Biology Center.

May 6, 2013243 notes
#meditation #stress response #relaxation response #anxiety #inflammation #metabolism #neuroscience #science
Epilepsy Cured in Mice Using Brain Cells

Epilepsy that does not respond to drugs can be halted in adult mice by transplanting a specific type of cell into the brain, UC San Francisco researchers have discovered, raising hope that a similar treatment might work in severe forms of human epilepsy.

UCSF scientists controlled seizures in epileptic mice with a one-time transplantation of medial ganglionic eminence (MGE) cells, which inhibit signaling in overactive nerve circuits, into the hippocampus, a brain region associated with seizures, as well as with learning and memory. Other researchers had previously used different cell types in rodent cell transplantation experiments and failed to stop seizures. 

Cell therapy has become an active focus of epilepsy research, in part because current medications, even when effective, only control symptoms and not underlying causes of the disease, according to Scott C. Baraban, PhD, who holds the William K. Bowes Jr. Endowed Chair in Neuroscience Research at UCSF and led the new study. In many types of epilepsy, he said, current drugs have no therapeutic value at all.

“Our results are an encouraging step toward using inhibitory neurons for cell transplantation in adults with severe forms of epilepsy,” Baraban said. “This procedure offers the possibility of controlling seizures and rescuing cognitive deficits in these patients.”

The findings, which are the first ever to report stopping seizures in mouse models of adult human epilepsy, will be published online May 5 in the journal Nature Neuroscience.

During epileptic seizures, extreme muscle contractions and, often, a loss of consciousness can cause seizure sufferers to lose control, fall and sometimes be seriously injured. The unseen malfunction behind these effects is the abnormal firing of many excitatory nerve cells in the brain at the same time.

In the UCSF study, the transplanted inhibitory cells quenched this synchronous, nerve-signaling firestorm, eliminating seizures in half of the treated mice and dramatically reducing the number of spontaneous seizures in the rest. Robert Hunt, PhD, a postdoctoral fellow in the Baraban lab, guided many of the key experiments.

In another encouraging step, UCSF researchers reported May 2 that they found a way to reliably generate human MGE-like cells in the laboratory, and that, when transplanted into healthy mice,the cells similarly spun off functional inhibitory nerve cells. That research can be found online in the journal Cell Stem Cell.

In many forms of epilepsy, loss or malfunction of inhibitory nerve cells within the hippocampus plays a critical role. MGE cells are progenitor cells that form early within the embryo and are capable of generating mature inhibitory nerve cells called interneurons. In the Baraban-led UCSF study, the transplanted MGE cells from mouse embryos migrated and generated interneurons, in effect replacing the cells that fail in epilepsy. The new cells integrated into existing neural circuits in the mice, the researchers found.

“These cells migrate widely and integrate into the adult brain as new inhibitory neurons,” Baraban said. “This is the first report in a mouse model of adult epilepsy in which mice that already were having seizures stopped having seizures after treatment.”

The mouse model of disease that Baraban’s lab team worked with is meant to resemble a severe and typically drug-resistant form of human epilepsy called mesial temporal lobe epilepsy, in which seizures are thought to arise in the hippocampus. In contrast to transplants into the hippocampus, transplants into the amygdala, a brain region involved in memory and emotion, failed to halt seizure activity in this same mouse model, the researcher found.

Temporal lobe epilepsy often develops in adolescence, in some cases long after a seizure episode triggered during early childhood by a high fever. A similar condition in mice can be induced with a chemical exposure, and in addition to seizures, this mouse model shares other pathological features with the human condition, such as loss of cells in the hippocampus, behavioral alterations and impaired problem solving.

In the Nature Neuroscience study, in addition to having fewer seizures, treated mice became less abnormally agitated, less hyperactive, and performed better in water-maze tests.

May 6, 201387 notes
#epilepsy #seizures #neurons #cell transplantation #inhibitory cells #neuroscience #science
Children’s brain processing speed indicates risk of psychosis

New research from Bristol and Cardiff universities shows that children whose brains process information more slowly than their peers are at greater risk of psychotic experiences.

image

These can include hearing voices, seeing things that are not present or holding unrealistic beliefs that other people don’t share. These experiences can often be distressing and frightening and interfere with their everyday life.

Children with psychotic experiences are more likely to develop psychotic illnesses like schizophrenia later in life.

Using data gathered from 6,784 participants in Children of the 90s, researchers from the MRC Centre for Neuropsychiatric Genetics and Genomics in Cardiff University and the School of Social and Community Medicine in the University of Bristol examined whether performance in a number of cognitive tests conducted at ages 8, 10 and 11 was related to the risk of having psychotic experiences at age 12.

The tests assessed how quickly the children could process information, as well as their attention, memory, reasoning, and ability to solve problems.

Among those interviewed, 787 (11.6 per cent) had suspected or definite psychotic experiences at age 12. Children that scored less well in the various tests at the ages of 8, 10 and 11 were more likely to have psychotic experiences at age 12.

This was particularly the case for the test that assessed how quickly the children processed information. Furthermore, children whose speed of processing information became slower between ages 8 and 11 had greater risk of having psychotic experiences at age 12.

These findings did not change when other factors, including the parent’s psychiatric history and the children’s own developmental delay, were taken into account. The study’s findings could have important implications for identifying children at risk of psychosis, with the benefit of early treatment.

Speaking about the findings, lead author and PhD student, Miss Maria Niarchou from Cardiff University’s School of Medicine, said:

‘Previous research has shown a link between the slowing down of information processing and schizophrenia and this was found to be at least in part the result of anti-psychotic medication.

‘However, this study shows that impaired information processing speed can already be present in childhood and associated with higher risk of psychotic experiences, irrespective of medication.

‘Our findings improve our understanding of the brain processes that are associated with high risk of psychotic experiences in childhood and in turn high risk of psychotic disorder later in life.’

Senior author, Dr Marianne van den Bree of Cardiff University’s School of Medicine, said:

‘Schizophrenia is a complex and relatively rare mental health condition, occurring at a rate of 1 per cent in the general population. Not every child with impaired information processing speed is at risk of psychosis later in life. Further research is needed to determine whether interventions to improve processing speed in at-risk children can lead to decreased transition to psychotic disorders.’

Ruth Coombs, Manager for Influence and Change at Mind Cymru, said:

‘This is a very interesting piece of research, which could help young people at risk of developing mental health problems in later life build resilience and benefit from early intervention. It is important to remember that people can and do recover from mental health problems and we also welcome further research which supports resilience building in young people.’

May 6, 2013103 notes
#brain #psychotic experiences #schizophrenia #chidren #child development #psychology #neuroscience #science
The woman who can't recognise her face

"I’ve been in a crowded elevator with mirrors all around, and a woman will move and I’ll go to get out the way and then realise: ‘oh that woman is me’."

Heather Sellers has prosopagnosia, more commonly known as face blindness. “I can’t remember any image of the human face. It’s simply not special to me,” she says. “I don’t process them like I do a car or a dog. It’s not a visual problem, it’s a perception problem.”

image

Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn’t diagnosed until she was in her 30s. “I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy.”

The condition is estimated to affect around 2.5 per cent of the population, and it’s common for those who have it not to realise that anything is wrong. “In many ways it’s a subtle disorder,” says Heather. “It’s easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it’s socially devastating.”

As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn’t initially recognise her mother. “But I didn’t know that I wasn’t recognising her.”

Chaos explained

Heather was 36 when she stumbled across the phrase face blindness in a psychology textbook. “When I saw those two words I knew instantly that was exactly what I had – that explained all the chaos.”

She found her way to Harvard neuroscientist Brad Duchaine who diagnosed her as having one of the three worst cases of the disorder that he had ever seen.

So what’s it like to not recognise anyone you know? Heather says the biggest difficulty with the disorder is recognising people who she is close to – the people that are most important to recognise. In the school where she teaches English she is fine, because she recognises people by their clothes or hair and asks her students to wear name badges.

But it can be harder in social settings. Once she went up to the wrong person at a party and put her arm around him thinking he was her partner. And at college men would phone her angry that she had walked straight past them after they had had a date. “At the time I was thinking ‘I didn’t see you, why is everyone making my life so difficult?’”

It’s not just other people Heather doesn’t recognise – she can’t identify her own face either. “A few times I have been in a crowded elevator with mirrors all around and a woman will move, and I will go to get out the way and then realise ‘oh that woman is me’.” She also finds it unsettling to see photos and not recognise herself in them.

Face processing

To try and understand the condition, Duchaine and his colleagues recorded brain activity while 12 people with prosopagnosia looked at famous and non-famous faces. The team found that part of the brain responsible for stored visual memory was activated in six people when they saw the famous faces.

But another component of brain activity thought to represent a later stage of face processing wasn’t triggered. “Some part of their brain was recognising the face,” says Duchaine, but the brain was failing to pass this information into higher-level consciousness (Brain).

"There may be training where we give people feedback and say ‘look you recognise that face even though you’re not aware of it’," says Duchaine.

Now Zaira Cattaneo at the University of Milano-Bicocca in Italy and colleagues have identified the specific brain areas that allow us to recognise our friends. The team used transcranial magnetic stimulation to block two vital aspects of face processing in people without prosopagnosia. Targeting the left prefrontal cortex blocked the ability to distinguish individual features like the nose and eyes, and blocking the right prefrontal cortex impaired the ability to distinguish the location of those features from one another (NeuroImage).

"We made performance worse," says Cattaneo. "We want to make it better." Now the team are trying to activate these areas of the brain. "The aim is to enhance face recognition abilities by directly modulating excitability in the prefrontal cortices," says Cattaneo.

Would Heather want a cure, should one be found? “I can’t imagine what you see when you see a face, and it’s scary,” she says. “I go back and forth on what I’d do. I’ve done so much work in figuring out how to chart my world, I’d need to do a whole new rewrite. But it would be fascinating.”

May 6, 2013223 notes
#prosopagnosia #face blindness #visual perception #visual memory #psychology #neuroscience #science
Insect-Eye Camera Offers Wide-Angle Vision for Tiny Drones

image

Eye See You: Composites of hard and soft materials and circuits make up an electronic version of an insect’s compound eye.

New “insect eye” cameras could someday help flying drones see into every corner of a battlefield or give tiny medical scopes an all-around view inside the human body. A team of researchers from the United States has constructed such a camera, which offers an almost 180-degree field of view using hundreds of tiny lenses.

The centimeter-wide digital camera has 180 microlenses—roughly what fire ants or bark beetles have in their compound eyes—placed on a hemispherical array. Researchers hope their design will eventually lead to insect-eye cameras that exceed even nature’s blueprints, according to a report in the 2 May issue of the journal Nature.

“We think of the insect world as an inspiration for design, but we’re not constrained by it,” says John Rogers, a physical chemist and materials engineer at the University of Illinois at Urbana-Champaign. “It’s not biomimicry; it’s bioinspiration.”

Biological insect eyes consist of hundreds or thousands of the tiny units, each having a lens, pigment, and photoreceptors. Each unit’s lens is mounted on a transparent crystalline cone that pipes light down to the photoreceptors. Black pigment isolates each of the eye units and screens out background light.

image

Biomimicry: The 160-degree, 180-pixel eye is inspired by an insect’s compound eye.

Nature’s design offers two huge advantages over that of ordinary cameras. First, the hemispherical shape allows for extremely wide-angle fields of view. Second, the hemispherical array of tiny lenses has an almost infinite depth of field, which keeps objects in focus regardless of their distance from the camera.

But camera chips aren’t usually shaped like fly eyes. Researchers faced the tricky task of bending the camera into a hemispherical shape without distorting the image created by each lens or ruining the electronics beneath the tiny lenses. Their solution “relies on composites of hard and soft materials in strategic layouts that allow stretching and bending and flexing to go from planar [flat] to hemispherical form,” Rogers says.

Rogers and his colleagues put the tiny lenses on top of columns connected to a flexible base membrane—all made from elastomeric polydimethylsiloxane material, which is also used in contact lenses. Each supporting cylindrical post protected its lens from any bending or stretching in the base membrane.

The array of tiny lenses sat on a second layer of stretchable silicon photodiodes that converted the focused light from the lenses into current or voltage. Tiny serpentine wires connected the array of photodiodes with the other electronics.

A third, “black matrix” layer sat on top of both the lens layer and the photodiode layer to act as the shield against background light. The black pigment of real insect eyes can adjust in real time to changing light conditions, but the artificial camera version must use software to make such adjustments.

The design allowed researchers to freely inflate the flat layers into the final hemispherical shape—a camera with a 160-degree field of view. (The prototype camera’s array of lenses didn’t quite stretch all the way to the edge of the hemispherical shape.)

A next step could involve figuring out how to dynamically “tune” the inflated shape of the camera, says Rogers. He has also challenged his team to try inflating the camera shape into an almost full spherical shape—he envisions flexible camera designs based on the different compound eyes of other creatures, such as lobsters and shrimp (reflecting superposition eyes), moths and lacewings (refracting superposition eyes), and houseflies (neural superposition eyes).  

The insect-eye camera depends on each individual unit to contribute 1 pixel of resolution. A 180-pixel-resolution camera may not do much right now, but the camera design can scale up its resolution by adding more units to the overall array. Rogers anticipates making camera designs with better resolution than the eyes of praying mantises (15 000 eye units) and dragonflies (28 000 eye units).

The technology won’t likely be used in consumer digital cameras any time soon. But the insect-eye cameras could be used in medical devices, such as endoscopes, which give physicians a look inside the human body. Alexander Borst, director of the Max Planck Institute of Neurobiology, in Germany, envisions commercial versions of the cameras within the next year or two.

Such cameras may also prove useful for small drones to explore disaster areas such as those left behind by the Chernobyl and Fukushima nuclear disasters, Borst says. He was not involved in the latest research but hopes to work with Rogers and his colleagues to put the insect-eye camera to use in a robo-fly developed at his institution.

May 5, 201374 notes
#insects #robotic vision #digital cameras #engineering #biomimicry #drones #technology #science
May 5, 201371 notes
#cognitive training #aging #cognitive decline #visual processing #performance #psychology #neuroscience #science
May 5, 201350 notes
#fruit flies #optomotor reaction #optomotor response #fixation response #motion perception #neuroscience #science
May 5, 201359 notes
#sleep #sleep loss #sleep deprivation #genes #fruit flies #neuroscience #science
Human Brain Cells Developed in Lab, Grow in Mice

A key type of human brain cell developed in the laboratory grows seamlessly when transplanted into the brains of mice, UC San Francisco researchers have discovered, raising hope that these cells might one day be used to treat people with Parkinson’s disease, epilepsy, and possibly even Alzheimer’s disease, as well as and complications of spinal cord injury such as chronic pain and spasticity.

image

“We think this one type of cell may be useful in treating several types of neurodevelopmental and neurodegenerative disorders in a targeted way,” said Arnold Kriegstein, MD, PhD, director of the Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research at UCSF and co-lead author on the paper.

The researchers generated and transplanted a type of human nerve-cell progenitor called the medial ganglionic eminence (MGE) cell, in experiments described in the May 2 edition of Cell Stem Cell. Development of these human MGE cells within the mouse brain mimics what occurs in human development, they said.

Kriegstein sees MGE cells as a potential treatment to better control nerve circuits that become overactive in certain neurological disorders. Unlike other neural stem cells that can form many cell types — and that may potentially be less controllable as a consequence — most MGE cells are restricted to producing a type of cell called an interneuron. Interneurons integrate into the brain and provide controlled inhibition to balance the activity of nerve circuits.

To generate MGE cells in the lab, the researchers reliably directed the differentiation of human pluripotent stem cells — either human embryonic stem cells or induced pluripotent stem cells derived from human skin. These two kinds of stem cells have virtually unlimited potential to become any human cell type. When transplanted into a strain of mice that does not reject human tissue, the human MGE-like cells survived within the rodent forebrain, integrated into the brain by forming connections with rodent nerve cells, and matured into specialized subtypes of interneurons.

These findings may serve as a model to study human diseases in which mature interneurons malfunction, according to Kriegstein. The researchers’ methods may also be used to generate vast numbers of human MGE cells in quantities sufficient to launch potential future clinical trials, he said.

Kriegstein was a co-leader of the research, along with Arturo Alvarez-Buylla, PhD, UCSF professor of neurological surgery; John Rubenstein, MD, PhD, UCSF professor of psychiatry; and UCSF postdoctoral scholars Cory Nicholas, PhD, and Jiadong Chen, PhD.

Nicholas utilized key growth factors and other molecules to direct the derivation and maturation of the human MGE-like interneurons. He timed the delivery of these factors to shape their developmental path and confirmed their progression along this path. Chen used electrical measurements to carefully study the physiological and firing properties of the interneurons, as well as the formation of synapses between neurons.

Previously, UCSF researchers led by Allan Basbaum, PhD, chair of anatomy at UCSF, have used mouse MGE cell transplantation into the mouse spinal cord to reduce neuropathic pain, a surprising application outside the brain. Kriegstein, Nicholas and colleagues now are exploring the use of human MGE cells in mouse models of neuropathic pain and spasticity, Parkinson’s disease and epilepsy.

“The hope is that we can deliver these cells to various places within the nervous system that have been overactive and that they will functionally integrate and provide regulated inhibition,” Nicholas said.

The researchers also plan to develop MGE cells from induced pluripotent stem cells derived from skin cells of individuals with autism, epilepsy, schizophrenia and Alzheimer’s disease, in order to investigate how the development and function of interneurons might become abnormal — creating a lab-dish model of disease.

One mystery and challenge to both the clinical and pre-clinical study of human MGE cells is that they develop at a slower, human pace, reflecting an “intrinsic clock”. In fast-developing mice, the human MGE-like cells still took seven to nine months to form interneuron subtypes that normally are present near birth.

“If we could accelerate the clock in human cells, then that would be very encouraging for various applications,” Kriegstein said.

May 5, 2013102 notes
#brain cells #neurodegenerative diseases #medial ganglionic eminence cell #mouse brain #interneurons #neuroscience #science
Printable 'bionic' ear melds electronics and biology

Scientists at Princeton University used off-the-shelf printing tools to create a functional ear that can “hear” radio frequencies far beyond the range of normal human capability.

image

The researchers’ primary purpose was to explore an efficient and versatile means to merge electronics with tissue. The scientists used 3D printing of cells and nanoparticles followed by cell culture to combine a small coil antenna with cartilage, creating what they term a bionic ear.

"In general, there are mechanical and thermal challenges with interfacing electronic materials with biological materials," said Michael McAlpine, an assistant professor of mechanical and aerospace engineering at Princeton and the lead researcher. "Previously, researchers have suggested some strategies to tailor the electronics so that this merger is less awkward. That typically happens between a 2D sheet of electronics and a surface of the tissue. However, our work suggests a new approach — to build and grow the biology up with the electronics synergistically and in a 3D interwoven format."

McAlpine’s team has made several advances in recent years involving the use of small-scale medical sensors and antenna. Last year, a research effort led by McAlpine and Naveen Verma, an assistant professor of electrical engineering, and Fio Omenetto of Tufts University, resulted in the development of a “tattoo” made up of a biological sensor and antenna that can be affixed to the surface of a tooth.

This project, however, is the team’s first effort to create a fully functional organ: one that not only replicates a human ability, but extends it using embedded electronics

"The design and implementation of bionic organs and devices that enhance human capabilities, known as cybernetics, has been an area of increasing scientific interest," the researchers wrote in the article which appears in the scholarly journal Nano Letters. “This field has the potential to generate customized replacement parts for the human body, or even create organs containing capabilities beyond what human biology ordinarily provides.”

Standard tissue engineering involves seeding types of cells, such as those that form ear cartilage, onto a scaffold of a polymer material called a hydrogel. However, the researchers said that this technique has problems replicating complicated three dimensional biological structures. Ear reconstruction “remains one of the most difficult problems in the field of plastic and reconstructive surgery,” they wrote.

To solve the problem, the team turned to a manufacturing approach called 3D printing. These printers use computer-assisted design to conceive of objects as arrays of thin slices. The printer then deposits layers of a variety of materials – ranging from plastic to cells – to build up a finished product. Proponents say additive manufacturing promises to revolutionize home industries by allowing small teams or individuals to create work that could previously only be done by factories.

Creating organs using 3D printers is a recent advance; several groups have reported using the technology for this purpose in the past few months. But this is the first time that researchers have demonstrated that 3D printing is a convenient strategy to interweave tissue with electronics.

The technique allowed the researchers to combine the antenna electronics with tissue within the highly complex topology of a human ear. The researchers used an ordinary 3D printer to combine a matrix of hydrogel and calf cells with silver nanoparticles that form an antenna. The calf cells later develop into cartilage.

Manu Mannoor, a graduate student in McAlpine’s lab and the paper’s lead author, said that additive manufacturing opens new ways to think about the integration of electronics with biological tissue and makes possible the creation of true bionic organs in form and function. He said that it may be possible to integrate sensors into a variety of biological tissues, for example, to monitor stress on a patient’s knee meniscus.

David Gracias, an associate professor at Johns Hopkins and co-author on the publication, said that bridging the divide between biology and electronics represents a formidable challenge that needs to be overcome to enable the creation of smart prostheses and implants.

"Biological structures are soft and squishy, composed mostly of water and organic molecules, while conventional electronic devices are hard and dry, composed mainly of metals, semiconductors and inorganic dielectrics," he said. "The differences in physical and chemical properties between these two material classes could not be any more pronounced."

The finished ear consists of a coiled antenna inside a cartilage structure. Two wires lead from the base of the ear and wind around a helical “cochlea” – the part of the ear that senses sound – which can connect to electrodes. Although McAlpine cautions that further work and extensive testing would need to be done before the technology could be used on a patient, he said the ear in principle could be used to restore or enhance human hearing. He said electrical signals produced by the ear could be connected to a patient’s nerve endings, similar to a hearing aid. The current system receives radio waves, but he said the research team plans to incorporate other materials, such as pressure-sensitive electronic sensors, to enable the ear to register acoustic sounds.

In addition to McAlpine, Verma, Mannoor and Gracias the research team includes: Winston Soboyejo, a professor of mechanical and aerospace engineering at Princeton; Karen Malatesta, a faculty fellow in molecular biology at Princeton; Yong Lin Kong, a graduate student in mechanical and aerospace engineering at Princeton; and Teena James, a graduate student in chemical and biomolecular engineering at Johns Hopkins.

The team also included Ziwen Jiang, a high school student at the Peddie School in Hightstown who participated as part of an outreach program for young researchers in McAlpine’s lab.

"Ziwen Jiang is one of the most spectacular high school students I have ever seen," McAlpine said. "We would not have been able to complete this project without him, particularly in his skill at mastering CAD designs of the bionic ears."

May 4, 2013154 notes
#bionic ear #3D printing #cybernetics #biological tissue #human ear #neuroscience #science
May 4, 2013103 notes
#brain cells #neurons #virtual reality #neuronal maps #visual cues #sensory cues #neuroscience #science
Mathematicians help to unlock brain function

Mathematicians from Queen Mary, University of London will bring researchers one-step closer to understanding how the structure of the brain relates to its function in two recently published studies.

image

Publishing in Physical Review Letters the researchers from the Complex Networks group at Queen Mary’s School of Mathematical Sciences describe how different areas in the brain can have an association despite a lack of direct interaction. 

The team, in collaboration with researchers in Barcelona, Pamplona and Paris, combined two different human brain networks - one that maps all the physical connections among brain areas known as the backbone network, and another that reports the activity of different regions as blood flow changes, known as the functional network. They showed that the presence of symmetrical neurons within the backbone network might be responsible for the synchronised activity of physically distant brain regions.

Lead author Vincenzo Nicosia, said “We don’t fully understand how the human brain works. So far the focus has been more on the analysis of the function of single, localised regions. However, there isn’t a complete model that brings the whole functionality of the brain together. Hopefully, our research will help neuroscientists to develop a more accurate map of the brain and investigate its functioning beyond single areas.”

The research adds to the recent findings published in Proceedings of the National Academy of Sciences in which the QM researchers along with the Department of Psychiatry at University of Cambridge analysed the development of the brain of a small worm called Caenorhabditis elegans. In this paper, the team examined the number of links formed in the brain during the worm’s lifespan, and observed an unexpected abrupt change in the pattern of growth, corresponding with the time of egg hatching.

“The research is important as it’s the first time that a sharp transition in the growth of a neural network has ever been observed,” added Dr Nicosia.

“Although we don’t know which biological factors are responsible for the change in the growth pattern, we were able to reproduce the pattern using a simple economical model of synaptic formation. This result can pave the way to a deeper understanding of how neural networks grow in more complex organisms.” 

May 4, 201391 notes
#brain #brain function #c. elegans #brain development #synaptic formation #neural networks #neuroscience #science
May 4, 201382 notes
#primates #evolution #numerosity #math #cognition #psychology #neuroscience #science
May 4, 2013287 notes
#perception #magic tricks #neuroimaging #inattentional blindness #change blindness #psychology #neuroscience #science
May 3, 201390 notes
#synchronization #motion capture #macaques #animal behavior #neuroscience #psychology #science
Kelly the Robot Helps Kids Tackle Autism

Using a kid-friendly robot during behavioral therapy sessions may help some children with autism gain better social skills, a preliminary study suggests.

image

The study, of 19 children with autism spectrum disorders (ASDs), found that kids tended to do better when their visit with a therapist included a robot “co-therapist.” On average, they made bigger gains in social skills such as asking “appropriate” questions, answering questions and making conversational comments.

So-called humanoid robots are already being marketed for this purpose, but there has been little research to back it up.

"Going into this study, we were skeptical," said lead researcher Joshua Diehl, an assistant professor of psychology at the University of Notre Dame in Indiana, who said he has no financial interest in the technology.

"We found that, to our surprise, the kids did better when the robot was added," he said.

There are still plenty of caveats, however, said Diehl, who is presenting his team’s findings Saturday at the International Meeting for Autism Research (IMFAR) in San Sebastian, Spain.

For one, the study was small. And it’s not clear that the results seen in a controlled research setting would be the same in the real world of therapists’ offices, according to Diehl.

"I’d say this is not yet ready for prime time," he said.

ASDs are a group of developmental disorders that affect a person’s ability to communicate and interact socially. The severity of those effects range widely: Some people have mild problems socializing, but have normal to above-normal intelligence; some people have profound difficulties relating to others, and may have intellectual impairment as well.

Experts have become interested in using technology — from robots to iPads — along with standard ASD therapies because it may help bridge some of the communication issues kids have.

Human communication is complex and unpredictable, with body language, facial expressions and other subtle cues coming into the mix, explained Geraldine Dawson, chief science officer for the advocacy group Autism Speaks.

A robot or a computer game, on the other hand, can be programmed to be simple and predictable, and that may help kids with ASDs better process the information they are being given, Dawson said.

"Broadly speaking," she said, "we are very excited about the potential role for technology in diagnosing and treating ASDs." But she also agreed with Diehl that the findings are "very preliminary," and that researchers have a lot more to learn about how technology — robots or otherwise — fits into ASD therapies.

For the study, Diehl’s team used a humanoid robot manufactured by Aldebaran Robotics, which markets the NAO robot for use in education, including special education for kids with ASDs. The robot, which stands at about 2 feet tall, looks like a toy but it’s priced more like a small car, Diehl noted.

The NAO H25 “Academic Edition” rings up at about $16,000. (Diehl said the study was funded by government and private grants, not the manufacturer.)

The researchers had 19 kids aged 6 to 13 complete 12 behavioral therapy sessions, where a therapist worked with the child on social skills. Half of the sessions involved the robot, named Kelly, which was wheeled out so the child could practice conversing with her, while the therapist stood by.

"So the child might say, ‘Hi Kelly, how are you?’" Diehl explained. "Then Kelly would say, ‘Fine. What did you do today?’" During the non-Kelly sessions, another person entered the room and carried on the same conversation with the child that the robot would have.

On average, Diehl’s team found, kids made bigger gains from the sessions that included Kelly — based on both their interactions with their therapists, and their parents’ reports.

"There was one child who, when his dad came home from work, asked him how his day was," Diehl said. "He’d never done that before."

Still, he stressed that while the robot sessions seemed more successful on average, the children varied widely in their responses to Kelly. Going forward, Diehl said, it will be important to figure out whether there are certain kids with ASDs more likely to benefit from a robot co-therapist.

Dawson agreed that there is no one-size-fits-all ASD therapy. “Any therapy for a person with an ASD has to be individualized,” she said. The idea with any technology, she added, is to give therapists and doctors extra “tools” to work with.

A separate study presented at the same meeting looked at another type of tool. Researchers had 60 “minimally verbal” children with ASDs attend two “play-based” sessions per week, aimed at boosting their ability to speak and gesture. Half of the kids were also given a “speech-generating device,” like an iPad.

Three and six months later, children who worked with the devices were able to say more words and were quicker to take up conversational skills.

Dawson said the robot and iPad studies are just part of the growing body of research into how technology can not only aid in ASD therapies, but also help doctors diagnose the disorders or help parents manage at home.

But both Diehl and Dawson stressed that no robot or iPad is intended to stand in for human connection. The idea, after all, is to enhance kids’ ability to communicate and have relationships, Dawson noted. “Technology will never take the place of people,” she said.

The data and conclusions of research presented at meetings should be viewed as preliminary until published in a peer-reviewed journal.

May 3, 201386 notes
#ASD #autism #humanoid robots #robots #robotics #communication #social skills #neuroscience #psychology #science
Kids with brains that under-react to painful images

When children with conduct problems see images of others in pain, key parts of their brains don’t react in the way they do in most people. This pattern of reduced brain activity upon witnessing pain may serve as a neurobiological risk factor for later adult psychopathy, say researchers who report their findings in the Cell Press journal Current Biology on May 2.

image

(Image: Shutterstock)

That’s not to say that all children with conduct problems are the same, or that all children showing this brain pattern in young life will become psychopaths. The researchers emphasize that many children with conduct problems do not persist with their antisocial behavior.

"Our findings indicate that children with conduct problems have an atypical brain response to seeing other people in pain," says Essi Viding of University College London. "It is important to view these findings as an indicator of early vulnerability, rather than biological destiny. We know that children can be very responsive to interventions, and the challenge is to make those interventions even better, so that we can really help the children, their families, and their wider social environment."

Conduct problems represent a major societal problem and include physical aggression, cruelty to others, and a lack of empathy, or “callousness.” In the United Kingdom, where the study was conducted, about five percent of children qualify for a diagnosis of conduct problems. But very little is known about the underlying biology.

In the new study, Viding, Patricia Lockwood, and their colleagues scanned children’s brains by functional magnetic resonance imaging (fMRI) to see how those with conduct problems differ in their response to viewing images of others in pain.

The brain images showed that, relative to controls, children with conduct problems show reduced responses to others’ pain specifically in regions of the brain known to play a role in empathy. The researchers also saw variation among those with conduct problems, with those deemed to be more callous showing lower brain activation than less callous individuals.

"Our findings very clearly point to the fact that not all children with conduct problems share the same vulnerabilities; some may have neurobiological vulnerability to psychopathy, while others do not," Viding says. "This raises the possibility of tailoring existing interventions to suit the specific profile of atypical processing that characterizes a child with conduct problems."

May 3, 2013104 notes
#brain activity #children #fMRI #antisocial behavior #aggression #psychopathy #neuroscience #science
Brilliant dye to probe the brain

To obtain very-high-resolution 3D images of the cerebral vascular system, a dye is used that fluoresces in the near infrared and can pass through the skin. The Lem-PHEA chromophore, a new product outclassing the best dyes, has been synthesized by a team from the Laboratoire de Chimie (CNRS/ENS de Lyon/Université Claude Bernard Lyon 1). Conducted in collaboration with researchers from the Institut des Neurosciences (Université Joseph Fourier - Grenoble/CEA/Inserm/CHU) and the Laboratoire Chimie et Interdisciplinarité: Synthèse, Analyse, Modélisation (CNRS /Université de Nantes), this work has been published online in the journal Chemical Science. It opens up significant prospects for better observing the brain and understanding how it works.

Different cerebral imaging techniques, such as two-photon microscopy or magnetic resonance imaging (MRI), contribute to our understanding of how the healthy or diseased brain works. One of their essential characteristics is their spatial resolution, in other words the dimension of the smallest details observable by each technique. Typically, for MRI, this resolution is limited to several millimeters, which does not make it possible to obtain images such as the one below, whose resolution is of the order of a micrometer.

image

To obtain such images of the vascular system of a mouse brain, it is necessary to use a fluorescent dye that combines several properties: luminescence in the near infrared, solubility in biological media, low cost, non-toxicity and suitable for 3D imaging (two-photon absorption). The researchers have developed a new product, Lem-PHEA, which combines these properties and is easy to synthesize. When injected into the blood vessels of a mouse, it has revealed details of the rodent’s vascular system with previously unattained precision, thanks to a considerably enhanced fluorescence compared to “conventional” dyes (such as Rhodamine-B and cyanine derivatives). With Lem-PHEA, the researchers have obtained more contrasted images (in terms of brilliance) than with these standard dyes. Finally, the product is easily eliminated by the kidneys and no toxic residues have been found in the liver. These results pave the way for a better understanding of the working of the brain.

May 3, 201348 notes
#brain #cerebral vascular system #cerebral imaging techniques #fluorescent dye #neuroscience #science
May 3, 2013166 notes
#science #stem cells #nerve cells #nervous system #pluripotent stem cells #neuroscience
Turning human stem cells into brain cells sheds light on neural development

Medical researchers have manipulated human stem cells into producing types of brain cells known to play important roles in neurodevelopmental disorders such as epilepsy, schizophrenia and autism. The new model cell system allows neuroscientists to investigate normal brain development, as well as to identify specific disruptions in biological signals that may contribute to neuropsychiatric diseases.

Scientists from The Children’s Hospital of Philadelphia and the Sloan-Kettering Institute for Cancer Research led a study team that described their research in the journal Cell Stem Cell, published online today.

The research harnesses human embryonic stem cells (hESCs), which differentiate into a broad range of different cell types. In the current study, the scientists directed the stem cells into becoming cortical interneurons—a class of brain cells that, by releasing the neurotransmitter GABA, controls electrical firing in brain circuits.

"Interneurons act like an orchestra conductor, directing other excitatory brain cells to fire in synchrony," said study co-leader Stewart A. Anderson, M.D., a research psychiatrist at The Children’s Hospital of Philadelphia. "However, when interneurons malfunction, the synchrony is disrupted, and seizures or mental disorders can result."

Anderson and study co-leader Lorenz Studer, M.D., of the Center for Stem Cell Biology at Sloan-Kettering, derived interneurons in a laboratory model that simulates how neurons normally develop in the human forebrain.

"Unlike, say, liver diseases, in which researchers can biopsy a section of a patient’s liver, neuroscientists cannot biopsy a living patient’s brain tissue," said Anderson. Hence it is important to produce a cell culture model of brain tissue for studying neurological diseases. Significantly, the human-derived cells in the current study also "wire up" in circuits with other types of brain cells taken from mice, when cultured together. Those interactions, Anderson added, allowed the study team to observe cell-to-cell signaling that occurs during forebrain development.

In ongoing studies, Anderson explained, he and colleagues are using their cell model to better define molecular events that occur during brain development. By selectively manipulating genes in the interneurons, the researchers seek to better understand how gene abnormalities may disrupt brain circuitry and give rise to particular diseases. Ultimately, those studies could help inform drug development by identifying molecules that could offer therapeutic targets for more effective treatments of neuropsychiatric diseases.

In addition, Anderson’s laboratory is studying interneurons derived from stem cells made from skin samples of patients with chromosome 22q.11.2 deletion syndrome, a genetic disease which has long been studied at The Children’s Hospital of Philadelphia. In this multisystem disorder, about one third of patients have autistic spectrum disorders, and a partially overlapping third of patients develop schizophrenia. Investigating the roles of genes and signaling pathways in their model cells may reveal specific genes that are crucial in those patients with this syndrome who have neurodevelopmental problems.

May 3, 201375 notes
#stem cells #embryonic stem cells #neurological disorders #brain cells #brain tissue #neuroscience #science
Study uses Botox to find new wrinkle in brain communication

National Institutes of Health researchers used the popular anti-wrinkle agent Botox to discover a new and important role for a group of molecules that nerve cells use to quickly send messages. This novel role for the molecules, called SNARES, may be a missing piece that scientists have been searching for to fully understand how brain cells communicate under normal and disease conditions.

"The results were very surprising," said Ling-Gang Wu, Ph.D., a scientist at NIH’s National Institute of Neurological Disorders and Stroke. "Like many scientists we thought SNAREs were only involved in fusion."

image

Every day almost 100 billion nerve cells throughout the body send thousands of messages through nearly 100 trillion communication points called synapses. Cell-to-cell communication at synapses controls thoughts, movements, and senses and could provide therapeutic targets for a number of neurological disorders, including epilepsy.

Nerve cells use chemicals, called neurotransmitters, to rapidly send messages at synapses. Like pellets inside shotgun shells, neurotransmitters are stored inside spherical membranes, called synaptic vesicles. Messages are sent when a carrier shell fuses with the nerve cell’s own shell, called the plasma membrane, and releases the neurotransmitter “pellets” into the synapse.

SNAREs (soluble N-ethylmaleimide-sensitive factor attachment protein receptor) are three proteins known to be critical for fusion between carrier shells and nerve cell membranes during neurotransmitter release.

"Without SNAREs there is no synaptic transmission," said Dr. Wu.

Botulinum toxin, or Botox, disrupts SNAREs. In a study published in Cell Reports, Dr. Wu and his colleagues describe how they used Botox and similar toxins as tools to show that SNAREs may also be involved in retrieving message carrier shells from nerve cell membranes immediately after release.

To study this, the researchers used advanced electrical recording techniques to directly monitor in real time carrier shells being fused with and retrieved from nerve cell membranes while the cells sent messages at synapses. The experiments were performed on a unique synapse involved with hearing called the calyx of Held. As expected, treating the synapses with toxins reduced fusion. However Dr. Wu and his colleagues also noticed that the toxins reduced retrieval.

"The results were very surprising," said Dr. Wu. "Like many scientists we thought SNAREs were only involved in fusion."

For at least a decade scientists have known that carrier shells have to be retrieved before more messages can be sent. Retrieval occurs in two modes: fast and slow. A different group of molecules are known to control the slow mode.

"Until now most scientists thought fusion and retrieval were two separate processes controlled by different sets of molecules", said Dr. Wu.

Nevertheless several studies suggested that one of the SNARE molecules could be involved with both modes.

In this study, Dr. Wu and his colleagues systematically tested this idea to fully understand retrieval. The results showed that all three SNARE proteins may be involved in both fast and slow retrieval.

"Our results suggest that SNAREs link fusion and retrieval," said Dr. Wu.

The results may have broad implications. SNAREs are commonly used by other cells throughout the body to release chemicals. For example, SNAREs help control the release of insulin from pancreas cells, making them a potential target for diabetes treatments. Recent studies suggest that SNAREs may be involved in neurological and psychiatric disorders, such as schizophrenia and spastic ataxia.

"We think SNARES work like this in most nerve cell synapses. This new role could change the way scientists think about how SNAREs are involved in neuronal communication and diseases," said Dr. Wu.

May 3, 201351 notes
#nerve cells #brain cells #synaptic transmission #botulinum toxin #botox #medicine #neuroscience #science
May 3, 2013212 notes
#stroke #stroke symptoms #brain #medicine
Persistent pain after stressful events may have a neurobiological basis

A new study led by University of North Carolina School of Medicine researchers is the first to identify a genetic risk factor for persistent pain after traumatic events such as motor vehicle collision and sexual assault.

In addition, the study contributes further evidence that persistent pain after stressful events has a specific biological basis. A manuscript of the study was published online ahead of print by the journal Pain on April 29.

“Our study findings indicate that mechanisms influencing chronic pain development may be related to the stress response, rather than any specific injury caused by the traumatic event,” said Samuel McLean, MD, MPH, senior author of the study and assistant professor of anesthesiology. “In other words, our results suggest that in some individuals something goes wrong with the body’s ‘fight or flight’ response or the body’s recovery from this response, and persistent pain results.”

The study assessed the role of the hypothalamic-pituitary adrenal (HPA) axis, a physiologic system of central importance to the body’s response to stressful events. The study evaluated whether the HPA axis influences musculoskeletal pain severity six weeks after motor vehicle collision (MVC) and sexual assault. Its findings revealed that variation in the gene encoding for the protein FKBP5, which plays an important role in regulating the HPA axis response to stress, was associated with a 20 percent higher risk of moderate to severe neck pain six weeks after a motor vehicle collision, as well as a greater extent of body pain. The same variant also predicted increased pain six weeks after sexual assault.

"Right now, if an someone comes to the emergency department after a car accident, we don’t have any interventions to prevent chronic pain from developing," McLean said. Similarly, if a woman comes to the emergency department after sexual assault, we have medications to prevent pregnancy or sexually transmitted disease, but no treatments to prevent chronic pain. This is because we understand what causes pregnancy or infection, but we have no idea what the biologic mechanisms are that cause chronic pain. Chronic pain after these events is common and can cause great suffering, and there is an urgent need to understand what causes chronic pain so that we can start to develop interventions. This study is an important first step in developing this understanding."

"In addition, because we don’t understand what causes these outcomes, individuals with chronic pain after traumatic events are often viewed with suspicion, as if they are making up their symptoms for financial gain or having a psychological reaction," McLean said. "An improved understanding of the biology helps with this stigma," McLean said. 

May 3, 201392 notes
#chronic pain #stress response #traumatic events #hypothalamic-pituitary adrenal axis #genes #neuroscience #science
New brain research shows two parents may be better than one

A team of researchers at the University of Calgary’s Hotchkiss Brain Institute (HBI) have discovered that adult brain cell production might be determined, in part, by the early parental environment. The study suggests that dual parenting may be more beneficial than single parenting.

image

Scientists studied mouse pups that were raised by either dual or single parents and found that adult cell production in the brain might be triggered by early life experiences. The scientists also found that the increased adult brain cell production varied based on gender. Specifically, female pups raised by two parents had enhanced white matter production as adults, increasing motor coordination and sociability. Male pups raised by dual parents displayed more grey matter production as an adult, which improves learning and memory.

“Our new work adds to a growing body of knowledge, which indicates that early, supportive experiences have long lasting, positive impact on adult brain function,” says Samuel Weiss, PhD, senior author of the study and director of the HBI.

Surprisingly, the advantages of dual parenting were also passed along when these two groups reproduced, even if their offspring were raised by one female. The advantages of dual parenting were thus passed along to the next generation.

To conduct the study, scientists divided mice into three groups i) pups raised to adulthood by one female ii) pups raised to adulthood by one female and one male and iii) pups raised to adulthood by two females. Researchers then waited for the offspring to reach adulthood to find out if there was any impact on brain cell production.

Scientists say that this research provides evidence that, in the mouse model, parenting and the environment directly impact adult brain cell production. While it’s not known at this point, it is possible that similar effects could be seen in other mammals, such as humans. The study is published in the May 1 edition of PLOS ONE.

May 2, 2013108 notes
#adult brain #brain cells #cell production #animal model #brain function #parenting #neuroscience #science
PTSD research: distinct gene activity patterns from childhood abuse

Abuse during childhood is different.

image

A study of adult civilians with PTSD (post-traumatic stress disorder) has shown that individuals with a history of childhood abuse have distinct, profound changes in gene activity patterns, compared to adults with PTSD but without a history of child abuse.

A team of researchers from Atlanta and Munich probed blood samples from 169 participants in the Grady Trauma Project, a study of more than 5000 Atlanta residents with high levels of exposure to violence, physical and sexual abuse and with high risk for civilian PTSD.

The results were published Monday, April 29 in Proceedings of the National Academy of Sciences, Early Edition.

“These are some of the most robust findings to date showing that different biological pathways may describe different subtypes of a psychiatric disorder, which appear similar at the level of symptoms but may be very different at the level of underlying biology,” says Kerry Ressler, MD, PhD, professor of psychiatry and behavioral sciences at Emory University School of Medicine and Yerkes National Primate Research Center.

“As these pathways become better understood, we expect that distinctly different biological treatments would be implicated for therapy and recovery from PTSD based on the presence or absence of past child abuse.”

Ressler, a Howard Hughes Medical Institute Investigator, is co-director of the Grady Trauma Project, along with co-author Bekh Bradley, PhD, assistant professor of psychiatry and behavioral sciences at Emory and director of the Trauma Recovery Program at the Atlanta Veterans Affairs Medical Center.

The first author of the paper is Divya Mehta, PhD, a postdoctoral fellow in Munich. The senior author is Elisabeth Binder, MD, PhD, associate professor of psychiatry and behavioral sciences at Emory and group leader at the Max-Planck Institute of Psychiatry in Munich, Germany.

Mehta and her colleagues examined changes in the patterns of which genes were turned on and off in blood cells from patients. They also looked at patterns of methylation, a DNA modification on top of the four letters of the genetic code that causes genes to be ‘silenced’ or made inactive.

Study participants were divided into three groups: people who experienced trauma without developing PTSD, people with PTSD who were exposed to child abuse, and people with PTSD who were not exposed to child abuse.

The researchers were surprised to find that although hundreds of genes had significant changes in activity in the PTSD with and without child abuse groups, there was very little overlap in patterns between these groups. The two groups shared similar symptoms of PTSD, which include intrusive thoughts such as nightmares and flashbacks, avoidance of trauma reminders, and symptoms of hyperarousal and hypervigilance.

The PTSD with child abuse group displayed more changes in genes linked with development of the nervous system and regulation of the immune system, while the PTSD minus child abuse group displayed more changes in genes linked with apoptosis (cell death) and growth rate regulation. In addition, changes in methylation were more frequent in the PTSD with child abuse group. The authors believe that these biological pathways may lead to different mechanisms of PTSD symptom formation within the brain.

The Max Planck/Emory scientists were probing gene activity in blood cells, rather than brain tissue. Similar results have been obtained by researchers studying the influence of child abuse on the brains of people who had committed suicide.

“Traumatic events that happen in childhood are embedded in the cells for a long time,” Binder says. “Not only the disease itself, but the individual’s life experience is important in the biology of PTSD, and this should be to be reflected in the way we treat these disorders.”

May 2, 2013278 notes
#science #child abuse #PTSD #gene activity #dna methylation #blood cells #psychology #neuroscience
May 2, 201380 notes
#surgical implant #brain activity #focal epilepsy #epilepsy #seizures #neuroscience #science
May 2, 2013126 notes
#science #migraines #headache #genetic mutation #cortical spreading depression #astrocytes #neuroscience
Scientists discover how brain’s auditory center transmits information for decisions and actions

When a pedestrian hears the screech of a car’s brakes, she has to decide whether, and if so, how, to move in response. Is the action taking place blocks away, or 20 feet to the left?

One of the truly primal mechanisms that we depend on every day of our lives — acting on the basis of information gathered by our sense of hearing — is yielding its secrets to modern neuroscience. A team of researchers from Cold Spring Harbor Laboratory (CSHL) today publishes experimental results in the journal Nature which they describe as surprising. The results fill in a key piece of the puzzle about how mammals act on the basis of sound cues.

It’s well known that sounds detected by the ears wind up in a part of the brain called the auditory cortex, where they are translated – transduced – into information that scientists call representations. These representations, in turn, form the informational basis upon which other parts of the brain can make decisions and issue commands for specific actions. What scientists have not understood is what happens between the auditory cortex and portions of the brain that ultimately issue commands, say, for muscles to move in response to the sound of that car’s screeching brakes.

To find out, CSHL Professor Anthony Zador and Dr. Petr Znamenskiy trained rats to listen to sounds and to make decisions based on those sounds. When a high-frequency sound is played, the animals are rewarded if they move to the left. When the sound is low-pitched, the reward is given if the animal moves right.

image

To the striatum

On the simplest level, says Zador, “we know that sound is coming into the ear; and we know what’s coming out in the end – a decision,” in the form of a muscle movement. The surprise, he says, is the destination of the information used by the animal to perform this task of discriminating between sounds of high and low frequency, as revealed in his team’s experiments.

“It turns out the information passes through a particular subset of neurons in the auditory cortex whose axons wind up in another part of the brain, called the striatum,” says Zador. The classic series of experiments that provided inspiration and a model for this work, performed at Stanford University by William Newsome and colleagues, involved the visual system of primates, and had led Zador to expect by analogy that representations formed in the auditory cortex would lead to other locations within the cortex.

These experiments in rats have implications for how neural circuits make decisions, according to Zador. Even though many neurons in auditory cortex are “tuned” to low or high frequencies, most do not transmit their information directly to the striatum. Rather, their information is transmitted by a much smaller number of neurons in their vicinity, which convey their “votes” directly to the striatum.

“This is like the difference between a direct democracy and a representative democracy, of the type we have in the United States,” Zador explains. “In a direct democracy model of how the auditory cortex conveys information to the rest of the brain, every neuron activated by a low- or high-pitched sound would have a ‘vote.’ Since there is noise in every perception, some minority of neurons will indicate ‘low’ when the sound is in fact ‘high,’ and vice-versa. In the direct democracy model, the information sent to the striatum for further action would be the equivalent of a simple sum of all these votes.

“In contrast – and this is what we found to be the case – the neurons registering ‘high’ and ‘low’ are represented by a specialized subset of neurons in their local area, which we might liken to members of Congress or the Electoral College: these in turn transmit the votes of the larger population to the place — in this case the auditory striatum — in which decisions are made and actions are taken.”

May 2, 201359 notes
#auditory cortex #hearing #striatum #muscle movement #neuroscience #science
May 2, 201365 notes
#alzheimer's disease #dementia #blood test #amyloid beta #biomarkers #neuroscience #science
Hypothalamus and Aging: Brain Region May Hold Key to Aging

While the search continues for the Fountain of Youth, researchers may have found the body’s “fountain of aging”: the brain region known as the hypothalamus. For the first time, scientists at Albert Einstein College of Medicine of Yeshiva University report that the hypothalamus of mice controls aging throughout the body. Their discovery of a specific age-related signaling pathway opens up new strategies for combating diseases of old age and extending lifespan. The paper was published today in the online edition of Nature.

image

“Scientists have long wondered whether aging occurs independently in the body’s various tissues or if it could be actively regulated by an organ in the body,” said senior author Dongsheng Cai, M.D., Ph.D., professor of molecular pharmacology at Einstein. “It’s clear from our study that many aspects of aging are controlled by the hypothalamus. What’s exciting is that it’s possible — at least in mice — to alter signaling within the hypothalamus to slow down the aging process and increase longevity.”

The hypothalamus, an almond-sized structure located deep within the brain, is known to have fundamental roles in growth, development, reproduction, and metabolism. Dr. Cai suspected that the hypothalamus might also play a key role in aging through the influence it exerts throughout the body.

“As people age,” he said, “you can detect inflammatory changes in various tissues. Inflammation is also involved in various age-related diseases, such as metabolic syndrome, cardiovascular disease, neurological disease and many types of cancer.” Over the past several years, Dr. Cai and his research colleagues showed that inflammatory changes in the hypothalamus can give rise to various components of metabolic syndrome (a combination of health problems that can lead to heart disease and diabetes).    

To find out how the hypothalamus might affect aging, Dr. Cai decided to study hypothalamic inflammation by focusing on a protein complex called NF-κB (nuclear factor kappa-light-chain-enhancer of activated B cells). “Inflammation involves hundreds of molecules, and NF-κB sits right at the center of that regulatory map,” he said.

In the current study, Dr. Cai and his team demonstrated that activating the NF-κB pathway in the hypothalamus of mice significantly accelerated the development of aging, as shown by various physiological, cognitive, and behavioral tests. “The mice showed a decrease in muscle strength and size, in skin thickness, and in their ability to learn — all indicators of aging. Activating this pathway promoted systemic aging that shortened the lifespan,” he said.

Conversely, Dr. Cai and his group found that blocking the NF-κB pathway in the hypothalamus of mouse brains slowed aging and increased median longevity by about 20 percent, compared to controls.

The researchers also found that activating the NF-κB pathway in the hypothalamus caused declines in levels of gonadotropin-releasing hormone (GnRH), which is synthesized in the hypothalamus. Release of GnRH into the blood is usually associated with reproduction. Suspecting that reduced release of GnRH from the brain might contribute to whole-body aging, the researchers injected the hormone into a hypothalamic ventricle (chamber) of aged mice and made the striking observation that the hormone injections protected them from the impaired neurogenesis (the creation of new neurons in the brain) associated with aging. When aged mice received daily GnRH injections for a prolonged period, this therapy exerted benefits that included the slowing of age-related cognitive decline, probably the result of neurogenesis.  

According to Dr. Cai, preventing the hypothalamus from causing inflammation and increasing neurogenesis via GnRH therapy are two potential strategies for increasing lifespan and treating age-related diseases. This technology is available for licensing.

May 2, 2013142 notes
#hypothalamus #aging #longevity #metabolic syndrome #inflammation #neuroscience #science
Neon exposes hidden ALS cells

A small group of elusive neurons in the brain’s cortex play a big role in ALS (amyotrophic lateral sclerosis), a swift and fatal neurodegenerative disease that paralyzes its victims. But the neurons have always been difficult to study because there are so few of them and they look so similar to other neurons in the cortex.

In a new preclinical study, a Northwestern Medicine® scientist has isolated the motor neurons in the brain that die in ALS and, for the first time, dressed them in a green fluorescent jacket. Now they’re impossible to miss and easy to study.

The cells slide on neon jackets when they are born and continue to wear them as they age and become sick. As a result, scientists will now be able to track what goes wrong in these cells to cause their deaths and be able to search for effective treatments.

"We have developed the tool to investigate what makes these cells become vulnerable and sick," said Hande Ozdinler, senior author of the study and assistant professor of neurology at Northwestern University Feinberg School of Medicine. "This was not possible before."

Ozdinler and colleagues also identified the motor neurons that don’t die, enabling scientists to study what protects them.

The study will be published in the Journal of Neuroscience on May 1.

ALS, also known as Lou Gehrig’s disease, causes the death of muscle-controlling nerve cells in the brain and spinal cord (motor neurons). It results in rapidly progressing paralysis and death usually within three to five years of the onset of symptoms.

There are about 75,000 upper motor neurons affected in ALS out of some 2 billion cells in the brain. Previously, the only way to study the upper motor neurons was to extract them through surgery, a difficult process that was beyond the scope of most scientists and still didn’t allow examination of the ailing neurons at various stages of the disease.

"You couldn’t study them at the cellular level, so the research field ignored them," Ozdinler said. She is one of the few scientists in the country who studies cortical motor neurons. Most of ALS research has focused on the death of motor neurons in the spinal cord.

Key puzzle piece: Why ALS moves so swiftly

But the brain’s motor neurons are a key piece of the ALS puzzle. Their disintegration explains why the disease advances more swiftly than other neurodegenerative diseases. It had previously been thought that the spinal motor neurons died first and their demise led to the secondary death of the brain’s motor neurons. But Ozdinler’s recent research showed that the motor neurons in the brain and spinal cord die simultaneously.

"The whole system collapses at once," Ozdinler said. "It’s degeneration from both ends which is why the disease moves so swiftly."

Every voluntary movement is initiated and modulated by upper motor neurons — answering a cell phone, typing an email, walking to the store. The upper motor neurons tell the spinal motor neurons what to do. In ALS, both the directing neurons and the neurons that create the movement disintegrate at the same time.

Finding the light that never goes out

Ozdinler spent the last four years figuring out how to permanently sheath cortical motor neurons in fluorescence.

Although scientists can flag spinal cord motor neurons in fluorescence, it wears off as the neuron ages because the process uses an embryonic gene. Ozdinler wanted a longer lasting effect so scientists could study the neuron as it ages and develops ALS. She sorted through 6,000 upper motor neuron genes that are vulnerable to ALS before she found one — UCHL1 — that is expressed through adulthood.

She used that gene — which had been cloned with the fluorescence molecule — and created a mouse model whose upper motor neurons shimmer in green. Then she mated that mouse with an ALS transgenic mouse model. The result is a mouse with fluorescent diseased motor neurons in the brain.

"Now we have a model of one motor neuron population that dies and one that is resistant," Ozdinler said. "That’s the perfect experiment. You can ask what does this neuron have that makes it resistant and what does the other one have that makes it vulnerable? That’s what we will find out."

Marina Yasvoina, a graduate student, and Baris Genc, a postdoctoral fellow, both in Ozdinler’s lab, are the lead authors of the paper. Ozdinler collaborated with Gordon Shepherd, associate professor of physiology, and C.J. Heckman, professor in physiology, both at Feinberg.

"This work was possible thanks to the collaborative nature of Northwestern," Ozdinler said.

May 1, 201345 notes
#ALS #Lou Gehrig's disease #motor neurons #nerve cells #cortex #neuroscience #science
Professor finds neuroscience provides insights into brains of complex and adaptive leaders

“This study represents a fusion of the leadership and neuroscience fields, and this fusion can revolutionize approaches to assessing and developing leaders,” says Hannah, the Tylee Wilson Chair in business ethics and professor of management at the Wake Forest University School of Business. Hannah is lead author of the paper in the May 2013 Journal of Applied Psychology titled, “The Psychological and Neurological Bases of Leader Self-Complexity and Effects on Adaptive Decision-Making.”

Hannah and four colleagues tested 103 young military leaders between the ranks of officer cadet and major at a U.S. Army base on the east coast. They administered psychological exams to assess the complexity of leaders’ identities, and neurological exams to assess the complexity of soldiers’ brain activity. For the brain tests, the researchers attached quantitative electroencephalogram (qEEG) electrodes to 19 areas of the soldier’s scalp.

Hannah and his fellow researchers wanted to know if great leaders had more complex brains – measured by the electrodes which reported which parts of the brain were firing together at the same time. A low complex brain shows more areas of the brain operating at the same time at the same electrical amplitude and frequency – which suggests those areas converge to process the same task leaving fewer brain resources for other tasks and processes. It’s a process called “phase lock.”

But in high complex brains, the activity patterns are much more different and varied – which suggests more of the brains resources are available at any one time to handle other situations or tasks.

“Think of it as a single core versus a multicore computer’s central processing unit (CPU),” Hannah says. “A multicore CPU can multitask because one core can process a task while the other CPU cores remain free to process new tasks. More complex brains are also more efficient in locking together only the brain resources needed to process a task and then efficiently releasing them when no longer needed.”

The study showed the high complex brains of the great leaders had a different “landscape.” The scans showed more differentiated activation patterns in the frontal and prefrontal lobes of leaders who demonstrated greater decisiveness, adaptive thinking and positive action orientation in the experiment.

“Further, individuals who have developed richer and more elaborate self-concepts as leaders were found to be more complex and adaptable,” Hannah says. “These findings have important implications for identifying and developing leaders who can lead effectively in today’s changing, dynamic, and often volatile organizational contexts.”

The researcher team suggests that once they validate neurological profiles of leaders with high complex brains, they will be able to use established techniques like neuro-feedback to enhance these leadership skills in others. Neuro-feedback has been successfully used with elite athletes, concert musicians and financial traders in their training. These profiles can also be used to assess leaders and track their development over time.

These findings have relevance to the WFU Schools of Business’ new student development framework, which focuses on developing practical wisdom, strategic thinking and critical thinking skills, along with the ability to embrace complexity and ambiguity.

Hannah’s co-authors include Pierre Balthazard, dean of the School of Business at Saint Bonaventure University; David A. Waldman, professor of business at Arizona State University; Peter L. Jennings, of the Center for the Army Profession and Ethic at West Point; and Robert W. Thatcher of the University of South Florida.

This research team is at the forefront of applying neuroscience to study effective leadership. The team previously published a 2012 paper in the Leadership Quarterly, which identified unique brain functioning in leaders who are seen by their followers as highly inspirational and charismatic.

May 1, 201359 notes
#brain activity #leadership #decision-making #prefrontal cortex #neuroscience #science
Musical memory deficits start in auditory cortex

Congenital amusia is a disorder characterized by impaired musical skills, which can extend to an inability to recognize very familiar tunes. The neural bases of this deficit are now being deciphered. According to a study conducted by researchers from CNRS and Inserm at the Centre de Recherche en Neurosciences de Lyon (CNRS / Inserm / Université Claude Bernard Lyon 1), amusics exhibit altered processing of musical information in two regions of the brain: the auditory cortex and the frontal cortex, particularly in the right cerebral hemisphere. These alterations seem to be linked to anatomical anomalies in these same cortices. This work, published in May in the journal Brain, adds invaluable information to our understanding of amusia and, more generally, of the “musical brain”, in other words the cerebral networks involved in the processing of music.

image

Congenital amusia, which affects between 2 and 4% of the population, can manifest itself in various ways: by difficulty in hearing a “wrong note”, by singing “out of tune” and sometimes by an aversion to music. For some of these individuals, music is like a foreign language or a simple noise. Amusia is not due to any auditory or psychological problem  and does not seem to be linked to other neurological disorders. Research on the neural bases of this impairment only began a decade ago with the work of the Canadian neuropsychologist Isabelle Peretz.

Two teams from the Centre de Recherche en Neurosciences de Lyon (CNRS / Inserm / Université Claude Bernard Lyon 1) have studied the encoding of musical information and the short-term memorization of notes. According to previous work, amusical individuals experience particular difficulty in hearing the pitch of notes (low or high) and, although they remember sequences of words normally, they have difficulty in memorizing sequences of notes.

In a bid to determine the regions of the brain concerned with these memorization difficulties, the researchers conducted magneto-encephalographs (a technique that allows very weak magnetic fields produced by neural activity to be measured at the surface of the head) on a group of amusics while they were performing a musical task. The task consisted in listening to two tunes separated by a two-second gap. The volunteers were asked to determine whether the tunes were identical or different.

The scientists observed that, when hearing and memorizing notes, amusics exhibited altered sound processing in two regions of the brain: the auditory cortex and the frontal cortex, essentially in the right hemisphere. Compared to non-amusics, their neural activity was delayed and impaired in these specific areas when encoding musical notes. These anomalies occurred 100 milliseconds after the start of a note.

These results agree with an anatomical observation that the researchers have confirmed using MRI: amusical individuals have an excess of grey matter in the inferior frontal cortex, accompanied by a deficit in white matter, one of whose essential constituents is myelin. This surrounds and protects the axons of the neurons, helping nerve signals to propagate rapidly. The researchers also observed anatomical anomalies  in the auditory cortex. This data lends weight to the hypothesis according to which amusia could be due to insufficient communication between the auditory cortex and the frontal cortex.

Amusia thus stems from impaired neural processing from the very first steps of sound processing in the auditory nervous system. This work makes it possible to envisage a program to remedy these musical difficulties, by targeting the early steps of the processing of sounds and their memorization.

May 1, 201391 notes
#congenital amusia #auditory cortex #pitch perception #memory #music #neuroscience #science
Tiny worm sheds light on giant mystery about neurons

Scientists have identified a gene that keeps our nerve fibers from clogging up. Researchers in Ken Miller’s laboratory at the Oklahoma Medical Research Foundation (OMRF) found that the unc-16 gene of the roundworm Caenorhabditis elegans encodes a gatekeeper that restricts flow of cellular organelles from the cell body to the axon, a long, narrow extension that neurons use for signaling. Organelles clogging the axon could interfere with neuronal signaling or cause the axon to degenerate, leading to neurodegenerative disorders. This research, published in the May 2013 Genetics Society of America’s journal GENETICS, adds an unexpected twist to our understanding of trafficking within neurons.

Proteins equivalent to UNC-16 are present in the neurons of all animals, including humans And are known to interact with proteins associated with neurodegenerative disorders in humans (Hereditary Spastic Paraplegia) and mice (Legs at Odd Angles). However, the underlying cause of these disorders is not well understood.

"Our UNC-16 study provides the first insights into a previously unrecognized trafficking system that protects axons from invasion by organelles from the cell soma," Dr. Miller said. "A breakdown in this gatekeeper may be the underlying cause of this group of disorders," he added.

The use of the model organism C. elegans, a tiny, translucent roundworm with only 300 neurons, enabled the discovery because the researchers were able to apply complex genetic techniques and imaging methods in living organisms, which would be impossible in larger animals. Dr. Miller’s team tagged organelles with fluorescent proteins and then used time-lapse imaging to follow the movements of the organelles. In normal axons, organelles exited the cell body and entered the initial segment of the axon, but did not move beyond that. In axons of unc-16 mutants, the organelles hitched a ride on tiny motors that carried them deep into the axon, where they accumulated.

Dr. Miller acknowledges there are still a lot of unanswered questions. His lab is currently investigating how UNC-16 performs its crucial gatekeeper function by looking for other mutant worms with similar phenotypes. A Commentary on the article, also published in this issue of GENETICS, calls the work “provocative”, and highlights several important questions prompted by this pioneering study.

"This research once again shows how studies of simple model organisms can bring insight into complex neurodegenerative diseases in humans," said Mark Johnston, Editor-in-Chief of the journal GENETICS. “This kind of basic research is necessary if we are to understand diseases that can’t easily be studied in more complex animals.”

May 1, 201363 notes
#C. elegans #organelles #neurodegenerative diseases #neurons #proteins #neuroscience #science
Paralyzed Patient Moves Prosthetic Arm With Her Mind

It sounds like science fiction, but researchers are gaining ground in developing mind-controlled robotic arms that could give people with paralysis or amputated limbs more independence.

image

The technology, known as brain-computer (or brain-machine) interface, is in its infancy as far as human use — though scientists have been studying the concept for years. But experts say that people with paralysis or amputations could be using the technology at home within the next decade.

It basically boils down to people using their thoughts to control a robot arm that then performs a desired task, like grasping and moving a cup. That’s done via tiny electrode “grids” implanted in the brain that read the movement signals firing from individual nerve cells, then translate them to the robot arm.

"We have the ability to capture information from the brain and use it to control the robotic arm," said Dr. Elizabeth Tyler-Kabara, who presented her team’s latest findings on the technology Tuesday, at the annual meeting of the American Association of Neurological Surgeons, in New Orleans.

However, she stressed, “we still have a ton to learn.”

Right now, the robot arm is confined to the lab. After getting their electrodes implanted, study patients come to the lab to work with the robotic limb under the researchers’ supervision. So far, Tyler-Kabara and her colleagues at the University of Pittsburgh School of Medicine have tested the approach in one patient. Researchers at Brown University in Providence, R.I., have done it in a handful of others.

One of the big questions, Tyler-Kabara said, is “how much control is enough?” That is, how well does the mind-controlled arm need to work to bring real everyday benefits to people?

At the meeting on Tuesday, Tyler-Kabara presented an update on how her team’s patient is faring. The 53-year-old woman had long-standing quadriplegia due to a disease called spinocerebellar degeneration — where, for unknown reasons, the connections between the brain and muscles slowly deteriorate.

Tyler-Kabara performed the surgery, where two tiny electrode grids were placed in the area of the brain that would normally control the movement of the right hand and arm. The electrode points penetrate the brain’s surface by about one-sixteenth of an inch.

"The idea is pretty scary," Tyler-Kabara acknowledged. But her team’s patient had no complications from the surgery and left the hospital the next day. There’ve been no longer-term problems either, she said — though, in theory, there would be concerns about infection or bleeding over the long haul.

The surgery left the patient with two terminals that protrude through her skull. The researchers used those to connect the implanted electrodes to a computer, where they could see brain cells firing when the patient thought about moving her hand.

She was quickly able to master simple movements with the robotic arm, like high-fiving the researchers. And after six months, she was performing “10-degrees-of-freedom” movements, Tyler-Kabara reported at the meeting.

That includes not only moving the arm, but also flexing and rotating the wrist, grasping objects and affecting several different hand “postures.” She has accomplished feats like feeding herself chocolate.

The researchers initially used a computer in training sessions with the patient, but after that the robot arm is directly linked to the electrodes — so there is no need for “computer assistance,” according to Tyler-Kabara.

Still, before the technology can ultimately be used at home, she said, researchers have to devise a “fully implanted” wireless system for controlling the robot arm.

Another expert talked about the new technology.

"This is one more encouraging step toward developing something practical that people can use in their daily lives," said Dr. Robert Grossman, a neurosurgeon at Methodist Neurological Institute in Houston, who was not involved in the research.

It’s hard to put a time line on it all, Grossman said, since technological advances could changes things. He also noted that several research groups are looking at different approaches to brain-computer interfaces.

One, Grossman said, is to do it noninvasively, through electrodes placed on the scalp.

Study author Tyler-Kabara said that noninvasive approach has met with success in helping people perform simple tasks, like moving a cursor on a computer screen. “But I don’t think it will ever be good enough for performing complicated tasks,” she said, noting that it can’t work as precisely as the implanted electrodes.

A next step, Tyler-Kabara said, is to develop a “two-way” electrode system that stimulates the brain to generate sensation — with the aim of helping people adjust the robot’s grip strength.

She said there is also much to learn about which people will ultimately be good candidates for the technology. There may, for example, be some brain injuries that prevent people from benefiting.

Because this study was presented at a medical meeting, the data and conclusions should be viewed as preliminary until published in a peer-reviewed journal.

May 1, 2013264 notes
#BCI #robots #robotics #prosthetic limbs #prosthetic arm #neuroscience #science
Researchers Successfully Treat Autism in Infants

Most infants respond to a game of peek-a-boo with smiles at the very least, and, for those who find the activity particularly entertaining, gales of laughter. For infants with autism spectrum disorders (ASD), however, the game can be distressing rather than pleasant, and they’ll do their best to tune out all aspects of it –– and that includes the people playing with them.

image

That disengagement is a hallmark of ASD, and one of the characteristics that amplifies the disorder as infants develop into children and then adults.

A study conducted by researchers at the Koegel Autism Center at UC Santa Barbara has found that replacing such games in favor of those the infant prefers can actually lessen the severity of the infants’ ASD symptoms, and, perhaps, alleviate the condition altogether. Their work is highlighted the current issue of the Journal of Positive Behavior Interventions.

Lynn Koegel, clinical director of the center and the study’s lead author, described the game-playing protocol as a modified Pivotal Response Treatment (PVT). Developed at UCSB, PRT is based on principles of positive motivation. The researchers identified the activities that seemed to be more enjoyable to the infants and taught the respective parents to focus on those rather than on the typical games they might otherwise choose. “We had them play with their infants for short periods, and then give them some kind of social reward,” Koegel said. “Over time, we conditioned the infants to enjoy all the activities that were presented by pairing the less desired activities with the highly desired ones.” The social reward is preferable to, say, a toy, Koegel noted, because it maintains the ever-crucial personal interaction.

"The idea is to get them more interested in people," she continued, "to focus on their socialization. If they’re avoiding people and avoiding interacting, that creates a whole host of other issues. They don’t form friendships, and then they don’t get the social feedback that comes from interacting with friends."

According to Koegel, by the end of the relatively short one- to three-month intervention period, which included teaching the parents how to implement the procedures, all the infants in the study had normal reactions to stimuli. “Two of the three have no disabilities at all, and the third is very social,” she said. “The third does have a language delay, but that’s more manageable than some of the other issues.”

On a large scale, Koegel hopes to establish some benchmark for identifying social deficits in infants so parents and health care providers can intervene sooner rather than later. “We have a grant from the Autism Science Foundation to look at lots of babies and try to really figure out which signs are red flags, and which aren’t,” she said. “A number of the infants who show signs of autism will turn out to be perfectly fine; but we’re saying, let’s not take the risk if we can put an intervention in play that really works. Then we don’t have to worry about whether or not these kids would develop the full-blown symptoms of autism.”

Historically, ASD is diagnosed in children 18 months or older, and treatment generally begins around 4 years. “You can pretty reliably diagnose kids at 18 months, especially the more severe cases,” said Koegel. “The mild cases might be a little harder, especially if the child has some verbal communication. There are a few measures –– like the ones we used in our study –– that can diagnose kids pre-language, even as young as six months. But ours was the first that worked with children under 12 months and found an effective intervention.”

Given the increasing number of children being diagnosed with ASD, Koegel’s findings could be life altering –– literally. “When you consider that the recommended intervention for preschoolers with autism is 30 to 40 hours per week of one-on-one therapy, this is a fairly easy fix,” she said. “We did a single one-hour session per week for four to 12 weeks until the symptoms improved, and some of these infants were only a few months old. We saw a lot of positive change.”

May 1, 2013171 notes
#ASD #autism #infants #socialization #social interaction #psychology #neuroscience #science

April 2013

Apr 30, 2013165 notes
#language #speech #speech perception #language processing #linguistics #psychology #neuroscience #science
Size, wiring of brain structures in kids predict benefit from math tutoring

Why do some children learn math more easily than others? Research from the Stanford University School of Medicine has yielded an unexpected new answer.

In a study of third-graders’ responses to math tutoring, Stanford scientists found that the size and wiring of specific brain structures predicted how much an individual child would benefit from math tutoring. However, traditional intelligence measures, such as children’s IQs and their scores on tests of mathematical ability, did not predict improvements from tutoring.

image

The research is the first to use brain scans to look for a link between math-learning abilities and brain structure or function, and also the first to compare neural and cognitive predictors of kids’ responses to tutoring. In addition, it provides information on the differences between how children and adults learn math, and could help researchers understand the origins of math-learning disabilities.

The study was published online April 29 in Proceedings of the National Academy of Sciences.

"What was really surprising was that intrinsic brain measures can predict change - we can actually predict how much a child is going to learn during eight weeks of math tutoring based on measures of brain structure and connectivity," said Vinod Menon, PhD, the study’s senior author and a professor of psychiatry and behavioral sciences. Menon is also a member of the Child Health Research Institute at Lucile Packard Children’s Hospital.

"The results are a significant step toward the development of targeted learning programs based on a child’s current as well as predicted learning trajectory," said the study’s lead author, Kaustubh Supekar, PhD, postdoctoral scholar in psychiatry and behavioral sciences.

Menon’s team focused on third-grade students ages 8 and 9 because these children are at a critical stage for acquiring basic arithmetic skills. The study included 24 third-graders who participated in a well-validated program of 15 to 20 hours of individualized math tutoring over eight weeks. The tutors explained new concepts to children and also got them to practice math skills with an emphasis on speed, and the sessions were tailored to each child’s level of understanding.

Before tutoring began, the children were given several standard neuropsychological assessments, including tests of IQ, working memory, reading and math-problem-solving abilities. Both before and after the eight-week tutoring period, children’s arithmetic performance was tested, and all children had structural and functional magnetic resonance imaging scans performed on their brains. To control for the effects of math instruction the children received at school (rather than during tutoring), a comparison group of 16 third-grade children who did not receive tutoring, but who had the same testing and brain scans before and after an eight-week interval, was also included in the study.

All 24 children receiving tutoring improved their arithmetic performance. Their performance efficiency, a composite measure of accuracy and speed of problem solving, improved an average of 67 percent after tutoring. But individual gains varied widely, ranging from 8 percent to 198 percent improvement. The children who did not receive tutoring did not show any change in arithmetic performance during the study.

When the researchers analyzed the children’s structural brain scans, they found that larger gray matter volume in three brain structures predicted greater ability to benefit from math tutoring. (The predictions were generated with a machine learning algorithm, the same type of data-analysis tool used to create movie recommendations for users of websites like Netflix, for example.) Of the three structures, the best predictor of improvement with tutoring was a larger hippocampus, a structure traditionally considered one of the brain’s most important memory centers. Functional connections between the hippocampus and several other brain regions, especially the prefrontal cortex and basal ganglia, also predicted ability to benefit from tutoring. These regions are important for forming long-term memories.

"The part of the brain that is recruited in memories for places and events also plays a pivotal role in determining how much and how well a child learns math," Supekar said.

None of the neuropsychological assessment scores, such as IQ or tests of working memory, could predict how much an individual child would benefit from tutoring.

The brain systems highlighted by this study - including the hippocampus, basal ganglia and prefrontal cortex - are different from those previously implicated for math learning in adults, the researchers noted. When solving math problems, adults rely on brain regions that are specialized for representing complex visual objects and processing spatial information.

And the findings suggest that the tutoring approach used, which was tailored to each child’s level of understanding and included lots of repetitive, high-speed arithmetic practice to help cement facts in children’s heads, works because it is compatible with the way their brains encode facts. “Memory resources provided by the hippocampal system create a scaffold for learning math in the developing brain,” Menon said. “Our findings suggest that, while conceptual knowledge about numbers is necessary for math learning, repeated, speeded practice and testing of simple number combinations is also needed to encode facts and encourage children’s reliance on retrieval - the most efficient strategy for answering simple arithmetic problems.” Once kids are able to pull up answers to basic arithmetic problems automatically from memory, their brains can tackle more complex problems.

The researchers’ next steps will include comparing brain structure and wiring in children with and without math learning disabilities, analyzing how the wiring of the brain changes in response to tutoring and examining whether lower-performing children’s brains can be exercised to help them learn math. “We’re pushing a very ecologically relevant model of learning,” Menon said. “Academic instruction should rely on validated instructional principles while incorporating individualized training to provide feedback on whether students are right or wrong, how they’re wrong and how they can improve their math skills.”

Apr 30, 201378 notes
#children #math tutoring #brain connections #brain scans #psychology #neuroscience #science
Ear-witness precision: Congenitally blind people have more accurate memories

Distortions and illusions within human memory are well documented in scientific and forensic work and appear to be a basic feature of memory functioning.

image

Yet several studies suggest that blind individuals, especially those without any visual experience, possess superior verbal and memory skills.

The researchers from the Department of Psychology ran memory tests on groups of congenitally blind people, those with late onset blindness and sighted people, in collaboration with a research assistant at Queen Mary, University of London.

Each participant was asked to listen to a series of word lists and then recall the words they heard. Past research has found that such words lists normally cause people to falsely “remember” words that are related to those heard, but that were never actually experienced. For example hearing ‘chimney’, ‘cigar’, and ‘fire’ can prompt some to produce a false memory of the word ‘smoke’ when asked to remember the list of words.

The researchers found that not only did the congenitally blind participants remember more words but were also less likely to create false memories of the related words. In contrast, the sighted and late blind participants remembered fewer words and were much more likely to falsely remember the related words that were not read to the participants.

Dr Achille Pasqualotto, postdoctoral researcher and first author of the study, said: “We found that congenitally blind participants reported significantly more correct words than both late onset blind and sighted people. Most of the congenitally blind participants avoided unrelated words, therefore congenitally blind participants can store more items and with a higher fidelity.”

Dr Michael Proulx who led the study added: “Our results show that visual experience has a significant negative impact on both the number of items remembered and the accuracy of semantic memory and also demonstrate the importance of adaptive neural plasticity in the congenitally blind brain for enhanced memory retrieval mechanisms.

“There is an old Hebrew proverb that believes the blind were the most trustworthy sources for quotations and that certainly seems true in this case. It will be interesting to see whether congenitally blind individuals would also be better witnesses in forensic studies.”

The researched is from the paper Congenital blindness improves semantic and episodic memory, published in the journal Behavioural Brain Research.

Apr 30, 201369 notes
#congenital blindness #false memories #memory #visual experience #psychology #neuroscience #science
Sniffing Out Schizophrenia

Neurons in the nose could be the key to early, fast, and accurate diagnosis, says a TAU researcher

image

A debilitating mental illness, schizophrenia can be difficult to diagnose. Because physiological evidence confirming the disease can only be gathered from the brain during an autopsy, mental health professionals have had to rely on a battery of psychological evaluations to diagnose their patients.

Now, Dr. Noam Shomron and Prof. Ruth Navon of Tel Aviv University’s Sackler Faculty of Medicine, together with PhD student Eyal Mor from Dr. Shomron’s lab and Prof. Akira Sawa of Johns Hopkins Hospital in Baltimore, Maryland, have discovered a method for physical diagnosis — by collecting tissue from the nose through a simple biopsy. Surprisingly, collecting and sequencing neurons from the nose may lead to “more sure-fire” diagnostic capabilities than ever before, Dr. Shomron says.

This finding, which was reported in the journal Neurobiology of Disease, could not only lead to a more accurate diagnosis, it may also permit the crucial, early detection of the disease, giving rise to vastly improved treatment overall.

From the nose to diagnosis

Until now, biomarkers for schizophrenia had only been found in the neuron cells of the brain, which can’t be collected before death. By that point it’s obviously too late to do the patient any good, says Dr. Shomron. Instead, psychiatrists depend on psychological evaluations for diagnosis, including interviews with the patient and reports by family and friends.

For a solution to this diagnostic dilemma, the researchers turned to the olfactory system, which includes neurons located on the upper part of the inner nose. Researchers at Johns Hopkins University collected samples of olfactory neurons from patients diagnosed with schizophrenia and a control group of non-affected individuals, then sent them to Dr. Shomron’s TAU lab.

Dr. Shomron and his fellow researchers applied a high-throughput technology to these samples, studying the microRNA of the olfactory neurons. Within these molecules, which help to regulate our genetic code, they were able to identify a microRNA which is highly elevated in those with schizophrenia, compared to individuals who do not have the disease.

"We were able to narrow down the microRNA to a differentially expressed set, and from there down to a specific microRNA which is elevated in individuals with the disease compared to healthy individuals," explains Dr. Shomron. Further research revealed that this particular microRNA controls genes associated with the generation of neurons.

In practice, material for biopsy could be collected through a quick and easy outpatient procedure, using a local anesthetic, says Dr. Shomron. And with microRNA profiling results ready in a matter of hours, this method could evolve into a relatively simple and accurate test to diagnose a very complicated illness.

Early detection, early intervention

Though there is much more to investigate, Dr. Shomron has high hopes for this diagnostic method. It’s important to determine whether this alteration in microRNA expression begins before schizophrenic symptoms begin to exhibit themselves, or only after the disease fully develops, he says. If this change comes near the beginning of the timeline, it could be invaluable for early diagnostics. This would mean early intervention, better treatment, and possibly even the postponement of symptoms.

If, for example, a person has a family history of schizophrenia, this test could reveal whether they too suffer from the disease. And while such advanced warning doesn’t mean a cure is on the horizon, it will help both patient and doctor identify and prepare for the challenges ahead.

Apr 30, 2013115 notes
#schizophrenia #olfactory system #diagnosis #neurons #microRNA #neuroscience #science
New subtype of ataxia identified

The finding opens the door for presymptomatic diagnostics and genetic counselling for patients and it is the first step in identifying the cause and developing therapies

image

(Image: Antony Gormley)

Researchers from the Germans Trias i Pujol Health Sciences Research Institute Foundation (IGTP), the Bellvitge Biomedical Research Institute (IDIBELL), and the Sant Joan de Déu de Martorell Hospital, has identified a new subtype of ataxia, a rare disease without treatment that causes atrophy in the cerebellum and affects around 1.5 million people in the world. The results have been published online on April 29 in the journal JAMA Neurology.

The cause of ataxia is a diverse genetic alteration. For this reason it is classified in subtypes. The new subtype identified described by the researchers has been called SCA37. The study has found this subtype in members of the same family living in Barcelona, Huelva and Madrid and Salamanca (Spain). The finding will allow in the medium term that these families and all who suffer the genetic alteration identified will have personalized therapies and diagnostics prior to the development of the disease. The study was funded by La Marató de TV3 (the Catalan public TV) in 2009, dedicated to rare diseases.

The cerebellum is a part of the brain located behind the brain that, among other functions, coordinates the movements of the human body. When it is atrophied, movement disorders appear, and when the ataxia evolves, the patients suffer frequent falls and swallowing problems. Progressively, they end up needing a wheelchair. Until now, there have been identified more than 30 different subtypes of ataxia, the first of which was described in 1993 by Dr. Antoni Matill, head of the Neurogenetics Unit, IGTP, and Dr. Victor Volpini, head of the Center for Molecular Genetic Diagnosis at IDIBELL.

The publication of this paper has been possible thanks to the collaboration of the Hospital de Sant Pau, Universitat Pompeu Fabra and the Pitie-Salpêtrière Hospital in Paris.

Particular eye movements

The first symptoms of ataxia may develop during the childhood or adult stage, depending on the subtype. The SCA37 subtype, the first cases of which were identified by Carme Serrano, neurologist at the Sant Joan de Deu Hospital, Martorell (Barcelona), is expressed at 48 years on average. One of the features of SCA37 subtype is the difficulty for vertical eye movements. Besides the patients identified in Spain by Dr. Serrano and Germans Trias and IDIBELL researchers, there are evidence of the existence of more people affected with this subtype of ataxia in France, Holland and Britain, and for this reason it seems to be a quite prevalent subtype of ataxia in Europe.

All SCA37 patients have a common genetic alteration in the portion 32 of the short arm of chromosome 1, wherein there are a hundred genes. Currently, researchers are sequencing it with new generation technologies to find the specific mutation that causes ataxia. When it is found it will be possible to make an accurate diagnosis in family members who do not yet have developed symptoms. Also, it will be possible to investigate the biological mechanisms that cause ataxia in order to develop and implement personalized therapies, with drugs or stem cells therapy.

Apr 30, 201343 notes
#ataxia #cerebellum #genetic alteration #SCA37 subtype #eye movements #neuroscience #science
Microglia Can Be Derived From Patient-Specific Human Induced Pluripotent Stem Cells and May Help Modulate the Course of Central Nervous System Diseases

Today, during the 81st American Association of Neurological Surgeons (AANS) Annual Scientific Meeting, researchers announced new findings regarding the development of methods to turn human induced pluripotent stem cells (iPSC) into microglia, which could be used for not only research but potentially in treatments for various diseases of the central nervous system (CNS).

Microglia are the resident inflammatory cells of the CNS and can modulate the outcomes of a wide range of disorders including trauma, infections, stroke, brain tumors, and various degenerative, inflammatory and psychiatric diseases. However, the effective therapeutic use of microglia demonstrated in various animal CNS disease models currently cannot be translated to patients due to the lack of methods for procuring high-purity patient-specific microglia. Developing a method for obtaining these cells would be highly valuable.

In the study Differentiation of Induced Pluripotent Stem Cells to Microglia for Treatment of CNS Diseases, mouse and human iPSCs were generated and sequentially co-cultured on various cell monolayers and in the presence of added growth factors. The microglial identity of the resulting cells was confirmed using fluorescence activated cell sorting analyses, functional assays, gene expression analyses and brain engraftment ability. The study results will be shared by presenting author John K. Park, MD, PhD, FAANS, from 3:34-3:42 p.m. on Monday, April 29. Co-authors are Michael Shen, BS; Yong Choi, PhD; and Hetal Pandya, PhD.

In the results, researchers found mouse and human iPSCs co-cultured with OP9 cells differentiate into hematopoietic progenitor cells (HPCs). HPCs in turn co-cultured with astrocytes, generate cells that express CD11b, Iba-1 and CX3CR1; secrete the cytokines IL-6, IL-1ß and TNF-a; generate reactive oxygen species; and phagocytose fluorescent particles, all consistent with a microglial phenotype. Gene expression clustering using self-organizing maps indicates that iPSC-derived microglia more closely resemble normal microglia than other inflammatory cell types. The iPSC-derived microglia engraft and migrate to areas of injury within the brain. These finding have led researchers to conclude that iPSC-derived microglia may one day be useful as gene and protein delivery vehicles to the CNS.

“The actual results of our research were not surprising to us, but the overall importance of microglia in a wide variety of brain and spinal cord diseases was surprising. Microglia likely have a role in improving or worsening diseases such as multiple sclerosis, Alzheimer’s disease, Parkinson’s disease, obsessive compulsive disorder and Rett’s syndrome, just to name a few,” said John K. Park, MD, PhD, FAANS. “Microglia are the principal immune system cells of the brain and spinal cord, and help fight infections as well as help the healing process after injuries such as trauma and strokes. They also play a poorly understood role in many neurodegenerative and psychiatric diseases. We have developed methods to turn iPSCs into microglia. Because human iPSC can easily be obtained in large numbers, we can now generate large numbers of human microglia not only for use in experiments, but also potentially for use in treatments. The ability to study normal and diseased human microglia will lead to a greater understanding of their roles in healthy brains and various diseases. Diseases that are caused or exacerbated by defective microglia or a paucity of normal microglia may potentially be treated by microglia generated from a patient’s iPSC.”

Apr 30, 201340 notes
#induced pluripotent stem cells #microglia cells #nervous system #CNS #stem cells #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December