Posts tagged neuroscience

Posts tagged neuroscience
A tangle of talents untangles neurons
Brown’s growing programs in brain science and engineering come together in the lab of Diane Hoffman-Kim. In a recent paper, her group employed techniques ranging from semiconductor-style circuit patterning to rat cell culture to optimize the growth of nerve cells for applications such as reconstructive surgery.
Two wrongs don’t make a right, they say, but here’s how one tangle can straighten out another.
Diane Hoffman-Kim, associate professor of medicine in the Department of Molecular Pharmacology, Physiology, and Biotechnology, is an affiliate of both Brown’s Center for Biomedical Engineering and the Brown Institute for Brain Science. Every thread of expertise woven through those multidisciplinary titles mattered in the Hoffman-Kim lab’s most recent paper, led by graduate student Cristina Lopez-Fagundo.
In research published online last month in Acta Biomaterialia, Hoffman-Kim and Lopez-Fagundo employed their neurophysiological knowledge and technological ingenuity to unravel a tangle of branching, tendrilous nerve cells, or neurons.
The scientist-engineers helped explain how neurons grow in new tissues in response to physical guideposts, called Schwann cells. Their paper also provided medical device makers with an overt demonstration of how to craft the best artificial Schwann cell implants in silicone to make neurons grow as straight as possible in a desired direction.
“If you’ve got an injury in your arm or your leg then you’d like to have proper reconnection so you can get function,” Hoffman-Kim said. “If it’s a small injury, your body does that fairly well in natural ways that largely depend on the Schwann cells. If the injury gets even just a little bit large then the Schwann cells can’t do it alone.”
Silicone Schwanns
Hoffman-Kim and Lopez-Fagundo did not invent the idea of creating an implant to direct neural growth through repaired or reattached tissues. Their clinical goal is to make that technology the best it can be by systematically studying neural growth on Schwann-like substrates. As a matter of basic science, they wanted to learn how neural growth proceeds.
Lopez-Fagundo, whom Hoffman-Kim recruited for her lab in 2008 when she applied to Brown after graduating from the University of Puerto Rico, started the research with rigorous measurements of Schwann cells in cell cultures of rat neural tissue — the cell size, their elliptical shape, and the average distance between any two, as well as the length and width of the “processes” or wispy extensions that connect them.
“We were able to deconstruct the topography of Schwann cells,” said Lopez-Fagundo. “We were then able to manipulate it into different designs to better understand the influence this topography has.”
They came up with six archetypal designs. One of them mimicked the somewhat messy real-world layout of Schwann cells but the other five were arranged in neat horizontal rows. In one the elliptical Schwann cell bodies were few and far between. In another they were densely packed and in another their spacing was the exact average of Lopez-Fagundo’s measurements. Another design had no “processes” to connect the ellipses and another had only processes but no ellipses.
Using Brown’s microfabrication facility, Lopez-Fagundo patterned their designs on silicon wafers (like those used to make computer chips) and then transferred them to silicone squares about a centimeter on a side so that the ellipses and processes were in raised relief on the silicone. Then they put each pattern in a cell culture of rat neurons and watched them as the neurons grew across each pattern of artificial Schwann cells. As a control for their experiment, they also cultured cells on unpatterned silicone squares.
All of the patterns encouraged some directed neuron growth compared to the random growth of neurons on the unpatterned squares, but clearly some patterns did better than others.
After 17 hours, the two best patterns were the ones with only processes and the one with average ellipse spacing. The natural replica pattern and the one with only ellipses fared the worst.
But by day five, new winners emerged: the patterns where the ellipses were farther than average and nearer than average. Hoffman-Kim said she was surprised that the nerve cells didn’t remain content to follow the straightforward pattern of plain horizontal tracks formed by the process-only pattern. Meanwhile, to some extent, the neurons grew the proper way even without a continuous track at all, for instance in the ellipse-only pattern.
Lopez-Fagundo puzzled over the question of why the ellipses, also called “soma,” matter even as the neurons clearly also grow along the processes.
“I asked myself that question a lot and it wasn’t until I sat at the computer and looked at the [time lapse] videos over and over,” Lopez-Fagundo said. “They use the soma as anchor points. They jump from soma to soma and use the long axis of the soma to guide themselves.”
It’s as if the neurons navigated most effectively when they had both roads (processes) and rest stops (ellipses or soma) where they could get their bearings.
And thus the neurons made their way along the artificially optimized straight and narrow. To the researchers, who also included co-authors Jennifer Mitchel, Talisha Ramchal, and Yu-Ting Dingle, the experiments were a triumph of how the meticulous analytical control afforded by engineering can demystify a complex biological phenomenon.
“Sometimes when I give lectures I say, ‘Biomedical engineers are control freaks and we consider that a compliment,’” Hoffman-Kim said.
A new study from investigators at the Benson-Henry Institute for Mind/Body Medicine at Massachusetts General Hospital and Beth Israel Deaconess Medical Center finds that eliciting the relaxation response—a physiologic state of deep rest induced by practices such as meditation, yoga, deep breathing and prayer—produces immediate changes in the expression of genes involved in immune function, energy metabolism and insulin secretion.

“Many studies have shown that mind/body interventions like the relaxation response can reduce stress and enhance wellness in healthy individuals and counteract the adverse clinical effects of stress in conditions like hypertension, anxiety, diabetes and aging,” said Herbert Benson, HMS professor of medicine at Mass General and co-senior author of thereport.
Benson is director emeritus of the Benson-Henry Institute.
“Now for the first time we’ve identified the key physiological hubs through which these benefits might be induced,” he said.
Published in the open-access journal PLOS ONE, the study combined advanced expression profiling and systems biology analysis to both identify genes affected by relaxation response practice and to determine the potential biological relevance of those changes.
“Some of the biological pathways we identify as being regulated by relaxation response practice are already known to play specific roles in stress, inflammation and human disease. For others, the connections are still speculative, but this study is generating new hypotheses for further investigation,” said Towia Libermann, HMS associate professor of medicine at Beth Israel Deaconess and co-senior author of the study.
Benson first described the relaxation response—the physiologic opposite of the fight-or-flight response—almost 40 years ago, and his team has pioneered the application of mind/body techniques to a wide range of health problems. Studies in many peer-reviewed journals have documented how the relaxation response both alleviates symptoms of anxiety and many other disorders and also affects factors such as heart rate, blood pressure, oxygen consumption and brain activity.
In 2008, Benson and Libermann led a study finding that long-term practice of the relaxation response changed the expression of genes involved with the body’s response to stress. The current study examined changes produced during a single session of relaxation response practice, as well as those taking place over longer periods of time.
The study enrolled a group of 26 healthy adults with no experience in relaxation response practice, who then completed an 8-week relaxation-response training course.
Before they started their training, they went through what was essentially a control group session: Blood samples were taken before and immediately after the participants listened to a 20-minute health education CD and again 15 minutes later. After completing the training course, a similar set of blood tests was taken before and after participants listened to a 20-minute CD used to elicit the relaxation response as part of daily practice.
The sets of blood tests taken before the training program were designated “novice,” and those taken after training completion were called “short-term practitioners.” For further comparison, a similar set of blood samples was taken from a group of 25 individuals with 4 to 25 years’ experience regularly eliciting the relaxation response through many different techniques before and after they listened to the same relaxation response CD.
Blood samples from all participants were analyzed to determine the expression of more than 22,000 genes at the different time points.
The results revealed significant changes in the expression of several important groups of genes between the novice samples and those from both the short- and long-term sets. Even more pronounced changes were shown in the long-term practitioners.
A systems biology analysis of known interactions among the proteins produced by the affected genes revealed that pathways involved with energy metabolism, particularly the function of mitochondria, were upregulated during the relaxation response. Pathways controlled by activation of a protein called NF-κB—known to have a prominent role in inflammation, stress, trauma and cancer—were suppressed after relaxation response elicitation. The expression of genes involved in insulin pathways was also significantly altered.
“The combination of genomics and systems biology in this study provided great insight into the key molecules and physiological gene interaction networks that might be involved in relaying beneficial effects of relaxation response in healthy subjects,” said Manoj Bhasin, HMS assistant professor of medicine, co-lead author of the study, and co-director of the Beth Israel Deaconess Genomics, Proteomics, Bioinformatics and Systems Biology Center.
Bhasin noted that these insights should provide a framework for determining, on a genomic basis, whether the relaxation response will help alleviate symptoms of diseases triggered by stress. The work could also lead to developing biomarkers that may suggest how individual patients will respond to interventions.
Benson stressed that the long-term practitioners in this study elicited the relaxation response through many different techniques—various forms of meditation, yoga or prayer—but those differences were not reflected in the gene expression patterns.
“People have been engaging in these practices for thousands of years, and our finding of this unity of function on a basic-science, genomic level gives greater credibility to what some have called ‘new age medicine,’ ” he said.
“While this and our previous studies focused on healthy participants, we currently are studying how the genomic changes induced by mind/body interventions affect pathways involved in hypertension, inflammatory bowel disease and irritable bowel syndrome. We have also started a study—a collaborative undertaking between Dana-Farber Cancer Institute, Mass General and Beth Israel Deaconess—in patients with precursor forms of multiple myeloma, a condition known to involve activation of NF-κB pathways,” said Libermann, who is the director of the Beth Israel Deaconess Medical Center Genomics, Proteomics, Bioinformatics and Systems Biology Center.
(Source: hms.harvard.edu)
Epilepsy that does not respond to drugs can be halted in adult mice by transplanting a specific type of cell into the brain, UC San Francisco researchers have discovered, raising hope that a similar treatment might work in severe forms of human epilepsy.
UCSF scientists controlled seizures in epileptic mice with a one-time transplantation of medial ganglionic eminence (MGE) cells, which inhibit signaling in overactive nerve circuits, into the hippocampus, a brain region associated with seizures, as well as with learning and memory. Other researchers had previously used different cell types in rodent cell transplantation experiments and failed to stop seizures.
Cell therapy has become an active focus of epilepsy research, in part because current medications, even when effective, only control symptoms and not underlying causes of the disease, according to Scott C. Baraban, PhD, who holds the William K. Bowes Jr. Endowed Chair in Neuroscience Research at UCSF and led the new study. In many types of epilepsy, he said, current drugs have no therapeutic value at all.
“Our results are an encouraging step toward using inhibitory neurons for cell transplantation in adults with severe forms of epilepsy,” Baraban said. “This procedure offers the possibility of controlling seizures and rescuing cognitive deficits in these patients.”
The findings, which are the first ever to report stopping seizures in mouse models of adult human epilepsy, will be published online May 5 in the journal Nature Neuroscience.
During epileptic seizures, extreme muscle contractions and, often, a loss of consciousness can cause seizure sufferers to lose control, fall and sometimes be seriously injured. The unseen malfunction behind these effects is the abnormal firing of many excitatory nerve cells in the brain at the same time.
In the UCSF study, the transplanted inhibitory cells quenched this synchronous, nerve-signaling firestorm, eliminating seizures in half of the treated mice and dramatically reducing the number of spontaneous seizures in the rest. Robert Hunt, PhD, a postdoctoral fellow in the Baraban lab, guided many of the key experiments.
In another encouraging step, UCSF researchers reported May 2 that they found a way to reliably generate human MGE-like cells in the laboratory, and that, when transplanted into healthy mice,the cells similarly spun off functional inhibitory nerve cells. That research can be found online in the journal Cell Stem Cell.
In many forms of epilepsy, loss or malfunction of inhibitory nerve cells within the hippocampus plays a critical role. MGE cells are progenitor cells that form early within the embryo and are capable of generating mature inhibitory nerve cells called interneurons. In the Baraban-led UCSF study, the transplanted MGE cells from mouse embryos migrated and generated interneurons, in effect replacing the cells that fail in epilepsy. The new cells integrated into existing neural circuits in the mice, the researchers found.
“These cells migrate widely and integrate into the adult brain as new inhibitory neurons,” Baraban said. “This is the first report in a mouse model of adult epilepsy in which mice that already were having seizures stopped having seizures after treatment.”
The mouse model of disease that Baraban’s lab team worked with is meant to resemble a severe and typically drug-resistant form of human epilepsy called mesial temporal lobe epilepsy, in which seizures are thought to arise in the hippocampus. In contrast to transplants into the hippocampus, transplants into the amygdala, a brain region involved in memory and emotion, failed to halt seizure activity in this same mouse model, the researcher found.
Temporal lobe epilepsy often develops in adolescence, in some cases long after a seizure episode triggered during early childhood by a high fever. A similar condition in mice can be induced with a chemical exposure, and in addition to seizures, this mouse model shares other pathological features with the human condition, such as loss of cells in the hippocampus, behavioral alterations and impaired problem solving.
In the Nature Neuroscience study, in addition to having fewer seizures, treated mice became less abnormally agitated, less hyperactive, and performed better in water-maze tests.
(Source: newswise.com)
New research from Bristol and Cardiff universities shows that children whose brains process information more slowly than their peers are at greater risk of psychotic experiences.

These can include hearing voices, seeing things that are not present or holding unrealistic beliefs that other people don’t share. These experiences can often be distressing and frightening and interfere with their everyday life.
Children with psychotic experiences are more likely to develop psychotic illnesses like schizophrenia later in life.
Using data gathered from 6,784 participants in Children of the 90s, researchers from the MRC Centre for Neuropsychiatric Genetics and Genomics in Cardiff University and the School of Social and Community Medicine in the University of Bristol examined whether performance in a number of cognitive tests conducted at ages 8, 10 and 11 was related to the risk of having psychotic experiences at age 12.
The tests assessed how quickly the children could process information, as well as their attention, memory, reasoning, and ability to solve problems.
Among those interviewed, 787 (11.6 per cent) had suspected or definite psychotic experiences at age 12. Children that scored less well in the various tests at the ages of 8, 10 and 11 were more likely to have psychotic experiences at age 12.
This was particularly the case for the test that assessed how quickly the children processed information. Furthermore, children whose speed of processing information became slower between ages 8 and 11 had greater risk of having psychotic experiences at age 12.
These findings did not change when other factors, including the parent’s psychiatric history and the children’s own developmental delay, were taken into account. The study’s findings could have important implications for identifying children at risk of psychosis, with the benefit of early treatment.
Speaking about the findings, lead author and PhD student, Miss Maria Niarchou from Cardiff University’s School of Medicine, said:
‘Previous research has shown a link between the slowing down of information processing and schizophrenia and this was found to be at least in part the result of anti-psychotic medication.‘However, this study shows that impaired information processing speed can already be present in childhood and associated with higher risk of psychotic experiences, irrespective of medication.
‘Our findings improve our understanding of the brain processes that are associated with high risk of psychotic experiences in childhood and in turn high risk of psychotic disorder later in life.’
Senior author, Dr Marianne van den Bree of Cardiff University’s School of Medicine, said:
‘Schizophrenia is a complex and relatively rare mental health condition, occurring at a rate of 1 per cent in the general population. Not every child with impaired information processing speed is at risk of psychosis later in life. Further research is needed to determine whether interventions to improve processing speed in at-risk children can lead to decreased transition to psychotic disorders.’
Ruth Coombs, Manager for Influence and Change at Mind Cymru, said:
‘This is a very interesting piece of research, which could help young people at risk of developing mental health problems in later life build resilience and benefit from early intervention. It is important to remember that people can and do recover from mental health problems and we also welcome further research which supports resilience building in young people.’
(Source: bristol.ac.uk)
"I’ve been in a crowded elevator with mirrors all around, and a woman will move and I’ll go to get out the way and then realise: ‘oh that woman is me’."
Heather Sellers has prosopagnosia, more commonly known as face blindness. “I can’t remember any image of the human face. It’s simply not special to me,” she says. “I don’t process them like I do a car or a dog. It’s not a visual problem, it’s a perception problem.”

Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn’t diagnosed until she was in her 30s. “I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy.”
The condition is estimated to affect around 2.5 per cent of the population, and it’s common for those who have it not to realise that anything is wrong. “In many ways it’s a subtle disorder,” says Heather. “It’s easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it’s socially devastating.”
As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn’t initially recognise her mother. “But I didn’t know that I wasn’t recognising her.”
Chaos explained
Heather was 36 when she stumbled across the phrase face blindness in a psychology textbook. “When I saw those two words I knew instantly that was exactly what I had – that explained all the chaos.”
She found her way to Harvard neuroscientist Brad Duchaine who diagnosed her as having one of the three worst cases of the disorder that he had ever seen.
So what’s it like to not recognise anyone you know? Heather says the biggest difficulty with the disorder is recognising people who she is close to – the people that are most important to recognise. In the school where she teaches English she is fine, because she recognises people by their clothes or hair and asks her students to wear name badges.
But it can be harder in social settings. Once she went up to the wrong person at a party and put her arm around him thinking he was her partner. And at college men would phone her angry that she had walked straight past them after they had had a date. “At the time I was thinking ‘I didn’t see you, why is everyone making my life so difficult?’”
It’s not just other people Heather doesn’t recognise – she can’t identify her own face either. “A few times I have been in a crowded elevator with mirrors all around and a woman will move, and I will go to get out the way and then realise ‘oh that woman is me’.” She also finds it unsettling to see photos and not recognise herself in them.
Face processing
To try and understand the condition, Duchaine and his colleagues recorded brain activity while 12 people with prosopagnosia looked at famous and non-famous faces. The team found that part of the brain responsible for stored visual memory was activated in six people when they saw the famous faces.
But another component of brain activity thought to represent a later stage of face processing wasn’t triggered. “Some part of their brain was recognising the face,” says Duchaine, but the brain was failing to pass this information into higher-level consciousness (Brain).
"There may be training where we give people feedback and say ‘look you recognise that face even though you’re not aware of it’," says Duchaine.
Now Zaira Cattaneo at the University of Milano-Bicocca in Italy and colleagues have identified the specific brain areas that allow us to recognise our friends. The team used transcranial magnetic stimulation to block two vital aspects of face processing in people without prosopagnosia. Targeting the left prefrontal cortex blocked the ability to distinguish individual features like the nose and eyes, and blocking the right prefrontal cortex impaired the ability to distinguish the location of those features from one another (NeuroImage).
"We made performance worse," says Cattaneo. "We want to make it better." Now the team are trying to activate these areas of the brain. "The aim is to enhance face recognition abilities by directly modulating excitability in the prefrontal cortices," says Cattaneo.
Would Heather want a cure, should one be found? “I can’t imagine what you see when you see a face, and it’s scary,” she says. “I go back and forth on what I’d do. I’ve done so much work in figuring out how to chart my world, I’d need to do a whole new rewrite. But it would be fascinating.”
A little brain training goes a long way
People who use a ‘brain-workout’ program for just 10 hours have a mental edge over their peers even a year later, researchers report today in PLoS ONE.
The search for a regimen of mental callisthenics to stave off age-related cognitive decline is a booming area of research — and a multimillion-dollar business. But critics argue that even though such computer programs can improve performance on specific mental tasks, there is scant proof that they have broader cognitive benefits.
For the study, adults aged 50 and older played a computer game designed to boost the speed at which players process visual stimuli. Processing speed is thought to be “the first domino that falls in cognitive decline”, says Fredric Wolinsky, a public-health researcher at the University of Iowa in Iowa City, who led the research.
The game was developed by academic researchers but is now sold under the name Double Decision by Posit Science, based in San Francisco, California. (Posit did not fund the study.) Players are timed on how fast they click on an image in the centre of the screen and on others that appear around the periphery. The program ratchets up the difficulty as a player’s performance improves.
Participants played the training game for 10 hours on site, some with an extra 4-hour ‘booster’ session later, or for 10 hours at home. A control group worked on computerized crossword puzzles for 10 hours on site. Researchers measured the mental agility of all 621 subjects before the brain training began, and again one year later, using eight well-established tests of cognitive performance.
The control group’s scores did not increase over the course of that year, but all the brain-training groups significantly upped their scores in the Useful Field of View test — which requires a subject to identify items in a scene with just a quick glance — and four others. When they compared the study participants’ scores to those expected for people their ages, the researchers found improvements that translated to 3-4.1 years of protection in age-related decline for the field-of-view test and from 1.5-6.6 years for the other tasks.
“It was interesting that it didn’t matter whether you were on site at the clinic or just did this at home — you got basically the same bang for your buck,” says Frederick Unverzagt, a neuropsychologist at the Indiana University School of Medicine in Indianapolis, who was not involved with the study.
But Peter Snyder, a neuropsychologist at Brown University in Providence, Rhode Island, points out that players’ performance could have improved simply because they were familiar with the game — not because their cognitive skills improved. “To me, that makes it hard to interpret the results with the same degree of certainty” that the authors have, he says.
Snyder also doubts that 10 hours of training could affect brain wiring enough to provide long-lasting general benefits, but Henry Mahncke, chief executive of Posit Science, disagrees. “If you’ve never played piano before and spend 10 hours practising, a year later you will be better than when you started,” he says. “The new study shows that there’s science to be done here. Some things you can do with your brain are highly productive and others are not.”
If a mosquito approaches a human ear or a bee heads for the next flower, two things are important: the insects must be able to locate their destination and correct course deviations, caused by a gust of wind for example. How does the brain process these different situations so that both behaviours are possible? Scientists at the Max Planck Institute of Neurobiology in Martinsried have demonstrated in behavioural experiments that both behaviours are controlled by separate circuits in the brain of the fruit fly (Drosophila). One of these neural networks processes motion information in the surrounding environment and helps the fly to stabilise its course. The other is responsible for determining the position of an object and is used for object fixation.
If a drum with vertical stripes rotates around an insect, the animal will rotate in the same direction as the stripes. This innate behaviour is known as an optomotor reaction. The experiment replicates a natural phenomenon: if, for example, a gust of wind moves a flying fly to the right, from the fly’s perspective, the surroundings move to the left by its eyes. The optomotor reaction consequently leads to a compensation for the gust of wind and brings the fly back on course. Scientists have long suspected that the nerve cells controlling this behaviour are located in the lobula plate of the fly’s brain. Up until now, however, it was not clear whether these cells are necessary to control the observed behaviour.
Alexander Borst and his department at the Max Planck Institute of Neurobiology are investigating how motion information is processed in the brain of the fly. To find out whether the lobula plate plays a role in the optomotor reaction, the neurobiologists developed a behavioural testing apparatus: in a virtual environment, they presented flies with a rotating striped pattern to which the flies displayed a clear optomotor reaction. However, when the scientists blocked the nerve cells from which the lobula plate receives its information, the behaviours disappeared completely. The flies were thus motion-blind. The experiments show that the lobula plate is a necessary element in stabilising the course of the fly.
In nature, however, flies must also be able to process information about other things than motion. Was this still possible? The next thing that the neurobiologists concentrated on was another, well-documented behaviour of insects: object fixation. If a single vertical stripe is displayed during the experiment, flies will turn to the stripe and try to keep it in front of them. This object fixation enables the animals to approach an object or to “keep an eye” on it. In the experiment, the scientists allowed a vertical stripe to appear slowly at different locations in the flies’ field of vision and then disappear again. If the stripe appeared on the right side of the fly, the animals turned to the right, if it appeared on the left, they turned to the left. If the motion perception system controls this behaviour, then motion-blind animals should no longer be able locate the stripes. Interestingly, motion-blind flies and control flies responded in exactly the same way.
The scientists concluded from these experiments that an independent position perception system must co-exist with the motion perception system. If a small object moves in the space, local changes in brightness occur. These are recorded by the position perception system. Motion-blind flies can therefore still recognise the position of an object even if they can no longer see it moving.
“It was a very complicated process to set up the experiment in a way that solid results could be obtained,” explains Armin Bahl, the lead author of the study. It was previously assumed that cells in the lobula plate are responsible for motion perception, as well as for object fixation. The scientists have now refuted this assumption and already described important properties of the fixation behaviour. “We do not yet know exactly where the cells of the position perception system are located in the fly’s brain, but we have a few good candidates,” says Armin Bahl, indicating the direction that the research will now take.
Unusual comparison nets new sleep loss marker
For years, Paul Shaw, PhD, a researcher at Washington University School of Medicine in St. Louis, has used what he learns in fruit flies to look for markers of sleep loss in humans.
Shaw reverses the process in a new paper, taking what he finds in humans back to the flies and gaining new insight into humans as a result: identification of a human gene that is more active after sleep deprivation.
“I’m calling the approach cross-translational research,” says Shaw, associate professor of neurobiology. “Normally we go from model to human, but there’s no reason why we can’t take our studies from human to model and back again.”
Shaw and his colleagues plan to use the information they are gaining to create a panel of tests for sleep loss. The tests may one day help assess a person’s risk of falling asleep at the wheel of a car or in other dangerous contexts.
PLOS One published the results on April 24.
Scientists have known for years that sleep disorders and disruption raise blood serum levels of interleukin 6, an inflammatory immune compound. Shaw showed that this change is also detectable in saliva samples from sleep-deprived rats and humans.
Based on this link, Shaw tested the activity of other immune proteins in humans to see if any changed after sleep loss. The scientists took saliva samples from research participants after they had a normal night’s sleep and after they stayed awake for 30 hours. They found two immune genes whose activity levels rose during sleep deprivation.
“Normally we would do additional human experiments to verify these links,” Shaw says. “But those studies can be quite expensive, so we thought we’d test the connections in flies first.”
The researchers identified genes in the fruit fly that were equivalent to the human genes, but their activity didn’t increase when flies lost sleep. When they screened other, similar fruit fly genes, though, the scientists found one that did.
“We’ve seen this kind of switch happen before as we compared families of fly genes and families of human genes,” Shaw says. “Sometimes the gene performing a particular role will change, but the task will still be handled by a gene in the same family.”
When the scientists looked for the human version of the newly identified fly marker for sleep deprivation, they found ITGA5 and realized it hadn’t been among the human immune genes they screened at the start of the study. Testing ITGA5 activity in the saliva samples revealed that its activity levels increased during sleep deprivation.
“We will need more time to figure out how useful this particular marker will be for detecting sleep deprivation in humans,” Shaw says. “In the meantime, we’re going to continue jumping between our flies and humans to maximize our insights.”
A key type of human brain cell developed in the laboratory grows seamlessly when transplanted into the brains of mice, UC San Francisco researchers have discovered, raising hope that these cells might one day be used to treat people with Parkinson’s disease, epilepsy, and possibly even Alzheimer’s disease, as well as and complications of spinal cord injury such as chronic pain and spasticity.

“We think this one type of cell may be useful in treating several types of neurodevelopmental and neurodegenerative disorders in a targeted way,” said Arnold Kriegstein, MD, PhD, director of the Eli and Edythe Broad Center of Regeneration Medicine and Stem Cell Research at UCSF and co-lead author on the paper.
The researchers generated and transplanted a type of human nerve-cell progenitor called the medial ganglionic eminence (MGE) cell, in experiments described in the May 2 edition of Cell Stem Cell. Development of these human MGE cells within the mouse brain mimics what occurs in human development, they said.
Kriegstein sees MGE cells as a potential treatment to better control nerve circuits that become overactive in certain neurological disorders. Unlike other neural stem cells that can form many cell types — and that may potentially be less controllable as a consequence — most MGE cells are restricted to producing a type of cell called an interneuron. Interneurons integrate into the brain and provide controlled inhibition to balance the activity of nerve circuits.
To generate MGE cells in the lab, the researchers reliably directed the differentiation of human pluripotent stem cells — either human embryonic stem cells or induced pluripotent stem cells derived from human skin. These two kinds of stem cells have virtually unlimited potential to become any human cell type. When transplanted into a strain of mice that does not reject human tissue, the human MGE-like cells survived within the rodent forebrain, integrated into the brain by forming connections with rodent nerve cells, and matured into specialized subtypes of interneurons.
These findings may serve as a model to study human diseases in which mature interneurons malfunction, according to Kriegstein. The researchers’ methods may also be used to generate vast numbers of human MGE cells in quantities sufficient to launch potential future clinical trials, he said.
Kriegstein was a co-leader of the research, along with Arturo Alvarez-Buylla, PhD, UCSF professor of neurological surgery; John Rubenstein, MD, PhD, UCSF professor of psychiatry; and UCSF postdoctoral scholars Cory Nicholas, PhD, and Jiadong Chen, PhD.
Nicholas utilized key growth factors and other molecules to direct the derivation and maturation of the human MGE-like interneurons. He timed the delivery of these factors to shape their developmental path and confirmed their progression along this path. Chen used electrical measurements to carefully study the physiological and firing properties of the interneurons, as well as the formation of synapses between neurons.
Previously, UCSF researchers led by Allan Basbaum, PhD, chair of anatomy at UCSF, have used mouse MGE cell transplantation into the mouse spinal cord to reduce neuropathic pain, a surprising application outside the brain. Kriegstein, Nicholas and colleagues now are exploring the use of human MGE cells in mouse models of neuropathic pain and spasticity, Parkinson’s disease and epilepsy.
“The hope is that we can deliver these cells to various places within the nervous system that have been overactive and that they will functionally integrate and provide regulated inhibition,” Nicholas said.
The researchers also plan to develop MGE cells from induced pluripotent stem cells derived from skin cells of individuals with autism, epilepsy, schizophrenia and Alzheimer’s disease, in order to investigate how the development and function of interneurons might become abnormal — creating a lab-dish model of disease.
One mystery and challenge to both the clinical and pre-clinical study of human MGE cells is that they develop at a slower, human pace, reflecting an “intrinsic clock”. In fast-developing mice, the human MGE-like cells still took seven to nine months to form interneuron subtypes that normally are present near birth.
“If we could accelerate the clock in human cells, then that would be very encouraging for various applications,” Kriegstein said.
(Source: newswise.com)
Scientists at Princeton University used off-the-shelf printing tools to create a functional ear that can “hear” radio frequencies far beyond the range of normal human capability.

The researchers’ primary purpose was to explore an efficient and versatile means to merge electronics with tissue. The scientists used 3D printing of cells and nanoparticles followed by cell culture to combine a small coil antenna with cartilage, creating what they term a bionic ear.
"In general, there are mechanical and thermal challenges with interfacing electronic materials with biological materials," said Michael McAlpine, an assistant professor of mechanical and aerospace engineering at Princeton and the lead researcher. "Previously, researchers have suggested some strategies to tailor the electronics so that this merger is less awkward. That typically happens between a 2D sheet of electronics and a surface of the tissue. However, our work suggests a new approach — to build and grow the biology up with the electronics synergistically and in a 3D interwoven format."
McAlpine’s team has made several advances in recent years involving the use of small-scale medical sensors and antenna. Last year, a research effort led by McAlpine and Naveen Verma, an assistant professor of electrical engineering, and Fio Omenetto of Tufts University, resulted in the development of a “tattoo” made up of a biological sensor and antenna that can be affixed to the surface of a tooth.
This project, however, is the team’s first effort to create a fully functional organ: one that not only replicates a human ability, but extends it using embedded electronics
"The design and implementation of bionic organs and devices that enhance human capabilities, known as cybernetics, has been an area of increasing scientific interest," the researchers wrote in the article which appears in the scholarly journal Nano Letters. “This field has the potential to generate customized replacement parts for the human body, or even create organs containing capabilities beyond what human biology ordinarily provides.”
Standard tissue engineering involves seeding types of cells, such as those that form ear cartilage, onto a scaffold of a polymer material called a hydrogel. However, the researchers said that this technique has problems replicating complicated three dimensional biological structures. Ear reconstruction “remains one of the most difficult problems in the field of plastic and reconstructive surgery,” they wrote.
To solve the problem, the team turned to a manufacturing approach called 3D printing. These printers use computer-assisted design to conceive of objects as arrays of thin slices. The printer then deposits layers of a variety of materials – ranging from plastic to cells – to build up a finished product. Proponents say additive manufacturing promises to revolutionize home industries by allowing small teams or individuals to create work that could previously only be done by factories.
Creating organs using 3D printers is a recent advance; several groups have reported using the technology for this purpose in the past few months. But this is the first time that researchers have demonstrated that 3D printing is a convenient strategy to interweave tissue with electronics.
The technique allowed the researchers to combine the antenna electronics with tissue within the highly complex topology of a human ear. The researchers used an ordinary 3D printer to combine a matrix of hydrogel and calf cells with silver nanoparticles that form an antenna. The calf cells later develop into cartilage.
Manu Mannoor, a graduate student in McAlpine’s lab and the paper’s lead author, said that additive manufacturing opens new ways to think about the integration of electronics with biological tissue and makes possible the creation of true bionic organs in form and function. He said that it may be possible to integrate sensors into a variety of biological tissues, for example, to monitor stress on a patient’s knee meniscus.
David Gracias, an associate professor at Johns Hopkins and co-author on the publication, said that bridging the divide between biology and electronics represents a formidable challenge that needs to be overcome to enable the creation of smart prostheses and implants.
"Biological structures are soft and squishy, composed mostly of water and organic molecules, while conventional electronic devices are hard and dry, composed mainly of metals, semiconductors and inorganic dielectrics," he said. "The differences in physical and chemical properties between these two material classes could not be any more pronounced."
The finished ear consists of a coiled antenna inside a cartilage structure. Two wires lead from the base of the ear and wind around a helical “cochlea” – the part of the ear that senses sound – which can connect to electrodes. Although McAlpine cautions that further work and extensive testing would need to be done before the technology could be used on a patient, he said the ear in principle could be used to restore or enhance human hearing. He said electrical signals produced by the ear could be connected to a patient’s nerve endings, similar to a hearing aid. The current system receives radio waves, but he said the research team plans to incorporate other materials, such as pressure-sensitive electronic sensors, to enable the ear to register acoustic sounds.
In addition to McAlpine, Verma, Mannoor and Gracias the research team includes: Winston Soboyejo, a professor of mechanical and aerospace engineering at Princeton; Karen Malatesta, a faculty fellow in molecular biology at Princeton; Yong Lin Kong, a graduate student in mechanical and aerospace engineering at Princeton; and Teena James, a graduate student in chemical and biomolecular engineering at Johns Hopkins.
The team also included Ziwen Jiang, a high school student at the Peddie School in Hightstown who participated as part of an outreach program for young researchers in McAlpine’s lab.
"Ziwen Jiang is one of the most spectacular high school students I have ever seen," McAlpine said. "We would not have been able to complete this project without him, particularly in his skill at mastering CAD designs of the bionic ears."
(Source: eurekalert.org)