Posts tagged science

Posts tagged science
![Early cerebellum injury hinders neural development, possible root of autism, theory suggests
A brain region largely known for coordinating motor control has a largely overlooked role in childhood development that could reveal information crucial to understanding the onset of autism, according to Princeton University researchers.
The cerebellum — an area located in the lower rear of the brain — is known to process external and internal information such as sensory cues that influence the development of other brain regions, the researchers report in the journal Neuron. Based on a review of existing research, the researchers offer a new theory that an injury to the cerebellum during early life potentially disrupts this process and leads to what they call “developmental diaschisis,” which is when a loss of function in one part of the brain leads to problems in another region.
The researchers specifically apply their theory to autism, though they note that it could help understand other childhood neurological conditions. Conditions within the autism spectrum present “longstanding puzzles” related to cognitive and behavioral disruptions that their ideas could help resolve, they wrote. Under their theory, cerebellar injury causes disruptions in how other areas of the brain develop an ability to interpret external stimuli and organize internal processes, explained first author Sam Wang, an associate professor of molecular biology and the Princeton Neuroscience Institute (PNI).
"It is well known that the cerebellum is an information processor. Our neocortex [the largest part of the brain, responsible for much higher processing] does not receive information unfiltered. There are critical steps that have to happen between when external information is detected by our brain and when it reaches the neural cortex," said Wang, who worked with doctoral student Alexander Kloth and postdoctoral research associate Aleksandra Badura, both in PNI.
"At some point, you learn that smiling is nice because Mom smiles at you. We have all these associations we make in early life because we don’t arrive knowing that a smile is nice," Wang said. "In autism, something in that process goes wrong and one thing could be that sensory information is not processed correctly in the cerebellum."
Mustafa Sahin, a neurologist at Boston’s Children Hospital and associate professor of neurology at Harvard Medical School, said that Wang and his co-authors build upon known links between cerebellar damage and autism to suggest that the cerebellum is essential to healthy neural development. Numerous studies — including from his own lab — support their theory, said Sahin, who is familiar with the work but was not involved in it.
"The association between cerebellar deficits and autism has been around for a while," Sahin said. "What Sam Wang and colleagues do in this perspective article is to synthesize these two themes and hypothesize that in a critical period of development, cerebellar dysfunction may disrupt the maturation of distant neocortical circuits, leading to cognitive and behavioral symptoms including autism."
Traditionally, the cerebellum has been studied in relation to motor movement and coordination in adults. Recent studies, however, strongly suggest that it also influences childhood cognition, Wang said. Several studies also have found a correlation between cerebellar injury and the development of a disorder in the autism spectrum, the researchers report. For instance, the researchers cite a 2007 paper in the journal Pediatrics that found that individuals who experienced cerebellum damage at birth were 40 times more likely to score highly on autism screening tests. They also reference studies in 2004 and 2005 that found that the cerebellum is the most frequently disrupted brain region in people with autism.
"What we realized from looking at the literature is that these two problems — autism and cerebellar injury — might be related to each other" via the cerebellum’s influence on wider neural development, Wang said. "We hope to get people and scientists thinking differently about the cerebellum or about autism so that the whole field can move forward."
The researchers conclude by suggesting methods for testing their theory. First, by inactivating brain-cell electrical activity, it should be possible to pinpoint the developmental stage in which injury to one part of the brain affects the maturation of another. A second, more advanced method is to reconstruct the neural connections between the cerebellum and other brain regions; the federal BRAIN Initiative announced in 2013 aims to map the activity of all the brain’s neurons. Finally, mouse brains can be used to disable and restore brain-region function to observe the “upstream” effect in other areas.](http://40.media.tumblr.com/af3e898055f15645d00eb91715335762/tumblr_nbbmmhzo6S1rog5d1o1_400.jpg)
Early cerebellum injury hinders neural development, possible root of autism, theory suggests
A brain region largely known for coordinating motor control has a largely overlooked role in childhood development that could reveal information crucial to understanding the onset of autism, according to Princeton University researchers.
The cerebellum — an area located in the lower rear of the brain — is known to process external and internal information such as sensory cues that influence the development of other brain regions, the researchers report in the journal Neuron. Based on a review of existing research, the researchers offer a new theory that an injury to the cerebellum during early life potentially disrupts this process and leads to what they call “developmental diaschisis,” which is when a loss of function in one part of the brain leads to problems in another region.
The researchers specifically apply their theory to autism, though they note that it could help understand other childhood neurological conditions. Conditions within the autism spectrum present “longstanding puzzles” related to cognitive and behavioral disruptions that their ideas could help resolve, they wrote. Under their theory, cerebellar injury causes disruptions in how other areas of the brain develop an ability to interpret external stimuli and organize internal processes, explained first author Sam Wang, an associate professor of molecular biology and the Princeton Neuroscience Institute (PNI).
"It is well known that the cerebellum is an information processor. Our neocortex [the largest part of the brain, responsible for much higher processing] does not receive information unfiltered. There are critical steps that have to happen between when external information is detected by our brain and when it reaches the neural cortex," said Wang, who worked with doctoral student Alexander Kloth and postdoctoral research associate Aleksandra Badura, both in PNI.
"At some point, you learn that smiling is nice because Mom smiles at you. We have all these associations we make in early life because we don’t arrive knowing that a smile is nice," Wang said. "In autism, something in that process goes wrong and one thing could be that sensory information is not processed correctly in the cerebellum."
Mustafa Sahin, a neurologist at Boston’s Children Hospital and associate professor of neurology at Harvard Medical School, said that Wang and his co-authors build upon known links between cerebellar damage and autism to suggest that the cerebellum is essential to healthy neural development. Numerous studies — including from his own lab — support their theory, said Sahin, who is familiar with the work but was not involved in it.
"The association between cerebellar deficits and autism has been around for a while," Sahin said. "What Sam Wang and colleagues do in this perspective article is to synthesize these two themes and hypothesize that in a critical period of development, cerebellar dysfunction may disrupt the maturation of distant neocortical circuits, leading to cognitive and behavioral symptoms including autism."
Traditionally, the cerebellum has been studied in relation to motor movement and coordination in adults. Recent studies, however, strongly suggest that it also influences childhood cognition, Wang said. Several studies also have found a correlation between cerebellar injury and the development of a disorder in the autism spectrum, the researchers report. For instance, the researchers cite a 2007 paper in the journal Pediatrics that found that individuals who experienced cerebellum damage at birth were 40 times more likely to score highly on autism screening tests. They also reference studies in 2004 and 2005 that found that the cerebellum is the most frequently disrupted brain region in people with autism.
"What we realized from looking at the literature is that these two problems — autism and cerebellar injury — might be related to each other" via the cerebellum’s influence on wider neural development, Wang said. "We hope to get people and scientists thinking differently about the cerebellum or about autism so that the whole field can move forward."
The researchers conclude by suggesting methods for testing their theory. First, by inactivating brain-cell electrical activity, it should be possible to pinpoint the developmental stage in which injury to one part of the brain affects the maturation of another. A second, more advanced method is to reconstruct the neural connections between the cerebellum and other brain regions; the federal BRAIN Initiative announced in 2013 aims to map the activity of all the brain’s neurons. Finally, mouse brains can be used to disable and restore brain-region function to observe the “upstream” effect in other areas.
(Image caption: A consensus shape for the calcium ion channel in the worm’s pain receptor nerve that was reached by computer modeling. Credit: Damian van Rossum and Andriy Anishkin, Duke University)
Surprising New Role for Calcium in Sensing Pain
When you accidentally touch a hot oven, you rapidly pull your hand away. Although scientists know the basic neural circuits involved in sensing and responding to such painful stimuli, they are still sorting out the molecular players.
Duke researchers have made a surprising discovery about the role of a key molecule involved in pain in worms, and have built a structural model of the molecule. These discoveries, described Sept. 2 in Nature Communications, may help direct new strategies to treat pain in people.
In humans and other mammals, a family of molecules called TRP ion channels plays a crucial role in nerve cells that directly sense painful stimuli. Researchers are now blocking these channels in clinical trials to evaluate this as a possible treatment for various types of pain.
The roundworm Caenorhabditis elegans also expresses TRP channels — one of which is called OSM-9 — in its single head pain-sensing neuron (which is similar to the pain-sensing nerve cells for the human face). OSM-9 is not only vital for detecting danger signals in the tiny worms, but is also a functional match to TRPV4, a mammalian TRP channel involved in sensing pain.
In the new study, researchers created a series of genetic mutant worms in which parts of the OSM-9 channel were disabled or replaced and then tested the engineered worms’ reactions to overly salty solution, which is normally aversive and painful.
Specifically, the mutant worms had alterations in the pore of the OSM-9 channels in their pain-sensing neuron, which gets fired up upon channel activation to allow calcium and sodium to flow into the neuron. That, in turn, was thought to switch on the neural circuit that encodes rapid withdrawal behavior — like pulling the finger from the stove.
“People strongly believed that calcium entering the cell through the TRP channel is everything in terms of cellular activation,” said lead author Wolfgang Liedtke, M.D., Ph.D., an associate professor of neurology, anesthesiology and neurobiology at Duke University School of Medicine and an attending physician in the Duke Pain Clinics, where he sees patients with chronic head-neck and face-pain.
With then-graduate student Amanda Lindy, “we wanted to systemically mutagenize the OSM-9 pore and see what we could find in the live animal, in its pain behavior,” Liedtke said.
To the group’s surprise, changing various bits of OSM-9’s pore did not change most of the mutant worms’ reactions to the salty solution. However, these mutations did affect the flow of calcium into the cell. The disconnect they saw suggested the calcium was not playing a direct role in the worms’ avoidance of danger signals.
Calcium has been thought to be indispensable for pain behavior — not only in worms’ channels but in pain-related TRP channels in mammals. So results from the engineered OSM-9 mutant worms will change a central concept for the understanding of pain, Liedtke said.
To see whether calcium might instead play a role in the worms’ ability to adapt to repeated painful stimuli, the group then repeatedly exposed pore-mutant worms to the aversive and pain stimuli.
After the tenth trial, a normal worm becomes less sensitive to high salt. But one mutant worm with a minimal change to one specific part of its OSM-9 pore — altered so that calcium no longer entered but sodium did — was just as sensitive on the tenth trial as on the first.
The results confirmed that calcium flow through the channel makes the worms more adaptable to painful stimuli; it helps them cope with the onslaught by desensitizing them. This could well represent a survival advantage, Liedtke said.
To put the findings into a structural context, Liedtke collaborated with computational protein scientists Damian van Rossum and Andriy Anishkin from Penn State University, who built a structural model of OSM-9 that was based on established structures of several of the channel’s relatives, including the recently resolved structure of TRPV1, the molecule that senses pain caused by heat and hot chili peppers.
The team was then able to visualize the key parts of the OSM-9 pore in the context of the entire channel. They understood better how the pore holds its shape and allows sodium and calcium to pass.
Liedtke said that understanding this structure could be a great help in designing compounds that will not completely block the channel but will just prevent calcium from entering the cell. Although calcium helps desensitize worms to painful stimuli in the near term, it might set up chronic, pathological pain circuits in the long term, Liedtke said.
So, as a next step, the group plans to assess the longer-term effects calcium flow has in pain neurons. For example, calcium could change the expression of particular genes in the sensory neuron. And such gene expression changes could underlie chronic, pathologic pain.
“We assume, and so far the evidence is quite good, that chronic, pathological pain has to do with people’s genetic switches in their sensory system set in the wrong way, long term. That’s something our new worm model will now allow us to approach rationally by experimentation,” Liedtke said.
Research hints at why stress is more devastating for some
Some people take stress in stride; others are done in by it. New research at Rockefeller University has identified the molecular mechanisms of this so-called stress gap in mice with very similar genetic backgrounds — a finding that could lead researchers to better understand the development of psychiatric disorders such as anxiety and depression.
“Like people, each animal has unique experiences as it goes through its life. And we suspect that these life experiences can alter the expression of genes, and as a result, affect an animal’s susceptibility to stress,” says senior author Bruce McEwen, Alfred E. Mirsky Professor and head of the Harold and Margaret Milliken Hatch Laboratory of Neuroendocrinology. “We have taken an important step toward explaining the molecular origins of this stress gap by showing that inbred mice react differently to stress, with some developing behaviors that resemble anxiety and depression, and others remaining resilient.”
The results, published September 2 in Molecular Psychiatry, point toward potential new markers to aid the diagnosis of stress-related disorders, such as anxiety and depression, and a promising route to the development of new treatments for these devastating disorders.
In experiments, researchers stressed the mice by exposing them to daily, unpredictable bouts of cage tilting, altered dark-light cycles, confinement in tight spaces and other conditions mice dislike with the goal of reproducing the sort of stressful experiences thought to be a primary cause of depression in humans. Afterward, in tests to see if the mice displayed the rodent equivalent of anxiety and depression symptoms, they found about 40 percent showed high levels of behaviors that included a preference for a dark compartment over a brightly lit one, or a loss of interest in sugar water. The remaining 60 percent recovered well from the stress. This distinction between the susceptible mice and the resilient ones was so fundamental that it emerged even before the mice were subjected to stress; some unstressed mice showed an anxiety-like preference for a dark compartment over a lighted one.
The researchers found that the highly stress-susceptible mice had less of an important molecule known as mGlu2 in a stress-involved region of the brain known as the hippocampus. The mGlu2 decrease, they determined, resulted from an epigenetic change, which affects the expression of genes, in this case the gene that codes for mGlu2.
“If you think of the genetic code as words in a book, the book must be opened in order for you to read it. These epigenetic changes, which affect histone proteins associated with DNA, effectively close the book, so the code for mGlu2 cannot be read,” says first author Carla Nasca, a postdoc in the lab and a fellow of the American Foundation for Suicide Prevention. Previously, she and colleagues implicated mGlu2 in depression when they showed that a promising potential treatment known as acetyl carnitine rapidly alleviated depression-like symptoms in rats and mice by reversing these epigenetic changes to mGlu2 and causing its levels to increase.
“Currently, depression is diagnosed only by its symptoms,” Nasca says. “But these results put us on track to discover molecular signatures in humans that may have the potential to serve as markers for certain types of depression. Our work could also lead to a new generation of rapidly acting antidepressants, such as the candidate acetyl carnitine, which would be particularly important to reduce the risk of suicide.”
A reduction in mGlu2 matters because this molecule regulates the neurotransmitter glutamate. While glutamate plays a crucial role relaying messages between neurons as part of many important processes, too much can lead to harmful structural changes in the brain.
“The brain is constantly changing. When stressful experiences lead to anxiety and depressive disorders the brain becomes locked in a state it cannot spontaneously escape,” McEwen says. “Studies like this one are increasingly focusing on the regulation of glutamate as an underlying mechanism in depression and, we hope, opening promising new avenues for the diagnosis and treatment of this devastating disorder.”
(Image caption: Aggressor cells, which have the potential to cause autoimmunity, are targeted by treatment, causing conversion of these cells to protector cells. Gene expression changes gradually at each stage of treatment, as illustrated by the color changes in this series of heat maps. Credit: University of Bristol/Dr. Bronwen Burton)
Scientists discover how to ‘switch off’ autoimmune diseases
Scientists have made an important breakthrough in the fight against debilitating autoimmune diseases such as multiple sclerosis by revealing how to stop cells attacking healthy body tissue.
Rather than the body’s immune system destroying its own tissue by mistake, researchers at the University of Bristol have discovered how cells convert from being aggressive to actually protecting against disease.
The study, funded by the Wellcome Trust, is published in Nature Communications.
It’s hoped this latest insight will lead to the widespread use of antigen-specific immunotherapy as a treatment for many autoimmune disorders, including multiple sclerosis (MS), type 1 diabetes, Graves’ disease and systemic lupus erythematosus (SLE).
MS alone affects around 100,000 people in the UK and 2.5 million people worldwide.
Scientists were able to selectively target the cells that cause autoimmune disease by dampening down their aggression against the body’s own tissues while converting them into cells capable of protecting against disease.
This type of conversion has been previously applied to allergies, known as ‘allergic desensitisation’, but its application to autoimmune diseases has only been appreciated recently.
The Bristol group has now revealed how the administration of fragments of the proteins that are normally the target for attack leads to correction of the autoimmune response.
Most importantly, their work reveals that effective treatment is achieved by gradually increasing the dose of antigenic fragment injected.
In order to figure out how this type of immunotherapy works, the scientists delved inside the immune cells themselves to see which genes and proteins were turned on or off by the treatment.
They found changes in gene expression that help explain how effective treatment leads to conversion of aggressor into protector cells. The outcome is to reinstate self-tolerance whereby an individual’s immune system ignores its own tissues while remaining fully armed to protect against infection.
By specifically targeting the cells at fault, this immunotherapeutic approach avoids the need for the immune suppressive drugs associated with unacceptable side effects such as infections, development of tumours and disruption of natural regulatory mechanisms.
Professor David Wraith, who led the research, said: “Insight into the molecular basis of antigen-specific immunotherapy opens up exciting new opportunities to enhance the selectivity of the approach while providing valuable markers with which to measure effective treatment. These findings have important implications for the many patients suffering from autoimmune conditions that are currently difficult to treat.”
This treatment approach, which could improve the lives of millions of people worldwide, is currently undergoing clinical development through biotechnology company Apitope, a spin-out from the University of Bristol.
Conscious Brain-to-Brain Communication in Humans Using Non-Invasive Technologies
Human sensory and motor systems provide the natural means for the exchange of information between individuals, and, hence, the basis for human civilization. The recent development of brain-computer interfaces (BCI) has provided an important element for the creation of brain-to-brain communication systems, and precise brain stimulation techniques are now available for the realization of non-invasive computer-brain interfaces (CBI). These technologies, BCI and CBI, can be combined to realize the vision of non-invasive, computer-mediated brain-to-brain (B2B) communication between subjects (hyperinteraction). Here we demonstrate the conscious transmission of information between human brains through the intact scalp and without intervention of motor or peripheral sensory systems. Pseudo-random binary streams encoding words were transmitted between the minds of emitter and receiver subjects separated by great distances, representing the realization of the first human brain-to-brain interface. In a series of experiments, we established internet-mediated B2B communication by combining a BCI based on voluntary motor imagery-controlled electroencephalographic (EEG) changes with a CBI inducing the conscious perception of phosphenes (light flashes) through neuronavigated, robotized transcranial magnetic stimulation (TMS), with special care taken to block sensory (tactile, visual or auditory) cues. Our results provide a critical proof-of-principle demonstration for the development of conscious B2B communication technologies. More fully developed, related implementations will open new research venues in cognitive, social and clinical neuroscience and the scientific study of consciousness. We envision that hyperinteraction technologies will eventually have a profound impact on the social structure of our civilization and raise important ethical issues.

Training Your Brain to Prefer Healthy Foods
It may be possible to train the brain to prefer healthy low-calorie foods over unhealthy higher-calorie foods, according to new research by scientists at the Jean Mayer USDA Human Nutrition Research Center on Aging (USDA HNRCA) at Tufts University and at Massachusetts General Hospital. Published online today in the journal Nutrition & Diabetes, a brain scan study in adult men and women suggests that it is possible to reverse the addictive power of unhealthy food while also increasing preference for healthy foods.
“We don’t start out in life loving French fries and hating, for example, whole wheat pasta,” said senior and co-corresponding author Susan B. Roberts, Ph.D., director of the Energy Metabolism Laboratory at the USDA HNRCA, who is also a professor at the Friedman School of Nutrition Science and Policy at Tufts University and an adjunct professor of psychiatry at Tufts University School of Medicine. “This conditioning happens over time in response to eating – repeatedly! - what is out there in the toxic food environment.”
Scientists have suspected that, once unhealthy food addiction circuits are established, they may be hard or impossible to reverse, subjecting people who have gained weight to a lifetime of unhealthy food cravings and temptation. To find out whether the brain can be re-trained to support healthy food choices, Roberts and colleagues studied the reward system in thirteen overweight and obese men and women, eight of whom were participants in a new weight loss program designed by Tufts University researchers and five who were in a control group and were not enrolled in the program.
Both groups underwent magnetic resonance imaging (MRI) brain scans at the beginning and end of a six-month period. Among those who participated in the weight loss program, the brain scans revealed changes in areas of the brain reward center associated with learning and addiction. After six months, this area had increased sensitivity to healthy, lower-calorie foods, indicating an increased reward and enjoyment of healthier food cues. The area also showed decreased sensitivity to the unhealthy higher-calorie foods.
“The weight loss program is specifically designed to change how people react to different foods, and our study shows those who participated in it had an increased desire for healthier foods along with a decreased preference for unhealthy foods, the combined effects of which are probably critical for sustainable weight control,” said co-author Sai Krupa Das, Ph.D., a scientist in the Energy Metabolism Laboratory at the USDA HNRCA and an assistant professor at the Friedman School. “To the best of our knowledge this is the first demonstration of this important switch.” The authors hypothesize that several features of the weight loss program were important, including behavior change education and high-fiber, low glycemic menu plans.
“Although other studies have shown that surgical procedures like gastric bypass surgery can decrease how much people enjoy food generally, this is not very satisfactory because it takes away food enjoyment generally rather than making healthier foods more appealing,” said first author and co-corresponding author Thilo Deckersbach, Ph.D., a psychologist at Massachusetts General Hospital. “We show here that it is possible to shift preferences from unhealthy food to healthy food without surgery, and that MRI is an important technique for exploring the brain’s role in food cues.”
“There is much more research to be done here, involving many more participants, long-term follow-up and investigating more areas of the brain,” Roberts added. “But we are very encouraged that, the weight loss program appears to change what foods are tempting to people.”
Bats do not use sight to navigate when flying. Instead, they emit ultrasound pulses and measure the echoes reflected from their surroundings. They have an extremely flexible internal navigation system that enables them to do this. A new study published in Nature Communications shows that when a bat flies close to an object, the number of active neurons in the part of a bat’s brain responsible for processing acoustic information about spatial positioning increases. This information helps these masters of flight to react rapidly and avoid obstacles.
As nocturnal animals, bats are perfectly adapted to a life without light. They emit echolocation sounds and use the delay between the reflected echoes to measure distance to obstacles or prey. In their brains, they have a spatial map representing different echo delays. A study carried out by researchers at Technische Universität München (TUM) has shown for the first time that this map dynamically adapts to external factors.
Closer objects appear larger
When a bat flies in too close to an object, the number of activated neurons in its brain increases. As a result, the object appears disproportionately larger on the bat’s brain map than objects at a safe distance, as if it were magnified. “The map is similar to the navigation systems used in cars in that it shows bats the terrain in which they are moving,” explains study director Dr. Uwe Firzlaff at the TUM Chair of Zoology. “The major difference, however, is that the bats’ inbuilt system warns them of an impending collision by enhancing neuronal signals for objects that are in close proximity.”
Bats constantly adapt their flight maneuvers to their surroundings to avoid collisions with buildings, trees or other animals. The ability to determine lateral distance to other objects also plays a key role here. Which is why bats process more spatial information than just echo delays. “Bats evaluate their own motion and map it against the lateral distance to objects,” elaborates the researcher.
Brain processes complex spatial information
In addition to the echo reflection time, bats process the reflection angle of echoes. They also compare the sound volume of their calls with those of the reflected sound waves and measure the wave spectrum of the echo. “Our research has led us to conclude that bats display much more spatial information on their acoustic maps than just echo reflection.”
The results show that the nerve cells interpret the bats’ rapid responses to external stimuli by enlarging the active area in the brain to display important information. “We may have just uncovered one of the fundamental mechanisms that enables vertebrates to adapt flexibly to continuously changing environments,” concludes Firzlaff.
Neurons in human skin perform advanced calculations
A fundamental characteristic of neurons that extend into the skin and record touch, so-called first-order neurons in the tactile system, is that they branch in the skin so that each neuron reports touch from many highly-sensitive zones on the skin.
According to researchers at the Department of Integrative Medical Biology, IMB, Umeå University, this branching allows first-order tactile neurons not only to send signals to the brain that something has touched the skin, but also process geometric data about the object touching the skin.
Our work has shown that two types of first-order tactile neurons that supply the sensitive skin at our fingertips not only signal information about when and how intensely an object is touched, but also information about the touched object’s shape, says Andrew Pruszynski, who is one of the researchers behind the study.
The study also shows that the sensitivity of individual neurons to the shape of an object depends on the layout of the neuron’s highly-sensitive zones in the skin.
Perhaps the most surprising result of our study is that these peripheral neurons, which are engaged when a fingertip examines an object, perform the same type of calculations done by neurons in the cerebral cortex. Somewhat simplified, it means that our touch experiences are already processed by neurons in the skin before they reach the brain for further processing, says Andrew Pruszynski.
Swinburne researchers have developed a technique to create a highly sensitive surface for measuring the concentration of a peptide that is a biomarker for early stage Alzheimer’s disease.

(Image caption: Ultrashort-laser pulses were used to write ripples on the surface of sapphire. The self-organised nano-structure of ripples (seen in the image) is a perfect sensing surface after coating with a nanometre-thin layer of gold made by evaporation or sputtering. Such surface ripples were used in the study of amyloid detection.)
Alzheimer’s disease was first recorded more than 100 years ago, but there is still no effective therapy to stop or slow the progression of the disease. Sufferers can lose up to 60 per cent of their neuronal cells before a diagnosis is obtained.
Diagnosis at the very early stages before neuronal degeneration has begun is vital for testing and developing new treatments.
Abnormality of the beta amyloid peptide in cerebrospinal fluid appears to be the earliest and most significant marker of Alzheimer’s. Currently there are no standardised tests to detect these biomarkers.
The researchers have developed a sensor based on nanotechnology that outperforms commercial sensors and demonstrates fast and reliable measurement of beta amyloid oligomers at low concentrations.
The key to the high sensitivity is the laser nano-textured gold coated surface. This sensor can identify concentrations of beta amyloid in a quantitative manner for the first time.
“We showed that sensors based on light scattering can indeed deliver QUANTATIVE measurements and they can be made fast,” Professor of Nanophotonics Saulius Juodkazis said.
“The sensor platform we developed by laser nano-texturing of surfaces is delivering results of the highest sensitivity and repeatability.
“The challenge is to create fast and efficient fabrication of sensors based on nanotechnology and develop new analytical methods of detection. This means we should be able to detect markers of diseases at far lower levels.”
Surface enhanced Raman spectroscopy (SERS) is one of the most sensitive and highly specific label-free detection methods which may evolve as a detection technique for different forms of beta amyloid or as a rapid, low cost technique to validate new biomarkers before developing standard assays for enzyme-linked immunosorbent assays (ELISAs).
This research is a PhD project work of Dr Ricardas Buividas who received his doctorate in May 2014. It was published in the Journal of Biophotonics.
(Source: swinburne.edu.au)
Chinese Doctors Use 3D-Printing in Pioneering Surgery to Replace Half of Man’s Skull
Surgeons at Xijing Hospital in Xi’an, Shaanxi province in Northwest China are using 3D-printing in a pioneering surgery to help rebuild the skull of a man who suffered brain damage in a construction accident.
Hu, a 46-year-old farmer, was overseeing construction to expand his home in Zhouzhi county last October when he was hit by a pile of wood and fell down three storeys.
Although he survived the fall, the left side of his skull was severely crushed and the shattered bone fragments needed to be removed, which has led to a depression of one side of his head.
Due to his injuries, Hu cannot see well out of his left eye, experiences double vision (diplopia) and is also unable to speak and write.