Neuroscience

Month

September 2013

Old memories recombine to give a taste of the unknown

Ever tried beetroot custard? Probably not, but your brain can imagine how it might taste by reactivating old memories in a new pattern.

image

Helen Barron and her colleagues at University College London and Oxford University wondered if our brains combine existing memories to help us decide whether to try something new.

So the team used an fMRI scanner to look at the brains of 19 volunteers who were asked to remember specific foods they had tried.

Each volunteer was then given a menu of 13 unusual food combinations – including beetroot custard, tea jelly, and coffee yoghurt – and asked to imagine how good or bad they would taste, and whether or not they would eat them.

"Tea jelly was popular," says Barron. "Beetroot custard not so much."

When each volunteer imagined a new combination, they showed brain activity associated with each of the known ingredients at the same time. It is the first evidence to suggest that we use memory combination to make decisions, says Barron.

Sep 9, 2013175 notes
#decision making #memory #medial prefrontal cortex #hippocampus #neuroscience #science
Genetic breakthrough another step to understanding schizophrenia

A consortium of scientists from 20 countries, including researchers from The University of Western Australia, has made a major breakthrough in understanding the genetic basis of the debilitating disorder, schizophrenia.

More than 175 scientists from 99 institutions across Europe, the United States of America and Australia contributed to a genome-wide association analysis which identified 13 new risk loci for schizophrenia.

In an article published in the journal, Nature Genetics, the study authors write that the results provide deeper insight into the genetic architecture of schizophrenia than ever before achieved, and provide a pathway to further research.

"For the first time, there is a clear path to increased knowledge of the etiology of schizophrenia through the application of standard, off-the-shelf genomic technologies for elucidating the effects of common variation," the authors wrote.

Schizophrenia is a complex mental disorder which affects about one per cent of people over their lifetime, leading to prolonged or recurrent episodes that impair severely social functioning and quality of life.

In terms of the ‘global burden of disease and disability’ index, developed by the World Health Organization, it ranks among the top 10 disorders, along with cancer, heart disease, diabetes and other non-communicable diseases.

Winthrop Professor Assen Jablensky, director of UWA’s Centre for Clinical Research in Neuropsychiatry (CCRN) at Graylands Hospital, and Professor Luba Kalaydjieva, of the UWA-affiliated Western Australian Institute for Medical Research (WAIMR), led the UWA research team which took part in the study.

Professor Jablensky said that while a strong genetic component in the causation of schizophrenia had been well established, the role of specific genes and the mechanisms of their regulation remained largely unknown.

"Until recently, results of genetic linkage and association studies could explain only a small fraction of the estimated heritability of the disorder and of its ‘genetic architecture’," Professor Jablensky said.

However recent technological advances, enabling efficient coverage of the entire human genome with millions of single nucleotide polymorphisms (SNPs) as genetic markers, had given rise to a new generation of genome-wide association studies (GWAS), which trace the DNA differences between people affected with the disease and healthy control individuals.

"Since the effects of individual SNPs are quite tiny, their reliable measurement requires very large samples of adequately diagnosed patients and controls," Professor Jablensky said.

"This recent study reports on a major breakthrough in the understanding of the genetic basis of schizophrenia, achieved through meta-analysis of GWAS datasets contributed by a large international Psychiatric Genomics Consortium (PGC) - which includes the UWA research team."

A WA case-control sample consisting of 893 schizophrenia patients and healthy controls was part of a collection of 21,246 schizophrenia cases and 38,072 controls from 19 research centres and consortia across Europe, Australia and the USA.

The study found that a total of 8300 SNPs contribute to the risk for schizophrenia and account for at least 32 per cent of the variance in liability.

"A particularly important result of this study is that many of these SNPs are located on a molecular pathway involved in neuronal calcium signalling, which suggests a novel pathogenetic link in the causation of schizophrenia and possibly other psychotic disorders," Professor Jablensky said.

He said ongoing and future studies by the UWA research team would aim to further refine the genetic analyses of the WA schizophrenia study (which at present includes 1259 persons), and to test neurobiological hypotheses about the treatment responses of genetically defined subsets of patients. 

Sep 9, 2013111 notes
#schizophrenia #GWAS #genetics #neuroscience #science
Sep 9, 201357 notes
#brain function #nerve cells #C. elegans #nervous system #neural activity #neuroscience #science
Sep 8, 2013144 notes
#glial cells #brain mapping #connectome #neuroscience #science
Sep 8, 201362 notes
#parkinson's disease #alpha synuclein #neurodegenerative diseases #protein #medicine #neuroscience #science
Sep 8, 2013956 notes
#decision making #trust #betrayal #frontal cortex #psychology #neuroscience #science
Sep 8, 2013121 notes
#vitamin B-12 #B-12 deficiency #cognitive decline #dementia #neuroscience #science
Finally mapped: The brain region that distinguishes bits from bounty

In comparing amounts of things — be it the grains of sand on a beach, or the size of a sea gull flock inhabiting it — humans use a part of the brain that is organized topographically, researchers have finally shown. In other words, the neurons that work to make this “numerosity” assessment are laid out in a shape that allows those most closely related to communicate and interact over the shortest possible distance.

image

This layout, referred to as a topographical map, is characteristic of all primary senses — sight, hearing, touch, smell and taste — and scientists have long assumed that numerosity, while not a primary sense (but perceived similarly to one), might be characterized by such a map, too.

But they have not been able to find it, which has caused some doubt in the field as to whether a map for numerosity exists.

Now, however, Utrecht University’s Benjamin Harvey, along with his colleagues, have sussed out signals that illustrate the hypothesized numerosity map is real.

Numerosity, it is important to note, is distinct from symbolic numbers. “We use symbolic numbers to represent numerosity and other aspects of magnitude, but the symbol itself is only a representation,” Harvey said. He went on to explain that numerosity selectivity in the brain is derived from visual processing of image features, where symbolic number selectivity is derived by recognizing the shapes of numerals, written words, and linguistic sounds that represent numbers. “This latter task relies on very different parts of the brain that specialize in written and spoken language.”

Understanding whether the brain’s processing of numerosity and symbolic numbers is related, as we might be tempted to think, is just one area that will be better informed by Harvey’s new map.

To uncover it, he and his colleagues asked eight adult study participants to look at patterns of dots that varied in number over time, all the while analysing the neural response properties in a numerosity-linked part of their brain using high-field fMRI (functional magnetic resonance imaging). Use of this advanced neuroimaging method allowed them to scan the subjects for far fewer hours per sitting than would have been required with a less powerful scanning technology.

With the fMRI data that resulted, Harvey and his team used population receptive field modelling, which aims to measure neural response as directly and quantitatively as possible. “This was the key to our success,” Harvey said. It allowed the researchers to model the human fMRI response properties they observed following results of recordings from macaque neurons, in which numerosity experiments had been conducted more extensively.

Their efforts revealed a topographical layout of numerosity in the human brain; the small quantities of dots the participants observed were encoded by neurons in one part of the brain, and the larger quantities, in another.

This finding demonstrates that topography can emerge not just for lower-level cognitive functions, like the primary senses, but for higher-level cognitive functions, too.

"We are very excited that association cortex can produce emergent topographic structures," Harvey said.

Because scientists know a great deal about topographical maps (and have the tools to probe them), the work of Harvey et al. may help scientists better analyse the neural computation underlying number processing.

"We believe this will lead to a much more complete understanding of humans’ unique numerical and mathematical skills," Harvey said.

Having heard from others in the field about the difficulty associated with the hunt for a topographical map of numerosity, Harvey and colleagues were surprised to obtain the results they did.

They also found the variations between their subjects interesting.

"Every individual brain is a complex and very different system," Harvey explained. "I was very surprised then that the map we report is in such a consistent location between our subjects, and that numerosity preferences always increased in the same direction along the cortex."

"On the other hand," he continued, "the extent of individual differences … is also striking." Harvey explained that understanding the consequences of these differences for their subjects’ perception or task performance will require further study.

Sep 7, 201383 notes
#numerosity #parietal cortex #topographical map #neuroimaging #neuroscience #science
Salk scientists and colleagues discover important mechanism underlying Alzheimer's disease

Details of destructive neuronal pathway should help improve drug therapies

Alzheimer’s disease affects more than 26 million people worldwide. It is predicted to skyrocket as boomers age—nearly 106 million people are projected to have the disease by 2050. Fortunately, scientists are making progress towards therapies. A collaboration among several research entities, including the Salk Institute and the Sanford-Burnham Medical Research Institute, has defined a key mechanism behind the disease’s progress, giving hope that a newly modified Alzheimer’s drug will be effective.

In a previous study in 2009, Stephen F. Heinemann, a professor in Salk’s Molecular Neurobiology Laboratory, found that a nicotinic receptor called Alpha7 may help trigger Alzheimer’s disease. “Previous studies exposed a possible interaction between Alpha-7 nicotinic receptors (α7Rs) with amyloid beta, the toxic protein found in the disease’s hallmark plaques,” says Gustavo Dziewczapolski, a staff researcher in Heinemann’s lab. “We showed for the first time, in vivo, that the binding of this two proteins, α7Rs and amyloid beta, provoke detrimental effects in mice similar to the symptoms observed in Alzheimer’s disease .”

Their experiments, published in The Journal of Neuroscience, with Dziewczapolski as first author, consisted in testing Alzheimer’s disease-induced mice with and without the gene for α7Rs. They found that while both types of mice developed plaques, only the ones with α7Rs showed the impairments associated with Alzheimer’s.

But that still left a key question: Why was the pairing deleterious?

In a recent paper in the Proceedings of the National Academy of Sciences, Heinemann and Dziewczapolski here at Salk with Juan Piña-Crespo, Sara Sanz-Blasco, Stuart A. Lipton of the Sanford-Burnham Medical Research Institute and their collaborators announced they had found the answer in unexpected interactions among neurons and other brain cells.

Neurons communicate by sending electrical and chemical signals to each other across gaps called synapses. The biochemical mix at synapses resembles a major airport on a holiday weekend—it’s crowded, complicated and exquisitely sensitive to increases and decreases in traffic. One of these signaling chemicals is glutamate, an excitatory neurotransmitter, which is essential for learning and storing memories. In the right balance, glutamate is part of the normal functioning of neuronal synapses. But neurons are not the only cells in the brain capable of releasing glutamate. Astrocytes, once thought to be merely cellular glue between neurons, also release this neurotransmitter.

In this new understanding of Alzheimer’s disease, there is a cellular signaling cascade, in which amyloid beta stimulates the alpha 7 nicotine receptors, which trigger astrocytes to release additional glutamate into the synapse, overwhelming it with excitatory (“go”) signals.

This release in turn activates another set of receptors outside of the synapse, called extrasynaptic-N-methyl-D-aspartate receptors (eNMDARs) that depress synaptic activity. Unfortunately, the eNMDARs seem to overly depress synaptic function, leading to the memory loss and confusion associated with Alzheimer’s.

Now that the team has finally determined the steps in this destructive pathway, the good news is that a drug developed by the Lipton’s Laboratory called NitroMemantine, a modification of the earlier Alzheimer’s medication, Memantine, may block the entry of eNMDARs into the cascade.

"Thanks to the joint effort of our colleagues and collaborators, we seem to finally have a clear mechanistic link between a key target of the amyloid beta in the brain, the Alpha7 nicotinic receptors, triggering downstream harmful effects associated with the initiation and progression of Alzheimer’s disease," says Dziewczapolski. "This is a clear demonstration of the value of basic biomedical research. Drug development cannot proceed without knowing the details of interactions at the molecular and cellular level. Our research revealed two potential targets, α7Rs and eNMDARs, for future disease-modifying therapeutics, which Dr. Heinemann and I both hope will translate in a better treatment for Alzheimer’s patients."

Sep 7, 201355 notes
#alzheimer's disease #amyloid beta #nicotine receptors #eNMDARs #neuroscience #science
Shout now! ‒ How Nerve Cells Initiate Voluntary Calls

University of Tübingen neuroscientists show that monkeys can decide to call out or keep silent

image

“Should I say something or not?” Human beings are not alone in pondering this dilemma – animals also face decisions when they communicate by voice. University of Tübingen neurobiologists Dr. Steffen Hage and Professor Andreas Nieder have now demonstrated that nerve cells in the brain signal the targeted initiation of calls – forming the basis of voluntary vocal expression. Their results are published in “Nature Communications.”

When we speak, we use the sounds we make for a specific purpose – we intentionally say what we think, or consciously withhold information. Animals, however, usually make sounds according to what they feel at that moment. Even our closest relations among the primates make sounds as a reflex based on their mood. Now, Tübingen neuroscientists have shown that rhesus monkeys are able to call (or be silent) on command. They can instrumentalize the sounds they make in a targeted way, an important behavioral ability which we also use to put language to a purpose.

To find out how the neural cells in the brain catalyse the production of controled vocal noises, the researchers taught rhesus monkeys to call out quickly when a spot appeared on a computer screen. While the monkeys solved puzzles, measurements taken in their prefrontal cortex revealed astonishing reactions in the cells there. The nerve cells became active whenever the monkey saw the spot of light which was the instruction to call out. But if the monkey simply called out spontaneously, these nerve cells were not activated. The cells therefore did not signaled for just any vocalisation – only for calls that the monkey actively decided to make.

The results published in “Nature Communications” provide valuable insights into the neurobiological foundations of vocalization. “We want to understand the physiological mechanisms in the brain which lead to the voluntary production of calls,” says Dr. Steffen Hage of the Institute for Neurobiology, “because it played a key role in the evolution of human ability to use speech.” The study offers important indicators of the function of part of the brain which in humans has developed into one of the central locations for controlling speech. “Disorders in this part of the human brain lead to severe speech disorders or even complete loss of speech in the patient,” Professor Andreas Nieder explains. The results – giving insights into how the production of sound is initiated – may help us better understand speech disorders.

Sep 7, 201351 notes
#speech production #vocalizations #primates #nerve cells #Broca's area #neuroscience #science
Experimental Compound Reverses Down Syndrome-Like Learning Deficits In Mice

Researchers at Johns Hopkins and the National Institutes of Health have identified a compound that dramatically bolsters learning and memory when given to mice with a Down syndrome-like condition on the day of birth. As they report in the Sept. 4 issue of Science Translational Medicine, the single-dose treatment appears to enable the cerebellum of the rodents’ brains to grow to a normal size.

The scientists caution that use of the compound, a small molecule known as a sonic hedgehog pathway agonist, has not been proven safe to try in people with Down syndrome, but say their experiments hold promise for developing drugs like it.

“Most people with Down syndrome have a cerebellum that’s about 60 percent of the normal size,” says Roger Reeves, Ph.D., a professor in the McKusick-Nathans Institute of Genetic Medicine at the Johns Hopkins University School of Medicine. “We treated the Down syndrome-like mice with a compound we thought might normalize the cerebellum’s growth, and it worked beautifully. What we didn’t expect were the effects on learning and memory, which are generally controlled by the hippocampus, not the cerebellum.”

Reeves has devoted his career to studying Down syndrome, a condition that occurs when people have three, rather than the usual two, copies of chromosome 21. As a result of this “trisomy,” people with Down syndrome have extra copies of the more than 300 genes housed on that chromosome, which leads to intellectual disabilities, distinctive facial features and sometimes heart problems and other health effects. Since the condition involves so many genes, developing treatments for it is a formidable challenge, Reeves says.

For the current experiments, Reeves and his colleagues used mice that were genetically engineered to have extra copies of about half of the genes found on human chromosome 21.
The mice have many characteristics similar to those of people with Down syndrome, including relatively small cerebellums and difficulty learning and remembering how to navigate through a familiar space. (In the case of the mice, this was tested by tracking how readily the animals located a platform while swimming in a so-called water maze.)
Based on previous experiments on how Down syndrome affects brain development, the researchers tried supercharging a biochemical chain of events known as the sonic hedgehog pathway that triggers growth and development. They used a compound — a sonic hedgehog pathway agonist — that could do just that.

The compound was injected into the Down syndrome-like mice just once, on the day of birth, while their cerebellums were still developing. “We were able to completely normalize growth of the cerebellum through adulthood with that single injection,” Reeves says.

But the research team went beyond measuring the cerebellums, looking for changes in behavior, too. “Making the animals, synthesizing the compound and guessing the right dose were so difficult and time-consuming that we wanted to get as much data out of the experiment as we could,” Reeves says. The team tested the treated mice against untreated Down syndrome-like mice and normal mice in a variety of ways, and found that the treated mice did just as well as the normal ones on the water maze test.

Reeves says further research is needed to learn why exactly the treatment works, because their examination of certain cells in the hippocampus known to be involved in learning and affected by Down syndrome appeared unchanged by the sonic hedgehog agonist treatment. One idea is that the treatment improved learning by strengthening communication between the cerebellum and the hippocampus, he says.

As for the compound’s potential to become a human drug, the problem, Reeves says, is that altering an important biological chain of events like sonic hedgehog would likely have many unintended effects throughout the body, such as raising the risk of cancer by triggering inappropriate growth. But now that the team has seen the potential of this strategy, they will look for more targeted ways to safely harness the power of sonic hedgehog in the cerebellum. Even if his team succeeds in developing a clinically useful drug, however, Reeves cautions that it wouldn’t constitute a “cure” for the learning and memory-related effects of Down syndrome. “Down syndrome is very complex, and nobody thinks there’s going to be a silver bullet that normalizes cognition,” he says. “Multiple approaches will be needed.”

Sep 7, 201352 notes
#down syndrome #trisomy #sonic hedgehog pathway #cerebellum #animal model #neuroscience #science
Sep 6, 2013150 notes
#peripersonal space #premotor cortex #mirror neurons #fMRI #psychology #neuroscience #science
Sep 6, 201352 notes
#hyperactivity #inner-ear disorders #gene mutation #striatum #neuroscience #science
“Seeing” Faces Through Touch

Our sense of touch can contribute to our ability to perceive faces, according to new research published in Psychological Science, a journal of the Association for Psychological Science.

“In daily life, we usually recognize faces through sight and almost never explore them through touch,” says lead researcher Kazumichi Matsumiya of Tohoku University in Japan. “But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition — these new findings suggest that even face processing is essentially multisensory.”

In a series of studies, Matsumiya took advantage of a phenomenon called the “face aftereffect” to investigate whether our visual system responds to nonvisual signals for processing faces. Inthe face aftereffect, we adapt to a face with a particular expression — happiness, for example — which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).

Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.

To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.

In line with his hypothesis, Matsumiya found that participants’ experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.

Further experiments ruled out other explanations for the results, including the possibility that the face aftereffects emerged because participants were intentionally imagining visual faces during the adaptation period.

And a fourth experiment revealed that the aftereffect also works the other way: Visual stimuli can influence how we perceive a face through touch.

According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality — but these experiments suggest that face perception is truly crossmodal.

“These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain,” notes Matsumiya, suggesting that these findings may have implications for enhancing vision and telecommunication in the development of aids for the visually impaired.

Sep 6, 201365 notes
#face perception #face processing #face aftereffects #adaptation #psychology #neuroscience #science
Nasal inhalation of oxytocin improves face blindness

Prosopagnosia (face blindness) may be temporarily improved following inhalation of the hormone oxytocin.

image

This is the finding of research led by Dr Sarah Bate and Dr Rachel Bennetts of the Centre for Face Processing Disorders at Bournemouth University that will be presented today, Friday 6 September, at the British Psychological Society’s Joint Cognitive and Developmental annual conference at the University of Reading.

Dr Bate explained: “Prosopagnosia is characterised by a severe impairment in face recognition, whereby a person cannot identify the faces of their family or friends, or even their own face”

The researchers tested twenty adults (10 with prosopagnosia and 10 control participants). Each participant visited the laboratory on two occasions, approximately two weeks apart. On one visit they inhaled the oxytocin nasal spray, and on the other visit they inhaled the placebo spray. The two sprays were prepared by an external pharmaceutical company in identical bottles, and neither the participants nor the researchers knew the identity of the sprays until the data had been analysed.

Regardless of which spray the person inhaled, the testing sessions had an identical format. Participants inhaled the spray, then sat quietly for 45 minutes to allow the spray to take effect. They then participated in two face processing tests: one testing their ability to remember faces and the other testing their ability to match faces of the same identity.

The researchers found that the participants with prosopagnosia achieved higher scores on both face processing tests in the oxytocin condition. Interestingly, no improvement was observed in the control participants, suggesting the hormone may be more effective in those with impaired face recognition systems.

The initial ten participants with prosopagnosia had a developmental form of the condition. Individuals with developmental prosopagnosia have never experienced brain damage, and this form of face blindness is thought to be very common, affecting one in 50 people. Much more rarely, people can acquire prosopagnosia following a brain injury. At a later date, the researchers had the opportunity to test one person with acquired prosopagnosia, and also observed a large improvement following oxytocin inhalation in this individual.

Dr Bate said: “This study provides the first evidence that oxytocin may be used to temporarily improve face recognition in people with either developmental or acquired prosopagnosia. The effects of the hormone are thought to last 2-3 hours, and it may be that the nasal spray can be used to improve face recognition on a special occasion. However, much more research needs to be carried out, as we don’t currently know whether there are benefits or risks associated with longer-term inhalation of the hormone.”

Sep 6, 201380 notes
#prosopagnosia #oxytocin #face recognition #psychology #neuroscience #science
Sep 6, 2013435 notes
#science #schizophrenia #bipolar disorder #psychiatric disorders #endophenotype #neuroscience
Sep 6, 201361 notes
#aging #neurodegenerative diseases #prion proteins #amyloid beta #neuroscience #science
Sep 6, 2013102 notes
#C. elegans #nerve cells #EBAX proteins #Hsp90 protein #neuroscience #science
Robots Could One Day Help Surgeons Remove Hard to Reach Brain Tumors

NIBIB-funded scientists and engineers are teaming up with neurosurgeons to develop technologies that enable less invasive, image-guided removal of hard-to-reach brain tumors. Their technologies combine novel imaging techniques that allow surgeons to see deep within the brain during surgery with robotic systems that enhance the precision of tissue removal.

A robot that worms its way in

image

The median survival rate for patients with glioblastomas, or high grade primary brain cancer, is less than two years. One factor contributing to this low rate is the fact that many deep-seated and pervasive tumors are not entirely accessible or even visible when using current neurosurgical tools and imaging techniques.

But several years ago, J. Marc Simard, M.D., a professor of neurosurgery at the University of Maryland School of Medicine in Baltimore (UMB), had an insight that he hoped might address this problem. At the time, he had been watching a TV show in which plastic surgeons were using sterile maggots to remove damaged or dead tissue from a patient.

“Here you had a natural system that recognized bad from good and good from bad,” said Simard. “In other words, the maggots removed all the bad stuff and left all the good stuff alone and they’re really small. I thought, if you had something equivalent to that to remove a brain tumor that would be an absolute home run.”

image

Image: Initial prototype for the minimally invasive neurosurgical intracranial robot. Image courtesy of University of Maryland.

And so Simard teamed up with Rao Gullapalli, Ph.D., professor of diagnostic radiology and nuclear medicine also at UMB, as well as Jaydev Desai, Ph.D., professor of mechanical engineering at the University of Maryland, College Park, to develop a small neurosurgical robot that could be used to remove deep-seated brain tumors.

Within four years, the team had designed, constructed, and tested their first prototype, a finger-like device with multiple joints, allowing it to move in many directions. At the tip of the robot is an electrocautery tool, which uses electricity to heat and ultimately destroy tumors, as well as a suction tube for removing debris.

“The idea was to have a device that’s small but that can do all the work a surgeon normally does,” said Simard. “You could place this small robotic device inside a tumor and have it work its way around from within, removing pieces of diseased tissue.”

A key component of the team’s device is its ability to be used while a patient is undergoing MRI. By replacing normal vision with continuously updated MRI, the surgeon is able to visualize deep-seated tumors and monitor the robot’s movement without having to create a large incision in the brain.

In addition to reducing incision size, Simard says the ability to view the brain under continuous MRI also helps surgeons keep track of tumor boundaries throughout an operation. “When we’re operating in a conventional way, we get an MRI on a patient before we do the surgery, and we use landmarks that can either be affixed to the scalp or are part of the skull to know where we are within the patient’s brain. But when the surgeon gets in there and starts to remove the tumor, the tissues shift around so that now the boundaries that were well-established when everything was in place don’t exist anymore, and you’re confronted once again with having to distinguish normal brain from tumor. This is very difficult for a surgeon using direct vision, but with MRI, the ability to discriminate tumor from non-tumor is much more powerful.”

Steve Krosnick, M.D., a program director at NIBIB, says real-time MRI guidance during brain tumor surgery would be a tremendous advantage. “Unlike pre-operative MRI or intermittent MRI, which requires interruption of the surgical procedure, real-time intra-operative MRI offers rapid delineation of normal tissue from tumor while accounting for brain shifts that occur during surgery.”

But designing a neurosurgical device that can be used inside an MRI magnet is no easy task. One of the first issues you have to consider, said Gullapalli, is a surgeon’s access to the brain. “When you scan a person’s brain during an MRI, he’s deep inside the machine’s tunnel. The problem is, how do you get your hands on the brain while the patient’s in the scanner?”

The team’s solution was to give the surgeon robotic control of the device in order to circumvent the need to access the brain directly. In other words, a surgeon can insert the robot into the brain while the patient is outside of the scanner. Then, when the patient moves into the scanner, the surgeon can sit in a different room and –while watching MRI images of the brain on a monitor—move the robot deep inside the brain and direct it to electrocauterize and aspirate the tissue.

Jaydev Desai, the team’s mechanical engineer, says the most challenging aspect of the project has been designing a robot that can be controlled inside the magnetic field of an MRI. While robots are often controlled via electromagnetic motors, this was not an option because, besides being magnetic, these motors create significant image distortion, making it impossible for the surgeon to perform the task. Other potential mechanisms such as hydraulic systems were off the table due to concerns about fluid leakage.

Instead, Desai decided to use shape memory alloy (SMA)—a material that alters its shape in response to changes in temperature—to control the robot’s movement. In the most recent prototype—developed by Desai and his team at the Robotics, Automation, and Medical Systems (RAMS) laboratory at the University of Maryland, College Park—a system of cables, pulleys and SMA springs are used. This cable and pulley system is an improvement from their previous prototype which caused some image distortion.

image

Image: The newest prototype for the minimally invasive neurosurgical intracranial robot uses a system of pulleys and springs to move the robot. Source: Jaydev Desai, University of Maryland

With continued support from NIBIB, Desai and colleagues are now working to further reduce image distortion and to test the safety and efficacy of their device in swine as well as in human cadavers. Though it will be several years before their device finds its way into the operating room, Simard is excited by the prospect. “Advancing brain surgery to this level where tiny machines or robots could navigate inside people’s heads while being directed by neurosurgeons with the help of MRI imaging…It’s beyond anything that most people dream of.”

Scoping the brain

On the opposite side of the country, a different group of engineers and neurosurgeons is also working to develop an image-guided, robotically-controlled neurosurgical tool. Lead by Eric Seibel, Ph.D., a professor of mechanical engineering at the University of Washington, the team is attempting to adapt a scanning fiber endoscope—a tool initially developed by Seibel to image inside the narrow bile ducts of the liver—so that it can be used to visualize the brain during surgery.

An endoscope is a thin, tube-like instrument with a video camera attached to its end that can be inserted through a small incision or natural opening in the body to produce real-time video during surgery. Endoscopes are an essential component of minimally invasive surgeries because they allow surgeons to view the inside of the body on a monitor without having to make a large incision.

However, there are many parts of the body such as small vessels and ducts as well as areas deep in the brain that are inaccessible to conventional endoscopes. Although ultrathin endoscopes have recently been developed, Seibel says these smaller scopes come with the price of greatly reduced image resolution.

“Right now, with the current state of the art ultrathin endoscopes, I calculate based on the field of view and their resolution that the person looking at that display would see so little as to be classified in the US as legally blind,” said Seibel.

image

Image: Microfabricated optical fiber scanner emitting red laser light, with scan amplitude of 1 mm peak-to-peak. Image courtesy of Eric Seibel, University of Washington

But with support from NIBIB over ten years ago, Seibel began working on a new type of endoscope that could fit into tiny crevices in the body while retaining high image quality. His end product was a new type of endoscope that, despite having the diameter of a toothpick, can provide doctors with microscopic views of the inside of the body.

Seibel retained image quality while significantly reducing the size of his scope by eschewing traditional endoscope models. Instead of a light source and a video camera, Seibel’s scope consists of a single optical fiber—approximately the size of a human hair—located in the middle of the scope. The fiber releases white laser light (a combination of green, red, and blue lasers) when vibrated at a particular frequency. By directing the laser light through a series of lenses in the scope, it can be reflected widely within the body, providing a 100 degree field of view. As the white laser light interacts with tissue, it picks up coloration and scatters it back to a ring of additional optical fibers which transmit this information to a monitor.

“It’s almost like putting your eyes inside the body so you can see with the wide field view of your human vision,” said Seibel.

In collaboration with three neurosurgeons and an electrical engineer, Seibel is now working to secure his novel endoscope to the tip of a robotically controlled micro-dissection neurosurgical tool.

As opposed to larger traditional endoscopes, Seibel say his scanning fiber endoscope is barely noticeable.

“It’ s like a piece of wet spaghetti,” said Seibel. “It’s even smaller then a piece of wet spaghetti in diameter, but it feels like that. So when it is actually at the tip of the surgeon’s tool, the surgeon wouldn’t feel it dragging behind her.”

One advantage of having the endoscope under robotic control is that the brain can be imaged at a higher magnification.

“A surgeon couldn’t hold a microscope steady in her hand while performing surgery, but the robot can,” said Seibel.

Microscopic detail is essential when trying to determine the border between healthy tissue—which if removed could lead to neurological deficits—and cancerous tissue—which if left in the brain could allow a tumor to return.

Krosnick says he’s excited by the combination of high-quality imaging and robotic enabled micro-neurosurgery. “It addresses a critical need, which is to discern tumor margins at high resolution while minimizing disruption to normal structures.”

Seibel believes this discrimination between cancerous and healthy tissue could be enhanced even further by taking advantage of the fact that his scanning endoscope is also able to detect fluorescence. One of the main focuses of his current research is a collaboration with Jim Olson, M.D., Ph.D. at the Fred Hutchinson Cancer Research Center, who is the inventor of a substance called “tumor paint”.

Tumor paint is a fluorescent probe that attaches to cancerous but not healthy cells when injected into the body. Seibel says the ultimate goal would be to give a patient an injection of tumor paint and then use his endoscope to create an image of the fluorescing cancer cells as well as a colored anatomic image of the brain. The two images could then be merged on a screen for the surgeon to view during an operation.“You would be able to see all the structure that a surgeon would see, but you’d also see those molecular pinpoints of light that are cancer cells…and from there the robot can be used to resect, or remove, these small cells of cancer, and it can do it very precisely because you don’t have the shaking of a human holding it.”

image

Image: Tumor paint is made of a compound extracted from scorpion venom that can travel through the blood brain barrier and bind specifically to tumor cells. Source: iStockphoto

Seibel concluded by saying, “There’s a real niche for video-quality, high-resolution, multi-modal imaging that’s in a tiny package so that it can be put on microscopic tools for minimally invasive medicine. I really feel it’s an enabling technology that could move the whole field forward.”

Krosnick is enthusiastic about the progress the two teams have made so far. “These are innovative technologies that, if effective, could significantly add to the brain surgery armamentarium. They’re still early in development, but I think both show considerable promise.” He concluded by emphasizing that, like all new devices, these technologies would need to undergo a series of clinical trials to ensure that they are safe and effective before making their way into an operating room.

Sep 6, 2013119 notes
#brain tumors #robotics #glioblastoma #neurosurgery #neuroscience #science
TB and Parkinson’s Disease Linked By Unique Protein

UCSF Researchers Seek Way to Boost Parkin to Fight Both Diseases

A protein at the center of Parkinson’s disease research now also has been found to play a key role in causing the destruction of bacteria that cause tuberculosis, according to scientists led by UC San Francisco microbiologist and tuberculosis expert Jeffery Cox, PhD.

The protein, named Parkin, already is the focus of intense investigation in Parkinson’s disease, in which its malfunction is associated with a loss of nerve cells. Cox and colleagues now report that Parkin also acts on tuberculosis, triggering destruction of the bacteria by immune cells known as macrophages. Results appear online today (September 4, 2013) in the journal Nature.

The finding suggests that disease-fighting strategies already under investigation in pre-clinical studies for Parkinson’s disease might also prove useful in fighting tuberculosis, according to Cox. Cox is investigating ways to ramp up Parkin activity in mice infected with tuberculosis using a strategy similar to one being explored by his UCSF colleague Kevan Shokat, PhD, as a way to ward off neurodegeneration in Parkinson’s disease.

Globally, tuberculosis kills 1.4 million people each year, spreading from person to person through the air. Parkinson’s disease, the most common neurodegenerative movement disorder, also affects millions of mostly elderly people worldwide.

Cox homed in on the enzyme Parkin as a common element in Parkinson’s and tuberculosis through his investigations of how macrophages engulf and destroy bacteria. In a sense the macrophage — which translates from Greek as “big eater” — gobbles down foreign bacteria, through a process scientists call xenophagy.

Mycobacterium tuberculosis, along with a few other types of bacteria, including Salmonella and leprosy-causing Mycobacterium leprae, are different from other kinds of bacteria in that, like viruses, they need to get inside cells to mount a successful infection.

The battle between macrophage and mycobacterium can be especially intense. M. tuberculosis invades the macrophage, but then becomes engulfed in a sac within the macrophage that is pinched off from the cell’s outer membrane. The bacteria often escape this intracellular jail by secreting a protein that degrades the sac, only to be targeted yet again by molecular chains made from a protein called ubiquitin. Previously, Cox discovered molecules that escort these chained mycobacteria to more secure confinement within compartments inside cells called lysosomes, where the bacteria are destroyed.

The cells of non-bacterial organisms ranging in complexity from baker’s yeast to humans also use a similar mechanism — called autophagy — to dispose of their own unneeded molecules or worn out cellular components. Among the most abundant and crucial of these components are the cell’s mitochondria, metabolic powerhouses that convert food molecules into a source of energy that the cell can readily use to carry out its everyday housekeeping chores, as well as its more specialized functions.

Like other cellular components, mitochondria can wear out and malfunction, and often require replacement. The process through which mitochondria are disposed of, called mitophagy, depends on Parkin.

Cox became curious about the enzyme when he learned that specific, naturally occurring variations in the Parkin gene, called polymorphisms, are associated with increased susceptibility to tuberculosis infection.

“Because of the commonalities between mitophagy and the xenophagy of intracellular mycobacteria, as well as the links between Parkin gene polymorphisms and increased susceptibility to bacterial infection in humans, we speculated that Parkin may also be recruited to M. tuberculosis and target it for xenophagy,” Cox said.

In both mouse and human macrophages infected with M. tuberculosis in the lab, Parkin played a key role in fighting the bacteria, Cox and colleagues found. In addition, genetically engineered mice lacking Parkin died when infected with M. tuberculosis, while mice with normal Parkin survived infection.

The involvement of Parkin in targeting both damaged mitochondria and infectious mycobacteria arose long ago in evolution, Cox argues. As part of the Nature study, the research team found that Parkin-deficient mice and flies – creatures quite distant from humans in evolutionary time – also are more sensitive than normal mice and flies to intracellular bacterial infections.

Looking back more than 1 billion years, Cox noted that mitochondria evolved from bacteria that were taken up by cells in a symbiotic relationship.

In the same way that the immune system recognizes infectious bacteria as foreign, Cox said, “The evolutionary origin of mitochondria from bacteria suggests that perhaps mitochondrial dysfunction triggers the recognition of a mitochondrian as non-self.”

Having now demonstrated the importance of Parkin in fighting mycobacterial infection, Cox has begun working with Shokat to find a way to boost Parkin activity against cell-invading bacteria. “We are exploring the possibility that small-molecule drugs could be developed to activate Parkin to better fight tuberculosis infection,” Cox said.

Sep 5, 201365 notes
#parkinson's disease #tuberculosis #parkin protein #macrophages #lysosomes #medicine #neuroscience #science
Sep 5, 201389 notes
#aging #cognitive performance #cognitive control #prefrontal cortex #neuroscience #science
Play
Sep 5, 2013319 notes
#science #schizophrenia #OCD #mental disorders #compulsive behavior #neuroscience #psychology
Discovery helps to unlock brain’s speech-learning mechanism

USC scientists have discovered a population of neurons in the brains of juvenile songbirds that are necessary for allowing the birds to recognize the vocal sounds they are learning to imitate.

image

These neurons encode a memory of learned vocal sounds and form a crucial (and hitherto only theorized) part of the neural system that allows songbirds to hear, imitate and learn its species’ songs — just as human infants acquire speech sounds.

The discovery will allow scientists to uncover the exact neural mechanisms that allow songbirds to hear their own self-produced songs, compare them to the memory of the song that they are trying to imitate and then adjust their vocalizations accordingly.

Because this brain-behavior system is thought to be a model for how human infants learn to speak, understanding it could prove crucial to future understanding and treatment of language disorders in children. In both songbirds and humans, feedback of self-produced vocalizations is compared to memorized vocal sounds and progressively refined to achieve a correct imitation.

“Every neurodevelopmental disorder you can think of — including Tourette syndrome, autism and Rett syndrome — entails in some way a breakdown in auditory processing and vocal communication,” said Sarah Bottjer, senior author of an article on the research that appears in the Journal of Neuroscience on Sept. 4. “Understanding mechanisms of vocal learning at a cellular level is a huge step toward being able to someday address the biological issues behind the behavioral issues.”

Bottjer professor of neurobiology at the USC Dornsife College of Letters, Arts and Sciences, collaborated with lead author Jennifer Achiro, a graduate student at USC, to examine the activity of neurons in songbirds’ brains using electrodes to record the activity of individual neurons.

In the basal ganglia — a complex system of neurons in the brain responsible for, among other things, procedural learning — Bottjer and Achiro were able to isolate two different types of neurons in young songbirds: ones that were activated only when the birds heard themselves singing and others that were activated only when the birds heard the songs of adult birds that they were trying to imitate.

The two sets of neurons allow the songbirds to recognize both their current behavior and a goal behavior that they would like to achieve.

“The process of learning speech requires the brain to compare feedback of current vocal behavior to a memory of target vocal sounds,” Achiro said. “The discovery of these two distinct populations of neurons means that this brain region contains separate neural representation of current and goal behaviors. Now, for the first time, we can test how these two neural representations are compared so that correct matches between the two are somehow rewarded.”

The next step for scientists will be to learn how the brain rewards correct matches between feedback of current vocal behavior and the goal memory that depicts memorized vocal sounds as songbirds make progress in bringing their current behavior closer to their goal behavior, Bottjer said.

Sep 5, 201386 notes
#songbirds #neural activity #basal ganglia #vocal learning #speech #neuroscience #science
New laser-based tool could dramatically improve the accuracy of brain tumor surgery

Imaging technique tells tumor tissue from normal tissue, could be used in operating room for real-time guidance of surgery

A new laser-based technology may make brain tumor surgery much more accurate, allowing surgeons to tell cancer tissue from normal brain at the microscopic level while they are operating, and avoid leaving behind cells that could spawn a new tumor.

image

This image of a human glioblastoma brain tumor in the brain of a mouse was made with stimulated Raman scattering, or SRS, microscopy. The technique allows the tumor (blue) to be easily distinguished from normal tissue (green) based on faint signals emitted by tissue with different cellular structures.

In a new paper, featured on the cover of the journal Science Translational Medicine, a team of University of Michigan Medical School and Harvard University researchers describes how the technique allows them to “see” the tiniest areas of tumor cells in brain tissue.

They used this technique to distinguish tumor from healthy tissue in the brains of living mice — and then showed that the same was possible in tissue removed from a patient with glioblastoma multiforme, one of the most deadly brain tumors.

Now, the team is working to develop the approach, called SRS microscopy, for use during an operation to guide them in removing tissue, and test it in a clinical trial at U-M. The work was funded by the National Institutes of Health.

A need for improvement in tumor removal

On average, patients diagnosed with glioblastoma multiforme live only 18 months after diagnosis. Surgery is one of the most effective treatments for such tumors, but less than a quarter of patients’ operations achieve the best possible results, according to a study published last fall in the Journal of Neurosurgery.

“Though brain tumor surgery has advanced in many ways, survival for many patients is still poor, in part because surgeons can’t be sure that they’ve removed all tumor tissue before the operation is over,” says co-lead author Daniel Orringer, M.D., a lecturer in the U-M Department of Neurosurgery who has worked with the Harvard team since a chance meeting with a team member during his U-M residency.

image

On the left, the view of the brain that neurosurgeons currently see during an operation using bright-field microscopy. On the right, an SRS microscopy view of the same area of brain - in this case, a mouse brain that has had human brain tumor tissue transplanted into it. SRS might someday allow surgeons to see this same view of patients’ brains.

“We need better tools for visualizing tumor during surgery, and SRS microscopy is highly promising,” he continues. “With SRS we can see something that’s invisible through conventional surgical microscopy.”

The SRS in the technique’s name stands for stimulated Raman scattering. Named for C.V. Raman, one of the Indian scientists who co-discovered the effect and shared a 1930 Nobel Prize in physics for it, Raman scattering involves allows researchers to measure the unique chemical signature of materials.

In the SRS technique, they can detect a weak light signal that comes out of a material after it’s hit with light from a non-invasive laser. By carefully analyzing the spectrum of colors in the light signal, the researchers can tell a lot about the chemical makeup of the sample.

Over the past 15 years, Sunney Xie, Ph.D., of the Department of Chemistry and Chemical Biology at Harvard University – the senior author of the new paper — has advanced the technique for high-speed chemical imaging. By amplifying the weak Raman signal by more than 10,000 times, it is now possible to make multicolor SRS images of living tissue or other materials. The team can even make 30 new images every second — the rate needed to create videos of the tissue in real time.

Seeing the brain’s microscopic architecture

A multidisciplinary team of chemists, neurosurgeons, pathologists and others worked to develop and test the tool. The new paper is the first time SRS microscopy has been used in a living organism to see the “margin” of a tumor – the boundary area where tumor cells infiltrate among normal cells. That’s the hardest area for a surgeon to operate – especially when a tumor has invaded a region with an important function.

As the images in the paper show, the technique can distinguish brain tumor from normal tissue with remarkable accuracy, by detecting the difference between the signal given off by the dense cellular structure of tumor tissue, and the normal healthy grey and white matter.

The authors suggest that SRS microscopy may be as accurate for detecting tumor as the approach currently used in brain tumor diagnosis – called H&E staining.

image

This image shows the same areas of brain, imaged with SRS microscopy (left) and conventional H&E staining, which is the current technique used to diagnose brain tumors at the tissue level. The research suggests that SRS microscopy could be as accurate as H&E staining in allowing doctors to see tumors - without having to remove tissue or inject dyes into the patient.

The paper contains data from a test that pitted H&E staining directly against SRS microscopy. Three surgical pathologists, trained in studying brain tissue and spotting tumor cells, had nearly the same level of accuracy no matter which images they studied. But unlike H&E staining, SRS microscopy can be done in real time, and without dyeing, removing or processing the tissue.

Next steps: A smaller laser, a clinical trial

The current SRS microscopy system is not yet small or stable enough to use in an operating room. The team is collaborating with a start-up company formed by members of Xie’s group, called Invenio Imaging Inc., which is developing a laser to perform SRS through inexpensive fiber-optic components. The team is also working with AdvancedMEMS Inc. to reduce the size of the probe that makes the images possible.

A validation study, to examine tissue removed from consenting U-M brain tumor patients, may begin as soon as next year.

Sep 5, 201370 notes
#brain tumor #glioblastoma #brain tissue #neuroimaging #SRS microscopy #neuroscience #science
Sep 5, 2013679 notes
#science #McGurk effect #auditory cortex #language #language processing #neuroscience
Sep 5, 2013152 notes
#science #cerebellum #proprioception #motor movements #neuroscience
Play
Sep 5, 201384 notes
#alzheimer's disease #tau protein #chaperone proteins #stress protein #neuroscience #science
Sep 5, 2013215 notes
#science #alzheimer's disease #mGluR5 #memory impairment #prion proteins #medicine #neuroscience
Brain study uncovers vital clue in bid to beat epilepsy

People with epilepsy could be helped by new research into the way a key molecule controls brain activity during a seizure.

Researchers have identified the role played by of a protein – called BDNF – and say the discovery could lead to new drugs that calm the symptoms of epileptic seizures.

Scientists analysed the way cells communicate when the brain is most active – such as in epileptic seizures – when electrical signalling by the brain’s neurons is increased.

They found that the BDNF molecule – which is known to be released in the brain during seizures – blocks a specific process known as activity-dependent bulk endocytosis (ABDE).

By blocking this process during an epileptic seizure, BDNF increases the release of neurotransmitters and causes heightened electrical activity in the brain.

Since ADBE is only triggered during high brain activity, drugs designed to target this process could have fewer side effects for normal day to day brain function, researchers say.

Experts say that not all epilepsy patients respond to current drug treatments and the finding could lead to the development of new medicines.

The team, however, offered a word of caution. Since ABDE is also implicated in a range of brain functions, such as creating new memories, more research is needed to establish what the effects of manipulating this molecule might be on these key processes.

The study, led by the University of Edinburgh, is published in the journal Nature Communications. The research was funded by the Wellcome Trust and the Medical Research Council.

Dr Mike Cousin, of the University of Edinburgh’s Centre for Integrative Physiology, who led the research, said: “Around one third of people with epilepsy do not respond to the treatments we currently have available. By studying the way brain cells behave during seizures, we have been able to uncover an exciting new research avenue for research into anti-epileptic therapies.”

Researchers will now focus on identifying specific genes that control this brain process to determine whether they hold the key to new drug treatments.

Sep 4, 201362 notes
#epilepsy #seizures #BDNF #activity-dependent bulk endocytosis #brain activity #neuroscience #science
Scientists fish for new epilepsy model and reel in potential drug

NIH-funded study finds zebrafish model may help identify treatments for a severe form of childhood epilepsy

image

According to new research on epilepsy, zebrafish have certainly earned their stripes. Results of a study in Nature Communications suggest that zebrafish carrying a specific mutation may help researchers discover treatments for Dravet syndrome (DS), a severe form of pediatric epilepsy that results in drug-resistant seizures and developmental delays.

Scott C. Baraban, Ph.D., and his colleagues at the University of California, San Francisco (UCSF), carefully assessed whether the mutated zebrafish could serve as a model for DS, and then developed a new screening method to quickly identify potential treatments for DS using these fish. This study was supported by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health and builds on pioneering epilepsy zebrafish models first described by the Baraban laboratory in 2005.

Dravet syndrome is commonly caused by a mutation in the Scn1a gene, which encodes for Nav1.1, a specific sodium ion channel found in the brain. Sodium ion channels are critical for communication between brain cells and proper brain functioning.

The researchers found that the zebrafish that were engineered to have the Scn1a mutation that causes DS in humans exhibited some of the same characteristics, such as spontaneous seizures, commonly seen in children with DS. Unprovoked seizure activity in the mutant fish resulted in hyperactivity and whole-body convulsions associated with very fast swimming. These types of behaviors are not seen in normal healthy zebrafish.

“We were also surprised at how similar the mutant zebrafish drug profile was to that of Dravet patients,” said Dr. Baraban. “Antiepileptic drugs shown to have some benefits in patients (such as benzodiazepines or stiripentol) also exhibited some antiepileptic activity in these mutants. Conversely, many of the antiepileptic drugs that do not reduce seizures in these patients showed no effect in the mutant zebrafish.”

In this study, the researchers developed a fast and automated drug screen to quickly test the effectiveness of various compounds in mutant zebrafish. The researchers tracked behavior and measured brain activity in the mutant zebrafish to determine if the compounds had an impact on seizures.

“Scn1a mutants seize often, so it is relatively easy to monitor their seizure behavior at baseline and then again after a drug application,” said Dr. Baraban. “Using zebrafish placed individually in a 96-part petri dish we can accurately quantify this seizure behavior. In this way, we can test almost 100 fish at one time and quickly determine whether a drug candidate has any effect on these spontaneous seizures.”

In the first such application of this approach, UCSF researchers screened 320 compounds and found that clemizole was most effective in inhibiting seizure activity. Clemizole is approved by the U.S. Food and Drug Administration and has a safe toxicology profile. “This finding was completely unexpected. Based on what is currently known about clemizole, we did not predict that it would have antiepileptic effects,” said Dr. Baraban.

These findings suggest that Scn1a mutant zebrafish may serve as a good model of DS and that the drug screen may be effective in quickly identifying novel therapies for epilepsy. 

Dr. Baraban also noted that someday these experiments can be “personalized,” by looking at mutated zebrafish that use genetic information from individual patients. 

Sep 4, 201351 notes
#Dravet syndrome #epilepsy #zebrafish #ion channels #Scn1a gene #mutations #neuroscience #science
Research confirms Mediterranean diet is good for the mind

The first systematic review of related research confirms a positive impact on cognitive function, but an inconsistent effect on mild cognitive impairment.

image

Over recent years many pieces of research have identified a link between adherence to a Mediterranean diet and a lower risk of age-related disease such as dementia.

Until now there has been no systematic review of such research, where a number of studies regarding a Mediterranean diet and cognitive function are reviewed for consistencies, common trends and inconsistencies.

A team of researchers from the University of Exeter Medical School, supported by the National Institute for Health Research Collaboration for Leadership in Applied Health Research and Care in the South West Peninsula (NIHR PenCLAHRC), has carried out the first such systematic review and their findings are published in Epidemiology.

The team analysed 12 eligible pieces of research, 11 observational studies and one randomised control trial. In nine out of the 12 studies, a higher adherence to a Mediterranean diet was associated with better cognitive function, lower rates of cognitive decline and a reduced risk of Alzheimer’s disease.

However, results for mild cognitive impairment were inconsistent.

A Mediterranean diet typically consists of higher levels of olive oil, vegetables, fruit and fish. A higher adherence to the diet means higher daily intakes of fruit and vegetables and fish, and reduced intakes of meat and dairy products.

The study was led by researcher Iliana Lourida. She said: “Mediterranean food is both delicious and nutritious, and our systematic review shows it may help to protect the ageing brain by reducing the risk of dementia. While the link between adherence to a Mediterranean diet and dementia risk is not new, ours is the first study to systematically analyse all existing evidence.”

She added: “Our review also highlights inconsistencies in the literature and the need for further research. In particular research is needed to clarify the association with mild cognitive impairment and vascular dementia. It is also important to note that while observational studies provide suggestive evidence we now need randomized controlled trials to confirm whether or not adherence to a Mediterranean diet protects against dementia.”

Sep 4, 2013157 notes
#Mediterranean diet #cognitive function #dementia #cognitive impairment #neuroscience #science
Aging really is ‘in your head’

Scientists answer hotly debated questions about how calorie restriction delays aging process

image

Among scientists, the role of proteins called sirtuins in enhancing longevity has been hotly debated, driven by contradictory results from many different scientists. But new research at Washington University School of Medicine in St. Louis may settle the dispute.

Reporting Sept. 3 in Cell Metabolism, Shin-ichiro Imai, MD, PhD, and his colleagues have identified the mechanism by which a specific sirtuin protein called Sirt1 operates in the brain to bring about a significant delay in aging and an increase in longevity. Both have been associated with a low-calorie diet.

The Japanese philosopher and scientist Ekiken Kaibara first described the concept of dietary control as a method to achieve good health and longevity in 1713. He died the following year at the ripe old age of 84—a long life for someone in the 18th century.

Since then, science has proven a link between a low-calorie diet (without malnutrition) and longevity in a variety of animal models. In the new study, Imai and his team have shown how Sirt1 prompts neural activity in specific areas of the hypothalamus of the brain, which triggers dramatic physical changes in skeletal muscle and increases in vigor and longevity.

“In our studies of mice that express Sirt1 in the brain, we found that the skeletal muscular structures of old mice resemble young muscle tissue,” said Imai. “Twenty-month-old mice (the equivalent of 70-year-old humans) look as active as five-month-olds.”

Imai and his team began their quest to define the critical junctures responsible for the connection between dietary restriction and longevity with the knowledge from previous studies that the Sirt1 protein played a role in delaying aging when calories are restricted. But the specific mechanisms by which it carried out its function were unknown.

Imai’s team studied mice that had been genetically modified to overproduce Sirt1 protein. Some of the mice had been engineered to overproduce Sirt1 in body tissues, while others were engineered to produce more of the Sirt1 protein only in the brain.

“We found that only the mice that overexpressed Sirt1 in the brain (called BRASTO) had significant lifespan extension and delay in aging, just like normal mice reared under dietary restriction regimens,” said Imai, an expert in aging research and a professor in the departments of Developmental Biology and Medicine.

The BRASTO mice demonstrated significant life span extension without undergoing dietary restriction. “They were free to eat regular chow whenever they wished,” he said.

In addition to positive skeletal muscle changes in the BRASTO mice, the investigators also observed significant increases in nighttime physical activity, body temperature and oxygen consumption compared with age-matched controls.

Mice are characteristically most active at night. The BRASTO mice also experienced better or deeper sleep, and both males and females had significant increases in longevity.

The median life span of BRASTO mice in the study was extended by 16 percent for females and 9 percent for males. Translated to humans, this could mean an extra 13 or 14 years for women, making their average life span almost 100 years, Shin said. For men, this would add another seven years, increasing their average life span to the mid-80s.

Delay in cancer-dependent death also was observed in the BRASTO mice relative to control mice, the researchers noted.

Imai said that the longevity and health profile associated with the BRASTO mice appears to be the result of a shift in the onset of aging rather than the pace of aging. “What we have observed in BRASTO mice is a delay in the time when age-related decline begins, so while the rate of aging does not change, aging and the risk of cancer has been postponed.”

Having narrowed control of aging to the brain, Imai’s team then traced the control center of aging regulation to two areas of the hypothalamus called the dorsomedial and lateral hypothalamic nuclei. They then were able to identify specific genes within those areas that partner with Sirt1 to kick off the neural signals that elicit the physical and behavioral responses observed.

“We found that overexpression of Sirt1 in the brain leads to an increase in the cellular response of a receptor called orexin type 2 receptor in the two areas of the hypothalamus,” said first author Akiko Satoh, PhD, a postdoctoral staff scientist in Imai’s lab.

“We have demonstrated that the increased response by the receptor initiates signaling from the hypothalamus to skeletal muscles,” said Satoh. She noted that the mechanism by which the signal is specifically directed to skeletal muscle remains to be discovered.

According to Imai, the tight association discovered between Sirt1-prompted brain activation and the regulation of aging and longevity raises the tantalizing possibility of a “control center of aging and longevity” in the brain, which could be manipulated to maintain youthful physiology and extend life span in other mammals as well.

Sep 4, 2013204 notes
#science #aging #calorie restriction #sirtuins #hypothalamus #Sirt1 #neuroscience
Sep 4, 2013125 notes
#sleep #oligodendrocytes #myelin #nerve cells #genes #MS #neuroscience #science
Sep 4, 2013100 notes
#parkinson's disease #brain cells #mitochondria #ursodeoxycholic acid #neuroscience #science
Sep 4, 2013303 notes
#science #tech #neurological disorders #cranial implants #brain imaging #neuroimaging #neuroscience
Sep 4, 2013207 notes
#auditory system #schizophrenia #psychosis #brain circuitry #motor cortex #neuroscience #science
Sep 3, 2013131 notes
#primates #vocalizations #language #categorization #psychology #neuroscience #science
Sep 3, 2013125 notes
#auditory system #auditory attention filter #cochlea #hair cells #neuroscience #science
Sep 3, 201341 notes
#fruit flies #hearing #noise-induced hearing loss #auditory system #neuroscience #science
Administering Natural Substance Spermidin Stopped Dementia

Scientists from Freie Universität Berlin and the University of Graz Have Shown That Feeding Fruit Flies with Spermidin Suppresses Age-dependent Memory Impairment

Age-induced memory impairment can be suppressed by administration of the natural substance spermidin. This was found in a recent study conducted by Prof. Dr. Stephan Sigrist from Freie Universität Berlin and the Neurocure Cluster of Excellence and Prof. Dr. Frank Madeo from Karl-Franzens-Universität Graz. Both biologists, they were able to show that the endogenous substance spermidine triggers a cellular cleansing process, which is followed by an improvement in the memory performance of older fruit flies. At the molecular level, memory processes in animal organisms such as fruit flies and mice are similar to those in humans. The work by Sigrist and Madeo has potential for developing substances for treating age-related memory impairment. The study was first published in the online version of Nature Neuroscience.

Aggregated proteins are potential candidates for causing age-related dementia. With increasing age, the proteins accumulate in the brains of fruit flies, mice, and humans. In 2009 Madeo’s group in Graz already found that the spermidin molecule has an anti-aging effect by setting off autophagy, a cleaning process at the cellular level. Protein aggregates and other cellular waste are delivered to lysosomes, the digestive apparatus in cells, and degraded.

Feeding the fruit flies spermidin significantly reduced the amount of protein aggregates in their brains, and their memories improved to juvenile levels. This can be measured because flies can learn under classical Pavovian conditioning and adjust their behavior accordingly.

In humans, memory capacity decreases beginnning around the age of 50. This loss accelerates with increasing age. Due to increasing life expectancy, age-related memory impairment is expected to increase drastically. The spermidine concentration increases with age in flies as in humans. If it were possible to delay the onset of age-related dementia by giving individuals spermidin as a food supplement, it would be a great breakthrough for individuals and for society. Patient studies are the next step for Sigrist and Madeo.

Sep 2, 201374 notes
#spermidin #fruit flies #memory impairment #dementia #aging #neuroscience #science
Sep 2, 2013364 notes
#science #language #language acquisition #brain activity #fetus #womb #neuroscience
Sep 2, 2013148 notes
#science #language #toolmaking #tool use #brain activity #blood flow #evolution #neuroscience #psychology
Sep 2, 201378 notes
#neurological diseases #microphages #microglia #calcium channel #lysosome #neuroscience #science
Sep 1, 2013147 notes
#brain lateralization #brain hemispheres #cognitive ability #psychology #neuroscience #science
Shutting off Neurons Helps Bullied Mice Overcome Symptoms of Depression

Findings Point to New Potential Drug Target—GABA Neurons—to Treat Patients with Depression and Other Mood Disorders

A new drug target to treat depression and other mood disorders may lie in a group of GABA neurons (gamma-aminobutyric acid –the neurotransmitters which inhibit other cells) shown to contribute to symptoms like social withdrawal and increased anxiety, Penn Medicine researchers report in a new study in the Journal of Neuroscience.

Experts know that people suffering from depression and other mood disorders often react to rejection or bullying by withdrawing themselves socially more than the average person who takes it in strides, yet the biological processes behind these responses have remained unclear.

Now, a preclinical study, from the labs of Olivier Berton, PhD, an assistant professor in the department of Psychiatry, with Collin Challis of the Neuroscience Graduate Group, and Sheryl Beck, PhD, a professor in the department of Anesthesiology at Children’s Hospital of Philadelphia, found that bullying and other social stresses triggered symptoms of depression in mice by activating GABA neurons, in a never-before-seen direct relationship between social stimuli and this neural circuitry.  Activation of those neurons, they found, directly inhibited levels of serotonin, long known to play a vital role in behavioral responses—without it, a depressed person is more likely to socially withdrawal.

 Conversely, when the researchers successfully put the brake on the GABA neurons, mice became more resilient to bullying and didn’t avoid once -perceived threats.

“This is the first time that GABA neuron activity—found deep in the brainstem—has been shown to play a key role in the cognitive processes associated with social approach or avoidance behavior in mammals,” said Dr. Berton. “The results help us to understand why current antidepressants may not work for everyone and how to make them work better—by targeting GABA neurons that put the brake on serotonin cells.”

Less serotonin elicits socially defensive responses such as avoidance or submission, where enhancement—the main goal of antidepressants—induces a positive shift in the perception of socio-affective stimuli, promoting affiliation and dominance. However, current antidepressants targeting serotonin, like SSRIs, are only effective in about 50 percent of patients. 

These new findings point to GABA neurons as a new, neural drug target that could help treat the other patients who don’t respond to today’s treatment.

For the study, “avoidant” mice were exposed to brief bouts of aggression from trained “bully” mice. By comparing gene expression in the brains of resilient and avoidant mice, Berton and colleagues discovered that bullying in avoidant mice puts GABA neurons in a state where they become more excitable and the mice exhibit signs of social defeat. Resilient mice, however, had no change in neuron levels and behavior.

To better understand the link between GABA and the development of stress resilience, Berton, Beck, and colleagues also devised an approach to directly manipulate levels: Lifting GABA inhibition of serotonin neurons reduced social and anxiety symptoms in mice exposed to bullies and also fully prevented neurobiological changes due to stress.

“Our paper provides a novel cellular understanding of how social defensiveness and social withdrawal develop in mice and gives us a stepping stone to better understand the basis of similar social symptoms in humans,” said Berton. “This has important implications for the understanding and treatment of mood disorders.”

Sep 1, 2013173 notes
#depression #mood disorders #GABA neurons #serotonin #social withdrawal #stress #neuroscience #science
Sep 1, 201391 notes
#alzheimer's disease #frontotemporal dementia #stem cells #iPSCs #tauopathies #medicine #neuroscience #science
Researchers Discover New Way to Track Huntington’s Disease Progression Using PET Scans

Investigators at The Feinstein Institute for Medical Research have discovered a new way to measure the progression of Huntington’s disease, using positron emission tomography (PET) to scan the brains of carriers of the gene. The findings are published in the September issue of The Journal of Clinical Investigation.

Huntington’s disease causes the progressive breakdown of nerve cells in the brain, which leads to impairments in movement, thinking and emotions. Most people with Huntington’s disease develop signs and symptoms in their 40s or 50s, but the onset of disease may be earlier or later in life. Medications are available to help manage the symptoms of Huntington’s disease, but treatments do not prevent the physical, mental and behavioral decline associated with the condition.

Huntington’s disease is an inherited disease, passed from parent to child through a mutation in the normal gene. Each child of a parent with Huntington’s disease has a 50/50 chance of inheriting the Huntington’s disease gene, and a child who inherits the gene will eventually develop the disease. Genetic testing for Huntington’s disease can be performed to determine whether a person carries the gene and is developing the disease even before symptoms appear. Having this ability provides an opportunity for scientists to study how the disease first develops and how it progresses in its early, presymptomatic stages. Even though a carrier of the Huntington’s disease gene may not have experienced symptoms, changes in the brain have already taken place, which ultimately lead to severe disability. Brain imaging is one tool that could be used to track how quickly Huntington’s disease progresses in gene carriers. Having a better way to track the disease at its earliest stages will make it easier to test drugs designed to delay or even prevent the onset of symptoms.

Researchers at the Feinstein Institute used PET scanning to map changes in brain metabolism in 12 people with the Huntington’s disease gene who had not developed clinical signs of the illness. The researchers scanned the subjects repeatedly over a seven-year period and found a characteristic set (network) of abnormalities in their brains. The network was used to measure the rate of disease progression in the study participants. The Feinstein Institute investigators then confirmed the progression rate through independent measurements in scans from a separate group of Huntington’s disease gene carriers who were studied in the Netherlands. The investigators believe that progression networks similar to the one identified in Huntington’s disease carriers will have an important role in evaluating new drugs for degenerative brain disorders.

“Huntington’s disease is an extremely debilitating disease. The findings make it possible to evaluate the effects of new drugs on disease progression before symptoms actually appear. This is a major advance in the field,” said David Eidelberg, MD, Susan and Leonard Feinstein Professor and head of the Center for Neurosciences at the Feinstein Institute.

Sep 1, 201331 notes
#huntington's disease #brain imaging #PET scan #metabolic network #medicine #neuroscience #science
Why We Look At The Puppet, Not The Ventriloquist

The brain doesn’t require simultaneous visual and audio stimulation to locate the source of a sound

image

As ventriloquists have long known, your eyes can sometimes tell your brain where a sound is coming from more convincingly than your ears can.

A series of experiments in humans and monkeys by Duke University researchers has found that the brain does not require simultaneous visual and audio stimulation to locate the source of a sound. Rather, visual feedback obtained from trying to find a sound with the eyes had a stronger effect than visual stimuli presented at the same time as the audio, according to the Duke study.

The findings could help those with mild hearing loss learn to localize voices better, improving their ability to communicate in noisy environments, said Jennifer Groh, a professor of psychology and neuroscience at Duke.

Locating where a sound is coming from is partially learned with the aid of vision. Researchers sought to learn more about how the brain locates the source of a sound when the source is unclear and there are a number of possible visual matches.

"Our study is related to ventriloquism, in which the visual image of a puppet’s mouth ‘captures’ the sound of the puppeteer’s voice," Groh said. "It is thought that one reason this illusion occurs is because vision normally teaches the brain how to tell where sounds are coming from. We investigated how the brain knows which visual stimulus should capture the location of a sound, such as why it is the puppet’s mouth and not some other visual stimulus."

The study, which appears Thursday (Aug. 29) in the journal PLOS ONE, tested two competing hypotheses. In one, the brain determines the location of a sound based on the simultaneous occurrence of audio and its visual source. In the other, the brain uses a “guess and check” method. In this scenario, visual feedback sent to the brain after the eye focuses on a sound affects how the eye searches for that sound in the future, possibly through the brain’s reward-related circuitry.

In both paradigms, the visual stimulus — an LED — was displaced from the sound. Groh’s team then looked for evidence that the LED caused a persistent mislocation of the sound.

"Surprisingly, we found that visual feedback exerts the more powerful effect on altering localization of sounds," Groh said. "This suggests that the active behavior of looking at the puppet during a ventriloquism performance plays a role in causing the shift in where you hear the voice."

Participants in the study — 11 humans  and two rhesus monkeys — shifted their sight to a sound under different visual and audio scenarios.

In one scenario, called the “synchrony-only” task, a visual stimulus appeared at the same time as a sound but too briefly to provide feedback after an eye movement to that sound.

In another, the “feedback-only” task, the visual stimulus appeared during the execution of an eye movement to a sound, but was never on at the same time as the sound.

The study found that the “feedback-only task” exerted a much more powerful effect on the estimation of sound location, as measured with eye tracking, than did the other scenario. This suggests that those who have difficulty localizing sounds may benefit from practice involving eye movements.

On average, participants altered their eye movements in the direction of the lights’ location to a greater degree, about a quarter of the way, when the visual stimulus was presented as feedback than when it was presented at the same time as the sound, the study found.

"This is about the brain’s self-improvement skills," said co-author Daniel Pages, a graduate student in Psychology & Neuroscience at Duke. "What we’re getting at is how the brain uses different types of information to improve how it does its job. In this case, it uses vision coupled with eye movements to improve hearing."

"We were surprised at how important the eye movements were," Groh said. "But finding sounds is really hard. Feedback about your performance is important for anything that is difficult, whether it is the B- you get on your homework or the error your eyes detect in localizing a sound."

Sep 1, 201390 notes
#science #eye movements #visual stimulus #hearing loss #sound location #neuroscience #psychology
Sep 1, 2013105 notes
#nerve cells #intellectual disability #mental retardation #primary cilium #brain development #neuroscience #medicine #science

August 2013

Researchers develop new model to study schizophrenia and other neurological conditions

Schizophrenia is one of the most devastating neurological conditions, with only 30 percent of sufferers ever experiencing full recovery. While current medications can control most psychotic symptoms, their side effects can leave individuals so severely impaired that the disease ranks among the top ten causes of disability in developed countries.

Now, in this week’s issue of the Proceedings of the National Academy of Sciences, Thomas Albright and Ricardo Gil-da-Costa of the Salk Institute for Biological Studies describe a model system that completes the bridge between cellular and human studies of schizophrenia, an advance that should help speed the development of therapeutics for schizophrenia and other neurological disorders.

"Part of the terror of schizophrenia is that the brain can’t properly integrate sensory information, so the world is a disorientating series of unrelated bits of input," says Albright, the Conrad T. Prebys Chair in Vision Research. "We’ve created a model that tests the ability to do sensory integration, which should be extremely useful for pharmaceutical research."

Currently, over 1.1 percent of the world’s population has schizophrenia, with an estimated three million individuals in the United States alone. The economic cost is high: In 2002, Americans spent nearly $63 billion on treatment and managing disability. The emotional cost is higher still: Ten percent of those with schizophrenia are driven to commit suicide by the burden of coping with the disease.

Initially, it was thought that excessive amounts of the neurotransmitter dopamine caused psychotic symptoms, and indeed, current anti-psychotic drugs work by blocking dopamine from entering brain cells. But nearly all of these drugs have severe cognitive side effects, which led researchers to speculate that some other mechanism must also be involved.

A major clue to understanding schizophrenia came with the development of phencyclidine (PCP) in 1956. It was intended to keep patients safely asleep during surgeries, but many woke up with symptoms similar to those experienced by people with schizophrenia, including hallucinations and the disorientation of feeling “dissociated” from their limbs, resulting in PCP being abandoned for clinical purposes. A decade later, it was replaced by a derivative called ketamine. At doses high enough to put patients to sleep, ketamine is an effective anesthetic. At lower doses, it temporarily produces the same schizophrenia-like effects as PCP.

The two drugs are part of a class called N-methyl-D-aspartate receptor antagonists. Essentially, they work by gumming up the mechanism by which glutamate, the main excitatory neurotransmitter, would enter brain cells. Thus, it is clear that dopamine dysfunction accounts for some of the symptoms of psychosis, although that is probably not the full story.

"While dopamine has limited reach in the brain, any dysfunction in glutamate would be expected to have the sort of widespread effects we see in the perceptual disorders associated with schizophrenia," says Albright. "Nevertheless, which neurotransmitter was primary to these disorders—glutamate or dopamine—has been argued about for years."

Standing in the way of a definitive answer was a researcher’s Catch-22: Many experiments designed to understand cognitive disorders such as schizophrenia or Alzheimer’s require a participant’s conscious attention-yet these disorders interfere with attention.

To get around this, scientists turned to electroencephalograms (EEGs), which can be used to detect changes in cases where a subject is not consciously paying attention to a stimulus, by recording the brain’s electrical signals through electrodes placed in a scalp cap. In one test, a series of tones is played, but an “oddball” tone breaks the pattern in the sequence. A healthy brain can still easily spot the differences, even if a participant is concentrating on another task, such as reading a magazine.

"The test works because the brain is a prediction machine-it’s built to anticipate what should come next," says Albright. "If you have healthy working memory, you should be able to perceive a pattern and notice when something violates it, but patients suffering from some mental health disorders lack that basic ability."

In their latest research, Albright’s team detected the difference through two signals, event-related brain potentials called mismatch negativity (MMN) and P3. The MMN reflects differential brain activity to the detected oddball tone, below the level of conscious awareness. P3 picks up the next phase: a subject’s attention orientation to the oddball tone.

Still, a gap in understanding remained. While scientists could do cellular work in animal models on the role of dopamine versus glutamate, and they could do EEGs in human beings, a bridge between the two remained elusive. Such a bridge can help scientists understanding of how healthy and disordered brains work from the cellular level all the way to the multiple interactions between brain areas. Moreover, it can enable pre-clinical and clinical trials linking cellular and systems levels for successful therapeutic avenues.

Gil-da-Costa has at last crossed the bridge by crafting the first non-invasive scalp EEG setup that records accurately from the brains of non-human primates, with the same proportional density of electrodes as a human cap and no distortions in signal caused by an incorrect fit. This setup allows him to get accurate measurements of MMN and P3, with the same protocols that are followed in humans. As a result, the lab has come closer than ever before to untangling the roles of dopamine and glutamate.

"While rodents are essential for understanding mechanisms at a cellular or molecular level, at a higher cognitive level, the best you could do was a sort of rough analogy. Now, finally, we can have a one-to-one correspondence," says Gil-da-Costa. "For sensory integration, our findings with this model support the glutamate hypothesis."

Pharmaceutical companies are interested in the model, because of the potential for more precise testing and the universality of the MMN/P3 assays. “These brain makers are the same across dozens of neurological diseases, as well as brain trauma, so you can test potential therapies not just for schizophrenia, but for conditions such as Parkinson’s, Alzheimer’s, bi-polar disorder, and traumatic brain injuries,” says Gil-da-Costa. “We hope this will help begin a new era in neurological therapeutics.”

Aug 31, 2013118 notes
#schizophrenia #psychosis #glutamate #dopamine #brain activity #neuroscience #science
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December