Posts tagged science

Posts tagged science
Repairing mitochondria in neurodegenerative disease
The relationship between fine-scale structure and function in the brain is perhaps best explored today by the study of neurodegenerative disease. Disorders like Rett syndrome may be considered developmental in origin—and defined by exotic mechanisms including X-linked inactivation, DNA methylation, and genomic imprinting—but even here, its larger physical pathology evolves through the course of life and continues to be revealed in almost any place that researchers look. When diseases directly involve inputs to the brain like vitamin or diet, and can also be controlled by them, things get even more interesting. More often than not, these disorders have a clear genetic component, are frequently linked to the mitochondria, and lead to progressive and often perplexing deficits of movement. One such enigma is known as pantothenate kinase-associated neurodegeneration, or PKAN syndrome, in its the most frequent form. A recent open paper in the journal Brain explains.
This particular syndrome can be caused by any number of a hundred or so mutations in the PANK2 gene, which codes for the mitochondrial enzyme pantothenate kinase 2. Of the four nuclear-coded PANK genes, only PANK2 is targeted to the mitochondria. Its protein product is involved in co-enzyme A biosynthesis and catalyzes the phosphorylation of pantothenate (vitamin B5). The hallmark pathology, as defined by T2-weighted MRI, can be seen in the globus pallidus and even has its own unique name— the Eye-of-the-Tiger sign.
The researchers used a mouse model of the disease with a Pank2 double gene knockout. On a standard diet, the mice showed growth issues, azoospermia (lack of sperm) and minor mitochondrial dysfunction, but not some of the other typical issues like iron accumulation in the brain or retinal degeneration. Since co-enzyme A is crucial to several metabolic pathways, the researchers also tested the mice on a high fat ketogenic diet. Under these conditions, ketone bodies produced through fatty acid oxidation bypass the normal glycolytic pathways and proceed directly to the citric acid acid.
On the ketogenic diet, the mitochondria, which were already ailing with abnormal, swollen cristae, fared much worse, losing some cristae entirely. Extensive lipofuschin deposits were also found in these mice, and movement issues were amplified. It had previously been established in other organisms like flies, that panthethine (a dimeric form of vitamin B5 linked by cysteamine bridging groups) could counteract these issues. When the mice were given panthethine, the general pathology was resolved. In particular, the mitochondria were completely rescued, presumably restored to health, or otherwise replaced in the natural course of events.
The researchers also evaluated mitochondrial membrane potential using dye staining methods. In the knockout mice, membrane potential was compromised, however it was completely restored by the panthethine. At present there is no definitive way to predict functional variables, like membrane potential, from the morphology as it is seen on processed EM tissue. In a recent review of new brain mapping techniques, we discussed this issue, and also pointed to new technologies which may permit closer examinations.
On EM images, one of the most striking features in the interior of mitochondria is the crista junction. This protein structure functionally divides the inner and intermembrane spaces, and controls exchanges between them. While mitochondria come in a variety of forms, the junctions generally converge on a preferred shape and size. Efforts to thermodynamically characterize them in terms of shape entropy have been initiated, as have conceptions of how they evolve as conditions in the mitochondria change mechanically. The so-called “baffle model” of mitochondrial has been entirely replaced by the new cristae junction model which aims to relate structure to function for these organelles, just as we seek it on larger scales for the brain.
Several issues in PNAK style neurodegeneration still stand out like a sore thumb. The iron accumulation is still unexplained, but may be related to another unexplained issue: namely, not only does panthethine fail to cross the BBB, it does not even appear to be working through a vitamin B5 function. When panthethine is metabolized into two pantothenic acid molecules, it also forms two cysteamines. While cysteamine is associated with various side effects, and it can bind and inactivate certain liver enzymes, it also can cross the BBB, perhaps as seen here, to great effect.
The doses necessary for vitamin B5 function are far below those needed here for restorative function. More work is needed to constrain the range of possible mechanisms at play here, but in addition to finding cures for the disease, it will also help cure our ignorance as far as structure-function relations.
Children get plenty of benefits from music lessons. Learning to play instruments can fuel their creativity, and practicing can teach much-needed focus and discipline. And the payoff, whether in learning a new song or just mastering a chord, often boosts self-esteem.
But Harvard researchers now say that one oft-cited benefit — that studying music improves intelligence — is a myth.
Though it has been embraced by everyone from advocates for arts education to parents hoping to encourage their kids to stick with piano lessons, a pair of studies conducted by Samuel Mehr, a Harvard Graduate School of Education (HGSE) doctoral student working in the lab of Elizabeth Spelke, the Marshall L. Berkman Professor of Psychology, found that music training had no effect on the cognitive abilities of young children. The studies are described in a Dec. 11 paper published in the open-access journal PLoS One.
“More than 80 percent of American adults think that music improves children’s grades or intelligence,” Mehr said. “Even in the scientific community, there’s a general belief that music is important for these extrinsic reasons. But there is very little evidence supporting the idea that music classes enhance children’s cognitive development.”
The notion that music training can make someone smarter, Mehr said, can largely be traced to a single study published in Nature. In it, researchers identified what they called the “Mozart effect.” After listening to music, test subjects performed better on spatial tasks.
Though the study was later debunked, the notion that simply listening to music could make someone smarter became firmly embedded in the public imagination, and spurred a host of follow-up studies, including several that focused on the cognitive benefits of music lessons.
Though dozens of studies have explored whether and how music and cognitive skills might be connected, when Mehr and colleagues reviewed the literature they found only five studies that used randomized trials, the gold standard for determining causal effects of educational interventions on child development. Of the five, only one showed an unambiguously positive effect, and it was so small — just a 2.7 point increase in IQ after a year of music lessons — that it was barely enough to be statistically significant.
“The experimental work on this question is very much in its infancy, but the few published studies on the topic show little evidence for ‘music makes you smarter,’” Mehr said.
To explore the connection between music and cognition, Mehr and his colleagues recruited 29 parents and 4-year-old children from the Cambridge area. After initial vocabulary tests for the children and music aptitude tests for the parents, each was randomly assigned to one of two classes, one that had music training, or another that focused on visual arts.
“We wanted to test the effects of the type of music education that actually happens in the real world, and we wanted to study the effect in young children, so we implemented a parent-child music enrichment program with preschoolers,” Mehr said. “The goal is to encourage musical play between parents and children in a classroom environment, which gives parents a strong repertoire of musical activities they can continue to use at home with their kids.”
Among the key changes Mehr and his colleagues made from earlier studies were controlling for the effect of different teachers — Mehr taught both the music and visual arts classes — and using assessment tools designed to test areas of cognition, vocabulary, mathematics, and two spatial tasks.
“Instead of using something general, like an IQ test, we tested four specific domains of cognition,” Mehr said. “If there really is an effect of music training on children’s cognition, we should be able to better detect it here than in previous studies, because these tests are more sensitive than tests of general intelligence.”
The study’s results, however, showed no evidence for cognitive benefits of music training.
While the groups performed comparably on vocabulary and number-estimation tasks, the assessments showed that children who received music training performed slightly better at one spatial task, while those who received visual arts training performed better at the other.
“Study One was very small. We only had 15 children in the music group, and 14 in the visual arts,” Mehr said. “The effects were tiny, and their statistical significance was marginal at best. So we attempted to replicate the study, something that hasn’t been done in any of the previous work.”
To replicate the effect, Mehr and colleagues designed a second study that recruited 45 parents and children, half of whom received music training, and half of whom received no training.
Just as in the first study, Mehr said, there was no evidence that music training offered any cognitive benefit. Even when the results of both studies were pooled to allow researchers to compare the effect of music training, visual arts training, and no training, there was no sign that any group outperformed the others.
“There were slight differences in performance between the groups, but none were large enough to be statistically significant,” Mehr said. “Even when we used the finest-grained statistical analyses available to us, the effects just weren’t there.”
While the results suggest studying music may not be a shortcut to educational success, Mehr said there is still substantial value in music education.
“There’s a compelling case to be made for teaching music that has nothing to do with extrinsic benefits,” he said. “We don’t teach kids Shakespeare because we think it will help them do better on the SATs. We do it because we believe Shakespeare is important.
“Music is an ancient, uniquely human activity. The oldest flutes that have been dug up are 40,000 years old, and human song long preceded that,” he said. “Every single culture in the world has music, including music for children. Music says something about what it means to be human, and it would be crazy not to teach this to our children.”
The act of laughing at a joke is the result of a two-stage process in the brain, first detecting an incongruity before then resolving it with an expression of mirth. The brain actions involved in understanding humour differ between young boys and girls. These are the conclusions reached by a US-based scientist supported by the Swiss National Science Foundation.

Since science has demonstrated that animals are also capable of planning into the future, the once deep cleft between the brain capacities of humans and animals is rapidly disappearing. Fortunately, we can still claim humour as our unique selling point. This makes it even more astonishing that researchers have considered this attribute but fleetingly (and have spent much more time on negative emotions such as fear), write the Swiss neuroscientist Pascal Vrticka and his US colleagues at Stanford University, in the journal “Nature Reviews Neuroscience”.
Strangely cheerful feelings
In their recently published article (*), the researchers demonstrate that, while laughter at a joke requires activity in many different areas of the brain, just two separate elements can be identified among the complex patterns of activity. In the first part, the brain detects a logical incongruity, which, in the second part, it proceeds to resolve. The ensuing feeling of cheerfulness arises from a brain activity that can be clearly differentiated from that of other positive emotions.
Moreover, in the study of 22 children aged between six and thirteen, the research team led by Vrticka showed that sex-specific differences in the processing of humour are formed early on in life. The researchers recorded the children’s brain activity while they were enjoying film clips that were either funny – slapstick home video – or entertaining – such as clips of children break-dancing. On average, the girls’ brains responded more to the funny scenes, while the boys showed greater reaction to the entertaining clips.
Benefits of improved understanding
Vrticka speculates that these sex-based differences could play a role in helping women to select a suitable (and humorous) mate. Aside from this, humour also plays a key role in psychological health. This is demonstrated, among other things, in the fact that adults with psychological disorders such as autism or depression often have a modified humour processing activity and respond less markedly to humour than people who do not have these disorders. Vrticka believes that an improved understanding of the processes that take place in our brain when we enjoy the effects of an amusing joke could be of great benefit in the development of treatments.
(Source: alphagalileo.org)
Picturing pain could help unlock its mysteries and lead to better treatments
Understanding the science behind pain, from a simple “ouch” to the chronic and excruciating, has been an elusive goal for centuries. But now, researchers are reporting a promising step toward studying pain in action. In a study published in the Journal of the American Chemical Society, scientists describe the development of a new technique, which they tested in rats, that could result in better ways to relieve pain and monitor healing.
Sandip Biswal, Frederick T. Chin, Justin Du Bois and colleagues note that current ways to diagnose pain basically involve asking the patient if something hurts. These subjective approaches are fraught with bias and can lead doctors in the wrong direction if a patient doesn’t want to talk about the pain or can’t communicate well. It can also be difficult to tell how well a treatment is really working. No existing method can measure pain intensity objectively or help physicians pinpoint the exact location of the pain. Past research has shown an association between pain and a certain kind of protein, called a sodium channel, that helps nerve cells transmit pain and other sensations to the brain. Certain forms of this channel are overproduced at the site of an injury, so the team set out to develop an imaging method to visualize high concentrations of this protein.
They turned to a small molecule called saxitoxin, produced naturally by certain types of microscopic marine creatures, and attached a signal to it so they could trace it by PET imaging. PET scanners are used in hospitals to diagnose diseases and injuries. When the researchers injected the molecule into rats, often a stand-in for humans in lab tests, they saw that the molecule accumulated where the rats had nerve damage. The rats didn’t show signs of toxic side effects. The work is one of the first attempts to mark these sodium channels in a living animal, they say.
A study in mice shows a breakdown of the brain’s blood vessels may amplify or cause problems associated with Alzheimer’s disease. The results published in Nature Communications suggest that blood vessel cells called pericytes may provide novel targets for treatments and diagnoses.

“This study helps show how the brain’s vascular system may contribute to the development of Alzheimer’s disease,” said study leader Berislav V. Zlokovic, M.D. Ph.D., director of the Zilkha Neurogenetic Institute at the Keck School of Medicine of the University of Southern California, Los Angeles. The study was co-funded by the National Institute of Neurological Diseases and Stroke (NINDS) and the National Institute on Aging (NIA), parts of the National Institutes of Health
Alzheimer’s disease is the leading cause of dementia. It is an age-related disease that gradually erodes a person’s memory, thinking, and ability to perform everyday tasks. Brains from Alzheimer’s patients typically have abnormally high levels of plaques made up of accumulations of beta-amyloid protein next to brain cells, tau protein that clumps together to form neurofibrillary tangles inside neurons, and extensive neuron loss.
Vascular dementias, the second leading cause of dementia, are a diverse group of brain disorders caused by a range of blood vessel problems. Brains from Alzheimer’s patients often show evidence of vascular disease, including ischemic stroke, small hemorrhages, and diffuse white matter disease, plus a buildup of beta-amyloid protein in vessel walls. Furthermore, previous studies suggest that APOE4, a genetic risk factor for Alzheimer’s disease, is linked to brain blood vessel health and integrity.
“This study may provide a better understanding of the overlap between Alzheimer’s disease and vascular dementia,” said Roderick Corriveau, Ph.D., a program director at NINDS.
One hypothesis about Alzheimer’s disease states that increases in beta-amyloid lead to nerve cell damage. This is supported by genetic studies that link familial forms of the disease to mutations in amyloid precursor protein (APP), the larger protein from which plaque-forming beta-amyloid molecules are derived. Nonetheless, previous studies on mice showed that increased beta-amyloid levels reproduce some of the problems associated with Alzheimer’s. The animals have memory problems, beta-amyloid plaques in the brain and vascular damage but none of the neurofibrillary tangles and neuron loss that are hallmarks of the disease.
In this study, the researchers show that pericytes may be a key to whether increased beta-amyloid leads to tangles and neuron loss.
Pericytes are cells that surround the outside of blood vessels. Many are found in a brain plumbing system, called the blood-brain barrier. It is a network that exquisitely controls the movement of cells and molecules between the blood and the interstitial fluid that surrounds the brain’s nerve cells. Pericytes work with other blood-brain barrier cells to transport nutrients and waste molecules between the blood and the interstitial brain fluid.
To study how pericytes influence Alzheimer’s disease, Dr. Zlokovic and his colleagues crossbred mice genetically engineered to have a form of APP linked to familial Alzheimer’s with ones that have reduced levels of platelet-derived growth factor beta receptor (PDGFR-beta), a protein known to control pericyte growth and survival. Previous studies showed that PDGFR-beta mutant mice have fewer pericytes than normal, decreased brain blood flow, and damage to the blood-brain barrier.
“Pericytes act like the gatekeepers of the blood-brain barrier,” said Dr. Zlokovic.
Both the APP and PDGFR-beta mutant mice had problems with learning and memory. Crossbreeding the mice slightly enhanced these problems. The mice also had increased beta-amyloid plaque deposition near brain cells and along brain blood vessels. Surprisingly, the brains of the crossbred mice had enhanced neuronal cell death and extensive neurofibrillary tangles in the hippocampus and cerebral cortex, regions that are typically affected during Alzheimer’s.
“Our results suggest that damage to the vascular system may be a critical step in the development of full-blown Alzheimer’s disease pathology,” said Dr. Zlokovic.
Further experiments suggested that pericytes may transport beta-amyloid across the blood-brain barrier into the blood and showed that crossbreeding the mice slowed the rate at which beta-amyloid was cleared away from nerve cells in the brain.
Next, the researchers addressed how beta-amyloid may affect the vascular system. The crossbred mutants had more pericyte death and more damage to the blood-brain barrier than the PDGFR-beta mutant mice, suggesting beta-amyloid may enhance vascular damage. The investigators also confirmed previous findings showing that beta-amyloid accumulation leads to pericyte death.
Dr. Zlokovic and his colleagues concluded that their results support a two-hit vascular hypothesis of Alzheimer’s. The hypothesis states that the toxic effects of increased beta-amyloid deposition on pericytes in aged blood vessels leads to a breakdown of the blood-brain barrier and a reduced ability to clear amyloid from the brain. In turn, the progressive accumulation of beta-amyloid in the brain and death of pericytes may become a damaging feedback loop that causes dementia. If true, then pericytes and other blood-brain barrier cells may be new therapeutic targets for treating Alzheimer’s disease.
(Source: ninds.nih.gov)
If you have ever said or done the wrong thing at the wrong time, you should read this. Neuroscientists at The University of Texas Health Science Center at Houston (UTHealth) and the University of California, San Diego, have successfully demonstrated a technique to enhance a form of self-control through a novel form of brain stimulation.

Study participants were asked to perform a simple behavioral task that required the braking/slowing of action – inhibition – in the brain. In each participant, the researchers first identified the specific location for this brake in the prefrontal region of the brain. Next, they increased activity in this brain region using stimulation with brief and imperceptible electrical charges. This led to increased braking – a form of enhanced self-control.
This proof-of-principle study appears in the Dec. 11 issue of The Journal of Neuroscience and its methods may one day be useful for treating attention deficit hyperactivity disorder (ADHD), Tourette’s syndrome and other severe disorders of self-control.
“There is a circuit in the brain for inhibiting or braking responses,” said Nitin Tandon, M.D., the study’s senior author and associate professor in The Vivian L. Smith Department of Neurosurgery at the UTHealth Medical School. “We believe we are the first to show that we can enhance this braking system with brain stimulation.”
A computer stimulated the prefrontal cortex exactly when braking was needed. This was done using electrodes implanted directly on the brain surface.
When the test was repeated with stimulation of a brain region outside the prefrontal cortex, there was no effect on behavior, showing the effect to be specific to the prefrontal braking system.
This was a double-blind study, meaning that participants and scientists did not know when or where the charges were being administered.
The method of electrical stimulation was novel in that it apparently enhanced prefrontal function, whereas other human brain stimulation studies mostly disrupt normal brain activity. This is the first published human study to enhance prefrontal lobe function using direct electrical stimulation, the researchers report.
The study involved four volunteers with epilepsy who agreed to participate while being monitored for seizures at the Mischer Neuroscience Institute at Memorial Hermann-Texas Medical Center (TMC). Stimulation enhanced braking in all four participants.
Tandon has been working on self-control research with researchers at the University of California, San Diego, for five years. “Our daily life is full of occasions when one must inhibit responses. For example, one must stop speaking when it’s inappropriate to the social context and stop oneself from reaching for extra candy,” said Tandon, who is a neurosurgeon with the Mischer Neuroscience Institute at Memorial Hermann-TMC.
The researchers are quick to point out that while their results are promising, they do not yet point to the ability to improve self-control in general. In particular, this study does not show that direct electrical stimulation is a realistic option for treating human self-control disorders such as obsessive-compulsive disorder, Tourette’s syndrome and borderline personality disorder. Notably, direct electrical stimulation requires an invasive surgical procedure, which is now used only for the localization and treatment of severe epilepsy.
(Source: uth.edu)
A research team from The University of Nottingham has helped uncover a second rare genetic mutation which strongly increases the risk of Alzheimer’s disease in later life.

In an international collaboration, the University’s Translational Cell Sciences Human Genetics research group has pinpointed a rare coding variation in the Phospholipase D3 (PLD3) gene which is more common in people with late-onset Alzheimer’s than non-sufferers.
The discovery is an important milestone on the road to early diagnosis of the disease and eventual improved treatment. Having surveyed the human genome for common variants associated with Alzheimer’s, geneticists are now turning the spotlight on rare mutations which may be even stronger risk factors.
More than 820,000 people in the UK have dementia and the number is rising as the population ages. The condition, of which Alzheimer’s disease is the predominant cause, costs the UK economy £23 billion per year, much more than other diseases like cancer and heart disease.
Nottingham’s genetic experts have been working with long-term partners from Washington University, St Louis, USA and University College, London, to carry out next-generation whole exome sequencing on families where Alzheimer’s affects several members.
Earlier this year the collaboration uncovered the first ever rare genetic mutation implicated in disease risk, linking the TREM2 gene to a higher risk of Alzheimer’s (published in the New England Journal of Medicine). Now, in a new study published today in the international journal, Nature, the team reveal that after analysis of the genes of around 2,000 people with Alzheimer’s, a second genetic variation has been found, in the PLD3 gene.
PLD3 influences processing of amyloid precursor protein which results in the generation of the characteristic amyloid plaques seen in AD brain tissue, suggesting that it may be a potential therapeutic target.
The international research team used Nottingham’s Alzheimer’s Research UK DNA bank, one of the largest collections of DNA from Alzheimer’s patients, to completely sequence the entire coding region (exome) of the PLD3 gene. The results showed several mutations in the gene occurred more frequently in people who had the disease than in non-sufferers. Carriers of PLD3 coding variants showed a two-fold increased risk for the disease.
Leading the team at Nottingham, Professor of Human Genomics and Molecular Genetics, Kevin Morgan, said:
“This second crucial discovery has confirmed that this latest scientific approach does deliver, it is able to find these clues. However, it is also inferring that there are lots more AD-significant variations out there and before we can use it for diagnosis we need to find all of the other genetic variations involved in Alzheimer’s too.
“Our research is forming the basis of potential diagnostics later on and more importantly it shows pathways that can be diagnostic targets which could lead to therapeutic interventions in the future.
“The next step will be to examine how this particular rare gene variant functions in the cell and see if it can be targeted, to see if there are any benefits to finding out how this gene operates in both normal and diseased cells. If we can do this, we may be able eventually to correct the defect with drug therapy. Here in Nottingham we will keep looking for more rare gene variations.
“Even if we could eventually slow or halt the progress of the disease with new drugs rather than curing it completely, the benefits would be huge in terms of the real impact on patients’ lives and also in vast savings to the health economy. The group The University of Nottingham has played a significant role in all of the recent AD genetics discoveries that have highlighted 20 new regions of interest in the genome in the last five years and we will continue to do so into the future.”
Rebecca Wood, Chief Executive of Alzheimer’s Research UK, the UK’s leading dementia research charity, said: “Advances in genetic technology are allowing researchers to understand more than ever about the genetic risk factors for the most common form of Alzheimer’s. This announcement, made just off the back of the G8 dementia research summit, is a timely reminder of the progress that can be made by worldwide collaboration. We know that late-onset Alzheimer’s is caused by a complex mix of risk factors, including both genetic and lifestyle. Understanding all of these risk factors and how they work together to affect someone’s likelihood of developing Alzheimer’s is incredibly important for developing interventions to slow the onset of the disease. Alzheimer’s Research UK is proud to have contributed to this discovery, both by funding researchers and through the establishment of a DNA collection that has been used in many of the recent genetic discoveries in Alzheimer’s.”
(Source: nottingham.ac.uk)
Brain structure shows affinity with numbers
The structure of the brain shows the way in which we process numbers. People either do this spatially or non-spatially. A study by Florian Krause from the Donders Institute in Nijmegen shows for the first time that these individual differences have a structural basis in the brain. The Journal of Cognitive Neuroscience published the results in an early access version of the article.
People who process numbers spatially do this using an imaginary horizontal line along which the numbers are arranged from low to high, left to right. A non-spatial representation is also possible, by comparing numbers to other magnitudes such as force or luminosity.
Different grey matter volumes
Florian Krause identified this predisposition to spatial or non-spatial number processing in MRI scans of test subjects. He discovered differences in grey matter volume, which contains the cell bodies of nerve cells, in two specific locations. Spatially oriented brains have an above-average grey matter volume in the right precuneus, a small area of the brain associated with processing visual-spatial information. Non-spatially oriented brains have more grey matter in the left angular gyrus, an area associated with semantic and conceptual processing.
Spatial numbers
For a long time, scientists thought that everyone processed numbers predominantly in a spatial way. Krause demonstrates that this is not the case. In his own words: ‘Our current study stresses the importance of non-spatial number representations. This is important since researchers in the field tend to focus mainly on spatial representations. Personally, I think that numbers are understood in terms of our body experiences. We use information about size in real life to understand number size in our heads.’
Classifying numbers
The thirty people taking part in the study were put into an MRI scanner and were shown numbers between 1 and 9 (except 5). In two consecutive judgement tasks, they had to classify the presented digits as odd or even. Both tasks differed only in the required response: in the spatial task subjects had to click with their index finger or middle finger to classify the digits, and in the non-spatial task they applied either a small or a large force on a pressure sensor with their thumb. Both tests were carried out using the right hand. Importantly, participants coupled the spatial response as well as the force response to the size of the presented number, as they responded faster with a left or soft press for small numbers and with a right or hard press for large numbers. Krause worked out those couplings for each subject, and compared the scores with the information from their brain scan.
Potential benefits for teaching maths
At present, maths is largely taught on the basis of a spatial number processing. ‘People with a non-spatial representation of numbers would probably benefit from a different approach to maths teaching’, says Krause. ‘It is possible to let pupils experience the size of numbers in a non-spatial way. This could involve expressing numbers with your body while doing simple arithmetics, for example.’ Krause is planning several new studies to explore the scientific basis of methods like these in more detail.
Optogenetics as good as electrical stimulation
Neuroscientists are eagerly, but not always successfully, looking for proof that optogenetics – a celebrated technique that uses pulses of visible light to genetically alter brain cells to be excited or silenced – can be as successful in complex and large brains as it has been in rodent models.
A new study in the journal Current Biology may be the most definitive demonstration yet that the technique can work in nonhuman primates as well as, or even a little better than, the tried-and-true method of perturbing brain circuits with small bursts of electrical current. Brown University researchers directly compared the two techniques to test how well they could influence the visual decision-making behavior of two primates.
“For most of my colleagues in neuroscience to say ‘I’ll be able to incorporate [optogenetics] into my daily work with nonhuman primates,’ you have to get beyond ‘It does seem to sort of work’,” said study senior author David Sheinberg, professor of neuroscience professor affiliated with the Brown Institute for Brain Science. “In our comparison, one of the nice things is that in some ways we found quite analogous effects between electrical and optical [stimulation] but in the optical case it seemed more focused.”
Ultimately if it consistently proves safe and effective in the large, complex brains of primates, optogenetics could eventually be used in humans where it could provide a variety of potential diagnostic and therapeutic benefits.
Evidence in sight
With that in mind, Sheinberg, lead author Ji Dai and second author Daniel Brooks designed their experiments to determine whether and how much optical or electrical stimulation in a particular area of the brain called the lateral intraparietal area (LIP) would affect each subject’s decision making when presented with a choice between a target and a similar-looking, distracting character.
“This is an area of the brain involved in registering the location of salient objects in the visual world,” said Sheinberg who added that the experimental task was more cognitively sophisticated than those tested in optogenetics experiments in nonhuman primates before.
The main task for the subjects was to fixate on a central point in middle of the screen and then to look toward the letter “T” when it appeared around the edge of the screen. In some trials, they had to decide quickly between the T and a similar looking “+” or “†” character presented on opposite ends of the screen. They were rewarded if they glanced toward the T.
Before beginning those trials, the researchers had carefully placed a very thin combination sensor of an optical fiber and an electrode amid a small population of cells in the LIP of each subject. Then they mapped where on the screen an object should be in order for them to detect a response in those cells. They called that area the receptive field. With this information, they could then look to see what difference either optical or electrical stimulation of those cells would have on the subject’s inclination to look when the T or the distracting character appeared at various locations in visual space.
They found that stimulating with either method increased both subjects’ accuracy in choosing the target when it appeared in their receptive field. They also found the primates became less accurate when the distracting character appeared in their receptive field. Generally accuracy was unchanged when neither character was in the receptive field.
In other words, the stimulation of a particular group of LIP cells significantly biased the subjects to look at objects that appeared in the receptive field associated with those cells. Either stimulation method could therefore make the subjects more accurate or effectively distract them from making the right choice.
The magnitude of the difference made by either stimulation method compared to no stimulation were small, but statistically significant. When the T was in the receptive field, one research subject became 10 percentage points more accurate (80 percent vs. 70 percent) when optically stimulated and eight points more accurate when electrically stimulated. The subject was five points less accurate (73 percent vs. 78 percent) with optical stimulation and six percentage points less accurate with electrical stimulation when the distracting character was in the receptive field.
The other subject showed similar differences. In all, the two primates made thousands of choices over scores of sessions between the T and the distracting character with either kind of stimulation or none. Compared head-to-head in a statistical analysis, electrical and optical stimulation showed essentially similar effects in biasing the decisions.
Optical advantages
Although the two methods performed at parity on the main measure of accuracy, the optogenetic method had a couple of advantages, Sheinberg said.
Electrical stimulation appeared to be less precise in the cells it reached, a possibility suggested by a reduction in electrically stimulated subjects’ reaction time when the T appeared outside the receptive field. Optogenetic stimulation, Sheinberg said, did not produce such unintended effects.
Electrical stimulation also makes simultaneous electrical recording very difficult, Sheinberg said. That makes it hard to understand what neurons do when they are stimulated. Optogenetics, he said, allows for easier simultaneous electrical recording of neural activity.
Sheinberg said he is encouraged about using optogenetics to investigate even more sophisticated questions of cognition.
“Our goal is to be able to now expand this and use it again as a daily tool to probe circuits in more complicated paradigms,” Sheinberg said.
He plans a new study in which his group will look at memory of visual cues in the LIP.
Most people – including scientists – assumed we can’t just sniff out danger.
It was thought that we become afraid of an odor – such as leaking gas – only after information about a scary scent is processed by our brain.

But neuroscientists at Rutgers University studying the olfactory – sense of smell – system in mice have discovered that this fear reaction can occur at the sensory level, even before the brain has the opportunity to interpret that the odor could mean trouble.
In a new study published today in Science, John McGann, associate professor of behavioral and systems neuroscience in the Department of Psychology, and his colleagues, report that neurons in the noses of laboratory animals reacted more strongly to threatening odors before the odor message was sent to the brain.
“What is surprising is that we tend to think of learning as something that only happens deep in the brain after conscious awareness,” says McGann whose laboratory studies the sense of smell. “But now we see how the nervous system can become especially sensitive to threatening stimuli and that fear-learning can affect the signals passing from sensory organs to the brain.”
McGann and students Marley Kass and Michelle Rosenthal made this discovery by using light to observe activity in the brains of genetically engineered mice through a window in the mouse’s skull. They found that those mice that received an electric shock simultaneously with a specific odor showed an enhanced response to the smell in the cells in the nose, before the message was delivered to the neurons in the brain.
This new research – which indicates that fearful memories can influence the senses – could help to better understand conditions like Post Traumatic Stress Disorder, in which feelings of anxiety and fear exist even though an individual is no longer in danger.
“We know that anxiety disorders like PTSD can sometimes be triggered by smell, like the smell of diesel exhaust for a soldier,” says McGann who received funding from the National Institute of Mental Health and the National Institute on Deafness and Other Communication Disorders for this research. “What this study does is gives us a new way of thinking about how this might happen.”
In their study, the scientists also discovered a heightened sensitivity to odors in the mice traumatized by shock. When these mice smelled the odor associated with the electrical shocks, the amount of neurotransmitter – chemicals that carry communications between nerve cells – released from the olfactory nerve into the brain was as big as if the odor were four times stronger than it actually was.
This created mice whose brains were hypersensitive to the fear-associated odors. Before now, scientists did not think that reward or punishment could influence how the sensory organs process information.
The next step in the continuing research, McGann says, is to determine whether the hypersensitivity to threatening odors can be reversed by using exposure therapy to teach the mice that the electrical shock is no longer associated with a specific odor. This could help develop a better understanding of fear learning that might someday lead to new therapeutic treatments for anxiety disorders in humans, he says.
(Source: news.rutgers.edu)