Neuroscience

Month

June 2014

Jun 12, 2014233 notes
#language #birdsong #evolution #linguistics #psychology #neuroscience #science
Jun 11, 2014216 notes
#science #children #TUBB5 #brain disorders #neurons #genetics #neuroscience
Researchers Use Human Stem Cells to Create Light-Sensitive Retina in a Dish

Using a type of human stem cell, Johns Hopkins researchers say they have created a three-dimensional complement of human retinal tissue in the laboratory, which notably includes functioning photoreceptor cells capable of responding to light, the first step in the process of converting it into visual images.

image

(Image caption: Rod photoreceptors (in green) within a “mini retina” derived from human iPS cells in the lab. Image courtesy of Johns Hopkins Medicine)

“We have basically created a miniature human retina in a dish that not only has the architectural organization of the retina but also has the ability to sense light,” says study leader M. Valeria Canto-Soler, Ph.D., an assistant professor of ophthalmology at the Johns Hopkins University School of Medicine. She says the work, reported online June 10 in the journal Nature Communications, “advances opportunities for vision-saving research and may ultimately lead to technologies that restore vision in people with retinal diseases.”

Like many processes in the body, vision depends on many different types of cells working in concert, in this case to turn light into something that can be recognized by the brain as an image. Canto-Soler cautions that photoreceptors are only part of the story in the complex eye-brain process of vision, and her lab hasn’t yet recreated all of the functions of the human eye and its links to the visual cortex of the brain. “Is our lab retina capable of producing a visual signal that the brain can interpret into an image? Probably not, but this is a good start,” she says.

The achievement emerged from experiments with human induced pluripotent stem cells (iPS) and could, eventually, enable genetically engineered retinal cell transplants that halt or even reverse a patient’s march toward blindness, the researchers say.

The iPS cells are adult cells that have been genetically reprogrammed to their most primitive state. Under the right circumstances, they can develop into most or all of the 200 cell types in the human body. In this case, the Johns Hopkins team turned them into retinal progenitor cells destined to form light-sensitive retinal tissue that lines the back of the eye.

Using a simple, straightforward technique they developed to foster the growth of the retinal progenitors, Canto-Soler and her team saw retinal cells and then tissue grow in their petri dishes, says Xiufeng Zhong, Ph.D., a postdoctoral researcher in Canto-Soler’s lab. The growth, she says, corresponded in timing and duration to retinal development in a human fetus in the womb. Moreover, the photoreceptors were mature enough to develop outer segments, a structure essential for photoreceptors to function.

Retinal tissue is complex, comprising seven major cell types, including six kinds of neurons, which are all organized into specific cell layers that absorb and process light, “see,” and transmit those visual signals to the brain for interpretation. The lab-grown retinas recreate the three-dimensional architecture of the human retina. “We knew that a 3-D cellular structure was necessary if we wanted to reproduce functional characteristics of the retina,” says Canto-Soler, “but when we began this work, we didn’t think stem cells would be able to build up a retina almost on their own. In our system, somehow the cells knew what to do.”

When the retinal tissue was at a stage equivalent to 28 weeks of development in the womb, with fairly mature photoreceptors, the researchers tested these mini-retinas to see if the photoreceptors could in fact sense and transform light into visual signals.

They did so by placing an electrode into a single photoreceptor cell and then giving a pulse of light to the cell, which reacted in a biochemical pattern similar to the behavior of photoreceptors in people exposed to light.

Specifically, she says, the lab-grown photoreceptors responded to light the way retinal rods do. Human retinas contain two major photoreceptor cell types called rods and cones. The vast majority of photoreceptors in humans are rods, which enable vision in low light. The retinas grown by the Johns Hopkins team were also dominated by rods.

Canto-Soler says that the newly developed system gives them the ability to generate hundreds of mini-retinas at a time directly from a person affected by a particular retinal disease such as retinitis pigmentosa. This provides a unique biological system to study the cause of retinal diseases directly in human tissue, instead of relying on animal models.

The system, she says, also opens an array of possibilities for personalized medicine such as testing drugs to treat these diseases in a patient-specific way. In the long term, the potential is also there to replace diseased or dead retinal tissue with lab-grown material to restore vision.

Jun 11, 201489 notes
#stem cells #iPSCs #photoreceptors #retinal tissue #vision #medicine #science
Jun 11, 2014209 notes
#facial recognition #artificial face #face perception #visual perception #psychology #neuroscience #science
Jun 11, 2014215 notes
#ADHD #neuroimaging #prefrontal cortex #default mode network #neuroscience #science
Jun 11, 2014185 notes
#BMI #exoskeleton #robotics #Walk Again Project #CellulARSkin #neuroscience #science
That Sounds Familiar, But Why?

When it comes to familiarity, a slew of memories including seemingly unrelated ones can come flooding into the brain, according to mathematical theories called global similarity models.

image

After conducting an fMRI study on memory and categorization, researchers including a Texas Tech University psychologist have shown for the first time that these mathematical models seem to correctly explain processing in the medial temporal lobes, a region of the brain associated with long-term memory that is disrupted by memory disorders like Alzheimer’s disease.

The findings were published in The Journal of Neuroscience.

Tyler Davis, assistant director of Texas Tech’s Neuroimaging Institute and an assistant professor of psychology, specializes in neurobiological approaches to learning and memory. He was part of a team that delved into global similarity models.

“Since at least the 1980s, scientists researching memory have believed that when a person finds someone’s face or a new experience familiar, that person is not simply retrieving a memory of only this previous experience, but memories of many other related and unrelated experiences as well,” Davis said. “Formal mathematical theories of memory called global similarity models suggest that when we judge familiarity, we match an experience, such as a face or a trip to a restaurant, to all of the memories that we have stored in our brains. Our recent work using fMRI suggests these models are correct.”

People may believe when they see someone’s familiar face or take a trip to a familiar restaurant, they only activate the most similar or recent memories for comparison. However, Davis said this is not the case. According to global similarity models, the feeling of familiarity for the taste of brisket at a particular restaurant draws on a spectrum of memories that a person has stored in his or her brain.

Eating the brisket can activate memories not only of a previous trip to that restaurant, but also of the décor, eating brisket at a similar restaurant, what that person’s home-cooked brisket tastes like and even seemingly tangential memories such as a recent trip to another city.

“In terms of global similarity theories and our new findings, the important thing is when you are judging familiarity, your brain doesn’t just retrieve the most relevant memories but many other memories as well,” Davis said. “This seems counter-intuitive to how memory feels. We often feel like we are just retrieving that previous trip to that one particular restaurant when we are asked whether we’d been there before, but there is a lot of behavioral evidence that we activate many other memories as well when we judge familiarity.”

This does not mean that every memory we have stored contributes to familiarity in the same way. The more similar a previous memory is to the current experience, the more it will contribute to judgments of familiarity.

In terms of the brisket example, Davis said, previous trips to the restaurant are going to impact the familiarity more than dissimilar memories, such as the recent trip out of town. However, similarity from these other less-related experiences can have a measurable effect in judgments of familiarity.

In his recent research, Davis and others used fMRI to examine how memory similarity related to behavioral measures of familiarity, in terms of activation patterns in the medial temporal lobes.

“We found that peoples’ memory for the items in our experiments was related to their activation patterns in the medial temporal lobes in a manner that was anticipated by mathematical global similarity models,” Davis said. “The more similar the activation pattern for an item was to all of the other activation patterns, the more strongly people remembered it. This is consistent with global similarity models, which suggest that the items that are most similar to all other items stored in memory will be most familiar.”

The findings suggest that global similarity models may have a neurobiological basis, he said. This is evidence that similarity, in terms of neural processing, may impact memory. People may find things familiar not just because they are identical to things we’ve previously experienced, but because they are similar to a number of things we’ve previously experienced.

Jun 11, 2014160 notes
#neuroimaging #global similarity models #memory #neuroscience #science
Game Technology Teaches Mice and Men to Hear Better in Noisy Environments

The ability to hear soft speech in a noisy environment is difficult for many and nearly impossible for the 48 million in the United States living with hearing loss. Researchers from the Massachusetts Eye and Ear, Harvard Medical School and Harvard University programmed a new type of game that trained both mice and humans to enhance their ability to discriminate soft sounds in noisy backgrounds. Their findings will be published in PNAS Online Early Edition the week of June 9-13, 2014.

image

In the experiment, adult humans and mice with normal hearing were trained on a rudimentary ‘audiogame’ inspired by sensory foraging behavior that required them to discriminate changes in the loudness of a tone presented in a moderate level of background noise. Their findings suggest new therapeutic options for clinical populations that receive little benefit from conventional sensory rehabilitation strategies.

“Like the children’s game ‘hot and cold’, our game provided instantaneous auditory feedback that allowed our human and mouse subjects to hone in on the location of a hidden target,” said senior author Daniel Polley, Ph.D., director of the Mass. Eye and Ear’s Amelia Peabody Neural Plasticity Unit of the Eaton-Peabody Laboratories and assistant professor of otology and laryngology at Harvard Medical School. “Over the course of training, both species learned adaptive search strategies that allowed them to more efficiently convert noisy, dynamic audio cues into actionable information for finding the target. To our surprise, human subjects who mastered this simple game over the course of 30 minutes of daily training for one month exhibited a generalized improvement in their ability to understand speech in noisy background conditions. Comparable improvements in the processing of speech in high levels of background noise were not observed for control subjects who heard the sounds of the game but did not actually play the game.”

The researchers recorded the electrical activity of neurons in auditory regions of the mouse cerebral cortex to gain some insight into how training might have boosted the ability of the brain to separate signal from noise. They found that training substantially altered the way the brain encoded sound.

In trained mice, many neurons became highly sensitive to faint sounds that signaled the location of the target in the game. Moreover, neurons displayed increased resistance to noise suppression; they retained an ability to encode faint sounds even under conditions of elevated background noise.

“Again, changes of this ilk were not observed in control mice that watched (and listened) to their counterparts play the game. Active participation in the training was required; passive listening was not enough,” Dr. Polley said.

These findings illustrate the utility of brain training exercises that are inspired by careful neuroscience research. “When combined with conventional assistive devices such as hearing aids or cochlear implants, ‘audiogames’ of the type we describe here may be able to provide the hearing impaired with an improved ability to reconnect to the auditory world. Of particular interest is the finding that brain training improved speech processing in noisy backgrounds – a listening environment where conventional hearing aids offer limited benefit,” concluded Dr. Jonathon Whitton, lead author on the paper. Dr. Whitton is a principal investigator at the Amelia Peabody Neural Plasticity Unit and affiliated with the Program in Speech Hearing Bioscience and Technology, Harvard–Massachusetts Institute of Technology Division of Health, Sciences, and Technology.

Jun 10, 201480 notes
#hearing #hearing loss #auditory cortex #foraging #noise suppression #neuroscience #science
Jun 10, 2014288 notes
#anaesthesia #memory #children #psychology #neuroscience #science
To recover consciousness, brain activity passes through newly detected states

Anesthesia makes otherwise painful procedures possible by derailing a conscious brain, rendering it incapable of sensing or responding to a surgeon’s knife. But little research exists on what happens when the drugs wear off.

image

(Image caption: Unconscious states. New findings suggest the anesthetized brain must pass through certain ‘way stations’ on the path back to consciousness. Above, the prevalence of particular clusters of brain activity states as recorded in rats that had been administered an anesthetic. The longest appear in red and the shortest in yellow and green.)

“I always found it remarkable that someone can recover from anesthesia, not only that you blink your eyes and can walk around, but you return to being yourself. So if you learned how to do something on Sunday and on Monday, you have surgery, and you wake up and you still know how to do it,” says Alexander Proekt, a visiting fellow in Don Pfaff’s Laboratory of Neurobiology and Behavior at Rockefeller University and an anesthesiologist at Weill Cornell Medical College. “It seemed like there ought to be some kind of guide or path for the system to follow.”

The obvious explanation is that as the anesthetic washes out of the body, electrical activity in the brain gradually returns to its conscious patterns. However, new research by Proekt and colleagues suggests the trip back is not so simple.

“Using statistical analysis, our research shows that the recovery from deep anesthesia is not a smooth, linear process. Instead, there are dynamic ‘way stations’ or states of activity the brain must temporarily occupy on the way to full recovery,” Pfaff says. “These results have implications for understanding how someone’s ability to recover consciousness can be disrupted by, for example, brain injury.”

Proekt, along with former postdoc Andrew Hudson, now an assistant professor in anesthesiology at the University of California, Los Angeles, and Diany Paola Calderon, a research associate in the lab, put rats “under” using the common medical and veterinary anesthetic isoflurane. As the rats recovered, the team monitored the electrical potential outside neurons, known as  local field potentials (LFPs), in particular parts of the brain known, from previous elecrophysiological and pharmacological studies, to be associated with wakefulness and anesthesia. These recordings gave them a sensitive handle on the activities of whole groups of neurons in particular parts of the thalamus and cortex.

In the awake brain, of both humans and rats, neurons generate electrical voltage that oscillates. Many of these oscillations together form a signal that appears as a squiggly line on a recording of brain activity, such as an LFP. When someone is asleep, under anesthesia, or in a coma, these oscillations occur more slowly, or at a low frequency. When he or she is awake, they speed up. The researchers examined the recordings from the rats’ brains to figure out how the electrical activity in these regions changed as they moved from anesthetized to awake.

“Recordings from each animal wound up having particular features that spontaneously appeared, suggesting their brain activity was abruptly transitioning through particular states,” Hudson says. “We analyzed the probability of a brain jumping from one state to another, and we found that certain states act as hubs through which the brain must pass to continue on its way to consciousness.” While the electrical activity in all the rats’ brains passed through these hubs, the precise path back to consciousness was not the same each time, the team reports today in the Proceedings of the National Academy of Sciences.

“These results suggest there is indeed an intrinsic way in which the unconscious brain finds its way back to consciousness. The anesthetic is just a tool for severely reducing brain activity  in a way in which we can control,” Hudson says.

In other scenarios, including coma caused by brain injury or neurological disease, the disruption to brain activity cannot be controlled, making these states much more difficult to study. However, the team’s results may help explain what is going on in these cases. “Maybe a pathway has shut down, or a brain structure that was key for full consciousness is no longer working. We don’t know yet, but our results suggest the possibility that under certain circumstances, someone may be theoretically capable of returning to consciousness but, due to the inability to transition through the hubs we have identified, his or her brain is unable to navigate the way back,” Calderon says.

Jun 10, 2014152 notes
#consciousness #brain activity #anaesthesia #neurons #neuroscience #science
Jun 10, 2014326 notes
#science #decision making #brain activity #EEG #attention #psychology #neuroscience
Jun 9, 2014149 notes
#science #fruit flies #giant fibers #neurons #neuroscience
Research lays foundations for brain damage study

Researchers at The University of Queensland have made a key step that could eventually offer hope for stroke survivors and other people with brain damage.

image

The international study, led by researchers at UQ, could help explain a debilitating neurological condition known as unilateral spatial neglect, which commonly occurs after a stroke causing damage to the right side of the brain.

People with this condition become unaware of the left side of their sensory world, making everyday tasks such as eating and dressing almost impossible to perform.

ARC Discovery Early Career Research Fellow Dr Marta Garrido from UQ’s Queensland Brain Institute (QBI) said this lack of awareness on the left side, might be caused by an uneven brain network that involves interactions between different brain regions.

“Patients with spatial neglect are impaired in attending to sensory information on the left or the right side of space, but this inability is a lot stronger for objects coming from the left,” she said.

“This research has enabled us to establish what happens in a healthy brain, so that we can then further understand exactly what goes on in the brain of someone who is experiencing spatial neglect.”

QBI co-investigator and ARC Australian Laureate Fellow Professor Jason Mattingley said the human brain performed many functions in an uneven way.

“We already know that in a healthy brain even basic perception can be lopsided. For example, when we look at others’ faces we tend to focus more on the left than the right side,” he said.

“Research like this helps us take a key step in understanding some of the puzzling symptoms observed in people following brain damage.”

The researchers at QBI collaborated with UQ’s School of Psychology, and colleagues from Aarhus University in Denmark, and University College London in the UK.

The study involved recording electrical activity in the brains of healthy adult volunteers using electroencephalography (EEG) while listening to sequences of sounds from the left, right or centre.

The next step for the researchers will be to study how people with brain damage use the left and right sides of the brain when perceiving visual objects and sounds. 

Findings of the study were published in The Journal of Neuroscience.

Jun 9, 201481 notes
#unilateral spatial neglect #hemispatial neglect #brain damage #EEG #audiospatial perception #neuroscience #science
Jun 9, 2014143 notes
#attention #mental illness #schizophrenia #bipolar disorder #neuroscience #science
Jun 9, 201499 notes
#glaucoma #neurodegeneration #vision #visual field #optic nerve #alzheimer's disease #neuroscience #science
Jun 9, 2014191 notes
#glioma #brain cancer #telomeres #TERT #TERC #genetics #neuroscience #science
A tiny molecule may help battle depression

Levels of a small molecule found only in humans and in other primates are lower in the brains of depressed individuals, according to researchers at McGill University and the Douglas Institute. This discovery may hold a key to improving treatment options for those who suffer from depression.

image

Depression is a common cause of disability, and while viable medications exist to treat it, finding the right medication for individual patients often amounts to trial and error for the physician. In a new study to be published in the journal Nature Medicine, Dr. Gustavo Turecki, a psychiatrist at the Douglas and professor in the Faculty of Medicine, Department of Psychiatry at McGill, together with his team, discovered that the levels of a tiny molecule, miR-1202, may provide a marker for depression and help detect individuals who are likely to respond to antidepressant treatment.

“Using samples from the Douglas Bell-Canada Brain Bank, we examined brain tissues from individuals who were depressed and compared them with brain tissues from psychiatrically healthy individuals, says Turecki, who is also Director of the McGill Group for Suicide Studies, “We identified this molecule, a microRNA known as miR-1202, only found in humans and primates and discovered that it regulates an important receptor of the neurotransmitter glutamate”.

The team conducted a number of experiments that showed that antidepressants change the levels of this microRNA. “In our clinical trials with living depressed individuals treated with citalopram, a commonly prescribed antidepressant, we found lower levels in depressed individuals compared to the non-depressed individuals before treatment,” says Turecki. “Clearly, microRNA miR-1202 increased as the treatment worked and individuals no longer felt depressed.”

Antidepressant drugs are the most common treatment for depressive episodes, and are among the most prescribed medications in North America. “Although antidepressants are clearly effective, there is variability in how individuals respond to antidepressant treatment,” says Turecki, “We found that miR-1202 is different in individuals with depression and particularly, among those patients who eventually will respond to antidepressant treatment”.

The discovery may provide “a potential target for the development of new and more effective antidepressant treatments,” he adds.

Jun 9, 2014313 notes
#depression #miR-1202 #gene expression #glutamate #antidepressants #neuroscience #science
Jun 9, 2014690 notes
#decision making #regret #orbitofrontal cortex #psychology #neuroscience #science
Jun 8, 2014580 notes
#science #language #language attrition #psychology #neuroscience
Jun 8, 2014137 notes
#tech #science #Pepper #robots #robotics #emotions #humanoids #technology
Jun 8, 2014170 notes
#proteins #stem cells #medicine #science
Jun 8, 201491 notes
#fruit flies #neurodegenerative diseases #nerve cells #neuroscience #science
New Evidence Links Air Pollution to Autism, Schizophrenia

New research from the University of Rochester Medical Center describes how exposure to air pollution early in life produces harmful changes in the brains of mice, including an enlargement of part of the brain that is seen in humans who have autism and schizophrenia.  

As in autism and schizophrenia, the changes occurred predominately in males. The mice also performed poorly in tests of short-term memory, learning ability, and impulsivity.

The new findings are consistent with several recent studies that have shown a link between air pollution and autism in children. Most notably, a 2013 study in JAMA Psychiatry reported that children who lived in areas with high levels of traffic-related air pollution during their first year of life were three times as likely to develop autism.

“Our findings add to the growing body of evidence that air pollution may play a role in autism, as well as in other neurodevelopmental disorders,” said Deborah Cory-Slechta, Ph.D., professor of Environmental Medicine at the University of Rochester and lead author of the study, published in the journal Environmental Health Perspectives.

In three sets of experiments, Cory-Slechta and her colleagues exposed mice to levels of air pollution typically found in mid-sized U.S. cities during rush hour. The exposures were conducted during the first two weeks after birth, a critical time in the brain’s development. The mice were exposed to polluted air for four hours each day for two four-day periods.

In one group of mice, the brains were examined 24 hours after the final pollution exposure. In all of those mice, inflammation was rampant throughout the brain, and the lateral ventricles — chambers on each side of the brain that contain cerebrospinal fluid — were enlarged two-to-three times their normal size.

image

“When we looked closely at the ventricles, we could see that the white matter that normally surrounds them hadn’t fully developed,” said Cory-Slechta. “It appears that inflammation had damaged those brain cells and prevented that region of the brain from developing, and the ventricles simply expanded to fill the space.”

The problems were also observed in a second group of mice 40 days after exposure and in another group 270 days after exposure, indicating that the damage to the brain was permanent. Brains of mice in all three groups also had elevated levels of glutamate, a neurotransmitter, which is also seen in humans with autism and schizophrenia.

Most air pollution is made up mainly of carbon particles that are produced when fuel is burned by power plants, factories, and cars. For decades, research on the health effects of air pollution has focused on the part of the body where the damage is most obvious — the lungs. That research began to show that different-sized particles produce different effects.  Larger particles — the ones regulated by the Environmental Protection Agency (EPA) — are actually the least harmful because they are coughed up and expelled.  But many researchers believe that smaller particles known as ultrafine particles —  which are not regulated by the EPA — are more dangerous, because they are small enough to travel deep into the lungs and be absorbed into the bloodstream, where they can produce toxic effects throughout the body.

That assumption led Cory-Slechta to design a set of experiments that would show whether ultrafine particles have a damaging effect on the brain, and if so, to reveal the mechanism by which they inflict harm. Her study published today is the first scientific work to do both.

“I think these findings are going to raise new questions about whether the current regulatory standards for air quality are sufficient to protect our children,” said Cory-Slechta.

Jun 7, 2014262 notes
#schizophrenia #autism #air pollution #health #science
Biologists pave the way for improved epilepsy treatments

University of Toronto biologists leading an investigation into the cells that regulate proper brain function, have identified and located the key players whose actions contribute to afflictions such as epilepsy and schizophrenia. The discovery is a major step toward developing improved treatments for these and other neurological disorders.

“Neurons in the brain communicate with other neurons through synapses, communication that can either excite or inhibit other neurons,” said Professor Melanie Woodin in the Department of Cell and Systems Biology at the University of Toronto (U of T), lead investigator of a study published today in Cell Reports. “An imbalance among the levels of excitation and inhibition – a tip towards excitation, for example – causes improper brain function and can produce seizures. We identified a key complex of proteins that can regulate excitation-inhibition balance at the cellular level.”

This complex brings together three key proteins – KCC2, Neto2 and GluK2 – required for inhibitory and excitatory synaptic communication. KCC2 is required for inhibitory impulses, GluK2 is a receptor for the main excitatory transmitter glutamate, and Neto2 is an auxiliary protein that interacts with both KCC2 and GluK2. The discovery of the complex of three proteins is pathbreaking as it was previously believed that KCC2 and GluK2 were in separate compartments of the cell and acted independently of each other.

“Finding that they are all directly interacting and can co-regulate each other’s function reveals for the first time a system that can mediate excitation-inhibition balance among neurons themselves,” said Vivek Mahadevan, a PhD candidate in Woodin’s group and lead author of the study.

Mahadevan and fellow researchers made the discovery via biochemistry, fluorescence imaging and electrophysiology experiments on mice brains. The most fruitful technique was the application of an advanced sensitive gel system to determine native protein complexes in neurons, called Blue Native PAGE. The process provided the biochemical conditions necessary to preserve the protein complexes that normally exist in neurons. Blue Native PAGE is advantageous over standard gel electrophoresis, where proteins are separated from their normal protein complexes based on their molecular weights.

“The results reveal the proteins that can be targeted by drug manufacturers in order to reset imbalances that occur in neurological disorders such as epilepsy, autism spectrum disorder, schizophrenia and neuropathic pain,” said Woodin. “There is no cure for epilepsy; the best available treatments only control its effects, such as convulsions and seizures. We can now imagine preventing them from occurring in the first place.”

“It was the cellular mechanisms that determine the excitation-inhibition balance that needed to be identified. Now that we know the key role played by KCC2 in moderating excitatory activity, further research can be done into its occasional dysfunction and how it can also be regulated by excitatory impulses,” said Mahadevan.

Jun 7, 2014113 notes
#epilepsy #hippocampal neurons #schizophrenia #neurons #neuroscience #science
Jun 7, 2014123 notes
#parkinson's disease #dopamine neurons #mitochondria #nerve cells #neuroscience #science
‘Map of pain’ reveals how our ability to identify the source of pain varies across the body

“Where does it hurt?” is the first question asked to any person in pain.

A new UCL study defines for the first time how our ability to identify where it hurts, called “spatial acuity”, varies across the body, being most sensitive at the forehead and fingertips.

image

Using lasers to cause pain to 26 healthy volunteers without any touch, the researchers produced the first systematic map of how acuity for pain is distributed across the body. The work is published in the journal Annals of Neurology and was funded by the Wellcome Trust.

With the exception of the hairless skin on the hands, spatial acuity improves towards the centre of the body whereas the acuity for touch is best at the extremities. This spatial pattern was highly consistent across all participants.

The experiment was also conducted on a rare patient lacking a sense of touch, but who normally feels pain. The results for this patient were consistent with those for healthy volunteers, proving that acuity for pain does not require a functioning sense of touch.

“Acuity for touch has been known for more than a century, and tested daily in neurology to assess the state of sensory nerves on the body. It is striking that until now nobody had done the same for pain,” says lead author Dr Flavia Mancini of the UCL Institute of Cognitive Neuroscience. “If you try to test pain with a physical object like a needle, you are also stimulating touch. This clouds the results, like taking an eye test wearing sunglasses. Using a specially-calibrated laser, we stimulate only the pain nerves in the upper layer of skin and not the deeper cells that sense touch.”

Volunteers were blindfolded and had specially-calibrated pairs of lasers targeted at various parts of their body. These lasers cause a brief sensation of pinprick pain. Sometimes only one laser would be activated, and sometimes both would be, unknown to participants. They were asked whether they felt one ‘sting’ or two, at varying distances between the two beams. The researchers recorded the minimum distance between the beams at which people were able to accurately say whether it was one sting or two.

“This measure tells us how precisely people can locate the source of pain on different parts of their body,” explains senior author Dr Giandomenico Iannetti of the UCL Department of Neuroscience, Physiology and Pharmacology. “Touch and pain are mediated by different sensory systems. While tactile acuity has been well studied, pain acuity has been largely ignored, beyond the common textbook assertion that pain has lower acuity than touch. We found the opposite: acuity for touch and pain are actually very similar. The main difference is in their gradients across the body. For example, pain acuity across the arm is much higher at the shoulder than at the wrist, whereas the opposite is true for touch.”

Acuity for both touch and pain normally correlates with the density of the relevant nerve fibres in each part of the body. However, the fingertips remain highly sensitive despite having a low density of pain-sensing nerve cells.

“The high pain acuity of the fingertips is something of a mystery that requires further investigation,” says Dr Mancini. “This may be because people regularly use their fingertips, and so the central nervous system may learn to process the information accurately.”

The findings have important implications for the assessment of both acute and chronic pain. Dr Roman Cregg of the UCL Centre for Anaesthesia, who was not involved in the research, is a clinical expert who treats patients with chronic pain.

“Chronic pain affects around 10 million people in the UK each year according to the British Pain Society, but we still have no reliable, reproducible way to test patients’ pain acuity,” says Dr Cregg. “This method offers an exciting, non-invasive way to test the state of pain networks across the body. Chronic pain is often caused by damaged nerves, but this is incredibly difficult to monitor and to treat. The laser method may enable us to monitor nerve damage across the body, offering a quantitative way to see if a condition is getting better or worse. I am excited at the prospect of taking this into the clinic, and now hope to work with Drs Mancini and Iannetti to translate their study to the chronic pain setting.”

Jun 7, 2014149 notes
#spatial acuity #touch #pain #neuroscience #science
Making artificial vision look more natural

In laboratory tests, researchers have used electrical stimulation of retinal cells to produce the same patterns of activity that occur when the retina sees a moving object. Although more work remains, this is a step toward restoring natural, high-fidelity vision to blind people, the researchers say. The work was funded in part by the National Institutes of Health.

image

(Image caption: Chichilnisky and colleagues used an electrode array to record activity from retinal ganglion cells (yellow and blue) and feed it back to them, reproducing the cells’ responses to visual stimulation. Credit: E.J. Chichilnisky, Stanford.)

Just 20 years ago, bionic vision was more a science fiction cliché than a realistic medical goal. But in the past few years, the first artificial vision technology has come on the market in the United States and Western Europe, allowing people who’ve been blinded by retinitis pigmentosa to regain some of their sight. While remarkable, the technology has its limits. It has enabled people to navigate through a door and even read headline-sized letters, but not to drive, jog down the street, or see a loved one’s face.

A team based at Stanford University in California is working to improve the technology by targeting specific cells in the retina—the neural tissue at the back of the eye that converts light into electrical activity.

"We’ve found that we can reproduce natural patterns of activity in the retina with exquisite precision," said E.J. Chichilnisky, Ph.D., a professor of neurosurgery at Stanford’s School of Medicine and Hansen Experimental Physics Laboratory. The study was published in Neuron, and was funded in part by NIH’s National Eye Institute (NEI) and National Institute of Biomedical Imaging and Bioengineering (NIBIB).

The retina contains several cell layers. The first layer contains photoreceptor cells, which detect light and convert it into electrical signals. Retinitis pigmentosa and several other blinding diseases are caused by a loss of these cells. The strategy behind many bionic retinas, or retinal prosthetics, is to bypass the need for photoreceptors and stimulate the retinal ganglion cell layer, the last stop in the retina before visual signals are sent to the brain.

Several types of retinal prostheses are under development. The Argus II, which was developed by Second Sight Therapeutics with more than $25 million in support from NEI, is the best known of these devices. In the United States, it was approved for treating retinitis pigmentosa in 2013, and it’s now available at a limited number of medical centers throughout the country. It consists of a camera, mounted on a pair of goggles, which transmits wireless signals to a grid of electrodes implanted on the retina. The electrodes stimulate retinal ganglion cells and give the person a rough sense of what the camera sees, including changes in light and contrast, edges, and rough shapes.

"It’s very exciting for someone who may not have seen anything for 20-30 years. It’s a big deal. On the other hand, it’s a long way from natural vision," said Dr. Chichilnisky, who was not involved in development of the Argus II.

Current technology does not have enough specificity or precision to reproduce natural vision, he said. Although much of visual processing occurs within the brain, some processing is accomplished by retinal ganglion cells. There are 1 to 1.5 million retinal ganglion cells inside the retina, in at least 20 varieties. Natural vision—including the ability to see details in shape, color, depth and motion—requires activating the right cells at the right time.

The new study shows that patterned electrical stimulation can do just that in isolated retinal tissue. The lead author was Lauren Jepson, Ph.D., who was a postdoctoral fellow in Dr. Chichilnisky’s former lab at the Salk Institute in La Jolla, California. The pair collaborated with researchers at the University of California, San Diego, the Santa Cruz Institute for Particle Physics, and the AGH University of Science and Technology in Krakow, Poland.

They focused their efforts on a type of retinal ganglion cell called parasol cells. These cells are known to be important for detecting movement, and its direction and speed, within a visual scene. When a moving object passes through visual space, the cells are activated in waves across the retina.

The researchers placed patches of retina on a 61-electrode grid. Then they sent out pulses at each of the electrodes and listened for cells to respond, almost like sonar. This enabled them to identify parasol cells, which have distinct responses from other retinal ganglion cells. It also established the amount of stimulation required to activate each of the cells. Next, the researchers recorded the cells’ responses to a simple moving image—a white bar passing over a gray background. Finally, they electrically stimulated the cells in this same pattern, at the required strengths. They were able to reproduce the same waves of parasol cell activity that they observed with the moving image.

"There is a long way to go between these results and making a device that produces meaningful, patterned activity over a large region of the retina in a human patient," Dr. Chichilnisky said. "But if we can handle the many technical hurdles ahead, we may be able to speak to the nervous system in its own language, and precisely reproduce its normal function."

Such advances could help make artificial vision more natural, and could be applied to other types of prosthetic devices, too, such as those being studied to help paralyzed individuals regain movement. NEI supports many other projects geared toward retinal prosthetics.

"Retinal prosthetics hold great promise, but this research is a marathon, not a sprint," said Thomas Greenwell, Ph.D., a program director in retinal neuroscience at NEI. "This important study helps illustrate the challenges of restoring high-quality vision, one group’s progress toward that goal, and the continued need to for the entire field to keep innovating."

Jun 7, 2014126 notes
#retinal ganglion cells #retinal prosthetics #artificial vision #implants #vision #neuroscience #science
Brain circuit problem likely sets stage for the “voices” that are symptom of schizophrenia

St. Jude Children’s Research Hospital scientists have identified problems in a connection between brain structures that may predispose individuals to hearing the “voices” that are a common symptom of schizophrenia. The work appears in the June 6 issue of the journal Science.

image

(Image: Getty Images)

Researchers linked the problem to a gene deletion. This leads to changes in brain chemistry that reduce the flow of information between two brain structures involved in processing auditory information.

The research marks the first time that a specific circuit in the brain has been linked to the auditory hallucinations, delusions and other psychotic symptoms of schizophrenia. The disease is a chronic, devastating brain disorder that affects about 1 percent of Americans and causes them to struggle with a variety of problems, including thinking, learning and memory.

The disrupted circuit identified in this study solves the mystery of how current antipsychotic drugs ease symptoms and provides a new focus for efforts to develop medications that quiet “voices” but cause fewer side effects.

“We think that reducing the flow of information between these two brain structures that play a central role in processing auditory information sets the stage for stress or other factors to come along and trigger the ‘voices’ that are the most common psychotic symptom of schizophrenia,” said the study’s corresponding author Stanislav Zakharenko, M.D., Ph.D., an associate member of the St. Jude Department of Developmental Neurobiology. “These findings also integrate several competing models regarding changes in the brain that lead to this complex disorder.”

The work was done in a mouse model of the human genetic disorder 22q11 deletion syndrome. The syndrome occurs when part of chromosome 22 is deleted and individuals are left with one rather than the usual two copies of about 25 genes. About 30 percent of individuals with the deletion syndrome develop schizophrenia, making it one of the strongest risk factors for the disorder. DNA is the blueprint for life. Human DNA is organized into 23 pairs of chromosomes that are found in nearly every cell.

Earlier work from Zakharenko’s laboratory linked one of the lost genes, Dgcr8, to brain changes in mice with the deletion syndrome that affect a structure important for learning and memory. They found evidence that the same mechanism was at work in patients with schizophrenia. Dgcr8 carries instructions for making small molecules called microRNAs that help regulate production of different proteins.

For this study, researchers used state-of-the-art tools to link the loss of Dgcr8 to changes that affect a different brain structure, the auditory thalamus. For decades antipsychotic drugs have been known to work by binding to a protein named the D2 dopamine receptor (Drd2). The binding blocks activity of the chemical messenger dopamine. Until now, however, how that quieted the “voices” of schizophrenia was unclear.

Working in mice with and without the 22q11 deletion, researchers showed that the strength of the nerve impulse from neurons in the auditory thalamus was reduced in mice with the deletion compared to normal mice. Electrical activity in other brain regions was not different.

Investigators showed that Drd2 levels were elevated in the auditory thalamus of mice with the deletion, but not in other brain regions. When researchers checked Drd2 levels in tissue from the same structure collected from 26 individuals with and without schizophrenia, scientists reported that protein levels were higher in patients with the disease.

As further evidence of Drd2’s role in disrupting signals from the auditory thalamus, researchers tested neurons in the laboratory from different brain regions of mutant and normal mice by adding antipsychotic drugs haloperidol and clozapine. Those drugs work by targeting Drd2. Originally nerve impulses in the mutant neurons were reduced compared to normal mice. But the nerve impulses were almost universally enhanced by antipsychotics in neurons from mutant mice, but only in neurons from the auditory thalamus.

When researchers looked more closely at the missing 22q11 genes, they found that mice that lacked the Dgcr8 responded to a loud noise in a similar manner as schizophrenia patients. Treatment with haloperidol restored the normal startle response in the mice, just as the drug does in patients.

Studying schizophrenia and other brain disorders advances understanding of normal brain development and the missteps that lead to various catastrophic diseases, including pediatric brain tumors and other problems.

Jun 6, 2014436 notes
#schizophrenia #auditory cortex #auditory hallucinations #22q11 deletion syndrome #genetics #neuroscience #science
Jun 6, 2014193 notes
#working memory #avian brain #crows #endbrain #nerve cells #neuroscience #science
Jun 6, 2014611 notes
#sleep #memory #learning #dendrites #motor cortex #neuroscience #science
Jun 6, 2014560 notes
#science #brain activity #attention #prefrontal cortex #schizophrenia #neuroscience
Jun 6, 2014155 notes
#brain activity #alpha rhythms #neuroimaging #working memory #attention #neuroscience #science
Researchers identify new gene involved in Parkinson's disease

A team of UCLA researchers has identified a new gene involved in Parkinson’s disease, a finding that may one day provide a target for a new drug to prevent and potentially even cure the debilitating neurological disorder.

Parkinson’s disease is the second most common neurodegenerative disorder after Alzheimer’s disease, and there is no cure for the progressive and devastating illness. About 60,000 Americans are diagnosed with Parkinson’s disease each year. It is estimated that as many as 1 million Americans live with Parkinson’s disease, which is more than the number of people diagnosed with multiple sclerosis, muscular dystrophy and Lou Gehrig’s disease combined.

In Parkinson’s disease, multiple neurons in the brain gradually break down or die. This leads to the movement impairments, such as tremor, rigidity, slowness in movement and difficulty walking, as well as depression, anxiety, sleeping difficulties and dementia, said Dr. Ming Guo, the study team leader, associate professor of neurology and pharmacology and a practicing neurologist at UCLA.

A handful of genes have been identified in inherited cases of Parkinson’s disease. Guo’s team was one of two groups worldwide that first reported in 2006 in the journal Nature that two of these genes, PTEN-induced putative kinase 1 (PINK1) and PARKIN, act together to maintain the health of mitochondria – the power house of the cell that is important in maintaining brain health. Mutations in these genes lead to early-onset Parkinson’s disease.

Guo’s team has further shown that when PINK1 and PARKIN are operating correctly, they help maintain the regular shape of healthy mitochondria and promote elimination of damaged mitochondria. Accumulation of unhealthy or damaged mitochondria in neurons and muscles ultimately results in Parkinson’s disease.

In this study, the team found that the new gene, called MUL1 (also known as MULAN and MAPL), plays an important role in mediating the pathology of the PINK1 and PARKIN. The study, performed in fruit flies and mice, showed that providing an extra amount of MUL1 ameliorates the mitochondrial damage due to mutated PINK/PARKIN, while inhibiting MUL1 in mutant PINK1/PARKIN exacerbates the damage to the mitochondria. In addition, Guo and her collaborators found that removing MUL1 from mouse neurons of the PARKIN disease model results in unhealthy mitochondria and degeneration of the neurons.

The five-year study appears June 4, 2014, in eLife, a new, open access scientific journal for groundbreaking biomedical and life research sponsored by the Howard Hughes Medical Institute (United States), the Wellcome Trust (United Kingdom) and Max Plank Institutes (Germany).

"We are very excited about this finding," Guo said. "There are several implications to this work, including that MUL1 appears to be a very promising drug target and that it may constitute a new pathway regulating the quality of mitochondria."

Guo characterized the work as “a major advancement in Parkinson’s disease research.”

"We show that MUL1 dosage is key and optimizing its function is crucial for brain health and to ward off Parkinson’s disease," she said. "Our work proves that mitochondrial health is of central importance to keep us from suffering from neurodegeneration. Further, finding a drug that can enhance MUL1 function would be of great benefit to patients with Parkinson’s disease."

Going forward, Guo and her team will test these results in more complex organisms, hoping to uncover additional functions and mechanisms of MUL1. Additionally, the team will perform small molecule screens to help identify potential compounds that specifically target MUL1. Further, they will examine if mutations in MUL1 exist in some patients with inherited forms of Parkinson’s.

Jun 5, 201490 notes
#parkinson's disease #parkin #PINK1 #mitochondria #MUL1 #neurodegeneration #neuroscience #science
Jun 5, 2014163 notes
#stem cells #brain damage #proteoglycans #brain cells #neuroscience #science
Researchers Decode How the Brain Miswires, Possibly Causing ADHD

Neuroscientists at Mayo Clinic in Florida and at Aarhus University in Denmark have shed light on why neurons in the brain’s reward system can be miswired, potentially contributing to disorders such as attention deficit hyperactivity disorder (ADHD).

They say findings from their study, published online today in Neuron, may increase the understanding of underlying causes of ADHD, potentially facilitating the development of more individualized treatment strategies.

The scientists looked at dopaminergic neurons, which regulate pleasure, motivation, reward, and cognition, and have been implicated in development of ADHD.

They uncovered a receptor system that is critical, during embryonic development, for correct wiring of the dopaminergic brain area. But they also discovered that after brain maturation, a cut in the same receptor, SorCS2, produces a two-chain receptor that induces cell death following damage to the peripheral nervous system.

The researchers report that the SorCS2 receptor functions as a molecular switch between apparently opposing effects in proBDNF. ProBDNF is a neuronal growth factor that helps select cells that are most beneficial to the nervous system, while eliminating those that are less favorable in order to create a finely tuned neuronal network.

They found that some cells in mice deficient in SorCS2 are unresponsive to proBDNF and have dysfunctional contacts between dopaminergic neurons.

“This miswiring of dopaminergic neurons in mice results in hyperactivity and attention deficits,” says the study’s senior investigator, Anders Nykjaer, M.D., Ph.D., a neuroscientist at Mayo Clinic in Florida and at Aarhus University in Denmark.

“A number of studies have reported that ADHD patients commonly exhibit miswiring in this brain area, accompanied by altered dopaminergic function. We may now have an explanation as to why ADHD risk genes have been linked to regulation of neuronal growth,” he says.

“SorCS2 is produced as a single-chain protein — one long row of amino acids — but it can be cut into two chains to perform a different function. While the single-chain receptor is essential to tell the neuron that it is time to stop growing, the two-chain form tells cells that support neurons in the developing peripheral nervous system to die when they should,” says Dr. Nykjaer.

Unfortunately, if damage occurs to a nerve in the peripheral nervous system, these cells that wrap around and nourish the neurons will die, preventing efficient regeneration, he says. “Our finding suggests that it may be possible to develop drug therapy to prevent this deadly cut of SorCS2 and treat acute nerve injury,” Dr. Nykjaer says.

Jun 5, 2014134 notes
#ADHD #neurons #SorCS2 #dopaminergic neurons #reward system #neuroscience #science
Jun 5, 2014180 notes
#working memory #hippocampus #entorhinal cortex #gamma waves #learning #neuroscience #science
Jun 4, 2014129 notes
#hemispatial neglect #unilateral neglect #consciousness #attention #brain damage #sleep #psychology #neuroscience #science
Study shows increasing rates of premature death and violent crime in people with schizophrenia since 1970s

New research, published in The Lancet Psychiatry journal, shows that rates of adverse outcomes, including premature death and violent crime, in people with schizophrenia are increasing, compared to the general population.

image

The results come from a unique study, led by Dr Seena Fazel, at Oxford University, UK, which analyses long-term adverse outcomes – including conviction for a violent crime (such as homicide or bodily harm) premature death (before the age of 56), and death by suicide – between 1972 and 2009 in nearly 25,000 people in Sweden diagnosed with schizophrenia or related disorders.

For the first time, the researchers compared adverse outcomes in people with a diagnosis of schizophrenia to both the general population and to unaffected siblings, allowing them to account for risk factors within families (such as parental criminality or violence) which might be expected to affect the risk of suicide or violent behaviour in siblings.

Overall, the results show that within five years of diagnosis, around 1 in 50 men and women with schizophrenia (2.3% of men and 1.7% of women) died by suicide; around one in 10 (10.7%) of men and around one in 37 (2.7%) of women with schizophrenia were convicted of a violent offence within five years of diagnosis.  Overall, men and women with schizophrenia were eight times more likely to die prematurely than the general population. 

Analysing the changing rate of adverse outcomes across the study period (1972 – 2009), the researchers found that the risk of premature death, suicide, and conviction for a violent offence has increased for men and women with schizophrenia in the last 38 years, compared with both the general population, and their unaffected siblings. 

By tracking the number of nights spent in hospital by people with schizophrenia during the study period, the study shows that these increased rates of adverse outcomes appear to be associated with decreasing levels of inpatient care for these patients, although the study does not provide any evidence for a causal connection between decreasing inpatient care and adverse outcomes.

The researchers also analysed risk factors for adverse outcomes in both people with schizophrenia, the general population, and unaffected siblings.  Across all three groups, the risk factors for violence and premature death were broadly similar, and included drug use disorders, criminality, and self-harm, all before diagnosis – suggesting that improved strategies to address these risk factors have the potential to reduce violence and premature deaths across the population, and not just in those with schizophrenia.

According to Dr Fazel, “In recent years, there has been a lot of focus on primary prevention of schizophrenia – preventing people from getting ill.  While primary prevention is clearly essential and may be some decades away, our study highlights the crucial importance of secondary prevention – treating and managing the risks of adverse outcomes, such as self-harm or violent behaviour, in patients.  Risks of these adverse outcomes relative to others in society appear to be increasing in recent decades, suggesting that there is still much work to be done in developing new treatments and mitigating risks of adverse outcomes in people with schizophrenia.”*

Dr Eric Elbogen and Sally Johnson, at the University of North Carolina-Chapel Hill School of Medicine, USA, write in a linked Comment that, “One of the unique aspects of this study—that violence and suicide were analysed simultaneously—has an important implication for how we as a society perceive people with mental illness. News coverage of schizophrenia and other psychiatric disorders often focuses on violence and crime. Much less attention is paid to suicide and self-harm in people with severe mental illnesses.”

However, they add that, “Importantly, we should remember that, when reporting about the intricate links between schizophrenia and these adverse outcomes, most people with schizophrenia and related disorders are neither violent nor suicidal. Despite the need to ensure people with schizophrenia are provided help to reduce their risks of suicide, violence, or premature death, researchers reporting findings also bear the burden of ensuring that most people with schizophrenia and related disorders, who are not violent, are not left to contend with stigma and discrimination. Policy makers, researchers, and clinicians need to remember the importance of appropriately weighing up the issue of schizophrenia relative to the myriad of other factors that contribute to increased risk of violence and suicide.”

Jun 4, 2014115 notes
#schizophrenia #suicide #mental illness #premature death #mortality #psychology #neuroscience #science
Jun 4, 2014137 notes
#language #physical activity #cognition #brain function #ERP #N400 #psychology #neuroscience #science
Jun 4, 2014180 notes
#autism #steroid hormones #cortisol #testosterone #psychology #neuroscience #science
Stress hormone receptors localized in sweet taste cells

According to new research from the Monell Center, receptors for stress-activated hormones have been localized in oral taste cells responsible for detection of sweet, umami, and bitter. The findings suggest that these hormones, known as glucocorticoids, may act directly on taste receptor cells under conditions of stress to affect how these cells respond to sugars and certain other taste stimuli.

"Sweet taste may be particularly affected by stress," said lead author M. Rockwell Parker, PhD, a chemical ecologist at Monell. "Our results may provide a molecular mechanism to help explain why some people eat more sugary foods when they are experiencing intense stress."

Glucocorticoid (GC) hormones affect the body by activating specialized GC receptors located inside of cells. Knowing that stress can have major effects on metabolism and food choice, the researchers used a mouse model to ask whether taste receptor cells contain these GC receptors.

The findings, published online ahead of print in the journal Neuroscience Letters, revealed that GC receptors are present on the tongue, where they are specifically localized to the cells that contain receptors for sweet, umami and bitter taste. The highest concentrations of GC receptors were found in Tas1r3 taste cells, which are sensitive to sweet and umami taste.

GC hormones act on cells via a multi-step process. After GCs bind to their receptors within target cells, the activated receptor complex moves, or translocates, to the cell nucleus, where it then influences gene expression and protein assembly.

To explore whether GC receptors in taste tissue are activated by stress, the researchers compared the proportion of taste cells with translocated receptors in stressed and non-stressed mice. Compared to controls, the stressed mice had a 77 percent increase of GC receptors within taste cell nuclei.

Together, the results suggest that sweet taste perception and intake, which are known to be altered by stress, may be specifically affected via secretion of GCs and subsequent activation of GC receptors in taste cells.

"Taste provides one of our initial evaluations of potential foods. If this sense can be directly affected by stress-related hormonal changes, our food interaction will likewise be altered," said Parker.

Parker noted that although stress is known to affect intake of salty foods, GC receptors were not found in cells thought to be responsible for detecting sally and sour taste. One explanation, he said, is that stress may influence salt taste processing in the brain.

Implications of the findings extend beyond the oral taste system. Noting that taste receptors are found throughout the body, senior author and Monell molecular neurobiologist Robert Margolskee, MD, PhD, said, “Taste receptors in the gut and pancreas might also be influenced by stress, potentially impacting metabolism of sugars and other nutrients and affecting appetite.”

Future studies will continue to explore how stress hormones act to affect the taste system.

Jun 4, 2014166 notes
#glucocorticoids #taste #taste cells #Tas1r3 #stress #neuroscience #science
Jun 4, 2014153 notes
#chewing #motor neurons #tongue #neuroscience #science
New Amyloid-Reducing Compound Could Be a Preventive Measure Against Alzheimer’s

Scientists at NYU Langone Medical Center have identified a compound, called 2-PMAP, in animal studies that reduced by more than half levels of amyloid proteins in the brain associated with Alzheimer’s disease. The researchers hope that someday a treatment based on the molecule could be used to ward off the neurodegenerative disease since it may be safe enough to be taken daily over many years.  

“What we want in an Alzheimer’s preventive is a drug that modestly lowers amyloid beta and is also safe for long term use,” says Martin J. Sadowski, MD, PhD, associate professor of neurology, psychiatry, and biochemistry and molecular pharmacology, who led the research to be published online June 3 in the journal Annals of Neurology. “Statin drugs that lower cholesterol appear to have those properties and have made a big impact in preventing coronary artery disease. That’s essentially what many of us envision for the future of Alzheimer’s medicine.”

The 2-PMAP molecule that Dr. Sadowski’s team identified is non-toxic in mice, gets easily into the brain, and lowers the production of amyloid beta and associated amyloid deposits.

The prime target for Alzheimer’s prevention is amyloid beta. Decades before dementia begins, this small protein accumulates in clumps in the brain. Modestly lowering the production of amyloid beta in late middle age, and thus removing some of the burden from the brain’s natural clearance mechanisms, is believed to be a good prevention strategy. Researchers two years ago reported that something like this happens naturally in about 0.5 percent of Icelanders, due to a mutation they carry that approximately halves amyloid beta production throughout life. These fortunate people show a slower cognitive decline in old age, live longer, and almost never get Alzheimer’s.

Prevention of Alzheimer’s dementia is now considered more feasible than stopping it after it has begun, when brain damage is already severe. Every prospective Alzheimer’s drug in clinical trials has failed even to slow the disease process at that late stage. “The key is to prevent the disease process from going that far,” Dr. Sadowski says.

Dr. Sadowski and colleagues screened a library of compounds and found that 2-PMAP reduced the production of amyloid beta’s mother protein, known as amyloid precursor protein (APP). The APP protein normally is cut by enzymes in a way that leaves amyloid beta as one of the fragments. Dr. Sadowski’s team found that 2-PMAP, even at low, non-toxic concentrations, significantly reduced APP production in test cells, lowering amyloid beta levels by 50 percent or more.

The scientists subsequently found that 2-PMAP had essentially the same impact on APP and amyloid beta in the brains of living mice. The mice were engineered to have the same genetic mutations found in Alzheimer’s patients with a hereditary form of the disease, causing overproduction of APP and Alzheimer’s-like amyloid deposits. A five-day treatment with 2-PMAP lowered brain levels of APP and, even more so, levels of amyloid beta. Four months of treatment sharply reduced the amyloid deposits and prevented the cognitive deficits that are normally seen in these transgenic mice as they get older.

Dr. Sadowski and his laboratory are now working to make chemical modifications to the compound to improve its effectiveness. But 2-PMAP already seems to have advantages over other amyloid-lowering compounds, he says. One is that it can cross efficiently from the bloodstream to the brain, and thus doesn’t require complex modifications that might compromise its effects on APP.

The compound also appears to have a highly selective effect on APP production, by interfering with the translation of APP’s gene transcript into the APP protein itself. The best known candidates for Alzheimer’s preventives lower amyloid by inhibiting the secretase enzymes that cleave amyloid beta from APP, tending to cause unwanted side-effects via their off target interference with the processing of other client proteins cleaved by these enzymes. A clinical trial of one secretase inhibitor was halted in 2010 after it was found to worsen dementia and cause a higher incidence of skin cancer.

Alzheimer’s disease, the most common form of dementia, currently afflicts more than five million Americans, according to the Alzheimer’s Association. Unless preventive drugs or treatments are developed, the prevalence of Alzheimer’s is expected to triple by 2050.

Jun 4, 201471 notes
#alzheimer's disease #beta amyloid #dementia #amyloid precursor protein #2-PMAP #neuroscience #science
Molecular 'scaffold' could hold key to new dementia treatments

Researchers at King’s College London have discovered how a molecular ‘scaffold’ which allows key parts of cells to interact, comes apart in dementia and motor neuron disease, revealing a potential new target for drug discovery.

image

The study, published today in Nature Communications, was funded by the UK Medical Research Council, Wellcome Trust, Alzheimer’s Research UK and the Motor Neurone Disease Association.

Researchers looked at two components of cells: mitochondria, the cell ‘power houses’ which produce energy for the cell;and the endoplasmic reticulum (ER) which makes proteins and stores calcium for signalling processes in the cell. ER and mitochondria form close associations and these interactions enable a number of important cell functions. However the mechanism by which ER and mitochondria become linked has not, until now, been fully understood.

Professor Chris Miller, from the Department of Neuroscience at the Institute of Psychiatry at King’s and lead author of the paper, says: “At the molecular level, many processes go wrong in dementia and motor neuron disease,and one of the puzzles we’re faced with is whether there is a common pathway connecting these different processes. Our study suggests that the loosening of this ‘scaffold’ between the mitochondria and ER in the cell may be a key process in neurodegenerative diseases such as dementia or motor neuron disease.”

By studying cells in a dish, the researchers discovered that an ER protein called VAPB binds to a mitochondrial protein called PTPIP51, to form a ‘scaffold’ enabling ER and mitochondria to form close associations. In fact, by increasing the levels of VAPB and PTPIP51, mitochondria and ER re-organised themselves to form tighter bonds.

Many of the cell’s functions that are controlled by ER-mitochondria associations are disrupted in neurodegenerative diseases, so the researchers studied how the strength of this ‘scaffold’ was affected in these diseases. TDP-43 is a protein which is strongly linked to Amyotrophic Lateral Sclerosis (ALS, a form of motor neuron disease) and Fronto-Temporal Dementia (FTD, the second most common form of dementia), but exactly how the protein causes neurodegeneration is not properly understood.

The researchers studied how TDP-43 affected mouse cells in a dish. They found that higher levels of TDP-43 resulted in a loosening of the scaffold which reduced ER-mitochondria bonds,affecting some important cellular functions that are linked to ALS and FTD.

Professor Miller concludes: “Our findings are important in terms of advancing our understanding of basic biology, but may also provide a potential new target for developing new treatments for these devastating disorders.”

Jun 4, 201487 notes
#dementia #motor neuron disease #mitochondria #neurodegeneration #neuroscience #science
Hypnosis extends restorative slow-wave sleep

Deep sleep promotes our well-being, improves our memory and strengthens the body’s defences. Zurich and Fribourg researchers demonstrate how restorative SWS can also be increased without medication – using hypnosis.

image

Sleeping well is a crucial factor contributing to our physical and mental restoration. SWS in particular has a positive impact for instance on memory and the functioning of the immune system. During periods of SWS, growth hormones are secreted, cell repair is promoted and the defence system is stimulated. If you feel sick or have had a hard working day, you often simply want to get some good, deep sleep. A wish that you can’t influence through your own will –  so the widely held preconception.  

Sleep researchers from the Universities of Zurich and Fribourg now prove the opposite. In a study that has now been published in the scientific journal “Sleep”, they have demonstrated that hypnosis has a positive impact on the quality of sleep, to a surprising  extent. “It opens up new, promising opportunities for improving the quality of sleep without drugs”, says biopsychologist Björn Rasch who heads the study at the Psychological Institute of the University of Zurich in conjunction with the “Sleep and Learning” project*.

Brain waves ­– an indicator of sleep quality

Hypnosis is a method that can influence processes which are very difficult to control voluntarily. Patients with sleep disturbances can indeed be successfully treated with hypnotherapy. However, up to now it hadn’t been proven that this can lead to an objectively measurable change in sleep. To objectively measure sleep, electrical brain activity is recorded using an electroencephalogram (EEG). The characteristic feature of slow-wave sleep, which is deemed to have high restorative capacity, is a very even and slow  oscillation in electrical brain activity.

70 healthy young women took part in the UZH study. They came to the sleep laboratory for a 90-minute midday nap. Before falling asleep they listened to a special 13-minute slow-wave sleep hypnosis tape over loudspeakers, developed by hypnotherapist Professor Angelika Schlarb, a sleep specialist, or to a neutral spoken text. At the beginning of the experiment the subjects were divided into highly suggestible and low suggestible groups using a standard procedure (Harvard Group Scale of Hypnotic Susceptibility). Around half of the population is moderately suggestible. With this method women achieve on average higher values for hypnotic susceptibility than men. Nevertheless, the researchers expect the same positive effects on sleep for highly suggestible men.

Slow-wave sleep increased by 80 percent

In their study, sleep researchers Maren Cordi and Björn Rasch were able to prove that highly suggestible women experienced 80 percent more slow-wave sleep after listening to the hypnosis tape compared with sleep after listening to the neutral text. In parallel, time spent awake was reduced by around one-third. In contrast to highly suggestible women, low suggestible female participants did not benefit as much from hypnosis. With additional control experiments the psychologists confirmed that the beneficial impact of hypnosis on slow-wave sleep could be attributed to the hypnotic suggestion to “sleep deeper” and could not be reduced to mere expectancy effects.

According to psychologist Maren Cordi “the results may be of major importance for patients with sleep problems and for older adults. In contrast to many sleep-inducing drugs, hypnosis has no adverse side effects”. Basically, everyone who responds to hypnosis could benefit from improved sleep through hypnosis.  

* The project “Sleep and Learning” is headed by Professor Björn Rasch from the University of Fribourg and conducted at the Universities of Zurich and Fribourg. The project is financed by the Swiss National Fund and the University of Zurich (main area of clinical research “Sleep and Health”). The goal of the project is to identify psychological and neurophysiological mechanisms underlying the positive role of sleep for our memory and mental health.  

Jun 3, 2014117 notes
#sleep #brainwaves #hypnosis #slow wave sleep #brain activity #psychology #neuroscience #science
Left-handed fetuses could show effects of maternal stress on unborn babies

Fetuses are more likely to show left-handed movements in the womb when their mothers are stressed, according to new research.

image

Researchers at Durham and Lancaster universities say their findings are an indicator that maternal stress could have a temporary effect on unborn babies, adding that their research highlights the importance of reducing stress during pregnancy.

However, the researchers emphasised that their study was not evidence that maternal stress led to fixed left-handedness in infants after birth. They said that some people might be genetically predisposed to being left-handed and that there are examples where right and left-handedness can switch throughout a person’s life.

Using 4d ultrasound scans, the researchers observed 57 scans of 15 healthy fetuses, recording 342 facial touches.

The fetuses were scanned at four different stages between 24 and 36 weeks of pregnancy. Researchers also asked the mothers of these babies how much stress they had experienced in the four weeks between each of the scans.

The researchers found that the more stress mothers reported, the more frequently fetuses touched their faces with their left hands. They added that a significant number of touches by the fetuses of stressed mothers were done with their left, rather than right hands - therefore fetal touches of their own faces, indicated a left-handed tendency.

As right-handedness is more common in the general population, the researchers had expected to see more of a bias towards right-handed movements in the fetuses as they grew older. The high percentage of left-handed behaviour, observed only when mothers reported being stressed, led them to conclude that maternal stress has an effect on the lateral behaviour of the babies they scanned.

The findings are published in the journal Laterality: Asymmetries of Body, Brain and Cognition.

Lead author Dr Nadja Reissland, in Durham University’s Department of Psychology, said: “Our research suggests that stressed mothers have fetuses who touch their face relatively more with their left hand.

“This suggests maternal stress could be having on effect on the child’s behaviour in the womb and highlights the importance of reducing maternal stress in pregnancy.

“Such measures may include increased emphasis on stopping stressful work early, the inclusion of relaxation classes in pre-natal care and involvement of the whole family in the pre-natal period.

“While we observed a higher degree of left-handed behaviour in the fetuses of stressed mothers than had been expected, we are not saying that maternal stress leads to a child becoming left-handed after birth, as there could be a number of reasons for this.

“The research does suggest, however, that a fetus can detect when a mother is stressed and that it responds to this stress.”

Professor Brian Francis, of Lancaster University, emphasised that the study also showed that overall preference for left or right hand varied considerably from scan to scan within each fetus, though fetuses showed more left-hand movements when mothers reported that they had experienced stress. He said: “Overall, there was no consistent handedness preference being shown by the fetuses, with most fetuses switching in preference at least once over the four scans.”

The researchers added that while mothers were asked to report their stress levels in the four weeks between scans, in practice some might have reported the stress they were experiencing at the time of being surveyed.

Previous research has shown that maternal stress in pregnancy leads to increased levels of cortisol – a hormone produced in response to stress - in mothers that could lead to an altered preference for left-sided or right-sided behaviour in fetuses.

The current study did not assess the stress levels of fetuses and Dr Reissland said that future research could examine cortisol levels in fetuses to further determine the effect of stress on lateral behaviour.

Dr Reissland added that further research was also needed to look at whether or not maternal prenatal stress had longer-term effects on the development of infants and children after birth.

Jun 3, 2014152 notes
#laterality #handedness #maternal stress #fetus #pregnancy #psychology #neuroscience #science
Antipsychotic medication during pregnancy does affect babies

A seven-year study of women who take antipsychotic medication while pregnant, proves it can affect babies.

image

The observational study, published in the journal PLOS ONE, reveals that whilst most women gave birth to healthy babies, the use of mood stabilisers or higher doses of antipsychotics during pregnancy increased the need for special care after birth with 43 per cent of babies placed in a Special Care Nursery (SCN) or a Neonatal Intensive Care Unit (NICU), almost three times the national rate in Australia.

As well as an increased likelihood of the need for intensive care, the world-first study by experts from the Monash Alfred Psychiatry Research Centre (MAPrc) and Monash University, shows antipsychotic drugs affects babies in other ways; 18 per cent were born prematurely, 37 per cent showed signs of respiratory distress and 15 per cent developed withdrawal symptoms.

Principal investigator, Professor Jayashri Kulkarni, Director of MAPrc, said the study highlights the need for clearer health guidelines when antipsychotic drugs are taken during pregnancy.

“There’s been little research on antipsychotic medication during pregnancy and if it affects babies. The lack of data has made it very difficult for clinicians to say anything conclusively on how safe it is for babies,” Professor Kulkarni said.

“This new research confirms that most babies are born healthy, but many experience neonatal problems such as respiratory distress.”

With no existing data to draw on, MAPrc established the world-first National Register of Antipsychotic Medications in Pregnancy (NRAMP) in 2005. Women who were pregnant and taking antipsychotic medication were recruited from around Australia through clinical networks in each state and territory. In all 147 women were interviewed every six weeks during pregnancy and then followed until their babies were one year old.

Antipsychotic drugs are currently used to treat a range of psychiatric disorders including schizophrenia, major depression and bipolar disorder. About 20 per cent of Australian women experience depression in their lifetime, compared to 10 per cent of men. In Australia 25 per cent of women experience postnatal depression and 20 per cent experience severe menopausal depression.

Women have much higher rates of anxiety disorders and there are equal percentages of men and women with schizophrenia (2 per cent) and bipolar disorder (about 3 per cent).

Professor Kulkarni said the emergence of new antipsychotic drugs means that many women with a well controlled psychiatric disorder are able to contemplate having babies, but there have always been concerns about the effect of treatment on their offspring.

“The potentially harmful effects of taking an antipsychotic drug in pregnancy have to be balanced against the harm of untreated psychotic illness. The good news is we now know there are no clear associations with specific congenital abnormalities and these drugs,” Professor Kulkarni said.

“However clinicians should be particularly mindful of neonatal problems such as respiratory distress, so it’s critical that Neonatal Intensive Care Units, or Special Care Nurseries are available for these babies.”

Jun 3, 2014117 notes
#pregnancy #antipsychotics #mental illness #health
Jun 3, 2014227 notes
#epilepsy #amygdalohippocampal complex #mesial temporal lobe #seizures #mesial temporal sclerosis #neuroscience #science
Marijuana shows potential in treating autoimmune disease

A team of University of South Carolina researchers led by Mitzi Nagarkatti, Prakash Nagarkatti and Xiaoming Yang have discovered a novel pathway through which marijuana can suppress the body’s immune functions. Their research has been published online in the Journal of Biological Chemistry.

image

Marijuana is the most frequently used illicit drug in the United States, but as more states legalize the drug for medical and even recreational purposes, research studies like this one are discovering new and innovative potential health applications for the federal Schedule I drug.

Marijuana is now regularly and successfully used to alleviate the nausea and vomiting many cancer patients experience as side effects to chemotherapy, combat the wasting syndrome that causes some AIDS patients to lose significant amounts of weight and muscle mass and ease chronic pain that is unresponsive to opioids, among other applications.

The university study has uncovered yet another potential application for marijuana, in the suppression of immune response to treat autoimmune diseases. The work builds on recent scientific discoveries that the environment in which humans live can actually trigger changes that occur outside of human DNA, but nevertheless can cause alterations to the function of genes controlled by DNA. These outside molecules that have the ability to alter DNA function are known collectively as the epigenome. In this study, the investigators wanted to find out if the tetrahydrocannabinol found in marijuana has the capacity to affect DNA expression through epigenetic pathways outside of the DNA itself.

The recent findings show that marijuana THC can change critical molecules of epigenome called histones, leading to suppression of inflammation. These results suggest that one potential negative impact of marijuana smoking could be suppression of beneficial inflammation in the body. But they also suggest that, because of its epigenetic influence toward inflammation suppression, marijuana use could be efficacious in the treatment of autoimmune diseases such as arthritis, lupus, colitis, multiple sclerosis and the like, in which chronic inflammation plays a central role.

Jun 3, 2014313 notes
#marijuana #autoimmune diseases #histones #inflammation #epigenetics #science
Why inflammation leads to a leaky blood-brain barrier: MicroRNA-155

Until now, scientists have not known exactly how inflammation weakens the Blood-Brain Barrier, allowing toxins and other molecules access to the brain. A new research report appearing in the June 2014 issue of The FASEB Journal solves this mystery by showing that a molecule, called “microRNA-155,” is responsible for cleaving epithelial cells to create microscopic gaps that let material through. Not only does this discovery help explain the molecular underpinnings of diseases like multiple sclerosis, but it also opens an entirely new avenue for developing therapies that can help penetrate the Blood-Brain Barrier to deliver lifesaving drugs.

image

According to Ignacio A, Romero, Ph.D., “We are beginning to understand the mechanisms by which the barrier between the blood and the brain becomes leaky in inflammatory conditions. Based on these and other findings, drugs that reduce the leakiness of the barrier have the potential to improve symptoms in many neurological conditions.” Romero is one of the researchers involved in the work from the Department of Life, Health and Chemical Sciences of the Biomedical Research Network at The Open University in the United Kingdom.

To make this discovery, Romero and colleagues first measured microRNA-155 (miR-155) levels in cultured human cells and compared them to cells under inflammatory conditions. Researchers then measured levels in the blood vessels of inflamed brain areas of patients with multiple sclerosis (MS) and compared them to non-inflamed areas. In both cases, miR-155 was elevated in inflammation. Then, in mice, normal mice were compared with mice that were genetically altered to lose miR-155. When an inflammatory reaction was induced in these two groups of mice, the mice that could not express miR-155 had a much reduced increase in “leakiness” of the Blood-Brain Barrier than normal mice. Finally, scientists investigated in cultured human cells the mechanism by which miR-155 levels cause leakiness of the barrier and concluded that miR-155 affects the organization of the complex structures that form the tight connections between endothelial cells.

"This study has the potential to be a game-changer in terms of how we treat neurological conditions and how we deliver drugs to the brain," said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. “Since it was first discovered, the Blood-Brain Barrier has always been a touch elusive. Now, after careful analysis, we are learning exactly how our bodies keep our brains safe and that microRNA-155 is a key player.”

Jun 3, 2014191 notes
#science #inflammation #blood brain barrier #microRNA-155 #MS #medicine
Next page →
20132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
201220132014
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December
20122013
  • January
  • February
  • March
  • April
  • May
  • June
  • July
  • August
  • September
  • October
  • November
  • December