Posts tagged touch

Posts tagged touch
“Where does it hurt?” is the first question asked to any person in pain.
A new UCL study defines for the first time how our ability to identify where it hurts, called “spatial acuity”, varies across the body, being most sensitive at the forehead and fingertips.

Using lasers to cause pain to 26 healthy volunteers without any touch, the researchers produced the first systematic map of how acuity for pain is distributed across the body. The work is published in the journal Annals of Neurology and was funded by the Wellcome Trust.
With the exception of the hairless skin on the hands, spatial acuity improves towards the centre of the body whereas the acuity for touch is best at the extremities. This spatial pattern was highly consistent across all participants.
The experiment was also conducted on a rare patient lacking a sense of touch, but who normally feels pain. The results for this patient were consistent with those for healthy volunteers, proving that acuity for pain does not require a functioning sense of touch.
“Acuity for touch has been known for more than a century, and tested daily in neurology to assess the state of sensory nerves on the body. It is striking that until now nobody had done the same for pain,” says lead author Dr Flavia Mancini of the UCL Institute of Cognitive Neuroscience. “If you try to test pain with a physical object like a needle, you are also stimulating touch. This clouds the results, like taking an eye test wearing sunglasses. Using a specially-calibrated laser, we stimulate only the pain nerves in the upper layer of skin and not the deeper cells that sense touch.”
Volunteers were blindfolded and had specially-calibrated pairs of lasers targeted at various parts of their body. These lasers cause a brief sensation of pinprick pain. Sometimes only one laser would be activated, and sometimes both would be, unknown to participants. They were asked whether they felt one ‘sting’ or two, at varying distances between the two beams. The researchers recorded the minimum distance between the beams at which people were able to accurately say whether it was one sting or two.
“This measure tells us how precisely people can locate the source of pain on different parts of their body,” explains senior author Dr Giandomenico Iannetti of the UCL Department of Neuroscience, Physiology and Pharmacology. “Touch and pain are mediated by different sensory systems. While tactile acuity has been well studied, pain acuity has been largely ignored, beyond the common textbook assertion that pain has lower acuity than touch. We found the opposite: acuity for touch and pain are actually very similar. The main difference is in their gradients across the body. For example, pain acuity across the arm is much higher at the shoulder than at the wrist, whereas the opposite is true for touch.”
Acuity for both touch and pain normally correlates with the density of the relevant nerve fibres in each part of the body. However, the fingertips remain highly sensitive despite having a low density of pain-sensing nerve cells.
“The high pain acuity of the fingertips is something of a mystery that requires further investigation,” says Dr Mancini. “This may be because people regularly use their fingertips, and so the central nervous system may learn to process the information accurately.”
The findings have important implications for the assessment of both acute and chronic pain. Dr Roman Cregg of the UCL Centre for Anaesthesia, who was not involved in the research, is a clinical expert who treats patients with chronic pain.
“Chronic pain affects around 10 million people in the UK each year according to the British Pain Society, but we still have no reliable, reproducible way to test patients’ pain acuity,” says Dr Cregg. “This method offers an exciting, non-invasive way to test the state of pain networks across the body. Chronic pain is often caused by damaged nerves, but this is incredibly difficult to monitor and to treat. The laser method may enable us to monitor nerve damage across the body, offering a quantitative way to see if a condition is getting better or worse. I am excited at the prospect of taking this into the clinic, and now hope to work with Drs Mancini and Iannetti to translate their study to the chronic pain setting.”
(Source: ucl.ac.uk)

Researchers examine how touch can trigger our emotions
While touch always involves awareness, it also sometimes involves emotion. For example, picking up a spoon triggers no real emotion, while feeling a gentle caress often does. Now, scientists in the Cell Press journal Neuron describe a system of slowly conducting nerves in the skin that respond to such gentle touch. Using a range of scientific techniques, investigators are beginning to characterize these nerves and to describe the fundamental role they play in our lives as a social species—from a nurturing touch to an infant to a reassuring pat on the back. Their work also suggests that this soft touch wiring may go awry in disorders such as autism.
The nerves that respond to gentle touch, called c-tactile afferents (CTs), are similar to those that detect pain, but they serve an opposite function: they relay events that are neither threatening nor tissue-damaging but are instead rewarding and pleasant.
"The evolutionary significance of such a system for a social species is yet to be fully determined," says first author Francis McGlone, PhD, of Liverpool John Moores University in England. "But recent research is finding that people on the autistic spectrum do not process emotional touch normally, leading us to hypothesize that a failure of the CT system during neurodevelopment may impact adversely on the functioning of the social brain and the sense of self."
For some individuals with autism, the light touch of certain fabrics in clothing can cause distress. Temple Grandin, an activist and assistant professor of animal sciences at Colorado State University who has written extensively on her experiences as an individual with autism, has remarked that her lack of empathy in social situations may be partially due to a lack of “comforting tactual input.” Professor McGlone also notes that deficits in nurturing touch during early life could have negative effects on a range of behaviors and psychological states later in life.
Further research on CTs may help investigators develop therapies for autistic patients and individuals who lacked adequate nurturing touch as children. Also, a better understanding of how nerves that relay rewarding sensations interact with those that signal pain could provide insights into new treatments for certain types of pain.
Professor McGlone believes that possessing an emotional touch system in the skin is as important to well-being and survival as having a system of nerves that protect us from harm. “In a world where human touch is becoming more and more of a rarity with the ubiquitous increase in social media leading to non-touch-based communication, and the decreasing opportunity for infants to experience enough nurturing touch from a carer or parent due to the economic pressures of modern living, it is becoming more important to recognize just how vital emotional touch is to all humankind.”
By solving a long standing scientific mystery, the common saying “you just hit a nerve” might need to be updated to “you just hit a Merkel cell,” jokes Jianguo Gu, PhD, a pain researcher at the University of Cincinnati (UC).
That’s because Gu and his research colleagues have proved that Merkel cells— which contact many sensory nerve endings in the skin—are the initial sites for sensing touch.

"Scientists have spent over a century trying to understand the function of this specialized skin cell and now we are the first to know … we’ve proved the Merkel cell to be a primary point of tactile detection," Gu, principal investigator and a professor in UC’s department of anesthesiology, says of their research study published in the April 15 edition of Cell, a leading scientific journal.
Of all the five senses, touch, Gu says, has been the least understood by science—especially in relation to the Merkel cell, discovered by Friedrich Sigmund Merkel in 1875.
"It’s been a great debate because for over two centuries nobody really knew what function this cell had," Gu says, adding that while some scientists—including him—suspected that the Merkel cell was related to touch because of the high abundance of these cells in the ridges of fingertips, the lips and other touch sensitive spots throughout the body; others dismissed the cell as not related to sensing touch at all.
To prove their hypothesis that Merkel cells were indeed the very foundation of touch, Gu’s team—which included UC postgraduate fellow Ryo Ikeda, PhD—studied Merkel cells in rat whisker hair follicles , because the hair follicles are functionally similar to human fingertips and have high abundance of Merkel cells. What they found was that the cells immediately fired up in response to gentle touch of whiskers.
"There was a marked response in Merkel cells; the recording trace ‘spiked’. With non-Merkel cells you don’t get anything," says Ikeda.
What they also found, and of equal importance, both say, was that gentle touch makes Merkel cells to fire “action potentials” and this mechano-electrical transduction was through a receptor/ion channel called the Piezo2.
"The implications here are profound," Gu says, pointing to the clinical applications of treating and preventing disease states that affect touch such as diabetes and fibromyalgia and pathological conditions such as peripheral neuropathy. Abnormal touch sensation, he says, can also be a side effect of many medical treatments such as with chemotherapy.
The discovery also has relevance to those who are blind and rely on touch to navigate a sighted world.
"This is a paradigm shift in the entire field," Gu says, pointing to touch as also indispensable for environmental exploration, tactile discrimination and other tasks in life such as modern social interaction.
"Think of the cellphone. You can hardly fit into social life without good touch sensation."
(Source: eurekalert.org)

Scientists Provide New Grasp of Soft Touch
A study led by scientists at The Scripps Research Institute (TSRI) has helped solve a long-standing mystery about the sense of touch.
The “gentle touch” sensations that convey the stroke of a finger, the fine texture of something grasped and the light pressure of a breeze on the skin are brought to us by nerves that often terminate against special skin cells called Merkel cells. These skin cells’ role in touch sensation has long been debated in the scientific community. The new study, however, suggests a dual-sensor system involving the Merkel cell and an associated nerve end in touch sensation.
“In this long debate over the role of Merkel cells, it appears that both camps were right,” said the study’s senior author Ardem Patapoutian, a Howard Hughes Medical Institute (HHMI) Investigator and professor at TSRI’s Dorris Neuroscience Center and Department of Molecular & Cellular Neuroscience. “The nerve ends respond to touch, but so do the adjacent Merkel cells.”
The report appears in an Advance Online Publication of Nature on April 6, 2014.
In addition to elucidating the mammalian sense of touch, whose mechanisms until recently have been obscure, the findings could have relevance for certain pain syndromes in which touch sensations trigger pain—even the light pressure of a shirt on the skin or a breeze against the skin.
“Touch and pain are very closely related,” said Patapoutian, “and thus the characterization of these mechanisms of touch should help us to understand pain better too.”
Opening the Flow
The discovery comes four years after the Patapoutian laboratory identified a protein called Piezo2 as a mechanically activated “ion channel” protein with a likely role in touch sensation.
Ion channels are embedded in the outer membranes of various cell types and nerve fibers throughout the body. Piezo2 ion channels have been thought to respond to the stretching of the nerve membrane where they are embedded—a stretching caused by something that presses against the skin, for example.
When activated in this way, the ion channels open to allow an inflow of sodium or other positively charged ions. Such a surge of electrical charge into a nerve can initiate a signal that travels up the nerve and to the brain via a relay of neurons along the spine.
In the earlier study, Patapoutian’s team found evidence that Piezo2 proteins are made within touch-sensing neurons, including gentle-touch neurons that extend their nerves into the skin and against the mysterious Merkel cells.
In the new study, Patapoutian and his colleagues set out to learn more.
In Pursuit of Answers
The team began by creating a line of mice in which the activity of the Piezo2 gene also causes the production of a fluorescing protein called GFP. Guided by these fluorescent beacons as well as other markers, they found a high concentration of Piezo2 in Merkel cells in the skin of the mice.
“You can easily miss Piezo2 expression in the skin, because it’s not highly expressed there, aside from the tiny population of Merkel cells,” said first author Seung-Hyun Woo, a postdoctoral fellow in the Patapoutian laboratory.
Next the researchers sought proof of Piezo2’s role in Merkel cells, essentially by subtracting the protein from those cells and observing the result. To do this—a particularly challenging feat—they created a new line of mice in which the Piezo2 gene is specifically “knocked out” of all skin cells, including Merkel cells, but left intact everywhere else where it is ordinarily produced.
Piezo2 skin-knockout mice and their Merkel cells appeared normal. The mice also responded normally on most standard tests of touch and pain sensitivity. But on the so-called von Frey test, in which thin, bendable fibers are pressed against the mice’s paws with varying force, the effect of the loss of Piezo2 became apparent. “The mice whose Merkel cells lacked Piezo2 didn’t respond to the gentler forces as much as the control mice did,” said Woo.
Examining this change in responsiveness in more detail, Woo and her colleagues isolated Merkel cells from the two groups of mice. They found that those Merkel cells lacking Piezo2 failed to show the usual current flows when gently pushed with a probe.
Collaborating researchers in the laboratory of Cheryl L. Stucky at the Medical College of Wisconsin showed that gentle touch-sensing nerves known as slowly adapting (SA) Aβ fibers generally responded with a lower frequency of signaling in the mice lacking Piezo2 in Merkel cells. Another collaborating laboratory, led by Ellen A. Lumpkin at Columbia University, showed that Merkel cell-associated nerves also responded less durably to test stimuli on skin in these same mice.
“It all shows that the Merkel cells play an important role in touch sensing and that they need Piezo2 to do so,” Woo said.
The findings were bolstered by a separate study from Lumpkin’s laboratory—of which Patapoutian is a co-author—that is reported in the same issue of Nature. In that study, mice engineered to lack Merkel cells exhibited touch-sensing deficits very similar to those described in the Patapoutian group’s study.
(Image: iStockphoto)
Scientists Identify Key Cells in Touch Sensation
In a study published online today in the journal Nature, a team of Columbia University Medical Center researchers led by Ellen Lumpkin, PhD, associate professor of somatosensory biology, solve an age-old mystery of touch: how cells just beneath the skin surface enable us to feel fine details and textures.
Touch is the last frontier of sensory neuroscience. The cells and molecules that initiate vision—rod and cone cells and light-sensitive receptors—have been known since the early 20th century, and the senses of smell, taste, and hearing are increasingly understood. But almost nothing is known about the cells and molecules responsible for initiating our sense of touch.
This study is the first to use optogenetics—a new method that uses light as a signaling system to turn neurons on and off on demand—on skin cells to determine how they function and communicate.
The team showed that skin cells called Merkel cells can sense touch and that they work virtually hand in glove with the skin’s neurons to create what we perceive as fine details and textures.
“These experiments are the first direct proof that Merkel cells can encode touch into neural signals that transmit information to the brain about the objects in the world around us,” Dr. Lumpkin said.
The findings not only describe a key advance in our understanding of touch sensation, but may stimulate research into loss of sensitive-touch perception.
Several conditions—including diabetes and some cancer chemotherapy treatments, as well as normal aging—are known to reduce sensitive touch. Merkel cells begin to disappear in one’s early 20s, at the same time that tactile acuity starts to decline. “No one has tested whether the loss of Merkel cells causes loss of function with aging—it could be a coincidence—but it’s a question we’re interested in pursuing,” Dr. Lumpkin said.
In the future, these findings could inform the design of new “smart” prosthetics that restore touch sensation to limb amputees, as well as introduce new targets for treating skin diseases such as chronic itch.
The study was published in conjunction with a second study by the team done in collaboration with the Scripps Research Institute. The companion study identifies a touch-activated molecule in skin cells, a gene called Piezo2, whose discovery has the potential to significantly advance the field of touch perception.
“The new findings should open up the field of skin biology and reveal how sensations are initiated,” Dr. Lumpkin said. Other types of skin cells may also play a role in sensations of touch, as well as less pleasurable skin sensations, such as itch. The same optogenetics techniques that Dr. Lumpkin’s team applied to Merkel cells can now be applied to other skin cells to answer these questions.
“It’s an exciting time in our field because there are still big questions to answer, and the tools of modern neuroscience give us a way to tackle them,” she said.
Research in mouse whiskers reveals signal pathway from touch neuron to brain

Human fingertips have several types of sensory neurons that are responsible for relaying touch signals to the central nervous system. Scientists have long believed these neurons followed a linear path to the brain with a “labeled-lines” structure.
But new research on mouse whiskers from Duke University reveals a surprise — at the fine scale, the sensory system’s wiring diagram doesn’t have a set pattern. And it’s probably the case that no two people’s touch sensory systems are wired exactly the same at the detailed level, according to Fan Wang, Ph.D., an associate professor of neurobiology in the Duke Medical School.
The results, which appear online in Cell Reports, highlight a “one-to-many, many-to-one” nerve connectivity strategy. Single neurons send signals to multiple potential secondary neurons, just as signals from many neurons can converge onto a secondary neuron. Many such connections are likely formed by chance, Wang said. This connectivity scheme allows the touch system to have many possible combinations to encode a large repertoire of textures and forms.
"We take our sense of touch for granted," Wang said. "When you speak, you are not aware of the constant tactile feedback from your tongue and teeth. Without such feedback, you won’t be able to say the words correctly. When you write with a pen, you’re mostly unaware of the sensors telling you how to move it."
It’s not feasible to visualize the touch pathways in the human brain at high resolutions. So, Wang and her collaborators from the University of Tsukuba in Japan and the Friedrich Miescher Institute for Biomedical Research in Switzerland used the whiskers of laboratory mice to map how distinct sensor neurons, presumably detecting different mechanical stimuli, are connected to signal the brain. When the sensory neurons are activated, they send the signal along an axon — a long, slender nerve fiber than conducts electric impulses to the brain. The researchers traced signals running the long path from the mouse’s whiskers to the brain.
Wang’s group used a combination of genetic engineering and fluorescent tags delivered by viruses to color-code four different kinds of neurons and map their connections.
Earlier work by Wang and others had found that all of the 100 to 200 sensors associated with a single whisker project their axons to a large structure representing that whisker in the brain. Each whisker has its own neural representation structure.
"People have thought that within the large whisker-representing structure, there will be finer-scale, labeled lines," Wang said. "In other words, different touch sensors would send information through separate parallel pathways, into that large structure. But surprisingly, we did not find such organized pathways. Instead, we found a completely unorganized mosaic pattern of connections within the large structure. Information from different sensors is intermixed already at the first relay station inside the brain."
Wang said the next step will be to stimulate the labeled circuits in different ways to see how impulses travel the network.
"We want to figure out the exact functions and signals transmitted by different sensors during natural tactile behaviors and determine their exact roles on the perception of textures," she said.
(Source: today.duke.edu)
A Blueprint for Restoring Touch with a Prosthetic Hand
New research at the University of Chicago is laying the groundwork for touch-sensitive prosthetic limbs that one day could convey real-time sensory information to amputees via a direct interface with the brain.
The research, published early online in the Proceedings of the National Academy of Sciences, marks an important step toward new technology that, if implemented successfully, would increase the dexterity and clinical viability of robotic prosthetic limbs.
“To restore sensory motor function of an arm, you not only have to replace the motor signals that the brain sends to the arm to move it around, but you also have to replace the sensory signals that the arm sends back to the brain,” said the study’s senior author, Sliman Bensmaia, PhD, assistant professor in the Department of Organismal Biology and Anatomy at the University of Chicago. “We think the key is to invoke what we know about how the brain of the intact organism processes sensory information, and then try to reproduce these patterns of neural activity through stimulation of the brain.”
Bensmaia’s research is part of Revolutionizing Prosthetics, a multi-year Defense Advanced Research Projects Agency (DARPA) project that seeks to create a modular, artificial upper limb that will restore natural motor control and sensation in amputees. Managed by the Johns Hopkins University Applied Physics Laboratory, the project has brought together an interdisciplinary team of experts from academic institutions, government agencies and private companies.
Bensmaia and his colleagues at the University of Chicago are working specifically on the sensory aspects of these limbs. In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, they indentified patterns of neural activity that occur during natural object manipulation and then successfully induced these patterns through artificial means.
The first set of experiments focused on contact location, or sensing where the skin has been touched. The animals were trained to identify several patterns of physical contact with their fingers. Researchers then connected electrodes to areas of the brain corresponding to each finger and replaced physical touches with electrical stimuli delivered to the appropriate areas of the brain. The result: The animals responded the same way to artificial stimulation as they did to physical contact.
Next the researchers focused on the sensation of pressure. In this case, they developed an algorithm to generate the appropriate amount of electrical current to elicit a sensation of pressure. Again, the animals’ response was the same whether the stimuli were felt through their fingers or through artificial means.
Finally, Bensmaia and his colleagues studied the sensation of contact events. When the hand first touches or releases an object, it produces a burst of activity in the brain. Again, the researchers established that these bursts of brain activity can be mimicked through electrical stimulation.
The result of these experiments is a set of instructions that can be incorporated into a robotic prosthetic arm to provide sensory feedback to the brain through a neural interface. Bensmaia believes such feedback will bring these devices closer to being tested in human clinical trials.
“The algorithms to decipher motor signals have come quite a long way, where you can now control arms with seven degrees of freedom. It’s very sophisticated. But I think there’s a strong argument to be made that they will not be clinically viable until the sensory feedback is incorporated,” Bensmaia said. “When it is, the functionality of these limbs will increase substantially.”
Babies learn how to anticipate touch while in the womb, according to new research.

Using 4-d scans psychologists at Durham and Lancaster universities found, for the first time, that fetuses were able to predict, rather than react to, their own hand movements towards their mouths as they entered the later stages of gestation compared to earlier in a pregnancy.
The Durham-led team of researchers said that the latest findings could improve understanding about babies, especially those born prematurely, their readiness to interact socially and their ability to calm themselves by sucking on their thumb or fingers.
They said the results could also be a potential indicator of how prepared babies are for feeding.
The researchers carried out a total of 60 scans of 15 healthy fetuses at monthly intervals between 24 weeks and 36 weeks gestation.
Fetuses in the earlier stage of gestation more frequently touched the upper part and sides of their heads.
As the fetuses matured they began to increasingly touch the lower, more sensitive, part of their faces including their mouths.
By 36 weeks a significantly higher proportion of fetuses were observed opening their mouths before touching them, suggesting that later in pregnancy they were able to anticipate that their hands were about to touch their mouths, rather than reacting to the touch of their hands, the researchers said.
Increased sensitivity around a fetus’ mouth at this later stage of pregnancy could mean that they have more “awareness” of mouth movement, they added.
Previous theories have suggested that movement in sequence could form the basis for the development of intention in fetuses.
The researchers said their findings could potentially be an indicator of healthy development, as arguably fetuses who are delayed in this development due to illness, such as growth restriction, might not show the same behaviour observed during the study.
The research, published in the journal Developmental Psychobiology, involved eight girls and seven boys and the researchers noticed no difference in behaviour between boys and girls.
Lead author Dr Nadja Reissland, in the Department of Psychology, at Durham University, said: “Increased touching of the lower part of the face and mouth in fetuses could be an indicator of brain development necessary for healthy development, including preparedness for social interaction, self-soothing and feeding.
“What we have observed are sequential events, which show maturation in the development of fetuses, which is the basis for life after birth.
“The findings could provide more information about when babies are ready to engage with their environment, especially if born prematurely.”
Brian Francis, Professor of Social Statistics at Lancaster, added: “This effect is likely to be evolutionally determined, preparing the child for life outside the womb. Building on these findings, future research could lead to more understanding about how the child is prepared prenatally for life, including their ability to engage with their social environment, regulate stimulation and being ready to take a breast or bottle.”
The study builds on previous research by Durham and Lancaster into fetal development. Earlier this year another of their studies showed that unborn babies practise facial expressions in the womb in what is thought to be preparation for communicating after birth.
And in 2012 Dr Reissland published research showing that unborn babies yawn in the womb, suggesting that yawning is a developmental process which could potentially give doctors another index of a fetus’ health.
The Star-Nosed Mole Reveals Clues to the Molecular Basis of Mammalian Touch
Little is known about the molecular mechanisms underlying mammalian touch transduction. To identify novel candidate transducers, we examined the molecular and cellular basis of touch in one of the most sensitive tactile organs in the animal kingdom, the star of the star-nosed mole. Our findings demonstrate that the trigeminal ganglia innervating the star are enriched in tactile-sensitive neurons, resulting in a higher proportion of light touch fibers and lower proportion of nociceptors compared to the dorsal root ganglia innervating the rest of the body. We exploit this difference using transcriptome analysis of the star-nosed mole sensory ganglia to identify novel candidate mammalian touch and pain transducers. The most enriched candidates are also expressed in mouse somatosesensory ganglia, suggesting they may mediate transduction in diverse species and are not unique to moles. These findings highlight the utility of examining diverse and specialized species to address fundamental questions in mammalian biology.
Honey bees trained to stick out their tongues for science
Biologists at Bielefeld University have trained honey bees to stick out their tongues when their antennae touch an object.
The tactile conditioning study was conducted by a team from the lab of Volker Dürr, professor for biological cybernetics at Bielefeld, and will allow researchers to investigate how the honey bees use touch in pattern recognition and sense memory.
"We work with honey bees because they are an important model system for behavioural biology and neurobiology," explained Dürr. "They can be trained. If you can train an insect to respond to a certain stimulus, then you can ask the bees questions in the form of ‘Is A like B? If so, stick your tongue out’."
The process by which a bee sticks out its tongue when faced with a stimulus is known as the proboscis extension response. It can be conditioned in the bees as a response to a particular textured surface using sugar water. Each time a harnessed honey bee’s antennae touched the surface, the bee was given sugar water. Eventually the bee extends its tongue whenever it touches the right surface.
Currently the biologists are hoping to use the response to find out more about how bees use antennae movements to gather information about their surroundings.
"It is clear that if a bee touches something with an antenna, a finely textured structure, the bee has to move it to get the information it wants," adds Dürr. "We don’t fully understand the relevance of this movement."