Posts tagged perception

Posts tagged perception
Would you believe your hand could turn into marble?
Bielefeld neuroscientists present a new bodily illusion
Our bodies are made of flesh and bones. We all know this, and throughout our daily lives, all our senses constantly provide converging information about this simple, factual truth. But is this always the case? A new study by Irene Senna from Bielefeld University’s Center of Excellence CITEC and her colleagues reports a surprising bodily illusion demonstrating how we can rapidly update our assumptions about the material qualities of our bodies based on recent multisensory perceptual experience. The study was published in the international scientific journal PLOS ONE on 13 March 2014.
To induce an illusory perception of the material properties of the hand, a group of neuroscientists from Bielefeld University, the Max-Planck Institute for Biological Cybernetics (Germany), and the University of Milano-Bicocca (Italy) asked volunteers to sit with their hands lying on a table in front of them. They repeatedly hit the participants’ right hands gently with a small hammer while replacing the natural sound of the hammer against the skin with the sound of a hammer hitting a piece of marble. Within minutes, hands started feeling stiffer, heavier, harder, less sensitive, and unnatural. Moreover, when approached by a threatening stimulus (a needle that the experimenter moved near their hands), participants showed an enhanced Galvanic skin response, thus demonstrating increased physiological arousal.
To perceive our bodies and the world around us, our brains constantly combine information from different senses with prior knowledge retrieved from memory. However, unlike most bodily properties that frequently change over time (such as posture and position), our body material never changes. Hence, in principle, it would be unnecessary for the brain to constantly try to infer it.
This novel bodily illusion, the ‘Marble-Hand Illusion’, demonstrates that the perceived material of our body, surely the most stable attribute of our bodily self, can quickly be updated through multisensory integration. What is more, it shows that even impact sounds of non-biological materials – such as marble and metal – can be attributed consistently to the body, as if its core material could indeed be modified. This surprising perceptual plasticity might help to explain why tools and prostheses can merge so easily into our body schemas despite being made of non-biological materials.
Why Do Our Brains Sometime Mess Up Simple Calculations?
If the human brain is comparable to a computer, why does it so often make mistakes that its electronic counterpart does not? New research suggests it all has to do with how various problems are presented.
Scientists typically like to make this comparison because both the human brain and a computer typically follow a set of rules in which to make decisions, communicate and perform other tasks. However, University of Wisconsin-Madison cognitive scientist and psychology professor Gary Lupyan said people can get tripped up on even the simplest logic problems because they get caught up in contextual information.
For example, even a simple challenge like determining whether or not a number is odd or even can be tricky, under the right circumstances. Lupyan said that there is a significant minority of people, even if they are well-educated, that can mistake a number such as 798 for an odd number – because, even though deep down we know that only the last number is used to determine whether it is even or odd, we can be fooled by the presence of two odd numbers.
“Most of us would attribute an error like that to carelessness, or not paying attention, but some errors may appear more often because our brains are not as well equipped to solve purely rule-based problems,” the professor, whose work appears in a recent edition of the journal Cognition, explained in a statement Friday.
In multiple trials involving such tasks as sorting numbers, shapes and even people into easy categories like evens, triangles and grandmothers, Lupyan found study participants often broke simple rules based on context.
For instance, when asked to consider a contest that was only open to grandmothers and that each eligible individual had an equal chance of winning, the subjects believed a 68-year-old woman with six grandchildren was more likely to emerge victorious than a 39-year-old female with one single, newborn grandchild.
“Even though people can articulate the rules, they can’t help but be influenced by perceptual details,” he explained. “Thinking of triangles tends to involve thinking of typical, equilateral sorts of triangles. It is difficult to focus on just the rules that make a shape a triangle, regardless of what it looks like exactly.”
Lupyan said that in many cases, not only is overlooking these types of rules overly detrimental, but doing so can actually be beneficial when it comes to evaluating unfamiliar things. The lone exception, he said, is when it comes to mathematics, where rules are unequivocally necessary in order to achieve a successful outcome.
“After all, although some people may mistakenly think that 798 is an odd number, not only can people follow such rules – though not always perfectly – we are capable of building computers that can execute such rules perfectly,” Lupyan said. “That itself required very precise, mathematical cognition. A big question is where this ability comes from and why some people are better at formal rules than other people.”
He added this issue could be especially important to math and science teachers: “Students approach learning with biases shaped both by evolution and day-to-day experience. Rather than treating errors as reflecting lack of knowledge or as inattention, trying to understand their source may lead to new ways of teaching rule-based systems while making use of the flexibility and creative problem solving at which humans excel.”

A difference at the smallest level of DNA — one amino acid on one gene — can determine whether you find a given smell pleasant. A different amino acid on the same gene in your friend’s body could mean he finds the same odor offensive, according to researchers at Duke University.
There are about 400 genes coding for the receptors in our noses, and according to the 1000 Genomes Project, there are more than 900,000 variations of those genes. These receptors control the sensors that determine how we smell odors. A given odor will activate a suite of receptors in the nose, creating a specific signal for the brain.
But the receptors don’t work the same for all of us, said Hiroaki Matsunami, Ph.D., associate professor of molecular genetics and microbiology at the Duke University School of Medicine. In fact, when comparing the receptors in any two people, they should be about 30 percent different, said Matsunami, who is also a member of the Neurobiology Graduate Program and the Duke Institute for Brain Sciences.
"There are many cases when you say you like the way something smells and other people don’t. That’s very common," Matsunami said. But what the researchers found is that no two people smell things the same way. "We found that individuals can be very different at the receptor levels, meaning that when we smell something, the receptors that are activated can be very different (from one person to the next) depending on your genome."
The study didn’t look at the promoter regions of the genes, which are highly variable, or gene copy number variation, which is very high in odor receptors, so the 30 percent figure for the difference between individuals is probably conservative, Matsunami said.
While researchers had earlier identified the genes that encode for odor receptors, it has been a mystery how the receptors are activated, Matsunami said. To determine what turns the receptors on, his team cloned more than 500 receptors each from 20 people that had slight variations of only one or two amino acids and systematically exposed them to odor molecules that might excite the receptors.
By exposing each receptor to a very small concentration — 1, 10, or 100 micromoles — of 73 odorants, such as vanillin or guaiacol, the group was able to identify 27 receptors that had a significant response to at least one odorant. This finding, published in the December issue of Nature Neuroscience, doubles the number of known odorant-activated receptors, bringing the number to 40.
Matsunami said this research could have a big impact for the flavors, fragrance, and food industries.
"These manufacturers all want to know a rational way to produce new chemicals of interest, whether it’s a new perfume or new-flavored ingredient, and right now there’s no scientific basis for doing that," he said. "To do that, we need to know which receptors are being activated by certain chemicals and the consequences of those activations in terms of how we feel and smell."
Monkeys “understand” rules underlying language musicality
Many of us have mixed feelings when remembering painful lessons in German or Latin grammar in school. Languages feature a large number of complex rules and patterns: using them correctly makes the difference between something which “sounds good”, and something which does not. However, cognitive biologists at the University of Vienna have shown that sensitivity to very simple structural and melodic patterns does not require much learning, or even being human: South American squirrel monkeys can do it, too.
Language and music are structured systems, featuring particular relationships between syllables, words and musical notes. For instance, implicit knowledge of the musical and grammatical patterns of our language makes us notice right away whether a speaker is native or not. Similarly, the perceived musicality of some languages results from dependency relations between vowels within a word. In Turkish, for example, the last syllable in words like “kaplanlar” or “güller” must “harmonize” with the previous vowels. (Try it yourself: “güllar” requires more movement and does not sound as good as “güller”.)
Similar “dependencies” between words, syllables or musical notes can be found in languages and musical cultures around the world. The biological question is whether the ability to process dependencies evolved in human cognition along with human language, or is rather a more general skill, also present in other animal species who lack language.
Andrea Ravignani, a PhD candidate at the Department of Cognitive Biology at the University of Vienna, and his colleagues looked for this “dependency detection” ability in squirrel monkeys, small arboreal primates living in Central and South America. Inspired by the monkeys’ natural calls and hearing predispositions, the researchers designed a sort of “musical system” for monkeys. These “musical patterns” had overall acoustic features similar to monkeys’ calls, while their structural features mimicked syntactic or phonological patterns like those found in Turkish and many human languages.
Monkeys were first presented with “phrases” containing structural dependencies, and later tested using stimuli either with or without dependencies. Their reactions were measured using the “violation of expectations” paradigm. “Show up at work in your pyjamas, people will turn around and stare at you, while at a slumber party nobody will notice”, explains Ravignani: In other words, one looks longer at something that breaks the “standard” pattern. “This is not about absolute perception, rather how something is categorized and contrasted within a broader system.” Using this paradigm, the scientists found that monkeys reacted more to the “ungrammatical” patterns, demonstrating perception of dependencies. “This kind of experiment is usually done by presenting monkeys with human speech: Designing species-specific, music-like stimuli may have helped the squirrel monkeys’ perception”, argues primatologist and co-author Ruth Sonnweber.
"Our ancestors may have already acquired this simple dependency-detection ability some 30 million years ago, and modern humans would thus share it with many other living primates. Mastering basic phonological patterns and syntactic rules is not an issue for squirrel monkeys: the bar for human uniqueness has to be raised", says Ravignani: "This is only a tiny step: we will keep working hard to unveil the evolutionary origins and potential connections between language and music".
Carbonation Alters the Mind’s Perception of Sweetness
Carbonation, an essential component of popular soft drinks, alters the brain’s perception of sweetness and makes it difficult for the brain to determine the difference between sugar and artificial sweeteners, according to a new article in Gastroenterology, the official journal of the American Gastroenterological Association.
"This study proves that the right combination of carbonation and artificial sweeteners can leave the sweet taste of diet drinks indistinguishable from normal drinks," said study author, Rosario Cuomo, associate professor, gastroenterology, department of clinical medicine and surgery, "Federico II" University, Naples, Italy. "Tricking the brain about the type of sweet could be advantageous to weight loss — it facilitates the consumption of low-calorie drinks because their taste is perceived as pleasant as the sugary, calorie-laden drink."
The study identifies, however, that there is a downside to this effect; the combination of carbonation and sugar may stimulate increased sugar and food consumption since the brain perceives less sugar intake and energy balance is impaired. This interpretation might better explain the prevalence of eating disorders, metabolic diseases and obesity among diet-soda drinkers.
Investigators used functional magnetic resonance imaging to monitor changes in regional brain activity in response to naturally or artificially sweetened carbonated beverages. The findings were a result of the integration of information on gastric fullness and on nutrient depletion conveyed to the brain.
Future studies combining analysis of carbonation effect on sweetness detection in taste buds and responses elicited by the carbonated sweetened beverages in the gastrointestinal cavity will be required to further clarify the puzzling link between reduced calorie intake with diet drinks and increased incidence of obesity and metabolic diseases.
Variation in bitter receptor mRNA expression affects taste perception
Do you love chomping on raw broccoli while your best friend can’t stand the healthy veggie in any form or guise? Part of the reason may be your genes, particularly your bitter taste genes.
Over the past decade, scientists at the Monell Center and elsewhere have made headway in understanding how variants of bitter taste receptor genes can help account for how people differ with regard to taste perception and food choice.
However, some perplexing pieces of the puzzle remained, as two people with exactly the same genetic makeup can still differ markedly regarding how bitter certain foods and liquids taste to them.
Now, findings from Monell reveal that a person’s sensitivity to bitter taste is shaped not only by which taste genes that person has, but also by how much messenger RNA – the gene’s instruction guide that tells a taste cell to build a specific receptor – their cells make.
Under normal circumstances, people whose taste receptor cells make more messenger RNA (mRNA) for a given gene make more of the encoded receptor.
The findings add a new level of complexity to our understanding of the cellular mechanisms of taste perception, which may ultimately lend insight into individual differences in food preferences and dietary choices.
"The amount of messenger RNA that taste cells choose to make may be the missing link in explaining why some people with ‘moderate-taster’ genes still are extremely sensitive to bitterness in foods and drinks," said Monell taste geneticist Danielle Reed, PhD, who is an author on the study.
In the study, reported online in the American Journal of Clinical Nutrition, small biopsies of papillae – the little bumps on the tongue that contain taste receptors – were taken from 18 people known to have the same moderate-taster (heterozygous) genotype for the TASR38 bitter taste receptor and the amount of mRNA expression for this genotype was measured.
Before the biopsy, people rated the intensity of various bitter and non-bitter solutions, including broccoli juice. Even though the subjects had the same ‘middle-of-the road’ genotype, their responses to some of the bitter substances varied over four orders of magnitude. Analyses revealed a direct relationship between mRNA expression and bitterness ratings of broccoli juice, with subjects having the most mRNA rating the juice as most bitter.
"The next step involves learning more about what causes these individual differences in mRNA expression; does diet drive expression or is it the reverse? And, can differences in expression explain why children are more sensitive to bitter than adults with the same genotype?" said co-author Julie Mennella PHD, a developmental psychobiologist at Monell.
It is important for robot designers to know how to make robots that interact effectively with humans. One key dimension is robot appearance and in particular how humanlike the robot should be. Uncanny Valley theory suggests that robots look uncanny when their appearance approaches, but is not absolutely, human. An underlying mechanism may be that appearance affects users’ perceptions of the robot’s personality and mind. This study aimed to investigate how robot facial appearance affected perceptions of the robot’s mind, personality and eeriness. A repeated measures experiment was conducted. 30 participants (14 females and 16 males, mean age 22.5 years) interacted with a Peoplebot healthcare robot under three conditions in a randomized order: the robot had either a humanlike face, silver face, or no-face on its display screen. Each time, the robot assisted the participant to take his/her blood pressure. Participants rated the robot’s mind, personality, and eeriness in each condition. The robot with the humanlike face display was most preferred, rated as having most mind, being most humanlike, alive, sociable and amiable. The robot with the silver face display was least preferred, rated most eerie, moderate in mind, humanlikeness and amiability. The robot with the no-face display was rated least sociable and amiable. There was no difference in blood pressure readings between the robots with different face displays. Higher ratings of eeriness were related to impressions of the robot with the humanlike face display being less amiable, less sociable and less trustworthy. These results suggest that the more humanlike a healthcare robot’s face display is, the more people attribute mind and positive personality characteristics to it. Eeriness was related to negative impressions of the robot’s personality. Designers should be aware that the face on a robot’s display screen can affect both the perceived mind and personality of the robot.
It is natural to imagine that the sense of sight takes in the world as it is — simply passing on what the eyes collect from light reflected by the objects around us.
But the eyes do not work alone. What we see is a function not only of incoming visual information, but also how that information is interpreted in light of other visual experiences, and may even be influenced by language.
Words can play a powerful role in what we see, according to a study published this month by UW-Madison cognitive scientist and psychology professor Gary Lupyan, and Emily Ward, a Yale University graduate student, in the journal Proceedings of the National Academy of Sciences.
"Perceptual systems do the best they can with inherently ambiguous inputs by putting them in context of what we know, what we expect," Lupyan says. "Studies like this are helping us show that language is a powerful tool for shaping perceptual systems, acting as a top-down signal to perceptual processes. In the case of vision, what we consciously perceive seems to be deeply shaped by our knowledge and expectations."
And those expectations can be altered with a single word.
To show how deeply words can influence perception, Lupyan and Ward used a technique called continuous flash suppression to render a series of objects invisible for a group of volunteers.
Each person was shown a picture of a familiar object — such as a chair, a pumpkin or a kangaroo — in one eye. At the same time, their other eye saw a series of flashing, “squiggly” lines.
"Essentially, it’s visual noise," Lupyan says. "Because the noise patterns are high-contrast and constantly moving, they dominate, and the input from the other eye is suppressed."
Immediately before looking at the combination of the flashing lines and suppressed object, the study participants heard one of three things: the word for the suppressed object (“pumpkin,” when the object was a pumpkin), the word for a different object (“kangaroo,” when the object was actually a pumpkin), or just static.
Then researchers asked the participants to indicate whether they saw something or not. When the word they heard matched the object that was being wiped out by the visual noise, the subjects were more likely to report that they did indeed see something than in cases where the wrong word or no word at all was paired with the image.
"Hearing the word for the object that was being suppressed boosted that object into their vision," Lupyan says.
And hearing an unmatched word actually hurt study subjects’ chances of seeing an object.
"With the label, you’re expecting pumpkin-shaped things," Lupyan says. "When you get a visual input consistent with that expectation, it boosts it into perception. When you get an incorrect label, it further suppresses that."
Experiments have shown that continuous flash suppression interrupts sight so thoroughly that there are no signals in the brain to suggest the invisible objects are perceived, even implicitly.
"Unless they can tell us they saw it, there’s nothing to suggest the brain was taking it in at all," Lupyan says. "If language affects performance on a test like this, it indicates that language is influencing vision at a pretty early stage. It’s getting really deep into the visual system."
The study demonstrates a deeper connection between language and simple sensory perception than previously thought, and one that makes Lupyan wonder about the extent of language’s power. The influence of language may extend to other senses as well.
"A lot of previous work has focused on vision, and we have neglected to examine the role of knowledge and expectations on other modalities, especially smell and taste," Lupyan says. "What I want to see is whether we can really alter threshold abilities," he says. "Does expecting a particular taste for example, allow you to detect a substance at a lower concentration?"
If you’re drinking a glass of milk, but thinking about orange juice, he says, that may change the way you experience the milk.
"There’s no point in figuring out what some objective taste is," Lupyan says. "What’s important is whether the milk is spoiled or not. If you expect it to be orange juice, and it tastes like orange juice, it’s fine. But if you expected it to be milk, you’d think something was wrong."
(Source: news.wisc.edu)
Sex, Smell And Science – The Genetics Of Olfaction
No two people smell exactly alike. That is, noses sense odors in individual ways. What one nose finds offensive, another may find pleasant, while another might not smell anything at all. Scientists have long known the way things smell to us is determined by our genes.
Now, two studies appearing in the journal Current Biology (1, 2) have identified “the genetic differences that underpin the differences in smell sensitivity and perception in different individuals.” And while some of these differences merely help determine our culinary preferences, others appear to play a subconscious role in how we choose our sexual partners.
For the first study, 200 people were tested to determine their sensitivity to 10 different chemical compounds commonly found in foods. The researchers found four of the ten odors had a genetic association. These were malt, apple, blue cheese, and a floral scent associated with violets.
The research team, led by Sara Jaeger, Jeremy McRae, and Richard Newcomb of Plant and Food Research in New Zealand, used a genome-wide association study. Their first task was to identify which test subjects could smell each chemical compound and which could not. They then searched the subjects’ genomes for areas of DNA that differed between these people.
“We were surprised how many odors had genes associated with them. If this extends to other odors, then we might expect everyone to have their own unique set of smells that they are sensitive to,” explained McRae
“These smells are found in foods and drinks that people encounter every day, such as tomatoes and apples. This might mean that when people sit down to eat a meal, they each experience it in their own personalized way.”
They further found there is no regional differentiation. A person in one part of the world is just as likely to be able to smell a particular compound as a person in another part of the world. In addition, sensitivity to one compound does not predict the ability to smell another compound.
The genes that determine our ability to perceive certain odors all lie in or near the genes that encode olfactory receptors. These receptors occur on the surface of sensory nerve cells in the upper part of the nose. A particular smell is perceived when these receptor molecules bind with a chemical compound wafting through the nose, causing nerve cells to send an impulse to the brain and producing our sensation of smell.
For the violet smell, caused by a naturally occurring chemical compound known as β-ionone, the researchers were able to pinpoint the exact mutation in gene OR5A1 that determines whether the smell is perceived as floral, sour or pungent, and whether it is found to be pleasant.
These findings might have future marketing value. According to Richard Newcomb, “Knowing the compounds that people can sense in foods, as well as other products, will have an influence on the development of future products. Companies may wish to design foods that better target people based on their sensitivity, essentially developing foods and other products personalized for their taste and smell.”
SEXY OR STINKY?
A separate study was conducted by Leslie Vosshall of the Rockefeller University Hospital. Humans have about 1,000 genes that influence smell, and around 400 of these are responsible for sensing a particular odor molecule.
Testing 391 human subjects, Vosshall studied olfactory responses to two closely related steroids, androstenone and androstadienone, which are found in male sweat. People generally have strong reactions to these steroids, finding them either sweet and florally or rank and noxious. The gene 0R7D4 determines the intensity of these odors as well as the perception of them being either pleasant or repulsive.
According to Vosshall’s report: “People who found the smell repulsive were more likely to have two functional copies of OR7D4; those who perceived it as a more mild smell tended to have one or two impaired copies of the gene.”
This study is part of the larger goal of understanding how genetic and neuronal factors influence behaviors.
A 2002 study published in Nature Genetics provided more insight into the effect of male pheromones on women. This study looked at the link between women’s preferences for the odors given off by men and a group of genes called the Major Histocompatibily Complex (MHC) which contribute to a persons’ immune response.
In this experiment, a group of 49 women were asked to smell 10 boxes. Some of the boxes held t-shirts worn by men with different MHC genes, and others contained familiar household odors such as bleach or cloves.
The t-shirts were worn by men who slept in them for two nights and avoided contact with other scents during that time, even to the point of avoiding other people. According to the report, “the women were then asked to rate each scent based on their familiarity, intensity, pleasantness and spiciness, as well as choose the one odor which they would choose if they had to smell it all the time.”
What the researchers found was the women did not choose the scents of men whose genes were similar to their own, nor did they choose those whose genes were too dissimilar. The women showed no preference for odors from men who had the same genes as their mothers, but did show a preference for odors from men who shared genes they inherited from their fathers.
Scientists believe there are two reasons for preferring a mate whose MHC genes are different than one’s own. One is that it would tend to create offspring with more genetic diversity and thus more robust immune systems. The other is it helps to avoid inbreeding.
Of course, when people choose their mates, there are a number of social factors that come into play as well. However, studies have shown married people tend to have different types of genes than their spouses.
So, the next time you like the way a person smells, keep in mind it may mean you have complementary genes.
Researchers Uncover Cellular Mechanisms for Attention in the Brain
The ability to pay attention to relevant information while ignoring distractions is a core brain function. Without the ability to focus and filter out “noise,” we could not effectively interact with our environment. Despite much study of attention in the brain, the cellular mechanisms responsible for the effects of attention have remained a mystery… until now.
In a study appearing in the journal Nature, researchers from Dartmouth’s Geisel School of Medicine and the University of California Davis studied communications between synaptically connected neurons under conditions where subjects shifted their attention toward or away from visual stimuli that activated the recorded neurons. Using this highly sensitive measure of attention’s influence on neuron-to-neuron communication, they were able to demonstrate that attention operates at the level of the synapse to improve sensitivity to incoming signals, sharpen the precision of these signals, and selectively boost the transmission of attention-grabbing information while reducing the level of noisy or attention-disrupting information.
The results point to a novel mechanism by which attention shapes perception by selectively altering presynaptic weights to highlight sensory features among all the noisy sensory input.
"While our findings are consistent with other reported changes in neuronal firing rates with attention, they go far beyond such descriptions, revealing never-before tested mechanisms at the synaptic level," said study co-author Farran Briggs, PhD, assistant professor of Physiology and Neurobiology at the Geisel School of Medicine.
In addition to expanding our understanding of brain, this study could help people with attention deficits resulting from brain injury or disease, possibly leading to improved screening and new treatments.