Posts tagged adaptation

Posts tagged adaptation
Neuroscientists discover adaptation mechanisms of the brain when perceiving letters of the alphabet
The headlights – two eyes, the radiator cowling – a smiling mouth: This is how our brain sometimes creates a face out of a car front. The same happens with other objects: in house facades, trees or stones – a “human face” can often be detected as well. Prof. Dr. Gyula Kovács from Friedrich Schiller University Jena (Germany) knows the reason why. “Faces are of tremendous importance for human beings,” the neuroscientist explains. That’s why in the course of the evolution our visual perception has specialized in the recognition of faces in particular. “This sometimes even goes as far as us recognizing faces when there are none at all.”
Until now the researchers assumed that this phenomenon is an exception that can only be applied to faces. But, as Prof. Kovács and his colleague Mareike Grotheer were able to point out in a new study: these distinct adaptation mechanisms are not only restricted to the perception of faces. In the The Journal of Neuroscience the Jena researchers have proved that the effect can also occur in the perception of letters.

Listen to this: Research upends understanding of how humans perceive sound
A key piece of the scientific model used for the past 30 years to help explain how humans perceive sound is wrong, according to a new study by researchers at the Stanford University School of Medicine.
The long-held theory helped to explain a part of the hearing process called “adaptation,” or how humans can hear everything from the drop of a pin to a jet engine blast with high acuity, without pain or damage to the ear. Its overturning could have significant impact on future research for treating hearing loss, said Anthony Ricci, PhD, the Edward C. and Amy H. Sewall Professor of Otolaryngology and senior author of the study.
“I would argue that adaptation is probably the most important step in the hearing process, and this study shows we have no idea how it works,” Ricci said. “Hearing damage caused by noise and by aging can target this particular molecular process. We need to know how it works if we are going to be able to fix it.”
The study was published Nov. 20 in Neuron. The lead author is postdoctoral scholar Anthony Peng, PhD.
Deep inside the ear, specialized cells called hair cells detect vibrations caused by air pressure differences and convert them into electrochemical signals that the brain interprets as sound. Adaptation is the part of this process that enables these sensory hair cells to regulate the decibel range over which they operate. The process helps protect the ear against sounds that are too loud by adjusting the ears’ sensitivity to match the noise level of the environment.
The traditional explanation for how adaptation works, based on earlier research on frogs and turtles, is that it is controlled by at least two complex cellular mechanisms both requiring calcium entry through a specific, mechanically sensitive ion channel in auditory hair cells. The new study, however, finds that calcium is not required for adaptation in mammalian auditory hair cells and posits that one of the two previously described mechanisms is absent in auditory cochlear hair cells.
Experimenting mostly on rats, the Stanford scientists used ultrafast mechanical stimulation to elicit responses from hair cells as well as high-speed, high-resolution imaging to track calcium signals quickly before they had time to diffuse. After manipulating intracellular calcium in various ways, the scientists were surprised to find that calcium was not necessary for adaptation to occur, thus challenging the 30-year-old hypothesis and opening the door to new models of mechanotransduction (the conversion of mechanical signals into electrical signals) and adaptation.
“This somewhat heretical finding suggests that at least some of the underlying molecular mechanisms for adaptation must be different in mammalian cochlear hair cells as compared to that of frog or turtle hair cells, where adaptation was first described,” Ricci said.
The study was conducted to better understand how the adaptation process works by studying the machinery of the inner ear that converts sound waves into electrical signals.
“To me this is really a landmark study,” said Ulrich Mueller, PhD, professor and chair of molecular and cellular neuroscience at the Scripps Research Institute in La Jolla, who was not involved with the study. “It really shifts our understanding. The hearing field has such precise models — models that everyone uses. When one of the models tumbles, it’s monumental.”
Humans are born with 30,000 cochlear and vestibular hair cells per ear. When a significant number of these cells are lost or damaged, hearing or balance disorders occur. Hair cell loss occurs for multiple reasons, including aging and damage to the ear from loud sounds. Damage or impairment to the process of adaptation may lead to the further loss of hair cells and, therefore, hearing. Unlike many other species, including birds, humans and other mammals are unable to spontaneously regenerate these hearing cells.
As the U.S. population has aged and noise pollution has grown more severe, health experts now estimate that one in three adults over the age of 65 has developed at least some degree of hearing disability because of the destruction of these limited number of hair cells.
“It’s by understanding just how the inner machinery of the ear works that scientists hope to eventually find ways to fix the parts that break,” Ricci said. “So when a key piece of the puzzle is shown to be wrong, it’s of extreme importance to scientists working to cure hearing loss.”
Our sense of touch can contribute to our ability to perceive faces, according to new research published in Psychological Science, a journal of the Association for Psychological Science.
“In daily life, we usually recognize faces through sight and almost never explore them through touch,” says lead researcher Kazumichi Matsumiya of Tohoku University in Japan. “But we use information from multiple sensory modalities in order to perceive many everyday non-face objects and events, such as speech perception or object recognition — these new findings suggest that even face processing is essentially multisensory.”
In a series of studies, Matsumiya took advantage of a phenomenon called the “face aftereffect” to investigate whether our visual system responds to nonvisual signals for processing faces. Inthe face aftereffect, we adapt to a face with a particular expression — happiness, for example — which causes us to perceive a subsequent neutral face as having the opposite facial expression (i.e., sadness).
Matsumiya hypothesized that if the visual system really does respond to signals from another modality, then we should see evidence for face aftereffects from one modality to the other. So, adaptation to a face that is explored by touch should produce visual face aftereffects.
To test this, Matsumiya had participants explore face masks concealed below a mirror by touching them. After this adaptation period, the participants were visually presented with a series of faces that had varying expressions and were asked to classify the faces as happy or sad. The visual faces and the masks were created from the same exemplar.
In line with his hypothesis, Matsumiya found that participants’ experiences exploring the face masks by touch shifted their perception of the faces presented visually compared to participants who had no adaptation period, such that the visual faces were perceived as having the opposite facial expression.
Further experiments ruled out other explanations for the results, including the possibility that the face aftereffects emerged because participants were intentionally imagining visual faces during the adaptation period.
And a fourth experiment revealed that the aftereffect also works the other way: Visual stimuli can influence how we perceive a face through touch.
According to Matsumiya, current views on face processing assume that the visual system only receives facial signals from the visual modality — but these experiments suggest that face perception is truly crossmodal.
“These findings suggest that facial information may be coded in a shared representation between vision and haptics in the brain,” notes Matsumiya, suggesting that these findings may have implications for enhancing vision and telecommunication in the development of aids for the visually impaired.

Insects change the way they communicate when drowned out by man-made noises
Birds and frogs do it, even whales have been known to do it. Now scientists have for the first time shown that insects also change the way they sing to one another when drowned out by man-made noises.
Click HERE to listen to a grasshopper battling traffic noise
Grasshoppers living next to a main road respond to the increased background volume of passing traffic by adjusting their summer courtship songs, scientists have discovered.
In order to make themselves heard above the low-rumble noise pollution of moving vehicles, male bow-winged grasshoppers of central Europe alter the pitch of their songs’ lower notes so that they rise to a mini-crescendo, the scientists found.
“Bow-winged grasshoppers produce songs that include low and high frequency components,” said Ulrike Lampe of the University of Bielefeld in Germany, who led the study published in the journal Functional Ecology.
“We found that grasshoppers from noisy habitats boost the volume of the lower-frequency part of their song, which makes sense since road noise can mask signals in this part of the frequency spectrum,” Dr Lampe said.
Evolution is actually pretty predictable
“Is evolution predictable? To a surprising extent the answer is ‘yes’,” says Princeton professor Peter Andolfatto.
New research by Andolfatto and colleagues published in the journal Science suggests that knowledge of a species’ genes—and how certain external conditions affect the proteins encoded by those genes—could be used to determine a predictable evolutionary pattern driven by outside factors.
Scientists could then pinpoint how the diversity of adaptations seen in the natural world developed even in distantly related animals.
The researchers carried out a survey of DNA sequences from 29 distantly related insect species, the largest sample of organisms yet examined for a single evolutionary trait. Fourteen of these species have evolved a nearly identical characteristic due to one external influence—they feed on plants that produce cardenolides, a class of steroid-like cardiotoxins that are a natural defense for plants such as milkweed and dogbane.
Though separated by 300 million years of evolution, these diverse insects—which include beetles, butterflies, and aphids—experienced changes to a key protein called sodium-potassium adenosine triphosphatase, or the sodium-potassium pump, which regulates a cell’s crucial sodium-to-potassium ratio.
Cross-Category Adaptation: Objects Produce Gender Adaptation in the Perception of Faces
Adaptation aftereffects have been found for low-level visual features such as colour, motion and shape perception, as well as higher-level features such as gender, race and identity in domains such as faces and biological motion. It is not yet clear if adaptation effects in humans extend beyond this set of higher order features. The aim of this study was to investigate whether objects highly associated with one gender, e.g. high heels for females or electric shavers for males can modulate gender perception of a face. In two separate experiments, we adapted subjects to a series of objects highly associated with one gender and subsequently asked participants to judge the gender of an ambiguous face. Results showed that participants are more likely to perceive an ambiguous face as male after being exposed to objects highly associated to females and vice versa. A gender adaptation aftereffect was obtained despite the adaptor and test stimuli being from different global categories (objects and faces respectively). These findings show that our perception of gender from faces is highly affected by our environment and recent experience. This suggests two possible mechanisms: (a) that perception of the gender associated with an object shares at least some brain areas with those responsible for gender perception of faces and (b) adaptation to gender, which is a high-level concept, can modulate brain areas that are involved in facial gender perception through top-down processes.
![How digital culture is rewiring our brains
Our brains are superlatively evolved to adapt to our environment: a process known as neuroplasticity. The connections between our brain cells will be shaped, strengthened and refined by our individual experiences. It is this personalisation of the physical brain, driven by unique interactions with the external world, that arguably constitutes the biological basis of each mind, so what will happen to that mind if the external world changes in unprecedented ways, for example, with an all-pervasive digital technology?
A recent survey in the US showed that more than half of teenagers aged 13 to 17 spend more than 30 hours a week, outside school, using computers and other web-connected devices. If their environment is being transformed for so much of the time into a fast-paced and highly interactive two-dimensional space, the brain will adapt, for good or ill. Professor Michael Merzenich, of the University of California, San Francisco, gives a typical neuroscientific perspective.
”There is a massive and unprecedented difference in how [digital natives’] brains are plastically engaged in life compared with those of average individuals from earlier generations and there is little question that the operational characteristics of the average modern brain substantially differ,” he says.](http://41.media.tumblr.com/tumblr_m8ft5jgs2V1rog5d1o1_400.jpg)
How digital culture is rewiring our brains
Our brains are superlatively evolved to adapt to our environment: a process known as neuroplasticity. The connections between our brain cells will be shaped, strengthened and refined by our individual experiences. It is this personalisation of the physical brain, driven by unique interactions with the external world, that arguably constitutes the biological basis of each mind, so what will happen to that mind if the external world changes in unprecedented ways, for example, with an all-pervasive digital technology?
A recent survey in the US showed that more than half of teenagers aged 13 to 17 spend more than 30 hours a week, outside school, using computers and other web-connected devices. If their environment is being transformed for so much of the time into a fast-paced and highly interactive two-dimensional space, the brain will adapt, for good or ill. Professor Michael Merzenich, of the University of California, San Francisco, gives a typical neuroscientific perspective.
”There is a massive and unprecedented difference in how [digital natives’] brains are plastically engaged in life compared with those of average individuals from earlier generations and there is little question that the operational characteristics of the average modern brain substantially differ,” he says.