Posts tagged language

Posts tagged language
The Road to Language Learning Is Iconic
Languages are highly complex systems and yet most children seem to acquire language easily, even in the absence of formal instruction. New research on young children’s use of British Sign Language (BSL) sheds light on one mechanism – iconicity – that may play an important role in children’s ability to learn language.
For spoken and written language, the arbitrary relationship between a word’s form – how it sounds or how it looks on paper – and its meaning is a particularly challenging feature of language acquisition. But one of the first things people notice about sign languages is that signs often represent aspects of meaning in their form. For example, in BSL the sign EAT involves bringing the hand to the mouth just as you would if you were bringing food to the mouth to eat it.
In fact, a high proportion of signs across the world’s sign languages are similarly iconic, connecting human experience to linguistic form.
Robin Thompson and colleagues David Vison, Bencie Woll, and Gabriella Vigliocco at the Deafness, Cognition and Language Research Centre (DCAL) at University College London in the United Kingdom wanted to examine whether this kind of iconicity might provide a key to understanding how children come to link words to their meaning.
Their findings are published in Psychological Science, a journal of the Association for Psychological Science.
(Photo: David Levene)
A new Northwestern University study shows the power of language in infants’ ability to understand the intentions of others.
As the babies watched intently, an experimenter produced an unusual behavior—she used her forehead to turn on a light. But how did babies interpret this behavior? Did they see it as an intentional act, as something worthy of imitating? Or did they see it as a fluke? To answer this question, the experimenter gave 14-month-old infants an opportunity to play with the light themselves.
The results, based on two experiments, show that introducing a novel word for the impending novel event had a powerful effect on the infants’ tendency to imitate the behavior. Infants were more likely to imitate behavior, however unconventional, if it had been named, than if it remained unnamed, the study shows.
When the experimenter announced her unusual behavior (“I’m going to blick the light”), infants imitated her. But when she did not provide a name, they did not follow suit.
This revealed that infants as young as 14 months of age coordinate their insights about human behavior and their intuitions about human language in the service of discovering which behaviors, observed in others, are ones to imitate.
"This work shows, for the first time, that even for infants who have only just begun to ‘crack the language code,’ language promotes culturally-shared knowledge and actions – naturally, generatively and apparently effortlessly," said Sandra R. Waxman, co-author of the study and the Louis W. Menk Professor of Psychology at Northwestern.
"This is the first demonstration of how infants’ keen observational skills, when augmented by human language, heighten their acuity for ‘reading’ the underlying intentions of their ‘tutors’ (adults) and foster infants’ imitation of adults’ actions."
Waxman said absent language and its power in conveying meaning, infants don’t imitate these “strange” actions.
"This means that human language provides infants with a powerful key: it unlocks for them a broader world of social intentions," Waxman said. "We know that language, and especially the shared meaning within a linguistic community, is one of the most powerful conduits of the cultural knowledge that we humans transmit across generations."
(Source: eurekalert.org)
The unconscious brain may not be able to ace an SAT test, but new research suggests that it can handle more complex language processing and arithmetic tasks than anyone has previously believed. According to these findings, just published in the Proceedings of the National Academy of Sciences, we may be blithely unaware of all the hard work the unconscious brain is doing.

In their experiments, researchers from Hebrew University in Israel used a cutting-edge “masking” technique to keep their test subjects from consciously perceiving certain stimuli. With this technique, known as continuous flash suppression, the researchers show a rapidly changing series of colorful patterns to just one of the subject’s eyes. The bright patterns dominate the subject’s awareness to such an extent that when researchers show less flashy material to the other eye (like words or equations), it takes several seconds before the brain consciously registers it.
This masking technique is “a game changer in the study of the unconscious,” the scientists write, “because unlike all previous methods, it gives unconscious processes ample time to engage with and operate on subliminal stimuli.”
To study the unconscious brain’s ability to process language, the researchers subliminally showed the subject short phrases that made variable amounts of sense: For example, subjects might see the phrase “I ironed coffee” or “I ironed clothes.” The researchers gradually turned up the contrast between the phrase and its background, and measured how long it took for the phrase to “pop” into the subject’s conscious awareness. As the nonsensical phrases popped sooner, the researchers hypothesize that the unconscious brain processed the sentence, found it surprising and odd, and quickly passed it along to the conscious brain for further examination.

To determine the unconscious brain’s mathematical abilities, the researchers presented a simple subtraction or addition equation (for example, “9 − 3 − 4 = “) to a subject, but took it away before it could pop into consciousness. Then they stopped the masking pattern and displayed a single number, asking the viewer to pronounce the number as soon as it registered. When the number was the answer to the subtraction equation (for example, “2”), the subject was quicker to pronounce it. The researchers argue that the viewer was “primed” to respond to that number because the unconscious brain had solved the equation. Oddly, they didn’t find the same clear effect with easier addition equations.
(Source: spectrum.ieee.org)

An elephant that speaks Korean
An Asian elephant named Koshik can imitate human speech, speaking words in Korean that can be readily understood by those who know the language. The elephant accomplishes this in a most unusual way: he vocalizes with his trunk in his mouth.
The elephant’s vocabulary consists of exactly five words, researchers report on November 1 in Current Biology, a Cell Press publication. Those include “annyong” (“hello”), “anja” (“sit down”), “aniya” (“no”), “nuo” (“lie down”), and “choah” (“good”). Ultimately, Koshik’s language skills may provide important insights into the biology and evolution of complex vocal learning, an ability that is critical for human speech and music, the researchers say.
"Human speech basically has two important aspects, pitch and timbre," says Angela Stoeger of the University of Vienna. "Intriguingly, the elephant Koshik is capable of matching both pitch and timbre patterns: he accurately imitates human formants as well as the voice pitch of his trainers. This is remarkable considering the huge size, the long vocal tract, and other anatomical differences between an elephant and a human."
Speed-Learning a New Language May Help Brain Grow
Learning a new language over a short period of time appears to make the brain grow, new research suggests. The new study included young recruits at the Swedish Armed Forces Interpreter Academy who went from having no knowledge of a new language to speaking it fluently within 13 months. The recruits studied at a furious pace: from morning to evening, weekdays and weekends.
The recruits were compared to medicine and cognitive science students at a university (the “control” group), who also studied hard, but weren’t learning a new language. Both groups underwent MRI brain scans before and after a three-month period of intensive study. The scans showed that the brain structure of the control group remained unchanged, but certain parts of the brain of the language students grew.
This growth occurred in the hippocampus, a structure involved in learning new material and spatial navigation, and in three areas of the cerebral cortex. Among the recruits, those who took naturally to learning a new language had greater growth in the hippocampus and areas of the cerebral cortex related to language learning, while those who had to put more effort into learning a new language had greater growth in an area of the motor region of the cerebral cortex, the investigators found.
"We were surprised that different parts of the brain developed to different degrees depending on how well the students performed and how much effort they had had to put in to keep up with the course," Johan Martensson, a researcher in psychology at Lund University in Sweden, said in a university news release.
Martensson noted that previous research has indicated that bilingual and multilingual people develop Alzheimer’s disease at a later age. “Even if we cannot compare three months of intensive language study with a lifetime of being bilingual, there is a lot to suggest that learning languages is a good way to keep the brain in shape,” Martensson said.
The study appeared in the Oct. 15 issue of the journal NeuroImage.
Neuroscientists find Broca’s area is really two subunits, each with its own function
A century and a half ago, French physician Pierre Paul Broca found that patients with damage to part of the brain’s frontal lobe were unable to speak more than a few words. Later dubbed Broca’s area, this region is believed to be critical for speech production and some aspects of language comprehension.
However, in recent years neuroscientists have observed activity in Broca’s area when people perform cognitive tasks that have nothing to do with language, such as solving math problems or holding information in working memory. Those findings have stimulated debate over whether Broca’s area is specific to language or plays a more general role in cognition.
A new study from MIT may help resolve this longstanding question. The researchers, led by Nancy Kanwisher, the Walter A. Rosenblith Professor of Cognitive Neuroscience, found that Broca’s area actually consists of two distinct subunits. One of these focuses selectively on language processing, while the other is part of a brainwide network that appears to act as a central processing unit for general cognitive functions.
"I think we’ve shown pretty convincingly that there are two distinct bits that we should not be treating as a single region, and perhaps we shouldn’t even be talking about ‘Broca’s area’ because it’s not a functional unit," says Evelina Fedorenko, a research scientist in Kanwisher’s lab and lead author of the new study, which recently appeared in the journal Current Biology.
PNAS Study: Language Structure Arises from Balance of Clear and Effective Communication
When learning a new language, we automatically organize words into sentences that will be both clearly understood and efficient (quick) to communicate. That’s the finding of a new study reported today in the Proceedings of the National Academy of Sciences (PNAS) which challenges opposing theories on why and how languages come to be organized the way they are.
With more than 5000 languages in the world, it would be easy to assume all vary endlessly, but, in fact, there is great commonality: languages follow only a few recurrent patterns. These commonalities are called “language universals,” a notion suggested in the 1960’s by Noam Chomsky and Joseph Greenberg. A team of researchers from the University of Rochester and Georgetown University Medical Center set out to investigate how these language universals come to be.
Linguists and cognitive scientists have opposing ideas on how a language is developed and shaped. Some believe that languages all derived from a common ancestor; others think that languages vary quite widely and universals do not exist at all. Some have suggested that language universals are an arbitrary evolutionary outcome. The position of the Rochester-Georgetown team is that the human mind shapes a language, even while learning it, based on the need for robust and effective information transfer.
“The thousands of natural languages in our world only have a couple of formats in which they appear, and we are good at understanding and learning languages that have just these formats. Otherwise we could never succeed in learning something so complicated as human languages,” says one of the study’s authors, Elissa L. Newport, Ph.D., a professor in the department of neurology at Georgetown University Medical Center.
A new finding could lead to strategies for treating speech loss after a stroke and helping children with dyslexia.
New research links motor skills and perception, specifically as it relates to a second finding—a new understanding of what the left and right brain hemispheres “hear.” Georgetown University Medical Center researchers say these findings may eventually point to strategies to help stroke patients recover their language abilities, and to improve speech recognition in children with dyslexia.
The study, presented at Neuroscience 2012, the annual meeting of the Society for Neuroscience, is the first to match human behavior with left brain/right brain auditory processing tasks. Before this research, neuroimaging tests had hinted at differences in such processing.
“Language is processed mainly in the left hemisphere, and some have suggested that this is because the left hemisphere specializes in analyzing very rapidly changing sounds,” says the study’s senior investigator, Peter E. Turkeltaub, M.D., Ph.D., a neurologist in the Center for Brain Plasticity and Recovery. This newly created center is a joint program of Georgetown University and MedStar National Rehabilitation Network.
Turkeltaub and his team hid rapidly and slowly changing sounds in background noise and asked 24 volunteers to simply indicate whether they heard the sounds by pressing a button.
“We asked the subjects to respond to sounds hidden in background noise,” Turkeltaub explained. “Each subject was told to use his or her right hand to respond during the first 20 sounds, then the left hand for the next 20 second, then right, then left, and so on.”
He says when a subject was using their right hand, they heard the rapidly changing sounds more often than when they used their left hand, and vice versa for the slowly changing sounds.
“Since the left hemisphere controls the right hand and vice versa, these results demonstrate that the two hemispheres specialize in different kinds of sounds—the left hemisphere likes rapidly changing sounds, such as consonants, and the right hemisphere likes slowly changing sounds, such as syllables or intonation,” Turkeltaub explains.
“These results also demonstrate the interaction between motor systems and perception. It’s really pretty amazing. Imagine you’re waving an American flag while listening to one of the presidential candidates. The speech will actually sound slightly different to you depending on whether the flag is in your left hand or your right hand.”
Ultimately, Turkeltaub hopes that understanding the basic organization of auditory systems and how they interact with motor systems will help explain why language resides in the left hemisphere of the brain, and will lead to new treatments for language disorders, like aphasia (language difficulties after stroke or brain injury) or dyslexia.
“If we can understand the basic brain organization for audition, this might ultimately lead to new treatments for people who have speech recognition problems due to stroke or other brain injury. Understanding better the specific roles of the two hemispheres in auditory processing will be a big step in that direction. If we find that people with aphasia, who typically have injuries to the left hemisphere, have difficulty recognizing speech because of problems with low-level auditory perception of rapidly changing sounds, maybe training the specific auditory processing deficits will improve their ability to recognize speech,” Turkeltaub concludes.
(Source: explore.georgetown.edu)
Could Stem Cells Treat Autism? Newly Approved Study May Tell
Autism researchers have been given the go-ahead by the U.S. Food and Drug Administration to launch a small study in children with autism that evaluates whether a child’s own umbilical cord blood may be an effective treatment.
Thirty children with the disorder, aged 2 to 7, will receive injections of their own stem cells from umbilical cord blood banked by their parents after their births. All of the cord blood comes from the Cord Blood Registry, the world’s largest stem cell bank.
Scientists at Sutter Neuroscience Institute, in Sacramento, Calif., said the placebo-controlled study will evaluate whether the stem cell therapy helps improve language and behavior in the youngsters.
There is anecdotal evidence that stem cell infusions may have a benefit in other conditions such as cerebral palsy, said lead study investigator Dr. Michael Chez, director of pediatric neurology at the institute.
"We’re hoping we’ll see in the autism population a group of patients that also responds," Chez said. Other autism and stem cell research is going on abroad, but this study is the first to use a child’s own cord blood stem cells.
Chez said the study will involve only patients whose autism is not linked to a genetic syndrome or brain injury, and all of the children will eventually receive the stem cells.
New Study Reveals How Humans Became Right-Handed
According to a new study led by Dr Gillian Forrester of the University of Sussex, a predominance to be right-handed is not a uniquely human trait but one shared by great apes.
The study, published in the journal Behavioural Brain Research, analyzed hand actions directed towards either objects or individuals in chimpanzees, gorillas and children, and found that all three species are right-handed for actions to objects, but not for actions directed to individuals.
The results support a theory that human right-handedness is a trait developed through tool use that was inherited from an ancestor common to both humans and great apes. The findings challenge a widely held view that right-handed dominance in humans was a species-unique trait linked to the emergence of language.
“Humans have been tool users for 2.5 million years, while the current view is that language only emerged one hundred thousand years ago,” Dr Forrester said. “Our findings provide the first non-invasive results from naturalistic behavior, suggesting that language emerged as a consequence of left hemisphere brain regions that were already evolved to process regular sequences of actions. The structure found in language may have developed from pre-existing brain processes adapted from experience with tool-use.”