Neuroscience

Articles and news from the latest research reports.

Posts tagged language development

71 notes

Gestures of Human and Ape Infants Are More Similar Than You Might Expect
Thirteen years after the release of On the Origin of Species, Charles Darwin published another report on the evolution of mankind. In the 1872 book The Expression of the Emotions in Man and Animals, the naturalist argued that people from different cultures exhibit any given emotion through the same facial expression. This hypothesis didn’t quite pan out—last year, researchers poked a hole in the idea by showing that the expression of emotions such as anger, happiness and fear wasn’t universal (PDF). Nonetheless, certain basic things—such as the urge to cry out in pain, an increase in blood pressure when feeling anger, even shrugging when we don’t understand something—cross cultures.
A new study, published today in the journal Frontiers in Psychology, compares such involuntary responses, but with an added twist: Some observable behaviors aren’t only universal to the human species, but to our closest relatives too—chimpanzees and bonobos.
Using video analysis, a team of UCLA researchers found that human, chimpanzee and bonobo babies make similar gestures when interacting with caregivers. Members of all three species reach with their arms and hands for objects or people, and point with their fingers or heads. They also raise their arms up, a motion indicating that they want to be picked up, in the same manner. Such gestures, which seemed to be innate in all three species, precede and eventually lead to the development of language in humans, the researchers say.
To pick up on these behaviors, the team studied three babies of differing species through videos taken over a number of months. The child stars of these videos included a chimpanzee named Panpanzee, a bonobo called Panbanisha and a human girl, identified as GN. The apes were raised together at the Georgia State University Language Research Center in Atlanta, where researchers study language and cognitive processes in chimps, monkeys and humans. There, Panpanzee and Panbanisha were taught to communicate with their human caregivers using gestures, noises and lexigrams, abstract symbols that represent words. The human child grew up in her family’s home, where her parents facilitated her learning.
Researchers filmed the child’s development for seven months, starting when she was 11 months old, while the apes were taped from 12 months of age to 26 months. In the early stages of the study, the observed gestures were of a communicative nature: all three infants engaged in the behavior with the intention of conveying how their emotions and needs. They made eye contact with their caregivers, added non-verbal vocalizations to their movements or exerted physical effort to elicit a response.
By the second half of the experiment, the production of communicative symbols—visual ones for the apes, vocal ones for the human—increased. As she grew older, the human child began using more spoken words, while the chimpanzee and bonobo learned and used more lexigrams. Eventually, the child began speaking to convey what she felt, rather than only gesturing. The apes, on the other hand, continued to rely on gestures. The study calls this divergence in behavior “the first indication of a distinctive human pathway to language.”
The researchers speculate that the matching behaviors can be traced to the last shared ancestor of humans, chimps and bobonos, who lived between four and seven million years ago. That ancestor probably exhibited the same early gestures, which all three species then inherited. When the species diverged, humans managed to build on this communicative capacity by eventually graduating to speech.
Hints of this can be seen in how the human child paired her gestures with non-speech vocalizations, the precursors to words, far more than the apes did. It’s this successful combinationof gestures and words that may have led to the birth of human language.

Gestures of Human and Ape Infants Are More Similar Than You Might Expect

Thirteen years after the release of On the Origin of Species, Charles Darwin published another report on the evolution of mankind. In the 1872 book The Expression of the Emotions in Man and Animals, the naturalist argued that people from different cultures exhibit any given emotion through the same facial expression. This hypothesis didn’t quite pan out—last year, researchers poked a hole in the idea by showing that the expression of emotions such as anger, happiness and fear wasn’t universal (PDF). Nonetheless, certain basic things—such as the urge to cry out in pain, an increase in blood pressure when feeling anger, even shrugging when we don’t understand something—cross cultures.

A new study, published today in the journal Frontiers in Psychology, compares such involuntary responses, but with an added twist: Some observable behaviors aren’t only universal to the human species, but to our closest relatives too—chimpanzees and bonobos.

Using video analysis, a team of UCLA researchers found that human, chimpanzee and bonobo babies make similar gestures when interacting with caregivers. Members of all three species reach with their arms and hands for objects or people, and point with their fingers or heads. They also raise their arms up, a motion indicating that they want to be picked up, in the same manner. Such gestures, which seemed to be innate in all three species, precede and eventually lead to the development of language in humans, the researchers say.

To pick up on these behaviors, the team studied three babies of differing species through videos taken over a number of months. The child stars of these videos included a chimpanzee named Panpanzee, a bonobo called Panbanisha and a human girl, identified as GN. The apes were raised together at the Georgia State University Language Research Center in Atlanta, where researchers study language and cognitive processes in chimps, monkeys and humans. There, Panpanzee and Panbanisha were taught to communicate with their human caregivers using gestures, noises and lexigrams, abstract symbols that represent words. The human child grew up in her family’s home, where her parents facilitated her learning.

Researchers filmed the child’s development for seven months, starting when she was 11 months old, while the apes were taped from 12 months of age to 26 months. In the early stages of the study, the observed gestures were of a communicative nature: all three infants engaged in the behavior with the intention of conveying how their emotions and needs. They made eye contact with their caregivers, added non-verbal vocalizations to their movements or exerted physical effort to elicit a response.

By the second half of the experiment, the production of communicative symbols—visual ones for the apes, vocal ones for the human—increased. As she grew older, the human child began using more spoken words, while the chimpanzee and bonobo learned and used more lexigrams. Eventually, the child began speaking to convey what she felt, rather than only gesturing. The apes, on the other hand, continued to rely on gestures. The study calls this divergence in behavior “the first indication of a distinctive human pathway to language.”

The researchers speculate that the matching behaviors can be traced to the last shared ancestor of humans, chimps and bobonos, who lived between four and seven million years ago. That ancestor probably exhibited the same early gestures, which all three species then inherited. When the species diverged, humans managed to build on this communicative capacity by eventually graduating to speech.

Hints of this can be seen in how the human child paired her gestures with non-speech vocalizations, the precursors to words, far more than the apes did. It’s this successful combinationof gestures and words that may have led to the birth of human language.

Filed under language development evolution gestures primates symbolic development psychology neuroscience science

181 notes

How Birds and Babies Learn to Talk

Few things are harder to study than human language. The brains of living humans can only be studied indirectly, and language, unlike vision, has no analogue in the animal world. Vision scientists can study sight in monkeys using techniques like single-neuron recording. But monkeys don’t talk.
However, in an article published in Nature, a group of researchers, including myself, detail a discovery in birdsong that may help lead to a revised understanding of an important aspect of human language development. Almost five years ago, I sent a piece of fan mail to Ofer Tchernichovski, who had just published an article showing that, in just three or four generations, songbirds raised in isolation often developed songs typical of their species. He invited me to visit his lab, a cramped space stuffed with several hundred birds residing in souped-up climate-controlled refrigerators. Dina Lipkind, at the time Tchernichovski’s post-doctoral student, explained a method she had developed for teaching zebra finches two songs. (Ordinarily, a zebra finch learns only one song in its lifetime.) She had discovered that by switching the song of a tutor bird at precisely the right moment, a juvenile bird could learn a second, new song after it had mastered the first one.
Thinking about bilingualism and some puzzles I had encountered in my own lab, I suggested that Lipkind’s method could be useful in casting light on the question of how a creature—any creature—learns to put linguistic elements together. We mapped out an experiment that day: birds would learn one “grammar” in which every phrase followed the form of ABCABC, and then we would switch things up, giving them a new target, ACBACB (the As, Bs, and Cs were certain stereotyped chirps and peeps).
The results were thrilling: most of the birds could accomplish the task. But it was clearly difficult—it took several weeks for them to learn the new grammar—and it was challenging in a particular way. While the birds showed no sign of needing to relearn individual sounds, the connections between individual syllables, known as “transitions,” proved incredibly difficult. The birds proceeded slowly and systematically, incrementally working out each transition (e.g., from C to B, and B to A). They could not freely move syllables around, and did not engage in trial and error, either. Instead, they undertook a systematic struggle to learn particular connections between specific, individual syllables. The moment they mastered the third transition of the sequence, they were able to produce the entire grammar. Never, to my knowledge, had the process of learning any sort of grammar been so precisely articulated.
We wrote up the results, but Nature declined to publish them. Then Dina and Ofer speculated that our findings might be more convincing if they were true for not only zebra finches (hardly the Einsteins of the bird world) but for other species as well. Ofer contacted a Japanese researcher, Kazuo Okanoya, who he thought might be able to gather data for Bengalese finches, which have a more complex grammar than zebra finches. Amazingly, the Bengalese finches followed almost exactly the same learning pattern as the zebra finches.
Then we decided to test our ideas about the incrementality of vocal learning in human infants, enlisting the help of a graduate student I had been working with at N.Y.U., Doug Bemis. Bemis and Lipkind analyzed an old, publicly available set of human-babbling data, drawn from the CHILDES database, in a new way. The literature said that in the later part of the first year of life, babies undergo a change from “reduplicated” babbling—repeating a syllable, like bababa—to “variegated” babbling—often switching between syllables, like babadaga. Our birdsong results led us to wonder whether such a change might be more piecemeal than is commonly presumed, and our examination of the data proved that, in fact, the change did not happen all at once. It was gradual, with new transitions worked out one by one; human babies were stymied in the same ways that the birds were. Nobody had ever really explained why babbling took so many months; our birdsong data has finally yielded a first clue.
Today, almost five years after Lipkind and Tchernichovski began developing the methods that are at the paper’s core, the work is finally being published by Nature.
What we don’t yet know is whether the similarity between birds and babies stems from a fundamental similarity between species at the biological level. When two species do something in similar ways, it can be a matter of “homology,” a genuine lineage at the genetic level, or “analogy,” which is independent reinvention. It will likely be years before we know for sure, but there is reason to believe that our results are not purely an accident of independent invention. Some of the important genes in human vocal learning (including FOXP2, the gene thus far most decisively tied to human language) are also involved in avian vocal learning, as a new book, “Birdsong, Speech, and Language,” discusses at length.
Language will never be as easy to dissect as birdsong, but knowledge about one can inform knowledge about the other. Our brains didn’t evolve to be easily understood, but the fact that humans share so many genes with so many other species gives scientists a fighting chance.

How Birds and Babies Learn to Talk

Few things are harder to study than human language. The brains of living humans can only be studied indirectly, and language, unlike vision, has no analogue in the animal world. Vision scientists can study sight in monkeys using techniques like single-neuron recording. But monkeys don’t talk.

However, in an article published in Nature, a group of researchers, including myself, detail a discovery in birdsong that may help lead to a revised understanding of an important aspect of human language development. Almost five years ago, I sent a piece of fan mail to Ofer Tchernichovski, who had just published an article showing that, in just three or four generations, songbirds raised in isolation often developed songs typical of their species. He invited me to visit his lab, a cramped space stuffed with several hundred birds residing in souped-up climate-controlled refrigerators. Dina Lipkind, at the time Tchernichovski’s post-doctoral student, explained a method she had developed for teaching zebra finches two songs. (Ordinarily, a zebra finch learns only one song in its lifetime.) She had discovered that by switching the song of a tutor bird at precisely the right moment, a juvenile bird could learn a second, new song after it had mastered the first one.

Thinking about bilingualism and some puzzles I had encountered in my own lab, I suggested that Lipkind’s method could be useful in casting light on the question of how a creature—any creature—learns to put linguistic elements together. We mapped out an experiment that day: birds would learn one “grammar” in which every phrase followed the form of ABCABC, and then we would switch things up, giving them a new target, ACBACB (the As, Bs, and Cs were certain stereotyped chirps and peeps).

The results were thrilling: most of the birds could accomplish the task. But it was clearly difficult—it took several weeks for them to learn the new grammar—and it was challenging in a particular way. While the birds showed no sign of needing to relearn individual sounds, the connections between individual syllables, known as “transitions,” proved incredibly difficult. The birds proceeded slowly and systematically, incrementally working out each transition (e.g., from C to B, and B to A). They could not freely move syllables around, and did not engage in trial and error, either. Instead, they undertook a systematic struggle to learn particular connections between specific, individual syllables. The moment they mastered the third transition of the sequence, they were able to produce the entire grammar. Never, to my knowledge, had the process of learning any sort of grammar been so precisely articulated.

We wrote up the results, but Nature declined to publish them. Then Dina and Ofer speculated that our findings might be more convincing if they were true for not only zebra finches (hardly the Einsteins of the bird world) but for other species as well. Ofer contacted a Japanese researcher, Kazuo Okanoya, who he thought might be able to gather data for Bengalese finches, which have a more complex grammar than zebra finches. Amazingly, the Bengalese finches followed almost exactly the same learning pattern as the zebra finches.

Then we decided to test our ideas about the incrementality of vocal learning in human infants, enlisting the help of a graduate student I had been working with at N.Y.U., Doug Bemis. Bemis and Lipkind analyzed an old, publicly available set of human-babbling data, drawn from the CHILDES database, in a new way. The literature said that in the later part of the first year of life, babies undergo a change from “reduplicated” babbling—repeating a syllable, like bababa—to “variegated” babbling—often switching between syllables, like babadaga. Our birdsong results led us to wonder whether such a change might be more piecemeal than is commonly presumed, and our examination of the data proved that, in fact, the change did not happen all at once. It was gradual, with new transitions worked out one by one; human babies were stymied in the same ways that the birds were. Nobody had ever really explained why babbling took so many months; our birdsong data has finally yielded a first clue.

Today, almost five years after Lipkind and Tchernichovski began developing the methods that are at the paper’s core, the work is finally being published by Nature.

What we don’t yet know is whether the similarity between birds and babies stems from a fundamental similarity between species at the biological level. When two species do something in similar ways, it can be a matter of “homology,” a genuine lineage at the genetic level, or “analogy,” which is independent reinvention. It will likely be years before we know for sure, but there is reason to believe that our results are not purely an accident of independent invention. Some of the important genes in human vocal learning (including FOXP2, the gene thus far most decisively tied to human language) are also involved in avian vocal learning, as a new book, “Birdsong, Speech, and Language,” discusses at length.

Language will never be as easy to dissect as birdsong, but knowledge about one can inform knowledge about the other. Our brains didn’t evolve to be easily understood, but the fact that humans share so many genes with so many other species gives scientists a fighting chance.

Filed under birdsong language language development zebra finches vocal learning neuroscience science

134 notes

Study shows humans and apes learn language differently
How do children learn language? Many linguists believe that the stages that a child goes through when learning language mirror the stages of language development in primate evolution. In a paper published in the Proceedings of the National Academy of Sciences, Charles Yang of the University of Pennsylvania suggests that if this is true, then small children and non-human primates would use language the same way. He then uses statistical analysis to prove that this is not the case. The language of small children uses grammar, while language in non-human primates relies on imitation.
Yang examines two hypotheses about language development in children. One of these says that children learn how to put words together by imitating the word combinations of adults. The other states that children learn to combine words by following grammatical rules.
Linguists who support the idea that children are parroting refer to the fact that children appear to combine the same words in the same ways. For example, an English speaker can put either the determiner “a” or the determiner “the” in front of a singular noun. “A door” and “the door” are both grammatically correct, as are “a cat” and “the cat.” However, with most singular nouns, children tend to use either “a” or “the” but not both. This suggests that children are mimicking strings of words without understanding grammatical rules about how to combine the words.
Yang, however, points out that the lack of diversity in children’s word combinations could reflect the way that adults use language. Adults are more likely to use “a” with some words and “the” with others. “The bathroom” is more common than “a bathroom.” “A bath” is more common than “the bath.”
To test this conjecture, Yang analyzed language samples of young children who had just begun making two-word combinations. He calculated the number of different noun-determiner combinations someone would make if they were combining nouns and determiners independently, and found that the diversity of the children’s language matched this profile. He also found that the children’s word combinations were much more diverse than they would be if they were simply imitating word strings.
Yang also studied language diversity in Nim Chimpsky, a chimpanzee who knows American Sign Language. Nim’s word combinations are much less diverse than would be expected if he were combining words independently. This indicates that he is probably mimicking, rather than using grammar.
This difference in language use indicates that human children do not acquire language in the same way that non-human primates do. Young children learn rules of grammar very quickly, while a chimpanzee who has spent many years learning language continues to imitate rather than combine words based on grammatical rules.

Study shows humans and apes learn language differently

How do children learn language? Many linguists believe that the stages that a child goes through when learning language mirror the stages of language development in primate evolution. In a paper published in the Proceedings of the National Academy of Sciences, Charles Yang of the University of Pennsylvania suggests that if this is true, then small children and non-human primates would use language the same way. He then uses statistical analysis to prove that this is not the case. The language of small children uses grammar, while language in non-human primates relies on imitation.

Yang examines two hypotheses about language development in children. One of these says that children learn how to put words together by imitating the word combinations of adults. The other states that children learn to combine words by following grammatical rules.

Linguists who support the idea that children are parroting refer to the fact that children appear to combine the same words in the same ways. For example, an English speaker can put either the determiner “a” or the determiner “the” in front of a singular noun. “A door” and “the door” are both grammatically correct, as are “a cat” and “the cat.” However, with most singular nouns, children tend to use either “a” or “the” but not both. This suggests that children are mimicking strings of words without understanding grammatical rules about how to combine the words.

Yang, however, points out that the lack of diversity in children’s word combinations could reflect the way that adults use language. Adults are more likely to use “a” with some words and “the” with others. “The bathroom” is more common than “a bathroom.” “A bath” is more common than “the bath.”

To test this conjecture, Yang analyzed language samples of young children who had just begun making two-word combinations. He calculated the number of different noun-determiner combinations someone would make if they were combining nouns and determiners independently, and found that the diversity of the children’s language matched this profile. He also found that the children’s word combinations were much more diverse than they would be if they were simply imitating word strings.

Yang also studied language diversity in Nim Chimpsky, a chimpanzee who knows American Sign Language. Nim’s word combinations are much less diverse than would be expected if he were combining words independently. This indicates that he is probably mimicking, rather than using grammar.

This difference in language use indicates that human children do not acquire language in the same way that non-human primates do. Young children learn rules of grammar very quickly, while a chimpanzee who has spent many years learning language continues to imitate rather than combine words based on grammatical rules.

Filed under primates language language development grammatical rules linguistics psychology neuroscience science

80 notes

Language Protein Differs in Males, Females
Male rat pups have more of a specific brain protein associated with language development than females, according to a study published February 20 in The Journal of Neuroscience. The study also found sex differences in the brain protein in a small group of children. The findings may shed light on sex differences in communication in animals and language acquisition in people.
Sex differences in early language acquisition and development in children are well documented — on average, girls tend to speak earlier and with greater complexity than boys of the same age. However, scientists continue to debate the origin and significance of such differences. Previous studies showed the Foxp2 protein plays an important role in speech and language development in humans and vocal communication in birds and other mammals.
In the current study, J. Michael Bowers, PhD, Margaret McCarthy, PhD, and colleagues at the University of Maryland School of Medicine examined whether sex differences in the expression of the Foxp2 protein in the developing brain might underlie communication differences between the sexes.
The researchers analyzed the levels of Foxp2 protein in the brains of four-day-old female and male rats and compared the ultrasonic distress calls made by the animals when separated from their mothers and siblings. Compared with females, males had more of the protein in brain areas associated with cognition, emotion, and vocalization. They also made more noise than females — producing nearly double the total vocalizations over the five-minute separation period — and were preferentially retrieved and returned to the nest first by the mother.
When the researchers reduced levels of the Foxp2 protein in the male pups and increased it in female pups, they reversed the sex difference in the distress calls, causing males to sound like females and the females like males. This change led the mother to reverse her behavior as well, preferentially retrieving the females over the males.
“This study is one of the first to report a sex difference in the expression of a language-associated protein in humans or animals,” McCarthy said. “The findings raise the possibility that sex differences in brain and behavior are more pervasive and established earlier than previously appreciated.”
The researchers extended their findings to humans in a preliminary study of Foxp2 protein in a small group of children. Unlike the rats, in which Foxp2 protein was elevated in males, they found that in humans, the girls had more of the Foxp2 protein in the cortex — a brain region associated with language — than age-matched boys.
“At first glance, one might conclude that the findings in rats don’t generalize to humans, but the higher levels of Foxp2 expression are found in the more communicative sex in each species,” noted Cheryl Sisk, who studies sex differences at Michigan State University and was not involved with the study.

Language Protein Differs in Males, Females

Male rat pups have more of a specific brain protein associated with language development than females, according to a study published February 20 in The Journal of Neuroscience. The study also found sex differences in the brain protein in a small group of children. The findings may shed light on sex differences in communication in animals and language acquisition in people.

Sex differences in early language acquisition and development in children are well documented — on average, girls tend to speak earlier and with greater complexity than boys of the same age. However, scientists continue to debate the origin and significance of such differences. Previous studies showed the Foxp2 protein plays an important role in speech and language development in humans and vocal communication in birds and other mammals.

In the current study, J. Michael Bowers, PhD, Margaret McCarthy, PhD, and colleagues at the University of Maryland School of Medicine examined whether sex differences in the expression of the Foxp2 protein in the developing brain might underlie communication differences between the sexes.

The researchers analyzed the levels of Foxp2 protein in the brains of four-day-old female and male rats and compared the ultrasonic distress calls made by the animals when separated from their mothers and siblings. Compared with females, males had more of the protein in brain areas associated with cognition, emotion, and vocalization. They also made more noise than females — producing nearly double the total vocalizations over the five-minute separation period — and were preferentially retrieved and returned to the nest first by the mother.

When the researchers reduced levels of the Foxp2 protein in the male pups and increased it in female pups, they reversed the sex difference in the distress calls, causing males to sound like females and the females like males. This change led the mother to reverse her behavior as well, preferentially retrieving the females over the males.

“This study is one of the first to report a sex difference in the expression of a language-associated protein in humans or animals,” McCarthy said. “The findings raise the possibility that sex differences in brain and behavior are more pervasive and established earlier than previously appreciated.”

The researchers extended their findings to humans in a preliminary study of Foxp2 protein in a small group of children. Unlike the rats, in which Foxp2 protein was elevated in males, they found that in humans, the girls had more of the Foxp2 protein in the cortex — a brain region associated with language — than age-matched boys.

“At first glance, one might conclude that the findings in rats don’t generalize to humans, but the higher levels of Foxp2 expression are found in the more communicative sex in each species,” noted Cheryl Sisk, who studies sex differences at Michigan State University and was not involved with the study.

Filed under language development brain protein sex differences vocal communication vocalization neuroscience science

169 notes

First ever UK based language tool to decode baby talk
A tool which could radically improve the diagnosis of language delays in infants in the UK is being developed by psychologists.
A £358,000 grant to develop the first standardised UK speech and language development tool means that for the first time, researchers will be able to establish language development norms for UK children aged eight months to 18 months.
The tool will plug an important gap which has left UK researchers, education and health professionals at a disadvantage.
Until now, UK language experts have been forced to rely upon more complicated methods of testing child language development, or on methods designed for American English speakers which can lead to UK babies being misdiagnosed as being delayed in language development.
The two-and-a-half year project funded by the ESRC will also look into the impact of family income and education on UK children’s language development, as well as examining differences between children learning UK English, and other languages and English dialects.
The project is expected to make a major contribution to language development research as well as to the effectiveness of speech and language therapy and improved policy making.

First ever UK based language tool to decode baby talk

A tool which could radically improve the diagnosis of language delays in infants in the UK is being developed by psychologists.

A £358,000 grant to develop the first standardised UK speech and language development tool means that for the first time, researchers will be able to establish language development norms for UK children aged eight months to 18 months.

The tool will plug an important gap which has left UK researchers, education and health professionals at a disadvantage.

Until now, UK language experts have been forced to rely upon more complicated methods of testing child language development, or on methods designed for American English speakers which can lead to UK babies being misdiagnosed as being delayed in language development.

The two-and-a-half year project funded by the ESRC will also look into the impact of family income and education on UK children’s language development, as well as examining differences between children learning UK English, and other languages and English dialects.

The project is expected to make a major contribution to language development research as well as to the effectiveness of speech and language therapy and improved policy making.

Filed under language language development UK Communicative Development Inventory children psychology science

40 notes



Video-based Test to Study Language Development in Toddlers and Children with Autism
Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments). 
In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).
"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.
While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Video-based Test to Study Language Development in Toddlers and Children with Autism

Parents often wonder how much of the world their young children really understand. Though typically developing children are not able to speak or point to objects on command until they are between eighteen months and two years old, they do provide clues that they understand language as early as the age of one. These clues provide a point of measurement for psychologists interested in language comprehension of toddlers and young children with autism, as demonstrated in a new video-article published in JoVE (Journal of Visualized Experiments).

In the assessment, psychologists track a child’s eye movements while they are watching two side by side videos. Children who understand language are more likely to look at the video that the audio corresponds to. This way, language comprehension is tested by attention, not by asking the child to respond or point something out.  Furthermore, all assessments can be conducted in the child’s home, using mobile, commercially available equipment. The technique was developed in the laboratory of Dr. Letitia Naigles, and is known as a portable intermodal preferential looking assessment (IPL).

"When I started working with children with autism, I realized that they have similar issues with strangers that very young typical children do," Dr. Naigles tells us. "Children with autism may understand more than they can show because they are not socially inclined and find social interaction aversive and challenging." Dr. Naigles’ approach helps make this assessment more valuable. By testing the child in the home, where they are comfortable, Dr. Naigles removes much of the anxiety associated with a new environment that may skew results.

While this technique identifies some similarities between typically developing toddlers and children with autism spectrum disorder, such as understanding some types of sentences before they produce them, this does not mean that these children are the same. “Some strategies of word learning that typical children have acquired are not demonstrated in children with autism.” Dr. Naigles says. By illuminating both strengths and weaknesses, the test is valuable for assessing language development. “JoVE is useful because in the past, I have gone to visit various labs to coach them in putting together an IPL. JoVE will enable other labs to set up the procedure more efficiently.” JoVE associate editor Allison Diamond stated, “Showing this work in a video format will allow other scientists in the field to quickly adapt Dr. Naigles’ technique, and use it to address the question of language development in autism, an extremely important field of research.”

Filed under autism language language development eye movements language comprehension psychology neuroscience science

132 notes

Moms’ depression affects babies’ language development – but so does anti-depressant drug – research shows
Janet Werker and her colleagues played recordings to babies when they were still in the womb.
Then the University of British Columbia psychologist and her team tested babies’ ability to discriminate between English and French when the infants were just six and 10 months old.
The findings, published Monday, are striking.
Both maternal depression, which affects up to 20 per cent of pregnant women, and treating mothers with a common anti-depressant drug threw off infants’ language development, Werker and her colleagues at the University of British Columbia and Harvard University report in the U.S. Proceedings of the National Academy of Sciences.
Babies of depressed mothers were slow to reach language development “milestones,” they report. And babies of mothers taking antidepressants known as serotonin reuptake inhibitors (SRIs) reached milestones months early, they report.

Moms’ depression affects babies’ language development – but so does anti-depressant drug – research shows

Janet Werker and her colleagues played recordings to babies when they were still in the womb.

Then the University of British Columbia psychologist and her team tested babies’ ability to discriminate between English and French when the infants were just six and 10 months old.

The findings, published Monday, are striking.

Both maternal depression, which affects up to 20 per cent of pregnant women, and treating mothers with a common anti-depressant drug threw off infants’ language development, Werker and her colleagues at the University of British Columbia and Harvard University report in the U.S. Proceedings of the National Academy of Sciences.

Babies of depressed mothers were slow to reach language development “milestones,” they report. And babies of mothers taking antidepressants known as serotonin reuptake inhibitors (SRIs) reached milestones months early, they report.

Filed under brain infants development language development depression maternal depression neuroscience psychology science

43 notes

Babies’ ability to detect complex rules in language outshines that of adults
New research examining auditory mechanisms of language learning in babies has revealed that infants as young as three months of age are able to automatically detect and learn complex dependencies between syllables in spoken language. By contrast, adults only recognized the same dependencies when asked to actively search for them. The study by scientists at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig also highlights the important role of basic pitch discrimination abilities for early language development.

Babies’ ability to detect complex rules in language outshines that of adults

New research examining auditory mechanisms of language learning in babies has revealed that infants as young as three months of age are able to automatically detect and learn complex dependencies between syllables in spoken language. By contrast, adults only recognized the same dependencies when asked to actively search for them. The study by scientists at the Max Planck Institute for Human Cognitive and Brain Sciences in Leipzig also highlights the important role of basic pitch discrimination abilities for early language development.

Filed under brain language language development linguistics neuroscience psychology learning science

162 notes

Languages are extremely diverse, but they are not arbitrary. Behind the bewildering, contradictory ways in which different tongues conceptualise the world, we can sometimes discern order. Linguists have traditionally assumed that this reflects the hardwired linguistic aptitude of the human brain. Yet recent scientific studies propose that language “universals” aren’t simply prescribed by genes but that they arise from the interaction between the biology of human perception and the bustle, exchange and negotiation of human culture.
Language has a logical job to do—to convey information—and yet it is riddled with irrationality: irregular verbs, random genders, silent vowels, ambiguous homophones. You’d think languages would evolve towards an optimal state of concision, but instead they accumulate quirks that hinder learning, not only for foreigners but also for native speakers.
These peculiarities have been explained by linguists by reference to the history of the people who speak it. That’s often fascinating, but it does not yield general principles about how languages have developed—or how they will change in future. As they evolve, what guides their form?
Read more

Languages are extremely diverse, but they are not arbitrary. Behind the bewildering, contradictory ways in which different tongues conceptualise the world, we can sometimes discern order. Linguists have traditionally assumed that this reflects the hardwired linguistic aptitude of the human brain. Yet recent scientific studies propose that language “universals” aren’t simply prescribed by genes but that they arise from the interaction between the biology of human perception and the bustle, exchange and negotiation of human culture.

Language has a logical job to do—to convey information—and yet it is riddled with irrationality: irregular verbs, random genders, silent vowels, ambiguous homophones. You’d think languages would evolve towards an optimal state of concision, but instead they accumulate quirks that hinder learning, not only for foreigners but also for native speakers.

These peculiarities have been explained by linguists by reference to the history of the people who speak it. That’s often fascinating, but it does not yield general principles about how languages have developed—or how they will change in future. As they evolve, what guides their form?

Read more

Filed under neuroscience psychology brain language linguistics language development science

free counters