Posts tagged grammar

Posts tagged grammar
The sound of small children chattering has always been considered cute – but not particularly sophisticated. However, research by a Newcastle University expert has shown their speech is far more advanced than previously understood.

Dr Cristina Dye, a lecturer in child language development, found that two to three- year-olds are using grammar far sooner than expected.
She studied fifty French speaking youngsters aged between 23 and 37 months, capturing tens of thousands of their utterances.
Dr Dye, who carried out the research while at Cornell University in the United States, found that the children were using ‘little words’ which form the skeleton of sentences such as a, an, can, is, an, far sooner than previously thought.
Dr Dye and her team used advanced recording technology including highly sensitive microphones placed close to the children, to capture the precise sounds the children voiced. They spent years painstakingly analysing every minute sound made by the toddlers and the context in which it was produced.
They found a clear, yet previously undetected, pattern of sounds and puffs of air, which consistently replaced grammatical words in many of the children’s utterances.
Dr Dye said: “Many of the toddlers we studied made a small sound, a soft breath, or a pause, at exactly the place that a grammatical word would normally be uttered.”
“The fact that this sound was always produced in the correct place in the sentence leads us to believe that young children are knowledgeable of grammatical words. They are far more sophisticated in their grammatical competence than we ever understood.
“Despite the fact the toddlers we studied were acquiring French, our findings are expected to extend to other languages. I believe we should give toddlers more credit – they’re much more amazing than we realised.”
For decades the prevailing view among developmental specialists has been that children’s early word combinations are devoid of grammatical words. On this view, children then undergo a ‘tadpole to frog’ transformation where due to an unknown mechanism, they start to develop grammar in their speech. Dye’s results now challenge the old view.
Dr Dye said: “The research sheds light on a really important part of a child’s development. Language is one of the things that makes us human and understanding how we acquire it shows just how amazing children are.
“There are also implications for understanding language delay in children. When children don’t learn to speak normally it can lead to serious issues later in life. For example, those who have it are more likely to suffer from mental illness or be unemployed later in life. If we can understand what is ‘normal’ as early as possible then we can intervene sooner to help those children.”
The research was originally published in the Journal of Linguistics.
(Source: ncl.ac.uk)
Grammar errors? The brain detects them even when you are unaware
Your brain often works on autopilot when it comes to grammar. That theory has been around for years, but University of Oregon neuroscientists have captured elusive hard evidence that people indeed detect and process grammatical errors with no awareness of doing so.
Participants in the study — native-English speaking people, ages 18-30 — had their brain activity recorded using electroencephalography, from which researchers focused on a signal known as the Event-Related Potential (ERP). This non-invasive technique allows for the capture of changes in brain electrical activity during an event. In this case, events were short sentences presented visually one word at a time.
Subjects were given 280 experimental sentences, including some that were syntactically (grammatically) correct and others containing grammatical errors, such as “We drank Lisa’s brandy by the fire in the lobby,” or “We drank Lisa’s by brandy the fire in the lobby.” A 50 millisecond audio tone was also played at some point in each sentence. A tone appeared before or after a grammatical faux pas was presented. The auditory distraction also appeared in grammatically correct sentences.
This approach, said lead author Laura Batterink, a postdoctoral researcher, provided a signature of whether awareness was at work during processing of the errors. “Participants had to respond to the tone as quickly as they could, indicating if its pitch was low, medium or high,” she said. “The grammatical violations were fully visible to participants, but because they had to complete this extra task, they were often not consciously aware of the violations. They would read the sentence and have to indicate if it was correct or incorrect. If the tone was played immediately before the grammatical violation, they were more likely to say the sentence was correct even it wasn’t.”
When tones appeared after grammatical errors, subjects detected 89 percent of the errors. In cases where subjects correctly declared errors in sentences, the researchers found a P600 effect, an ERP response in which the error is recognized and corrected on the fly to make sense of the sentence.
When the tones appear before the grammatical errors, subjects detected only 51 percent of them. The tone before the event, said co-author Helen J. Neville, who holds the UO’s Robert and Beverly Lewis Endowed Chair in psychology, created a blink in their attention. The key to conscious awareness, she said, is based on whether or not a person can declare an error, and the tones disrupted participants’ ability to declare the errors. But, even when the participants did not notice these errors, their brains responded to them, generating an early negative ERP response. These undetected errors also delayed participants’ reaction times to the tones.
"Even when you don’t pick up on a syntactic error your brain is still picking up on it," Batterink said. "There is a brain mechanism recognizing it and reacting to it, processing it unconsciously so you understand it properly."
The study was published in the May 8 issue of the Journal of Neuroscience.
The brain processes syntactic information implicitly, in the absence of awareness, the authors concluded. “While other aspects of language, such as semantics and phonology, can also be processed implicitly, the present data represent the first direct evidence that implicit mechanisms also play a role in the processing of syntax, the core computational component of language.”
It may be time to reconsider some teaching strategies, especially how adults are taught a second language, said Neville, a member of the UO’s Institute of Neuroscience and director of the UO’s Brain Development Lab.
Children, she noted, often pick up grammar rules implicitly through routine daily interactions with parents or peers, simply hearing and processing new words and their usage before any formal instruction. She likened such learning to “Jabberwocky,” the nonsense poem introduced by writer Lewis Carroll in 1871 in “Through the Looking Glass,” where Alice discovers a book in an unrecognizable language that turns out to be written inversely and readable in a mirror.
For a second language, she said, “Teach grammatical rules implicitly, without any semantics at all, like with jabberwocky. Get them to listen to jabberwocky, like a child does.”