Neuroscience

Articles and news from the latest research reports.

Posts tagged sign language

839 notes

What sign language teaches us about the brain
The world’s leading humanoid robot, ASIMO, has recently learnt sign language. The news of this breakthrough came just as I completed Level 1 of British Sign Language (I dare say it took me longer to master signing than it did the robot!). As a neuroscientist, the experience of learning to sign made me think about how the brain perceives this means of communicating.
For instance, during my training, I found that mnemonics greatly simplified my learning process. To sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that the veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in that one relies on sounds and the other on visual signs.
Do our brains process these languages differently? It seems that for the most part, they don’t. And it turns out that brain studies of sign language users have helped bust a few myths.
Read more

What sign language teaches us about the brain

The world’s leading humanoid robot, ASIMO, has recently learnt sign language. The news of this breakthrough came just as I completed Level 1 of British Sign Language (I dare say it took me longer to master signing than it did the robot!). As a neuroscientist, the experience of learning to sign made me think about how the brain perceives this means of communicating.

For instance, during my training, I found that mnemonics greatly simplified my learning process. To sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that the veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in that one relies on sounds and the other on visual signs.

Do our brains process these languages differently? It seems that for the most part, they don’t. And it turns out that brain studies of sign language users have helped bust a few myths.

Read more

Filed under sign language neuroimaging communication lesion studies neuroscience science

91 notes


The Road to Language Learning Is Iconic
Languages are highly complex systems and yet most children seem to acquire language easily, even in the absence of formal instruction. New research on young children’s use of British Sign Language (BSL) sheds light on one mechanism – iconicity – that may play an important role in children’s ability to learn language.
For spoken and written language, the arbitrary relationship between a word’s form – how it sounds or how it looks on paper – and its meaning is a particularly challenging feature of language acquisition. But one of the first things people notice about sign languages is that signs often represent aspects of meaning in their form. For example, in BSL the sign EAT involves bringing the hand to the mouth just as you would if you were bringing food to the mouth to eat it.
In fact, a high proportion of signs across the world’s sign languages are similarly iconic, connecting human experience to linguistic form.
Robin Thompson and colleagues David Vison, Bencie Woll, and Gabriella Vigliocco at the Deafness, Cognition and Language Research Centre (DCAL) at University College London in the United Kingdom wanted to examine whether this kind of iconicity might provide a key to understanding how children come to link words to their meaning.
Their findings are published in Psychological Science, a journal of the Association for Psychological Science.

(Photo: David Levene)

The Road to Language Learning Is Iconic

Languages are highly complex systems and yet most children seem to acquire language easily, even in the absence of formal instruction. New research on young children’s use of British Sign Language (BSL) sheds light on one mechanism – iconicity – that may play an important role in children’s ability to learn language.

For spoken and written language, the arbitrary relationship between a word’s form – how it sounds or how it looks on paper – and its meaning is a particularly challenging feature of language acquisition. But one of the first things people notice about sign languages is that signs often represent aspects of meaning in their form. For example, in BSL the sign EAT involves bringing the hand to the mouth just as you would if you were bringing food to the mouth to eat it.

In fact, a high proportion of signs across the world’s sign languages are similarly iconic, connecting human experience to linguistic form.

Robin Thompson and colleagues David Vison, Bencie Woll, and Gabriella Vigliocco at the Deafness, Cognition and Language Research Centre (DCAL) at University College London in the United Kingdom wanted to examine whether this kind of iconicity might provide a key to understanding how children come to link words to their meaning.

Their findings are published in Psychological Science, a journal of the Association for Psychological Science.

(Photo: David Levene)

Filed under language sign language iconicity BSL language acquisition neuroscience psychology science

56 notes

Giving a voice to the voiceless has been a cause that many have championed throughout history, but it’s safe to say that none of those efforts involved packing a bunch of sensors into a glove. A team of Ukrainian students has done just that in order to translate sign language into vocalized speech via a smartphone.
The inspiration for the gloves came from observing fellow college students who were deaf have difficulty communicating with other students, which results in them being excluded from activities. Initially, the team looked at commercially available gloves that could be modified to interpret a range of signs, but in the end, they opted to develop their own.
In their glove, a total of 15 flex sensors in the fingers measure the degree of bending while a compass, accelerometer, and gyroscope determine the motion of the glove through space. The sensor data are processed by a microcontroller on the glove then sent via Bluetooth to a mobile device, which translates the positions of the hand and fingers into text when the pattern is recognized. Using Microsoft APIs for Speech and Bing, the text is spoken by the phone running Windows Phone 7. The glove can also plug into a PC for data syncing and charging of its battery.

Giving a voice to the voiceless has been a cause that many have championed throughout history, but it’s safe to say that none of those efforts involved packing a bunch of sensors into a glove. A team of Ukrainian students has done just that in order to translate sign language into vocalized speech via a smartphone.

The inspiration for the gloves came from observing fellow college students who were deaf have difficulty communicating with other students, which results in them being excluded from activities. Initially, the team looked at commercially available gloves that could be modified to interpret a range of signs, but in the end, they opted to develop their own.

In their glove, a total of 15 flex sensors in the fingers measure the degree of bending while a compass, accelerometer, and gyroscope determine the motion of the glove through space. The sensor data are processed by a microcontroller on the glove then sent via Bluetooth to a mobile device, which translates the positions of the hand and fingers into text when the pattern is recognized. Using Microsoft APIs for Speech and Bing, the text is spoken by the phone running Windows Phone 7. The glove can also plug into a PC for data syncing and charging of its battery.

Filed under hearing loss sign language technology speech vocalization neuroscience psychology science

free counters