Sign Language, Neurolinguistics and You: Why sign language is every bit like any other language, and what it helps tells us about the brain and language.
So today I’m writing a bit about the study of sign language in linguistics. Sign language research is not my area, and most of the stuff I know comes from research into psycholinguistics, neurolinguistics and cognitive science. So I don’t know how to sign, and I only have a very passing familiarity with how sign language is used in the real world. That being said, there’s some very interesting things about sign language, and I think it’d be pretty handy to know them.
First off – sign language is a full ‘natural language’. It is just as capable of expressing anything a spoken language can, and really the only difference is that it’s done with signs instead of speech. It is in fact so similar that sign language has syllables, vowels and consonants, as well as an equally complex syntatical system. Not only that, but it has things like rhyming too! So, in linguistics, we say that sign language is a natural language just expressed in a different modality (i.e. hands).
Like I said above, I’m not really great with the structure of sign language, and a native speaker will be able to fill you in much more about specifics. I might talk a little about the presence of consonants and vowels in sign language, but it’s mostly an analysis thing in my mind. What I really want to make clear is – if it wasn’t obvious – that signers are not impaired in any linguistic sense at all, and a sign language is not superior nor inferior to any other language.
Sign language has fascinating ways of encoding information into language, but none of these are fundamentally different from spoken languages. Sign languages have morphology, word order and agreement like every other natural language. A classic way in which meanings of sign differ is spatially. An example of this can be seen below:
(Figures taken from Hickok, G., Bellugi, U. and Klima. E. The neural organization of language: evidence from sign language aphasia. Trends in Cognitive Sciences – Vol. 2, No. 4, April 1998)
The meaning of a sign depends on point of the face where it is done, the orientation of the palm, the movement of the hand. An interesting aspect of this the role of eyebrows and facial expressions in pragmatic communications. There’s more to it, but I’m sure a native speaker could explain better. The thing I want to point out is that sign language is highly spatial and so the meaning depends on where things are ordered in physical space. This is used to expressed different meanings, as well as allowing easy referring back to subjects or objects without directly naming them again (like, in English, saying “Tom likes Sarah.” where the signs are put in different spatial areas, and then can be referred back to by signs like “He” or “She” that make reference to the original spatial point they were signed in).
So why am I going on about spatial information encoding? Well, I’ll come back to that in a second, first I want to talk about aphasias and localised hemispheric damage to the brain.
The brain is organised into two hemispheres, which are contralateralised. This means that each hemisphere of the brain is responsible for the opposite part of the body, so the left hemisphere controls the right side of the body, and the right hemisphere controls the left side of the body. Different functions of the brain are localised to different hemispheres, and for 98% of right-handed people, the part dedicated to processing language is in the left hemisphere of the brain (there are some exceptions, but the majority of the case is that the left hemisphere of the brain is for language in humans). The right hemisphere is usually for spatial awareness and navigation. The left hemisphere’s language areas can be further divided up into two – Broca’s and Wernicke’s areas (I might go into more depth on aphasias in another post, but I’ll go over these superifically now).
When damage occurs to one of these areas, what usually occurs is that the subject picks up an aphasia, which is a problem with language. There is a whole range of different aphasias ranging from the very severe loss of language, to the somewhat bizarre such as losing the inability to speak about a very specific semantic set (like vegetables, but not fruit!). However, the two most famous and classic types of aphasia are Broca’s aphasia and Wernicke’s aphasia, which result from lesions/damage to near the respective areas.
Broca’s aphasia is characterised by heavy impairment of syntax, and so they have incredible trouble expressing words, and rarely form full grammatical sentences.
Wernicke’s aphasia, on the other hand, retains the fluentness of usual speak, but is semantically nonsense.
So why am I talking about aphasias in regards to sign language? I promise I’ll come to that very soon! On this photo of the brain, there’s more detail about the various functions and areas.
You may notice that near Broca’s area there’s the motor function for controlling fine mouth movement/vocalisation, and near Wernicke’s area there’s the primary auditory cortex. For a long time, these were thought the likely cuplrits of the types of aphasia shown. It seems plausible that inteference with the motor controls responsible for vocalisation would result in the halting speech of Broca’s aphasia, and that the problems with the primary auditory cortex could result in Wernicke’s aphasia, right?
But wait, what if a native speaker of sign language has aphasia? The answer is that they show the same sort of problems in their signing as other aphasiacs do!
It turns out that signers also make similar types of phonological errors as speakers do, which manifests them in different mistakes made due to the type of information encoding in the sign language:
Now, a while back you may have noticed that I mentioned that the right hemisphere was responsible for spatial awareness and the like. Now, for a heavy spatially-orientated language like sign language, what do you think would happen to a signer with right-hemisphere damage? Right hemisphere damage can result in an inability to spatially orient oneself, and a marked increase in difficulty for using spatial space, which shows up on tests:
The very surprising result is that even though signers who are severely deficient in spatial awareness show no impairment in their signing!. This is quite amazing, considering how heavy spatial information is important to sign language.
Anyway, if people are interested in more, I definitely recommend they check out What The Hands Reveal About the Brain by Poizner, Klima and Bellugi.
For more on sign language and deaf culture, please check out im a tough tootin babby i can punch all ur buns’ excellent thread!