User:Jkjorgensen/sandbox

From Wikipedia, the free encyclopedia

Sign Language in the Brain[edit]

Brain centers responsible for language processing[edit]

In 1861, Paul Broca studied patients with the ability to understand spoken languages but the inability to produce them. The damaged area was named Broca's area, and located in the left hemisphere’s inferior frontal gyrus (Brodmann areas 44, 45).[1] Soon after, in 1874, Carl Wernicke studied patients with the reverse deficits: patients could produce spoken language, but could not comprehend it. The damaged area was named Wernicke’s area, and is located in the left hemisphere’s posterior superior temporal gyrus.[2]

It was noted that Broca’s area was near the part of the motor cortex controlling the face and mouth. Likewise, Wernicke’s area was near the auditory cortex.[2] These motor and auditory areas are important in spoken language processing and production. For this reason, the left hemisphere was described as the verbal hemisphere, with the right hemisphere deemed to be responsible for spatial tasks.[2]

The debate then arose around sign languages. What was the brain organization of these languages? It was hypothesized that the deaf-equivalent of Broca's aphasia arose from damage somewhere near the cortex controlling the movement of the hands, and the deaf-equivalent of Wernicke's aphasia arose from damage near the visual cortex.[2]

The language of ASL: similarities to and differences from spoken languages[edit]

Unlike spoken languages, which are encoded in vocal-auditory changes, signed languages rely on visual-spatial changes to convey meaning. The neural organization underlying sign language abilities, however, has more in common with that of spoken language than it does with the neural organization underlying visuospatial processing.[2] When communicating in their respective languages, similar brain regions are activated for both deaf and hearing subjects, with deaf subjects having some additional activated regions near the visual cortex.[3] In other words, like spoken languages, signed languages are controlled by the left hemisphere of brain, in Broca’s and Wernicke’s areas.[4],[3]

American Sign Language is a highly structured linguistic system, and includes all complexities of spoken languages. Similarly, many other signed languages have their own sets of phonological, morphological and syntactic characteristics.5 Like any spoken language, they are not a senseless string of motor actions, but stem from higher-order functions in the brain.[2],[5]

A characteristic specific to signed languages is that of “signing space.” Signing space refers to the area in front of the signer in which signs are expressed. It is often used to detail relationships between arguments in discourse, but is also used in place of verbal prepositions; rather than using words to explain spatial relationships, signing space allows for a visual configuration of these relationships.[6]

Lesion studies[edit]

Left hemisphere damage[edit]

To determine the brain structures associated with processing and production of signed languages, signers with left- and right-hemisphere damage were studied. Those with left hemisphere damage (LHD), in areas ranging from the frontal lobe to the occipital lobe, exhibited both Broca’s and Wernicke’s aphasia symptoms. Patients performed poorly on many language-based tasks such as comprehending signs and sentences and fluently signing. Similar to hearing patients’ “slips of the tongue” after LHD, deaf LHD patients experienced paraphasias, or “slips of the hand.” These slips of the hand usually involve an incorrect hand shape in the correct location and with the correct movement, similar to a hearing patient substituting “bline” or “gine” for “fine.”[2]

It was determined that deficits in sign language articulation were not due to general motor problems; the patients who had difficulties signing were nevertheless capable of producing meaningless hand and arm gestures to command.[2]

Right hemisphere damage[edit]

Signers with right-hemisphere damage (RHD), again ranging from frontal lobes to occipital lobes, had no problems with fluency or correct sign comprehension and production. Even when nonlinguistic visuospatial abilities, like drawing or copying, were compromised, patients could communicate efficiently.[2]

Some right hemisphere damage does lead to disruptions in sign languages, however. The topographical use of signing space is often imprecise in patients with RHD; the relation between the location of hands in signing space and the location of objects in physical space is often impaired. Rather than being misunderstood, however, subjects and objects in a sentence may simply be placed incorrectly relative to the other subjects and objects in a sentence, like saying “the pencil is in the book” rather than, “the pencil is on top of the book.”[6]

Treatment for deaf aphasics[edit]

Learning sign language to communicate after stroke has been a treatment option for hearing aphasics, but currently there is not much literature on treatment for signers with aphasias and other communication deficits. Understanding the neural underpinnings of sign language is, however, a large step in treatment research.

References[edit]

  1. ^ Dronkers NF, Plaisant O, Iba-Zizen MT, and Cabanis EA (2007) Paul Broca's historic cases: high resolution MR imaging of the brains of Leborgne and Lelong. Brain 130.5: 1432-441.
  2. ^ a b c d e f g h i Hickok G, Bellugi U and Klima, ES (2011) Sign language in the brain. Scientific American June: 46-53.
  3. ^ a b MacSweeney M, Woll B, Campbell R, McGuire PK, David AS, Williams SCR, Suckling J, Calvert GA and Brammer MJ (2002) Neural systems underlying British Sign Language and audio-visual English processing in native users. Brain 125: 1583-593.
  4. ^ Emmorey K, Mehta S, and Grabowski TJ (2007) The neural correlates of sign versus word production. NeuroImage 36.1: 202-08.
  5. ^ Hickok G, Bellugi U and Klima, ES (1996) The neurobiology of sign language and its implications for the neural basis of language. Nature 381.6584: 699-702.
  6. ^ a b Emmorey K, Damasio H, McCullough S, Grabowski T, Ponto LLB, Hichwa RD and Bellugi U (2002) Neural systems underlying spatial language in American Sign Language. NeuroImage 17: 812-24.