Phonological memory in sign language relies on the visuomotor neural system outside the left hemisphere language network

  • Kanazawa, Yuji
    Human Brain Research Center, Kyoto University Graduate School of Medicine・Department of Otolaryngology-Head and Neck Surgery, Kyoto University Graduate School of Medicine
  • Nakamura, Kimihiro
    Human Brain Research Center, Kyoto University Graduate School of Medicine・Faculty of Human Sciences, University of Tsukuba
  • Ishii, Toru
    Human Brain Research Center, Kyoto University Graduate School of Medicine
  • Aso, Toshihiko
    Human Brain Research Center, Kyoto University Graduate School of Medicine
  • Yamazaki, Hiroshi
    Department of Otolaryngology-Head and Neck Surgery, Kyoto University Graduate School of Medicine
  • Omori, Koichi
    Department of Otolaryngology-Head and Neck Surgery, Kyoto University Graduate School of Medicine

Abstract

Sign language is an essential medium for everyday social interaction for deaf people and plays a critical role in verbal learning. In particular, language development in those people should heavily rely on the verbal short-term memory (STM) via sign language. Most previous studies compared neural activations during signed language processing in deaf signers and those during spoken language processing in hearing speakers. For sign language users, it thus remains unclear how visuospatial inputs are converted into the verbal STM operating in the left-hemisphere language network. Using functional magnetic resonance imaging, the present study investigated neural activation while bilinguals of spoken and signed language were engaged in a sequence memory span task. On each trial, participants viewed a nonsense syllable sequence presented either as written letters or as fingerspelling (4–7 syllables in length) and then held the syllable sequence for 12 s. Behavioral analysis revealed that participants relied on phonological memory while holding verbal information regardless of the type of input modality. At the neural level, this maintenance stage broadly activated the left-hemisphere language network, including the inferior frontal gyrus, supplementary motor area, superior temporal gyrus and inferior parietal lobule, for both letter and fingerspelling conditions. Interestingly, while most participants reported that they relied on phonological memory during maintenance, direct comparisons between letters and fingers revealed strikingly different patterns of neural activation during the same period. Namely, the effortful maintenance of fingerspelling inputs relative to letter inputs activated the left superior parietal lobule and dorsal premotor area, i.e., brain regions known to play a role in visuomotor analysis of hand/arm movements. These findings suggest that the dorsal visuomotor neural system subserves verbal learning via sign language by relaying gestural inputs to the classical left-hemisphere language network.

Journal

  • PLOS ONE

    PLOS ONE 12 (9), e0177599-, 2017-09-20

    Public Library of Science (PLoS)

Citations (3)*help

See more

References(48)*help

See more

Related Projects

See more

Details 詳細情報について

Report a problem

Back to top