Philosophy Colloquium
Xuan Wang
University of Maryland
Modality Difference in Writing and Sign Language

Natural language exists in different modalities: spoken, written, signed or brailled. Although there is ample evidence showing that languages in different modalities exhibit rather different structures, there is a ‘glottocentric bias’ among theorists, i.e. sound is central, if not essential to language. Speech is usually considered to be the primary linguistic modality, while other modalities are taken to be a surface perceptual differences only occur at the interfaces of the language faculty and the perceptual system (e.g. sign language). Theorists in general think explanations for the facts about linguistic structures in spoken language can be generalized to other modalities. Many linguists (e.g. Chomsky, Bromberger etc.) have utilized the features of spoken language observed to draw inferences about the nature of the human language faculty. The principles that govern spoken language also govern linguistic phenomena in other modalities. In this talk, I will argue that the traditional picture of taking spoken language as the primary modality to understand other modalities of language is probably mistaken. I propose to look at the written and spoken modalities of logographic languages like Chinese (written and spoken), and also compare the syllable structures of speech to sign language. With respect to Chinese language, I will show that there are cases in which meanings cannot be disambiguated only by analyzing the spoken sounds, but rather we need to look at the written words, I argue that there might be different procedures of mapping forms of different modalities to meanings. As for sign language, I argue that the so-called ‘syllable structure’ in sign language exhibit rather different structure from that of spoken language, and the difference might be at the linguistic rather than merely at the perceptual level.

Tuesday, January 22, 2013

Skinner 1115