Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-06T12:57:19.790Z Has data issue: false hasContentIssue false

Let's not forget the role of deafness in sign/speech bilingualism*

Published online by Cambridge University Press:  02 July 2015

BENCIE WOLL
Affiliation:
ESRC Deafness, Cognition and Language Research Centre (DCAL), University College London, UK
MAIRÉAD MACSWEENEY*
Affiliation:
ESRC Deafness, Cognition and Language Research Centre (DCAL), University College London, UK Institute of Cognitive Neuroscience, University College London, UK
*
Address for correspondence: Mairead MacSweeney, Institute of Cognitive Neuroscience, University College London, 17 Queen Square London WC1N 3ARm.macsweeney@ucl.ac.uk
Rights & Permissions [Opens in a new window]

Extract

Emmorey, Giezen and Gollan (Emmorey, Giezen & Gollan) address the fascinating question of what can be learnt about language, cognition and the brain from the unique group of people who have grown up learning both a signed and a spoken language. The focus of their review is hearing individuals – referred to as hearing bimodal bilinguals. The review presents an excellent overview of research in this field and highlights the unique insights that this population can provide.

Type
Peer Commentaries
Copyright
Copyright © Cambridge University Press 2015 

Emmorey, Giezen and Gollan (Emmorey, Giezen & Gollan) address the fascinating question of what can be learnt about language, cognition and the brain from the unique group of people who have grown up learning both a signed and a spoken language. The focus of their review is hearing individuals – referred to as hearing bimodal bilinguals. The review presents an excellent overview of research in this field and highlights the unique insights that this population can provide.

However, the review leaves open an important question: to what extent can research with hearing bimodal bilinguals inform our understanding of the consequences of sign–speech bilingualism in deaf people? The answer is probably less than we would wish.

The first issue to consider is one of terminology. Emmorey et al. (Emmorey et al.) refer to deaf bilinguals as ‘deaf bimodal bilinguals’ just as the term is applied to ‘hearing bimodal bilinguals’. However, the use of the term ‘bimodal’ is confusing. Hearing individuals who are bilingual in sign and speech indeed access their languages via two different modalities (here meaning senses): primarily auditory for speech and visual for sign. For those born deaf, access to both languages is through the visual modality. Thus, the application of the term ‘bimodal’ to this group appears misleading. For the sake of clarity in the field it is perhaps more appropriate to refer to these individuals as ‘deaf unimodal sign–speech bilinguals’ or for precision ‘deaf sign language and spoken/written language bilinguals’.

The use of these terms may be rejected by some in the field who prefer the term sign–print bilinguals (Piñar, Dussias & Morford, Reference Piñar, Dussias and Morford2011; Kubus, Villwock, Morford & Rathmann, Reference Kubus, Villwock, Morford and Rathmann2014). This makes the strong assumption that written text exists in isolation of what it actually represents – speech. Yet there is ample evidence that deaf signers make use of elements that are recognisable as being derived from spoken language.

As Emmorey et al. (Emmorey et al.) comment in the final section of their paper “. . . mouthings from spoken language words . . . are often produced silently and simultaneously with signs.” The mouthings referred to are mouth actions (silent – not whispered) produced by deaf signers which represent words from the surrounding dominant spoken language. For the most part the semantics of the mouthing and sign are the same (Bank, Crasborn & van Hout, Reference Bank, Crasborn and van Hout2011). This phenomenon has been observed and studied in a large number of sign languages, beginning with the work of Vogt-Svendsen on Norwegian Sign Language (Vogt-Svendsen, Reference Vogt-Svendsen1981; Reference Vogt-Svendsen, Boyes-Braem and Sutton-Spence2001), and including studies of non-Western sign languages such as Adamorobe Sign Language (Ghana) (Nyst, Reference Nyst2007), and Inuit Sign Language (Schuit, Reference Schuit, Zeshan and de Vos2013) as well as studies of British Sign Language (Sutton-Spence & Day, Reference Sutton-Spence, Day, Boyes-Braem and Sutton-Spence2001; Sutton-Spence, Reference Sutton-Spence, Vermeerbergen, Leeson and Crasborn2007), Irish Sign Language (Mohr, Reference Mohr2012), German Sign Language (Hohenberger & Happ, Reference Hohenberger, Happ, Boyes-Braem and Sutton-Spence2001), and Sign Language of the Netherlands (Schermer, Reference Schermer1990). Although there has been very little study of mouthing in ASL, Nadolske & Rosenstock (Reference Nadolske, Rosenstock, Perniss, Pfau and Steinbach2007) have demonstrated that the use of mouthing in ASL is comparable to that found in other sign languages. Indeed, the only sign language which has been reported not to make use of mouthings is Kata Kolok, a sign language used by a village community on Bali (de Vos & Zeshan, Reference de Vos, Zeshan, Zeshan and de Vos2012).

There is ongoing debate about the linguistic status of mouthings, with many studies describing mouthings as part of the sign language lexicon – i.e., the lexical representations of signs includes both oral and manual information (Vogt-Svendsen, Reference Vogt-Svendsen, Boyes-Braem and Sutton-Spence2001; van de Sande & Crasborn, Reference van de Sande and Crasborn2009); other researchers have argued the opposite: mouthings and signs are represented and accessed independently and reflect knowledge of two languages (Ebbinghaus & Hessmann, Reference Ebbinghaus, Hessmann, Boyes-Braem and Sutton-Spence2001; Vinson, Thompson, Skinner, Fox & Vigliocco, Reference Vinson, Thompson, Skinner, Fox and Vigliocco2010). Using fMRI with deaf native signers we have reported that BSL signs with speech-like mouth actions showed greater superior temporal activation, whereas signs made with nonspeech-like mouth actions showed more activation in posterior and inferior temporal regions (Capek et al., Reference Capek, Woll, MacSweeney, Waters, David, McGuire, Brammer and Campbell2008). Thus, the brain does appear to care about the status of mouthings used in signed languages. This finding suggests that consideration of mouthings, vis a vis code-blends, is crucial to any discussion of bilingualism in a signed and spoken language, especially in relation to deaf signers.

Evidence for the influence of elements of speech on signed languages also comes from fingerspelling. The pattern of ‘reductions’ in fingerspelling by skilled signers indicate that they do not reduce fingerspellings randomly or arbitrarily. Rather their reductions reflect speech rather than simply orthography. For example, in reducing the fingerspelling of the name CHARLES, this is more likely to be reduced to -C-H- than to -C- (Sutton-Spence, Reference Sutton-Spence1994).

Research which considers deaf individuals as bilinguals is much rarer than studies of their hearing siblings, which reflects the size of the different populations. An additional factor influencing the difficulty of conducting research in this field is, as Emmorey et al. (Emmorey et al.) point out, the great variability in spoken (and signed) language proficiency within the deaf population, in terms of spoken language comprehension (lipreading) (e.g., Mohammed, Campbell, MacSweeney, Barry & Coleman, Reference Mohammed, Campbell, MacSweeney, Barry and Coleman2006) and production, and indeed in accessing a spoken language via text – reading (e.g., Mayberry, del Giudice & Lieberman, Reference Mayberry, Giudice and Lieberman2011). However, this variability should not be a reason to ignore the role of spoken language when considering deaf sign–speech bilinguals.

Although the literature is mixed, at least some studies with deaf adults and children indicate a role for speech phonology when deaf people read text (see Mayberry et al., Reference Mayberry, Giudice and Lieberman2011). This variability suggests that there may be many routes to successful reading for a deaf person. That some deaf people do not appear to make use of speech phonology in their skilled reading is not, we argue, cause to ignore the role that awareness of speech structure may play in the development of reading skills or indeed the enduring role it may play in reading and reading-related skills for some deaf people (e.g., MacSweeney et al., Reference MacSweeney, Brammer, Waters and Goswami2009; Emmorey, Weisberg, McCullough, Petrich & Emmorey, Reference Emmorey, Weisberg, McCullough and Petrich2013).

Whilst studies of hearing bimodal bilinguals can provide great insights into language, cognition and the brain, future research which considers their deaf siblings as ‘sign language–spoken language’ bilinguals may prove even richer.

Footnotes

*

We acknowledge the support of a Wellcome Trust Fellowship to MM (GR075214MA) and a Centre Grant from the Economic and Social Research Council of Great Britain (RES-620-28-6001) to B.W. and M.M.

References

Bank, R., Crasborn, O., & van Hout, R. (2011). Variation in mouth actions with manual signs in Sign Language of the Netherlands (NGT). Sign Language & Linguistics 14 (2): 248270.CrossRefGoogle Scholar
Capek, C., Woll, B., MacSweeney, M., Waters, D., David, A.S., McGuire, P.K., Brammer, M., & Campbell, R. (2008). Both Hand and Mouth: Cortical correlates of lexical processing in BSL and speechreading. Journal of Cognitive Neuroscience, 20 (7), 12201234.CrossRefGoogle ScholarPubMed
de Vos, C., & Zeshan, U. (2012). Demographic, sociocultural and linguistic variation across rural signing communities. In Zeshan, U & de Vos, C (eds.) Sign languages in village communities: anthropological and linguistic insights. Berlin: Walter de Gruyter, pp. 223.CrossRefGoogle Scholar
Ebbinghaus, H., & Hessmann, J. (2001). Sign language as multidimensional communication: why manual signs, mouthings and mouth gestures are three different things. In Boyes-Braem, P., & Sutton-Spence, R.L., (Eds.) The hands are the head of the mouth: the mouth as articulator in sign languages. Hamburg: Signum Press, pp. 133151.Google Scholar
Emmorey, K., Giezen, M. R., & Gollan, T. H. (in press). Psycholinguistic, cognitive, and neural implications of bimodal bilingualism. Bilingualism: Language and Cognition. doi: 10.1017/S1366728915000085CrossRefGoogle Scholar
Emmorey, K., Weisberg, J., McCullough, S., & Petrich, J. A. F. (2013). Mapping the reading circuitry for skilled deaf readers: An fMRI study of semantic and phonological processing. Brain and Language, 126, 169180.CrossRefGoogle ScholarPubMed
Hohenberger, A. & Happ, D. (2001). The linguistic primacy of signs and mouth gestures over mouthing: evidence from language production in German Sign Language. In Boyes-Braem, P., & Sutton-Spence, R.L., (Eds.) The hands are the head of the mouth: the mouth as articulator in sign languages. Hamburg: Signum Press, pp. 153190.Google Scholar
Kubus, O., Villwock, A., Morford, J.P., & Rathmann, C. (2014). Word recognition in deaf readers: Cross-language activation of German Sign Language and German. Applied Psycholinguistics, DOI: http://dx.doi.org/10.1017/S0142716413000520. Published online: 27 January 2014.Google Scholar
MacSweeney, M., Brammer, M., Waters, D., & Goswami, U. (2009). Enhanced activation of the left inferior frontal gyrus in deaf and dyslexic adults during rhyming. Brain (132), 19281940 CrossRefGoogle ScholarPubMed
Mayberry, R. I., del Giudice, A. A., & Lieberman, A. M. (2011). Reading achievement in relation to phonological coding and awareness in deaf readers: A meta-analysis. Journal of Deaf Studies and Deaf Education, 16, 2, 164188 CrossRefGoogle ScholarPubMed
Mohammed, T., Campbell, R., MacSweeney, M., Barry, F., & Coleman, M. (2006). Speechreading and its association with reading among deaf, hearing and dyslexic individuals. Clinical Linguistics & Phonetics 20 (7–8), 621630 CrossRefGoogle ScholarPubMed
Mohr, S. (2012). The visual-gestural modality and beyond: mouthings as a language contact phenomenon in Irish Sign Language. Sign Language & Lingusitics 15 (2): 185211.CrossRefGoogle Scholar
Nadolske, M.A., & Rosenstock, R. (2007). The occurrence of mouthings in American Sign Language: a preliminary study. In Perniss, P., Pfau, R., & Steinbach, M. (Eds.) Visible variation: comparatie studies on sign language structure. Berlin: Mouton de Gruyter, pp. 3562.CrossRefGoogle Scholar
Nyst, V.A.S. (2007). A descriptive analysis of Adamorobe Sign Language (Ghana). Utrecht: LOT.Google Scholar
Piñar, P., Dussias, P., & Morford, J. (2011). Deaf readers as bilinguals: an examination of Deaf readers’ print comprehension in light of current advances in bilingualism and second language processing. Language and Linguistics Compass 5: 691704.CrossRefGoogle Scholar
van de Sande, I., & Crasborn, O. (2009). Lexically bound mouth actions in Sign Language of the Netherlands: a comparison between different registers and age groups. Linguistics in the Netherlands 26: 7890.CrossRefGoogle Scholar
Schermer, T. (1990). In search of a language: Influences from spoken Dutch on Sign Language of the Netherlands. Delft: Eburon.Google Scholar
Schuit, J. (2013). Signing in the Arctic: external influences on Inuit Sign Language. In Zeshan, U., & de Vos, C., (Eds.). Sign languages in village communities: anthropological and linguistic insights. Berlin: Walter de Gruyter, pp. 181208.Google Scholar
Sutton-Spence, R.L. (1994) The role of the manual alphabet and fingerspelling in British Sign Language. PhD Thesis, University of BristolGoogle Scholar
Sutton-Spence, R.L. (2007). Mouthings and simultaneity in British Sign Language. In Vermeerbergen, M., Leeson, L., & Crasborn, O. (Eds.) Simultaneity in signed languages. Amsterdam: John Benjamins, pp. 147162.CrossRefGoogle Scholar
Sutton-Spence, RL., & Day, L. (2001). Mouthings and mouth gestures in British Sign Language (BSL). In Boyes-Braem, P. & Sutton-Spence, R.L. (eds.) The hands are the head of the mouth: the mouth as articulator in sign languages. Hamburg: Signum Press, pp. 6986.Google Scholar
Vinson, D.P., Thompson, R., Skinner, R., Fox, N., & Vigliocco, G. (2010). The hands and mouth do not always slip together in British Sign Language: dissociating articulatory channels in the lexicon. Psychological Science 21 (8): 11581167.CrossRefGoogle Scholar
Vogt-Svendsen, M. (1981). Mouth position & mouth movement in Norwegian Sign Language. Sign Language Studies 33 (1): 363376.CrossRefGoogle Scholar
Vogt-Svendsen, M. (2001). A comparison of mouth gestures and mouthings in Norwegian Sign Language (NSL). In Boyes-Braem, P. & Sutton-Spence, R.L. (Eds.) The hands are the head of the mouth: the mouth as articulator in sign languages. Hamburg: Signum Press, pp. 940.Google Scholar