PLM2016 Thematic session: The Three A’s: data acquisition, annotation and analysis in multimodal communication studies

Conveners: Anna Jelec and Konrad Juszczyk (Adam Mickiewicz University, Poznań)

The contact person for this thematic session is Konrad Juszczyk (e-mail:juszczyk@amu.edu.pl)

Interpersonal communication is multimodal. Beyond speech, people communicate using different modalities. Therefore, research on communication has embraced a growing number of data sources: not only language but gesture (McNeill 1992; Kendon 2004), posture, gaze direction (Gulberg and Kita 2009), facial expression and other non-verbal behaviour. The increased availability of resources that has resulted from digital recording technologies is both exciting and challenging. “Language, gestures, facial expressions, voice and movements [which] interplay in the rising of meaning” are at the heart of multimodal communication studies (Bonacchi & Karpiński 2014), but the complex interplay of multimodal data poses a unique challenge to research in this developing field.

The focus of the present workshop will be the acquisition, annotation and analysis of multimodal data, in particular discussing problems and finding solutions for the theoretical and practical issues that researchers face at all stages of data collection and processing. Choices made at the acquisition and annotation level, such as what is considered a data unit in gesture research, need to be well-informed, since they impact further analysis. For annotators, the choice of coding systems to use for multimodal data annotation (Bressem, Ladewig & Müller 2013) is a vital topic, as is establishing and calculating interrater agreement (Holle & Rein 2015), so as to provide objective and reliable interdisciplinary analysis with coding systems like NEUROGES (Lausberg & Sloetjes 2015). Analysis of multimodal data is a developing field that could perhaps benefit from more coherent assumptions.

This workshop will provide a forum for a discussion of fundamental questions that emerge from the rapidly growing data pool. In this way this thematic workshop will connect to the conference leitmotif: “Linguistics and data: A fresh look”.

References:
Bonacchi, S., & Karpiński, M. (2014). Remarks about the use of the term “multimodality". Journal of Multimodal Communication Studies, (1), 1–7.
Bressem, J., Ladewig, S. H., & Müller, C. (2013). Linguistic Annotation System for Gestures. In C. Müller, A. J. Cienki, S. H. Ladewig, D. McNeill, & Tessendorf (Eds.), Body – Language – Communication. De Gruyter.
Gibbon, D. (2011). Modelling Gesture as Speech. Poznan Studies in Contemporary Linguistics, 47(3), 447-.
Gullberg, M., & Kita, S. (2009). Attention to speech-accompanying gestures: Eye movements and information uptake. Journal of Nonverbal Behavior, 33(4), 251-277. doi:10.1007/s10919-009-0073-2.
Holle, H. & Rein, R. (2015). EasyDIAg: A tool for easy determination of interrater agreement. Behavior Research Methods, 47(3), 837–847.
Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge: Cambridge University Press.
Lausberg, H. & Sloetjes, H. (2015) The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behaviour and gesture. Behavior Research Methods
McNeill, D. (1992). Hand and Mind: What Gestures Reveal about Thought. Chicago: University of Chicago Press.
Sweetser, E. (2008). What does it mean to compare language and gesture? Modalities and contrasts. In J. Guo, E. Lieven, N. Budwig, S. Ervin-Tripp, K. Nakamura, & S. Ozcaliskan (Eds.), Crosslinguistic Approaches to the Psychology of Language: Research in the Tradition of Dan Isaac Slobin. New York: Psychology Press.

The abstract submission deadline for this session is extended until March 30, 2016. Other details of abstract submission are available here.