Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

HeadTalk, HandTalk and the corpus: towards a framework for multi-modal, multi-media corpus development

Knight, Dawn ORCID:, Evans, David, Carter, Ronald and Adolphs, Svenja 2009. HeadTalk, HandTalk and the corpus: towards a framework for multi-modal, multi-media corpus development. Corpora 4 (1) , pp. 1-32. 10.3366/E1749503209000203

Full text not available from this repository.


In this paper, we address a number of key methodological challenges and concerns faced by linguists in the development of a new generation of corpora: the multi-modal, multi-media corpus – that which combines video, audio and textual records of naturally occurring discourse. We contextualise these issues according to a research project which is currently developing such a corpus: the ESRC-funded Understanding New Digital Records for e-Social Science (DReSS) project based at the University of Nottingham.2 2For further information, results and publications related to the project, please refer to the main DReSS website, at: This paper primarily explores the questions of the functionality of the corpus, identifying the problems we faced in making multi-modal corpora ‘usable’ for further research. We focus on the need for new methods for categorising and marking up multiple streams of data, using, as examples, the coding of head nods and hand gestures. We also consider the challenges faced when integrating and representing the data in a functional corpus tool, to allow for further synthesis and analysis. Here, we also underline some of the ethical challenges faced in the development of this tool, exploring the issues faced both in the collection of data and in the future distribution of video corpora to the wider research community.

Item Type: Article
Date Type: Publication
Status: Published
Schools: English, Communication and Philosophy
Publisher: Edinburgh University Press
ISSN: 1749-5032
Last Modified: 28 Oct 2022 10:35

Citation Data

Cited 22 times in Scopus. View in Scopus. Powered By Scopus® Data

Actions (repository staff only)

Edit Item Edit Item