Big Data and Multimodal Communication: A Perspective View

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Humans communicate face-to-face through at least two modalities, the auditive modality, speech, and the visual modality, gestures, which comprise e.g. gaze movements, facial expressions, head movements, and hand gestures. The relation between speech and gesture is complex and partly depends on factors such as the culture, the communicative situation, the interlocutors and their relation. Investigating these factors in real data is vital for studying multimodal communication and building models for implementing natural multimodal communicative interfaces able to interact naturally with individuals of different age, culture, and needs. In this paper, we discuss to what extent big data “in the wild”, which are growing explosively on the internet, are useful for this purpose also in light of legal aspects about the use of personal data, comprising multimodal data downloaded from social media.
TidsskriftIntelligent Systems Reference Library
Sider (fra-til)167-184
Antal sider17
StatusUdgivet - 2019

ID: 223923919