4.2 Article

MUMBAI: multi-person, multimodal board game affect and interaction analysis dataset

Journal

JOURNAL ON MULTIMODAL USER INTERFACES
Volume 15, Issue 4, Pages 373-391

Publisher

SPRINGER
DOI: 10.1007/s12193-021-00364-0

Keywords

Affective computing; Facial expression analysis; Board games; Social interactions; Group dynamics; Multimodal interaction; Game experience

Ask authors/readers for more resources

Board games serve as a fertile ground for social signals and insights into psychological indicators. The new MUMBAI dataset, collected from four-player board game sessions, is extensively annotated with emotional moments and additional data from questionnaires. Three benchmarks for expression detection and emotion classification are presented, along with discussions on potential research questions for analyzing social interactions and group dynamics during board games.
Board games are fertile grounds for the display of social signals, and they provide insights into psychological indicators in multi-person interactions. In this work, we introduce a new dataset collected from four-player board game sessions, recorded via multiple cameras, and containing over 46 hours of visual material. The new MUMBAI dataset is extensively annotated with emotional moments for all game sessions. Additional data comes from personality and game experience questionnaires. Our four-person setup allows the investigation of non-verbal interactions beyond dyadic settings. We present three benchmarks for expression detection and emotion classification and discuss potential research questions for the analysis of social interactions and group dynamics during board games.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.2
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available