Follow us on: GitHub YouTube Vimeo Twitter LinkedIn

DIS at CHI 2020

Publication date: 2020-04-16

Research carried out by the Distributed and Interactive Systems from Centrum Wiskunde & Informatica (CWI) has resulted in several contributions to this year’s ACM CHI Conference on Human Factors in Computing Systems. CHI is the flagship conference of ACM SIGCHI, the premier international society for professionals, academics and students who are interested in technology and human-computer interaction. While this year’s conference has been unfortunately cancelled due to COVID-19, below we highlight the work we would have presented. We unfortunately also had to cancel the 2nd Pre-CHI 2020 event that was to be held at CWI. Below we describe our contributions consisting of two full papers, two workshops, four late-breaking works, and one discussion panel.

Our first full paper investigates the problem of collecting accurate and precise emotion ground truth labels for mobile video watching. We contribute a validated annotation technique and associated annotation fusion method, that is suitable for collecting fine-grained emotion annotations while users watch mobile videos. Our second full paper investigates wearable, on-chest thermal displays and how they influence voice processing. We contribute a better understanding of how thermal displays can augment voice perception, which can enhance voice assistants and support individuals with emotional prosody impairments.

Our first workshop addresses Social VR, a new medium for remote communication and collaboration. We will run this Social VR workshop online using a social VR platform – Mozilla Hubs! Our second workshop deals with emotion ground truth data collection. We aim to (a) explore and define novel elicitation tasks (b) survey sensing and annotation techniques (c) create a taxonomy of when and where to apply an elicitation method.

Our first LBW titled “Designing A Social VR Clinic for Medical Consultations” investigates how social Virtual Reality (VR) can create new opportunities for remote communication, and can potentially be a new tool for remote medical consultations. In our second LBW titled “Designing Real-time, Continuous Emotion Annotation Techniques for 360° VR Videos”, we design six continuous emotion annotation techniques for the Oculus Rift HMD aimed at minimizing workload and distraction. In our third LBW titled “Towards Improving Emotion Self-report Collection using Self-reflection”, we look at Experience Sampling Methods (ESMs) and their use for emotion self-report data collection. In our fourth LBW titled “Designing User Interface for Facilitating Live Editing in Streaming”, we look at how media assets, such as overlay graphics or comments, can make video streaming a unique and engaging experience.

Finally, in this Panel Discussion, we aim at gathering researchers and practitioners to reflect on using mixed reality (XR) technologies to support collaborative learning and co-creation, and to foster a joint force by connecting the Learning and Education community and the XR community at CHI.

References

Original article