Affective Interactive Systems
We conduct research at the intersection of human computer interaction, affective computing, and artificial intelligence. With increased computing power, shrinking hardware, ubiquitous wireless networks, and widespread adoption of personal computing devices, we are entering into a new technological era of how humans interact with (affect-aware) machines. This is made possible through embedding (at times personal and imperceptible) low-cost and low-power sensors and devices into our everyday environments, whether through bio-responsive wearables and mobile devices, emotion-aware avatars in eXtended Reality (XR), or embodied personal assistants that exhibit emotional intelligence.
A key focus of our group is on affective interactive systems, including:
- Recognition: Emotion data acquisition, annotation, and recognition algorithms
- Visualization: Prototypes and infrastructure for gathering, synchronizing and visualizing user affective states across environments (mobile, wearables, XR)
- Bio-responsiveness: Novel bio-responsive interactive systems and interaction techniques using real-time sensed physiological data
We study handheld and wearable sensors as a paradigm for collecting and processing affective data, across a range of domains including media consumption, automotive, fashion, and immersive XR experiences. Based on realistic testing grounds, and collaborating with several commercial and academic partners, we have deployed our technology and infrastructure in places such as the National Theatre of China in Shanghai and the Amsterdam Dance Event in the Netherlands. Our overall objective is to create intelligent and empathic systems that both respect user privacy and appropriately react to humans and their experiences.
Topics
Affective Systems
- Multimodal emotion recognition
- Automotive physiological sensing
- Affect-aware avatars in XR
- Emotion-aware fashion
- Sensing audience engagement
Smart Textiles
- Bio-responsive wearables
- Self-actuating textiles
Sensing and Understanding Human Activities
- Multimodal emotion recognition
- Sleep monitoring
- Sensing nightclub activity
- Understanding urban mobility
Internet of Things
- Igor/IOTsa
Funding
Videos
- Lit Lace Interface: https://youtu.be/9Bhy-0gP33Y
- Amsterdam Dance Event (Red Bull): https://www.redbull.com/int-en/tv/video/AP-1R7QCPRDN1W11/playrooms
- Sensing Audiences (with visualization): https://vimeo.com/140198523
- Tangible Air: https://www.youtube.com/watch?v=QGQobnXHFw4
- Enhanced Jazz Concerts: https://www.youtube.com/watch?v=rl8M2o2qQl4
- Sensing Theatre Audiences: https://www.youtube.com/watch?v=dUM-qqRsTx8
- Smart Textiles and Soft Robotics Symposium (at Royal College of Art): https://www.youtube.com/playlist?list=PL36LcFICo85FA8XiNUrPQ0hzrFLahvJxK
- Volvo Design Challenge: https://www.youtube.com/watch?v=kw6YL740Hyo
Software and Datasets
Key Publications
-
CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos.
IEEE Transactions on Multimedia,
: pp. ,
2021
.
-
Investigating the Relationship between Momentary Emotion Self-reports and Head and Eye Movements in HMD-based 360° VR Video Watching.
In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems,
2021.
-
RCEA-360VR: Real-time, Continuous Emotion Annotation in 360° VR Videos for Collecting Precise Viewport-dependent Ground Truth Labels.
In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems,
2021.
-
CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors.
Sensors,
21: pp. ,
2021
.
-
AI at the Disco: Low Sample Frequency Human Activity Recognition for Night Club Experiences.
In Proceedings of the 1st International Workshop on Human-centric Multimedia Analysis,
Seattle, WA, USA,
2020.
-
RCEA: Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI),
Honolulu, HA, USA,
2020.
-
ThermalWear: Exploring Wearable On-chest Thermal Displays to Augment Voice Messages with Affect.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI),
Honolulu, HA, USA,
2020.
-
New Signals in Multimedia Systems and Applications.
IEEE Multimedia (IEEE MM),
25: pp. 12-13,
2018
.
-
Measuring, Understanding, and Classifying News Media Sympathy on Twitter after Crisis Events.
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18),
Montreal QC, Canada,
2018.
-
Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things.
Procedia Computer Science,
314: pp. 385-392,
2018
.
-
The Play Is a Hit - But How Can You Tell?.
In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition,
Singapore, Singapore,
2017.
pp. 336-347.
-
CWI-ADE2016 Dataset: Sensing nightclubs through 40 million BLE packets.
In Proceedings of the ACM Multimedia Systems Conference (ACM MMSys 2017),
Taipei, Taiwan,
2017.
pp. 181-186.
-
Quantifying Audience Experience in the Wild: Heuristics for Developing and Deploying a Biosensor Infrastructure in Theaters.
In Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEX 2016),
Lisbon, Portugal,
2016.
pp. 1-6.
-
Sensing a Live Audience.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'14),
Toronto, ON, Canada,
2014.
pp. 1909-1912.
-
Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things.
Procedia Computer Science,
134: pp. 385-392,
2018
.
Theses
- Chen Wang, Monitoring the Engagement of Groups by Using Physiological Sensors, Vrije Universiteit Amsterdam, Netherlands. PhD Thesis, 2018.