Affective Interactive Systems
We conduct research at the intersection of human computer interaction, affective computing, and artificial intelligence. Through the combination of increased computing power, shrinking sensors and actuators, and advanced (on-device) AI models, we are entering into a new technological era of how humans interact with (affect-aware) machines. This is made possible by integrating personal and imperceptible low-cost and low-power sensors and devices onto ourselves and into our everyday environments, which enables new forms of interaction, including: bio-responsive wearables and mobile devices, wireless affective haptics, novel eXtended Reality (XR) social interactions that leverage physiological data, or embodied personal AI assistants that exhibit emotional intelligence.
Our research on affective interactive systems focuses on:
- Recognition: Emotion data acquisition, annotation, and recognition algorithms
- Visualization: Prototypes and infrastructure for gathering, synchronizing and visualizing user affective states across environments (mobile, wearables, XR)
- Bio-responsiveness: Novel bio-responsive interactive systems and interaction techniques using real-time sensed physiological data
- Sensing and visualization: Systems for acquiring affective and behavioral data and labels across real and virtual environments. This includes prototypes and infrastructure for gathering, synchronizing and visualizing user affective states across environments (mobile, wearables, XR)
- Affective computing systems: Systems that can sense, recognize, and react to our affective and behavioral states. This spans fundamental and applied machine learning research in affective computing, including recent developments in generative models and agents
- Augmentation and interaction: Systems that augment our physical / virtual bodies and sensory perception across human-machine and human-human interactions. This includes novel haptic interfaces, avatars and embodiment, bio-responsive interactive systems, and developing and evaluating interaction techniques using real-time sensed physiological data
We study desktop, handheld, and wearable sensors as a paradigm for collecting and processing physiological data, across a range of domains including news media, automotive, fashion, and immersive XR experiences. We aim to base our research on realistic testing grounds, where we collaborate with several commercial and academic partners. We have deployed our technology and infrastructure in places such as the National Theatre of China in Shanghai and the Amsterdam Dance Event in the Netherlands, and carried out research using sensor- Instrumented vehicles on the road. Our overall objective is to create intelligent and empathic systems that can deeply understand users and their intentions, respect their privacy, and appropriately react to humans and their experiences.
Topics
- Physiological sensing and visualization
- Haptic interfaces (pneumatic, thermal, vibrotactile)
- Affective and ubiquitous computing
- Affective eXtended Reality (XR)
- Audience sensing and engagement
- Smart textiles
- Internet of Things (e.g., Igor/IOTsa)
Funding
Videos
- ShareYourReality: https://www.youtube.com/watch?v=UNrS5tpAPgc
- BreatheWithMe: https://www.youtube.com/watch?v=4tguxoZJj8k
- FeelTheNews: https://www.youtube.com/watch?v=XoqujMAZ0LA
- AvatarBiosignals: https://www.youtube.com/watch?v=6-ZZ7_IteqE
- ThermalWear: https://www.youtube.com/watch?v=dEgGRpj0HtI
- Lit Lace Interface: https://youtu.be/9Bhy-0gP33Y
- Amsterdam Dance Event (Red Bull): https://www.redbull.com/int-en/tv/video/AP-1R7QCPRDN1W11/playrooms
- Sensing Audiences (with visualization): https://vimeo.com/140198523
- Tangible Air: https://www.youtube.com/watch?v=QGQobnXHFw4
- Enhanced Jazz Concerts: https://www.youtube.com/watch?v=rl8M2o2qQl4
- Sensing Theatre Audiences: https://www.youtube.com/watch?v=dUM-qqRsTx8
- Smart Textiles and Soft Robotics Symposium (at Royal College of Art): https://www.youtube.com/playlist?list=PL36LcFICo85FA8XiNUrPQ0hzrFLahvJxK
- Volvo Design Challenge: https://www.youtube.com/watch?v=kw6YL740Hyo
Software and Datasets
- G-REx: A Real-World Dataset of Group Emotion Experiences based on Physiological Data
- CardioceptionVR
- Social VR Avatar Biosignal Visualizations
- CEAP 360VR dataset
- Bio-Signal Data Processing and Visualization Suite: The full cycle.
Key Publications
-
Dark Haptics: Exploring Manipulative Haptic Design in Mobile User Interfaces.
In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA '25),
New York, NY, USA,
2025.
Article 166, pages.
-
Haptic Biosignals Affect Proxemics Toward Virtual Reality Agents.
In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25),
New York, NY, USA,
2025.
Article 494, pages.
-
A Real-world Dataset of Group Emotion Experiences Based on Physiological Data.
Nature Scientific Data,
,
2024
.
-
Exploring Retrospective Annotation in Long-videos for Emotion Recognition.
IEEE Transactions on Affective Computing,
,
2024
.
-
Immersive Io3MT Environments: Design Guidelines, Use Cases and Future Directions.
In Proceedings of the International Symposium on the Internet of Sounds (IoS),
2024.
-
Augmenting Media Experiences with Affective Haptics.
Interactions,
31 : pp. 6-8,
2024
.
-
ShareYourReality: Investigating Haptic Feedback and Agency in Virtual Avatar Co-embodiment.
In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '24),
New York, NY, USA,
2024.
Article 100, pages.
-
PhysioCHI: Towards Best Practices for Integrating Physiological Signals in HCI.
In Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems (CHI EA '24),
New York, NY, USA,
2024.
Article 485, pages.
-
Towards an Internet of Multisensory, Multimedia and Musical Things (Io3MT) Environment.
In Proceedings of the International Symposium on the Internet of Sounds (IoS),
2023.
-
Affective Driver-Pedestrian Interaction: Exploring Driver Affective Responses Toward Pedestrian Crossing Actions Using Camera and Physiological Sensors.
In Proceedings of the International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI),
Ingolstadt, Germany,
2023.
-
BreatheWithMe: Exploring Visual and Vibrotactile Displays for Social Breath Awareness during Colocated, Collaborative Tasks.
In Extended Abstracts of the ACM CHI Conference on Human Factors in Computing Systems (CHI),
Hamburg, Germany,
2023.
-
FeelTheNews: Augmenting Affective Perceptions of News Videos with Thermal and Vibrotactile Stimulation.
In Extended Abstracts of the ACM CHI Conference on Human Factors in Computing Systems (CHI),
Hamburg, Germany,
2023.
-
Is that my Heartbeat? Measuring and Understanding Modality-dependent Cardiac Interoception in Virtual Reality.
IEEE Transactions on Visualization and Computer Graphics (TVCG),
,
2023
.
-
From Video to Hybrid Simulator: Exploring Affective Responses toward Non-Verbal Pedestrian Crossing Actions using Camera and Physiological Sensors.
International Journal of Human-Computer Interaction (IJHCI),
,
2023
.
-
Group Synchrony for Emotion Recognition using Physiological Signals.
IEEE Transactions on Affective Computing,
,
2023
.
-
Understanding and Designing Avatar Biosignal Visualizations for Social Virtual Reality Entertainment.
In Proceedings of the CHI Conference on Human Factors in Computing Systems (ACM CHI),
New Orleans, USA,
2022.
-
Few-shot Learning for Fine-grained Emotion Recognition using Physiological Signals.
IEEE Transactions on Multimedia,
,
2022
.
-
Weakly-supervised Learning for Fine-grained Emotion Recognition using Physiological Signals.
IEEE Transactions on Affective Computing,
,
2022
.
-
AC-WGAN-GP: Augmenting ECG and GSR Signals using Conditional Generative Models for Arousal Classification.
In Adjunct Proceedings of UbiComp/ISWC,
2021.
-
CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos.
IEEE Transactions on Multimedia,
,
2021
.
-
Investigating the Relationship between Momentary Emotion Self-reports and Head and Eye Movements in HMD-based 360° VR Video Watching.
In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems,
2021.
-
RCEA-360VR: Real-time, Continuous Emotion Annotation in 360° VR Videos for Collecting Precise Viewport-dependent Ground Truth Labels.
In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems,
2021.
-
CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors.
Sensors,
21 ,
2021
.
-
AI at the Disco: Low Sample Frequency Human Activity Recognition for Night Club Experiences.
In Proceedings of the 1st International Workshop on Human-centric Multimedia Analysis,
Seattle, WA, USA,
2020.
-
RCEA: Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI),
Honolulu, HA, USA,
2020.
-
ThermalWear: Exploring Wearable On-chest Thermal Displays to Augment Voice Messages with Affect.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI),
Honolulu, HA, USA,
2020.
-
New Signals in Multimedia Systems and Applications.
IEEE Multimedia (IEEE MM),
25 : pp. 12-13,
2018
.
-
Measuring, Understanding, and Classifying News Media Sympathy on Twitter after Crisis Events.
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18),
Montreal QC, Canada,
2018.
-
Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things.
Procedia Computer Science,
314 : pp. 385-392,
2018
.
-
The Play Is a Hit - But How Can You Tell?.
In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition,
Singapore, Singapore,
2017.
pp. 336-347.
-
CWI-ADE2016 Dataset: Sensing nightclubs through 40 million BLE packets.
In Proceedings of the ACM Multimedia Systems Conference (ACM MMSys 2017),
Taipei, Taiwan,
2017.
pp. 181-186.
-
Quantifying Audience Experience in the Wild: Heuristics for Developing and Deploying a Biosensor Infrastructure in Theaters.
In Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEX 2016),
Lisbon, Portugal,
2016.
pp. 1-6.
-
Sensing a Live Audience.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'14),
Toronto, ON, Canada,
2014.
pp. 1909-1912.
-
Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things.
Procedia Computer Science,
134 : pp. 385-392,
2018
.
Theses
- Patricia Bota, Physiological-Based Group Emotion Recognition: Novel Methods and Real-World Applications, PhD Thesis, Instituto Superior Técnico, 2024.
- Tianyi Zhang, On fine-grained temporal emotion recognition in video: How to trade off recognition accuracy with annotation complexity? PhD Thesis, TU Delft, Netherlands, 2022.
- Tong Xue, Continuous Emotion Annotation Techniques for Mixed Reality Environments, PhD Thesis, Beijing Institute of Technology, China, 2022
- Chen Wang, Monitoring the Engagement of Groups by Using Physiological Sensors, Vrije Universiteit Amsterdam, Netherlands. PhD Thesis, 2018.