Follow us on: GitHub YouTube Vimeo Twitter LinkedIn

Affective Interactive Systems

We conduct research at the intersection of human computer interaction, affective computing, and artificial intelligence. Through the combination of increased computing power, shrinking sensors and actuators, and advanced (on-device) AI models, we are entering into a new technological era of how humans interact with (affect-aware) machines. This is made possible by integrating personal and imperceptible low-cost and low-power sensors and devices onto ourselves and into our everyday environments, which enables new forms of interaction, including: bio-responsive wearables and mobile devices, wireless affective haptics, novel eXtended Reality (XR) social interactions that leverage physiological data, or embodied personal AI assistants that exhibit emotional intelligence.

Our research on affective interactive systems focuses on:

  • Recognition: Emotion data acquisition, annotation, and recognition algorithms
  • Visualization: Prototypes and infrastructure for gathering, synchronizing and visualizing user affective states across environments (mobile, wearables, XR)
  • Bio-responsiveness: Novel bio-responsive interactive systems and interaction techniques using real-time sensed physiological data
  • Sensing and visualization: Systems for acquiring affective and behavioral data and labels across real and virtual environments. This includes prototypes and infrastructure for gathering, synchronizing and visualizing user affective states across environments (mobile, wearables, XR)
  • Affective computing systems: Systems that can sense, recognize, and react to our affective and behavioral states. This spans fundamental and applied machine learning research in affective computing, including recent developments in generative models and agents
  • Augmentation and interaction: Systems that augment our physical / virtual bodies and sensory perception across human-machine and human-human interactions. This includes novel haptic interfaces, avatars and embodiment, bio-responsive interactive systems, and developing and evaluating interaction techniques using real-time sensed physiological data

We study desktop, handheld, and wearable sensors as a paradigm for collecting and processing physiological data, across a range of domains including news media, automotive, fashion, and immersive XR experiences. We aim to base our research on realistic testing grounds, where we collaborate with several commercial and academic partners. We have deployed our technology and infrastructure in places such as the National Theatre of China in Shanghai and the Amsterdam Dance Event in the Netherlands, and carried out research using sensor- Instrumented vehicles on the road. Our overall objective is to create intelligent and empathic systems that can deeply understand users and their intentions, respect their privacy, and appropriately react to humans and their experiences.

Topics

  • Physiological sensing and visualization
  • Haptic interfaces (pneumatic, thermal, vibrotactile)
  • Affective and ubiquitous computing
  • Affective eXtended Reality (XR)
  • Audience sensing and engagement
  • Smart textiles
  • Internet of Things (e.g., Igor/IOTsa)

Funding

Videos

Software and Datasets

Key Publications

  • C. Tang, K. Venkatraj, H. Liu, C. Schneegass, G. Huisman, A. El Ali Dark Haptics: Exploring Manipulative Haptic Design in Mobile User Interfaces. In Proceedings of the Extended Abstracts of the CHI Conference on Human Factors in Computing Systems (CHI EA '25), New York, NY, USA, 2025. Article 166, pages.
  • S. Ooms, M. Lee, E. R. Stepanova, P. Cesar, A. El Ali Haptic Biosignals Affect Proxemics Toward Virtual Reality Agents. In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (CHI '25), New York, NY, USA, 2025. Article 494, pages.
  • P. Bota, J. Brito, A. Fred, P. Cesar, H. Silva A Real-world Dataset of Group Emotion Experiences Based on Physiological Data. Nature Scientific Data, , 2024 .
  • P. Bota, P. Cesar, A. Fred, H. Silva Exploring Retrospective Annotation in Long-videos for Emotion Recognition. IEEE Transactions on Affective Computing, , 2024 .
  • R. Vieira, S. Wei, T. Röggla, D. C. Muchaluat-Saade, P. César Immersive Io3MT Environments: Design Guidelines, Use Cases and Future Directions. In Proceedings of the International Symposium on the Internet of Sounds (IoS), 2024.
  • S. Ooms, T. Röggla, P. Cesar, A. El Ali Augmenting Media Experiences with Affective Haptics. Interactions, 31 : pp. 6-8, 2024 .
  • K. Venkatraj, W. Meijer, M. Perusquia-Hernandez, G. Huisman, A. El Ali ShareYourReality: Investigating Haptic Feedback and Agency in Virtual Avatar Co-embodiment. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '24), New York, NY, USA, 2024. Article 100, pages.
  • F. Chiossi, E. R. Stepanova, B. Tag, M. Perusquia-Hernandez, A. Kitson, A. Dey, S. Mayer, A. El Ali PhysioCHI: Towards Best Practices for Integrating Physiological Signals in HCI. In Extended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems (CHI EA '24), New York, NY, USA, 2024. Article 485, pages.
  • R. Vieira, D. C. Muchaluat-Sade, P. Cesar Towards an Internet of Multisensory, Multimedia and Musical Things (Io3MT) Environment. In Proceedings of the International Symposium on the Internet of Sounds (IoS), 2023.
  • S. Rao, S. Wirjopawiro, G. Pons Rodriguez, T. Röggla, P. Cesar, A. El Ali Affective Driver-Pedestrian Interaction: Exploring Driver Affective Responses Toward Pedestrian Crossing Actions Using Camera and Physiological Sensors. In Proceedings of the International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI), Ingolstadt, Germany, 2023.
  • A. El Ali, E. R. Stepanova, S. Palande, A. Mader, P. Cesar, K. Jansen BreatheWithMe: Exploring Visual and Vibrotactile Displays for Social Breath Awareness during Colocated, Collaborative Tasks. In Extended Abstracts of the ACM CHI Conference on Human Factors in Computing Systems (CHI), Hamburg, Germany, 2023.
  • S. Ooms, M. Lee, P. Cesar, A. El Ali FeelTheNews: Augmenting Affective Perceptions of News Videos with Thermal and Vibrotactile Stimulation. In Extended Abstracts of the ACM CHI Conference on Human Factors in Computing Systems (CHI), Hamburg, Germany, 2023.
  • A. El Ali, R. Ney, Z.M.C. van Berlo, and P. Cesar Is that my Heartbeat? Measuring and Understanding Modality-dependent Cardiac Interoception in Virtual Reality. IEEE Transactions on Visualization and Computer Graphics (TVCG), , 2023 .
  • S. Rao, S. Ghosh, G. Pons Rodriguez, T. Röggla, P. Cesar, A. El Ali From Video to Hybrid Simulator: Exploring Affective Responses toward Non-Verbal Pedestrian Crossing Actions using Camera and Physiological Sensors. International Journal of Human-Computer Interaction (IJHCI), , 2023 .
  • P. Bota, T. Zhang, A. El Ali, A. Fred, H. da Silva, P. Cesar Group Synchrony for Emotion Recognition using Physiological Signals. IEEE Transactions on Affective Computing, , 2023 .
  • S. Lee, A. El Ali, M. Wijntjes, P. Cesar Understanding and Designing Avatar Biosignal Visualizations for Social Virtual Reality Entertainment. In Proceedings of the CHI Conference on Human Factors in Computing Systems (ACM CHI), New Orleans, USA, 2022.
  • T. Zhang, A. El Ali, A. Hanjalic, P. Cesar Few-shot Learning for Fine-grained Emotion Recognition using Physiological Signals. IEEE Transactions on Multimedia, , 2022 .
  • T. Zhang, A. El Ali, C. Wang, A. Hanjalic, P. Cesar Weakly-supervised Learning for Fine-grained Emotion Recognition using Physiological Signals. IEEE Transactions on Affective Computing, , 2022 .
  • A. Furdui, T. Zhang, M. Worring, P. Cesar, A. El Ali AC-WGAN-GP: Augmenting ECG and GSR Signals using Conditional Generative Models for Arousal Classification. In Adjunct Proceedings of UbiComp/ISWC, 2021.
  • T. Xue, A. El Ali, T. Zhang, G. Ding, P. Cesar CEAP-360VR: A Continuous Physiological and Behavioral Emotion Annotation Dataset for 360 VR Videos. IEEE Transactions on Multimedia, , 2021 .
  • T. Xue, A. El Ali, G. Ding, and P. Cesar Investigating the Relationship between Momentary Emotion Self-reports and Head and Eye Movements in HMD-based 360° VR Video Watching. In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 2021.
  • T. Xue, A. El Ali, T. Zhang, G. Ding, and P. Cesar RCEA-360VR: Real-time, Continuous Emotion Annotation in 360° VR Videos for Collecting Precise Viewport-dependent Ground Truth Labels. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 2021.
  • T. Zhang, A. El Ali, C. Wang, A. Hanjalic, P. Cesar CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors. Sensors, 21 , 2021 .
  • A. S. Gill, S. Cabrero, P. Cesar, D. A. Shamma AI at the Disco: Low Sample Frequency Human Activity Recognition for Night Club Experiences. In Proceedings of the 1st International Workshop on Human-centric Multimedia Analysis, Seattle, WA, USA, 2020.
  • T. Zhang, A. El Ali, C. Wang, A. Hanjalic, and P. Cesar RCEA: Real-time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI), Honolulu, HA, USA, 2020.
  • A. El Ali, X. Yang, S. Ananthanarayan, T. Röggla, J. Jansen, J. Hartcher-O’Brien, K. Jansen, and P. Cesar ThermalWear: Exploring Wearable On-chest Thermal Displays to Augment Voice Messages with Affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (ACM CHI), Honolulu, HA, USA, 2020.
  • P. Cesar, V. Singh, R. Jain, N. Sebe, and N. Oliver New Signals in Multimedia Systems and Applications. IEEE Multimedia (IEEE MM), 25 : pp. 12-13, 2018 .
  • A. El Ali, T. Stratmann, S. Park, J. Schöning, W. Heuten, and S. Boll Measuring, Understanding, and Classifying News Media Sympathy on Twitter after Crisis Events. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18), Montreal QC, Canada, 2018.
  • P. S. Wen Shieng, J. Jansen, S. Pemberton Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things. Procedia Computer Science, 314 : pp. 385-392, 2018 .
  • C. Wang and P. Cesar The Play Is a Hit - But How Can You Tell?. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition, Singapore, Singapore, 2017. pp. 336-347.
  • S. Cabrero, J. Jansen, T. Röggla, J.A. Guerra-Gomez, D.A. Shamma, and P. Cesar CWI-ADE2016 Dataset: Sensing nightclubs through 40 million BLE packets. In Proceedings of the ACM Multimedia Systems Conference (ACM MMSys 2017), Taipei, Taiwan, 2017. pp. 181-186.
  • C. Wang, J. Wong, X. Zhu, T. Röggla, J. Jansen, and P. Cesar Quantifying Audience Experience in the Wild: Heuristics for Developing and Deploying a Biosensor Infrastructure in Theaters. In Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEX 2016), Lisbon, Portugal, 2016. pp. 1-6.
  • C. Wang, E. Geelhoed, P. Stenton, and P. Cesar Sensing a Live Audience. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'14), Toronto, ON, Canada, 2014. pp. 1909-1912.
  • P.S. Wen Shieng, J. Jansen, and S. Pemberton Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things. Procedia Computer Science, 134 : pp. 385-392, 2018 .

Theses

  • Patricia Bota, Physiological-Based Group Emotion Recognition: Novel Methods and Real-World Applications, PhD Thesis, Instituto Superior Técnico, 2024.
  • Tianyi Zhang, On fine-grained temporal emotion recognition in video: How to trade off recognition accuracy with annotation complexity? PhD Thesis, TU Delft, Netherlands, 2022.
  • Tong Xue, Continuous Emotion Annotation Techniques for Mixed Reality Environments, PhD Thesis, Beijing Institute of Technology, China, 2022
  • Chen Wang, Monitoring the Engagement of Groups by Using Physiological Sensors, Vrije Universiteit Amsterdam, Netherlands. PhD Thesis, 2018.