Ubiquitous & Affective Computing
We conduct research at the intersections of ubiquitous computing, artificial intelligence and human computer interaction. With increased computing power, shrinking hardware, ubiquitous wireless networks, and widespread adoption of personal computing devices, we are entering into a new technological era of how humans interact with machines. This is made possible through embedding (at times personal and imperceptible) low-cost and low-power sensors and devices into our everyday environments.
A key focus of our group is on affective computing and wearable sensing, including:
- Emotion recognition and behavior understanding algorithms based on machine learning techniques
- Prototypes and infrastructure for gathering, synchronizing, and visualizing user affective states in real-world settings
- Developing new interaction techniques including smart textiles, using physiological sensing
We study handheld and wearable sensors as a paradigm for collecting and processing affective data, across a range of domains including media consumption, automotive, fashion, immersive VR experiences, and education. Based on realistic testing grounds, collaborating with several commercial and academic partners, we have deployed our technology and infrastructure in places such as the National Theatre of China in Shanghai and the Amsterdam Dance Event in the Netherlands. Our overall objective is to create intelligent and empathic systems that both respect user privacy and appropriately react to humans and their experiences.
1. Sensing and Understanding Human Activities
- Human activity recognition
- Sleep monitoring
- Sensing nightclub activity
- Understanding urban mobility
2. Internet Of Things
3. Affective Computing
- Multimodal emotion recognition
- Automotive physiological sensing
- Physiological privacy
- Emotion-aware fashion
- Sensing audience engagement
4. Smart Textiles
- Interactive fashion
- Self-actuating textiles
- Amsterdam Dance Event (Red Bull): https://www.redbull.com/int-en/tv/video/AP-1R7QCPRDN1W11/playrooms
- Sensing Audiences (with visualization): https://vimeo.com/140198523
- Tangible Air: https://www.youtube.com/watch?v=QGQobnXHFw4
- Enhanced Jazz Concerts: https://www.youtube.com/watch?v=rl8M2o2qQl4
- Sensing Theatre Audiences: https://www.youtube.com/watch?v=dUM-qqRsTx8
- Smart Textiles and Soft Robotics Symposium (at Royal College of Art): https://www.youtube.com/playlist?list=PL36LcFICo85FA8XiNUrPQ0hzrFLahvJxK
- Volvo Design Challenge: https://www.youtube.com/watch?v=kw6YL740Hyo
Open Source Infrastructure
New Signals in Multimedia Systems and Applications.
IEEE Multimedia (IEEE MM),
25 : pp. 12-13,
Measuring, Understanding, and Classifying News Media Sympathy on Twitter after Crisis Events.
In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18),
Montreal QC, Canada,
The Play Is a Hit - But How Can You Tell?.
In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition,
CWI-ADE2016 Dataset: Sensing nightclubs through 40 million BLE packets.
In Proceedings of the ACM Multimedia Systems Conference (ACM MMSys 2017),
Quantifying Audience Experience in the Wild: Heuristics for Developing and Deploying a Biosensor Infrastructure in Theaters.
In Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEX 2016),
Sensing a Live Audience.
In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI'14),
Toronto, ON, Canada,
Fine-grained Access Control Framework for Igor, a Unified Access Solution to The Internet of Things.
Procedia Computer Science,
134 : pp. 385-392,
- Chen Wang, Monitoring the Engagement of Groups by Using Physiological Sensors, Vrije Universiteit Amsterdam, Netherlands. PhD Thesis, 2018.