Skip to content
Home News Sweating the Details: Emotion Recognition and the Influence of Physical Exertion in Virtual Reality Exergaming

Sweating the Details: Emotion Recognition and the Influence of Physical Exertion in Virtual Reality Exergaming

We have received the final confirmation that our paper for CHI 2024 has been accepted! We are excited to share in more detail what the University of Bath’s team has been working on in their EMIL exergaming lighthouse project…

Exergames in virtual reality (VR) are a fun way to exercise and can help people stick to their exercise routines because they’re more enjoyable. We’ve been looking at how we can make these games even better by adjusting them based on how the player feels in the moment. We know that when people are at rest we can use their physiological signals (e.g., heart rate, eye movement) to recognise what emotions they are experiencing – this is known as “affect recognition”. However, figuring out how someone feels when they’re exercising is difficult because they’re moving a lot, sweating, and their heart and breathing rates are much higher than when they are resting.

In our research paper, we explore whether it is possible to recognize different emotions during various levels of exercise. To do this, we gathered data from 72 participants while they experienced VR environments designed to evoke different emotions at various exercise levels. We analyzed the relationship between their physiological responses and emotions and developed a statistical model. This model allows us to predict a user’s emotional state based on their physical signals during different levels of exercise, helping us tailor the game content to match how they’re feeling!

Our paper offers emotion recognition models, methods for cleaning physiological sensor data, validated VR environments for inducing emotions, a dataset from real-time experiments with 72 participants, and the EmoSense SDK for other researchers and designers in extended reality (XR) fields.

If you’re interested in our work and want to read our CHI paper in advance, please contact Dr. Dominic Potts at the University of Bath: dmp59@bath.ac.uk.

#HCI #VirtualReality #Research #Exergaming #CHI2024

Skip to content