On this page all public documents and deliverables of the EMIL project will be made available.
Deliverables
DRAFT
Tuija Heikura – AALTO
Christof Lutteroth – UB
Tim Moesgen – AALTO
Juhani Tenhunen – AALTO
Yu Xiao – AALTO
D1.1 – Data Management Plan
EMIL has prepared a Data Management Plan (DMP), which describes the data management life cycle for the data to be collected, processed and/or generated by the EMIL project. The DMP will be updated regularly throughout the project (D1.1). Updates of the DMP will be included in the Periodic Reporting Part B.
DRAFT D2.1 – FSTP Text and Application Procedure
This document will describe the set up the Financial Support for Third Parties (FSTP) calls I and II. It presents the information given about the calls and procedures (preparation activities of the call, application form, general requirements, criteria and expectations for the funded FSTP projects. The information about the publication of the calls and funding timeline of the FSTP calls are also listed.
DRAFT D2.2– FSTP call website
This document describes the FSTP call setup and call website. It will clarify the preparation of the FSTP call website, its thematic areas, general requirements for FSTP projects and instructions of how to write an application. The privacy of the applicants is a sensitive issue and is discussed briefly as well. Our conclusions highlight the massive workload required when starting a project that has new tools such as FSTPs to deliver.
DRAFT D2.3 – Launch of FSTP Call 1
This document contains the actions relating to the opening of the FSTP call 1.
DRAFT
Christopher Clarke – UB
Crescent Jicol – UB
Christof Lutteroth – UB
Adwait Sharma – UB
DRAFT D2.4 – Evaluated results of call 1 and public results on website
This document is a public announcement of the evaluation results of EMIL’s first FSTP call. It gives a brief overview of the evaluation and lists the successful applicants.
DRAFT
Christof Lutteroth – UB
Christopher Clarke – UB
Crescent Jicol – UB
Adwait Sharma – UB
DRAFT D2.6 – Evaluated results of call 2 and public results on website
This document is a public announcement of the evaluation results of EMIL’s second FSTP call. It gives a brief overview of the evaluation and lists the successful applicants.
DRAFT
Ramyah Gowrishankar – AALTO
Tim Moesgen – AALTO
Yu-Han Tseng – AALTO
Esa Vikberg – AALTO
Yu Xiao – AALTO
DRAFT D3.1 – Hardware specifications and APIs of smart garments
This document, “Hardware specifications and APIs of smart garments” (D3.1) describes the hardware design of smart garments that provide tactile and thermal feedback, including the spatial layout and default configurations of motors, heating and cooling elements and sensors, as well as techniques for integrating the hardware components into the textiles. The report also describes open APIs for (re)configuring and (de)activating components, and for obtaining sensor data.
DRAFT D3.3 – Hardware integration and Adaptation of foundation libraries of AR Magic Lanterns
This deliverable describes the advances in hardware and system integration of the devices known as Augmented Reality Magic Lanterns, designed and developed by the UPF partner. The document describes the progress in moving from the initial TRL4 demonstrator that had been tested under controlled conditions at the historical site of Barcino (the ancient Roman Barcelona), up to the desired TRL8 device that will be achieved by the end of EMIL. In this process we describe the reasons for leaving behind the off-the-shelf-hardware version of the TRL4 demonstrator prototype and justify the need for a different HW integration. Hence, this document provides an account of all the investigations, hardware analysis, computer vision utilities, platform compatibilities, etc., that are needed for reaching a final integration and how the UPF team has undertaken these tasks.
DRAFT
Christof Lutteroth – UB
Christopher Clarke – UB
Crescent Jicol – UB
Adwait Sharma – UB
DRAFT D3.6 – SDK for integration of physical activities into VR experiences
This document, “SDK for integration of physical activities into VR experiences” (D3.6) provides an overview of a Software Development Kit (SDK) that can be used to create VR experiences that integrate physical activities such as walking-in-place, cycling and walking on a crosstrainer. The SDK is able to detect physical activities purely based on headset movements. This document describes the software components of the SDK that support different forms of physical activity, and also gives brief examples of how they can be applied in a VR project.
DRAFT
Christof Lutteroth – UB
Christopher Clarke – UB
Crescent Jicol – UB
Adwait Sharma – UB
Dominic Potts – UB
DRAFT D3.7 – SDK for integration of affect recognition into VR experiences
This document, “EmoSense SDK for integration of affect recognition into VR experiences” (D3.7) provides an overview of a Software Development Kit (SDK) that can be used to collect physiological measures in real-time and produce a confidence value of predicted core affect for categorical and dimensional emotion models. The supported physiological measures include: pupillometry, gaze and head movement, heart rate and heart rate variability, galvanic skin response, and facial/lip gestures. The SDK includes software to take baseline and calibration measures for all physiological signals for the purpose of data cleaning and normalisation between users. This document will describe the software components of the SDK that support affect recognition in VR and give brief examples of integrating affect recognition into VR experiences.
D3.10 – Report on HMDs and content creation engines for LBE
This document, “Report on HMDs and content creation engines for LBE” (D3.10) provides an overview of available HMDs, content creation engines and tracking solutions and their potential in use for LBEs. It will also highlight special characteristics of the respective systems and give advice which technologies have the greatest potential for the implementation in the context of the project.
DRAFT
Justus Blönnigen – FABW
Andreas Dahn – FABW
Volker Helzle – FABW
Alexander Kreische – FABW
Leszek Plichta – FABW
Eduard Schäfer – FABW
Simon Spielmann – FABW
D3.11 – Location Based Experience Demonstrator
This document describes the background and current status of FABW’s Lighthouse Project “MinoXR” (working title), the story, design, interaction mechanics, technical implementation and the public demonstrator of the November 2023 prototype of this scalable interactive multi-user location-based XR experience.
DRAFT
Narcis Pares Burgues – UPF
Volker Helzle – FABW
Alexander Kreische – FABW
Christof Luttheroth – UB
Juhani Tenhunen – AALTO
Yu Xiao – AALTO
DRAFT D4.1 EMIL’s Services Guidelines for FSTP Projects
This document, “EMIL’s Services Guidelines for FSTP Projects” (D4.1), provides detailed information for FSTP participants to understand the availability of the resources at each node and ensure their engagement in the EMIL network. The budget assigned to each FSTP will be sufficiently high to undertake the main load of work. EMIL is a centre for promoting and supporting innovative XR projects. Applicants should define their projects as self-contained and EMIL will provide the expertise and access to resources (such as personnel, spaces, and technologies) in a reasonable but limited fashion to not become overloaded. This Guidelines document provides list of resources and use limits
that can be expected by an FSTP project for access to facilities, equipment, software and services.
DRAFT
Narcis Pares Burgues – UPF
Volker Helzle – FABW
Alexander Kreische – FABW
Christof Luttheroth – UB
Christopher Clarke – UB
Juhani Tenhunen – AALTO
Yu Xiao – AALTO
D4.2 EMIL’s Guide to Evaluation of FSTP Projects
This document describes the protocol and criteria with which the Financial Support for Third Parties (FSTP)proposals from EMIL calls will be evaluated during the selection process to be funded. These criteria will also be applied throughout their development process once they have been funded.
Narcis Pares Burgues – UPF
Volker Helzle – FABW
Alexander Kreische – FABW
Christof Luttheroth – UB
Juhani Tenhunen – AALTO
Yu Xiao – AALTO
DRAFT
DRAFT D5.1 Dissemination and Use Plan
This document will define the dissemination and communication activities EMIL – European Media and Immersion Lab will execute. It will include following: Audience, who are we are trying to reach? What groups or organizations we can use to help reach this audience, Message, the purpose for the dissemination and the dissemination Approach the best meet our needs, roughly Timing of the dissemination and Responsible party: Who will lead the dissemination efforts?
Lisa Forelli – FABW
Volker Helzle – FABW
Alexander Kreische – FABW
Jonas Trottnow – FABW
DRAFT
DRAFT D5.2 Create website, social media channels, promotion video for EMIL and open calls
This document accompanies the second Deliverable of WP5T1 Dissemination and communication strategy. WP5T1 consists of the strategy for scientific publications in international journals and conferences, project showcases, thematic workshops and events, European network meetings, a joint project website, various other communication channels such as social media, press releases and newsletters, a project media package as well as promotional material and recordings of events and webinars. The document introduces the structure of EMIL website, social media channels of EMIL and the logo of the project and its use.
DRAFT D5.3 Annual reporting and workshop at FMX and other conferences
This document accompanies the first Deliverable of WP5T2 Demonstration workshops and events. EMIL Lighthouse projects, selected FSTPs and associated topics will be continuously demonstrated at physical and virtual workshops, events and conferences like FMX, Aurea Award, CVMP, Sonar and others. Industry experts will get the opportunity to engage with FSTP innovative minds and entrepreneurship paving the way to commercialization. Workshops and events will be utilized to extend and strengthen the EMIL network.
Open Publications
Esa Vikberg, Yu-han Tseng, Tim Moesgen, Ramyah Gowrishankar, Yu Xiao. 2024. ThermoTouch: Exploring a modular design of a programmable wearable thermo-haptic device. In Proceedings of the 2024 ACM International
Conference on Interactive Media Experiences Workshops (IMXw ’24), June 12, 2024, Stockholm, Sweden. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3672406.3672423
ThermoTouch: Exploring a modular design of a programmable wearable thermo-haptic device
The field of haptic and thermal interfaces for extended reality (XR) is rapidly expanding. While prototypes of thermo-haptic interfaces exist, they often suffer from shortcomings in adaptability, wearability, and integration with XR applications. This paper presents ThermoTouch, an open-source and modular haptic wearable platform featuring open APIs. It facilitates seamless prototyping for thermo-haptic interactions within extended reality applications.
Sweating the Details – PDF File – 31 MB
Dominic Potts, Zoe Broad, Tarini Sehgal, Joseph Hartley, Eamonn O’Neill, Crescent Jicol, Christopher Clarke, and Christof Lutteroth. 2024. Sweating the Details: Emotion Recognition and the Influence of Physical Exertion in Virtual Reality Exergaming. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI ’24), May 11–16, 2024, Honolulu, HI, USA. ACM, New York, NY, USA, 21 pages. https://doi.org/10.1145/3613904.3642611
Sweating the Details: Emotion Recognition and the Influence of Physical Exertion in Virtual Reality Exergaming
There is great potential for adapting Virtual Reality (VR) exergames based on a user’s affective state. However, physical activity and VR interfere with physiological sensors, making affect recognition challenging. We conducted a study (n=72) in which users experienced four emotion inducing VR exergaming environments (happiness, sadness, stress and calmness) at three different levels of exertion (low, medium, high). We collected physiological measures through pupillometry, electrodermal activity, heart rate, and facial tracking, as well as subjective affect ratings. Our validated virtual environments, data, and analyses are openly available. We found that the level of exertion influences the way affect can be recognised, as well as affect itself. Furthermore, our results highlight the importance of data cleaning to account for environmental and interpersonal factors interfering with physiological measures. The results shed light on the relationships between physiological measures and affective states and inform design choices about sensors and data cleaning approaches for affective VR.
RetroSketch – PDF File – 38 MB
Dominic Potts, Miloni Gada, Aastha Gupta, Kavya Goel, Klaus Philipp Krzok, Genevieve Pate, Joseph Hartley, Mark Weston-Arnold, Jakob Aylott, Christopher Clarke, Crescent Jicol, and Christof Lutteroth. 2025. RetroSketch: A Retrospective Method for Measuring Emotions and Presence in Virtual Reality. In CHI Conference on Human actors in Computing Systems (CHI ’25), April 26–May 01, 2025, Yokohama, Japan. ACM, New York, NY, USA, 25 pages. https://doi.org/10.1145/3706598.3713957
RetroSketch: A Retrospective Method for Measuring Emotions and Presence in Virtual Reality
Virtual Reality (VR) designers and researchers often need to measure emotions and presence as they evolve over time. The experience sampling method (ESM) is a common way to achieve this, however, ESM disrupts the experience and lacks granularity. We propose RetroSketch, a new method for measuring subjective emotions and presence in VR, where users watch back their VR experience and retrospectively sketch a plot of their feelings. RetroSketch leaves the VR experience undisturbed and yields highly granular data, including information about salient events and qualitative descriptions of their feelings. We compared RetroSketch and ESM
in a large study (n=140) using five different VR experiences over one-hour sessions. Our results show that RetroSketch and ESM measures are highly correlated with each other, as well as physiological measures indicative of emotion. The correlations are robust across different VR experiences and user demographics. They also highlight the impact of ESM on users’ experience.
FateoftheMinotaur – PDF File – 1 MB
Andreas Dahn, Leszek Plichta, Simon Spielmann, Eduard Schäfer, and Justus Blönnigen. 2024. Fate of the Minotaur – A Scalable Location-Based VR Experience. In Special Interest Group on Computer Graphics and Interactive Techniques Conference Immersive Pavilion (SIGGRAPH Immersive Pavilion ’24), July 27–August 01, 2024, Denver, CO, USA. ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/3641521.3664406
Fate of the Minotaur – A Scalable Location-Based VR Experience
Within our narrative location-based VR experience „Fate of the Minotaur“, the players embody the role of human sacrifices from Athens who are sent into the labyrinth of the Minotaur by Minos,
King of Crete. The players learn about the tragic family story behind the ancient Greek myth and have to pick a side by either killing the Minotaur or sparing the troubled creature’s life. In a unique approach, the content can be experienced in different immersive scale levels, depending on the technical and physical limitations of the location presenting the experience. From a technical perspective, a novel engine-agnostic and flexible open-source virtual production framework was used to realise the multiplayer network part of the game. Our non-photorealistic visual approach is inspired by ancient Greek murals and vases, allowing us to provide the experience with a small footprint in energy consumption and required equipment.
ApredictiveModel – PDF File – 1.62 MB
Jicol C, Cheng HY, Petrini K, O’Neill E (2023) A predictive model for understanding the role of emotion for the formation of presence in virtual reality. PLoS ONE 18(3): e0280390. https://
doi.org/10.1371/journal.pone.0280390
A predictive model for understanding the role of emotion for the formation of presence in virtual reality
Users’ emotions may influence the formation of presence in virtual reality (VR). Users’
expectations, state of arousal and personality may also moderate the relationship between
emotions and presence. An interoceptive predictive coding model of conscious presence
(IPCM) considers presence as a product of the match between predictions of interoceptive
emotional states and the actual states evoked by an experience (Seth et al. 2012). The present
paper aims to test this model’s applicability to VR for both high-arousal and low-arousal
emotions. The moderating effect of personality traits on the creation of presence is also
investigated. Results show that user expectations about emotional states in VR have an
impact on presence, however, expression of this relationship is moderated by the intensity
of an emotion, with only high-arousal emotions showing an effect. Additionally, users’ personality
traits moderated the relationship between emotions and presence. A refined model
is proposed that predicts presence in VR by weighting emotions according to their level of
arousal and by considering the impact of personality traits.
AcceleratingXRInnovation – PDF File – 8 MB
Justus Blönnigen, Christopher Clarke, Andreas Dahn, Lisa Forelli, Ramyah Gowrishankar, Tuija Heikura, Volker Helzle, Paul Hine, Crescent Jicol, Alexander Kreische, Christof Lutteroth, Francisco Macía, Tim Moesgen, Narcis Pares, Leszek Plichta, Dominic Potts, Eduard Schäfer, Adwait Sharma, Simon Spielmann, Juhani Tenhunen, Jonas Trottnow, Yu-Han Tseng, Esa Vikberg, Yu Xiao. 2024. Accelerating XR Innovation through a pan-European Lab Network: An overviewof the EMIL project. In Proceedings of the 2024 ACM International Conference on Interactive Media Experiences Workshops (IMXw ’24), June 12, 2024, Stockholm, Sweden. ACM, New York, NY, USA, 5 pages. https://doi.org/10.1145/3672406.3672426
Accelerating XR Innovation through a pan-European Lab Network: An overview of the EMIL project
European Media and Immersion Lab, or EMIL, is a pan-European network of extended reality (XR) labs consisting of 4 European academic institutions, with a mission to accelerate development of
virtual, augmented and mixed reality technologies, content, services and applications. The 30-month project, which started in September 2022, has been funded by the European Union and co-funded by Innovate UK. This paper gives an overview of the project’s goals, its organization, and selected results that have been achieved.
DesigningBeyondHotandCold – PDF File – 9 MB
Tim Moesgen, Ramyah Gowrishankar, and Yu Xiao. 2024. Designing Beyond Hot and Cold – Exploring Full-Body Heat Experiences in Sauna. In Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’24), February 11–14, 2024, Cork, Ireland. ACM, New York, NY, USA, 14 pages. https://doi.org/10.1145/3623509.3633364
Designing Beyond Hot and Cold – Exploring Full-Body Heat Experiences in Sauna
The design of thermal experiences, often only associated with hot and cold sensations, encompasses a much wider range of qualities that can evoke emotional and sensory effects on users extending beyond this basic binary nature. Through a phenomenological
study of a traditional Finnish sauna, we delve into how individuals perceive and express full-body heat sensations using both verbal and non-verbal methods and infer dimensions and parameters of heat that can inform future thermal experience design. Findings from participants’ expressions lead to formulating experiential dimensions such as the dynamic nature of heat, the aesthetics of discomfort, considerations of texture and heaviness, interoception, and personal memories that expand our understanding what heat as a design material consists of and the various possibilities it may hold. Furthermore, we propose three thermal parameters of motion, timbre, and distribution which can contribute to designing more intricate heat experiences and pave the way for further research in temperature interfaces.
UnderstandingandDesigning – PDF File – 3 MB
Tim Moesgen. 2024. Understanding and Designing Thermal Experiences. In Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’24), February 11–14, 2024, Cork, Ireland. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/3623509.3634893
Understanding and Designing Thermal Experiences
This PhD explores the potential of leveraging the unique qualities of temperature experiences in the design of tactile and thermal interfaces. Thermal stimuli can evoke various emotions and sensations, from relaxation or nostalgia to feelings of vitality. The research examines the design space of temperature-related experiences, driven by the growing body of literature in multi-sensory technologies and thermal feedback. While temperature feedback has shown promise in enhancing virtual immersion or interpersonal communication, the focus has primarily been on technological aspects, leaving experiential and aesthetic dimensions largely unexplored. This paper highlights the need for a more nuanced understanding of temperature experiences to inform future design and enhance the richness of thermal interfaces in various applications.
MultisensoryExperiences – PDF File – 4 MB
Sun, W., Banakou, D., Świdrak, J. et al. Multisensory experiences of affective touch in virtual reality enhance engagement, body ownership, pleasantness, and arousal modulation. Virtual Reality 28, 162 (2024). https://doi.org/10.1007/s10055-024-01056-2
Multisensory experiences of affective touch in virtual reality enhance engagement, body ownership, pleasentness, and arousal modulation
When engaging in physical contact, our emotional response hinges not only on the nuanced sensory details and the receptive properties of the skin but also on contextual cues related to the situation and interpersonal dynamics. The consensus is that the nature of the affective interactive experience in social touch is shaped by a combination of ascending, C-tactile (CT) afferents mediated somatosensory information, and modulatory, top-down information. The question we pose here is whether, in the absence of somatosensory input, multisensory cues alone can suffice to create a genuinely pleasant, authentic, and engaging experience in virtual reality. The study aims to explore how affective touch is perceived in immersive virtual environments, considering varied social norms in neutral settings or settings like a physiotherapy room where the touch provider is a healthcare professional. We conducted an experiment with 58 male and female healthy adults, where we employed a within-group counterbalanced design featuring two factors: (a) visuo-tactile affective touch, and (B) visual-only affective touch. Findings, drawn from questionnaires and collected physiological data, shed light on how contextual factors influence implicit engagement, self-reported embodiment, co-presence, as well as the perceived realism
and pleasantness of the touch experience. Our findings, in line with the literature, indicate that to experience the advantages of touch in immersive virtual worlds, it is essential to incorporate haptic feedback, as depending solely on visual input may not be adequate for fully realising the optimal benefits of interpersonal touch. Furthermore, in contradiction with our hypothesis, a less ambiguous context (specifically, the physiotherapy room and touch from a physiotherapist) is
not linked to heightened touch pleasantness.
Additional Publications
- Moesgen, T., Ho, HN., Xiao, Y. (2025). Apparent Thermal Motion on the Forearm. In: Kajimoto, H., et al. Haptics: Understanding Touch; Technology and Systems; Applications and Interaction. EuroHaptics 2024. Lecture Notes in Computer Science, vol 14768. Springer, Cham. https://doi.org/10.1007/978-3-031-70058-3_5
- Hine, P., Tadesse Mamo, L., & Pares, N. (2022, April). AR Magic Lantern: Group-based Co-Located Augmentation Based on the World-as-Support AR Paradigm. In CHI Conference on Human Factors in Computing Systems Extended Abstracts (CHI ’22). https://doi.org/10.1145/3491101.3519918
- (Submitted) Anna Logetskaia, Paul Hine, Francisco Macia, and Narcis Pares (2026). Comparing Spatial AR to Pass-through Headset AR in a Real Virtual Heritage Experience. IEEE Conference on Virtual Reality and 3D User Interfaces (IEEE VR) 2026