As I walk across the frozen lake, brightly-colored bubbles move under the surface. I chase one down and stamp on it hard with my right foot. Instantly, a crack appears where my foot hit the ice. My companions are doing the same on other parts of the lake. We’re all motivated to fill the ice with cracks because we see the friendly dragon swimming impatiently below the surface, where he has been trapped by an evil genius and an errant knight. Finally, after a few minutes of chasing and stomping, the dragon bursts triumphantly through the hole that we have helped open in the ice.
We have been playing “DragonIce,” a new VR game being developed by doctoral students in the Full-body Interaction Lab at Universitat Pompeu Fabra in Barcelona. Rather than head-mounted VR in which the player wears VR goggles, it projects the virtual environment onto the floor and players interact with it simply by moving their bodies. This embodied interaction, based on embodied cognition theories, allows the players to have a more natural sense of the space, their bodies, and each other. That last one is important for the game’s intended player: a child with autism spectrum disorder (ASD). The researchers working on the game hope that it will become a powerful tool for helping children with ASD improve their social skills.
The room where DragonIce is installed is an XR facility created and managed by the Full-body Interaction Lab. Two projectors, mounted in the rafters, create a play space nearly 6m in diameter, giving players plenty of room to move about freely. Their movements are tracked with stunning precision thanks to an array of four HTC Vive base stations (the same used for tracking Vive head-mounted units) and up to eight tracking units. Players hold or wear small tracking units – in the case of DragonIce, they are first attached to the player’s shoe so that the system can properly react to the stomping actions, and later held by players so they can manipulate game elements with more precision. The visuals are driven by a powerful workstation running Unity, which seamlessly distributes the gameplay over the two projectors and coordinates it with the input from the tracking system.
The Full-body Interaction Lab is excited to offer its projective XR facility to EMIL partners wishing to conduct research or development using such a system. The lab is also beginning to develop the space into a benchmarking facility for inside-out tracking, taking advantage of the Vive tracking system as ground truth, and the projectors for controlled, diffuse lighting. EMIL partners interested in developing and testing projective VR games or novel inside-out tracking methods should consider the space and the associated research teams as an important asset. If you are interested in using the space for your project, please contact Narcis Pares, the PI of the UPF EMIL node.