Skip to content
Home Lighthouse Projects Aalto – Smart Garments

Aalto University

Smart Garments

Aalto University

Smart Garments

Designing smart garments and wearable haptic devices for enhancing immersive user experiences in VR.

Aalto University Logo
Hand touching a display
Photo by Killian Cartignies

Context

Addressing the sense of touch with haptic feedback in augmented and virtual reality applications has recently proved to enhance the immersive experience and has attracted tremendous attention from both academia and industry.

Furthermore, haptics as a research field has been increasingly growing in the last years. This is shown by the multiplicity of haptic devices that have been researched and developed to render haptic feedback for purposes such as accessibility, motor training, education or entertainment.

Photo by Mikko Raskinen
Photo by Mikko Raskinen

Haptic feedback tries to mimic various types of human sensations of weight, textures, temperatures and even pain. Particularly for VR development, it is crucial to simulate realistic tactile and kinesthetic interactions with virtual environments and objects for immersive user experiences.

Commercial haptic gloves are mainly designed to provide force feedback and in some cases also texture sensation in specific VR applications like simulation-based pilot training. From the texture sensation perspective, existing devices have focused on simulating the feel of solid objects in VR. 

However, the research on liquid-based and thermal haptic experiences is still in its infancy, with only a few examples available.

VR demonstration

Project Goals

Aalto University’s Wearable Systems Lab will aim for the exploration and simulation of multimodal thermal and liquid experiences and its effect on emotions, immersion and embodiment in VR by creating textile-based wearable haptic interfaces. Project objectives are:

1.

Exploration of multimodal perception of human haptic sense to achieve novel rich haptic sensations

2.

Haptic simulation of thermal, liquid sensations through wearable thermal and tactile actuators across the body

3.

Understand how emotions, feeling of immersion and embodiment can be enhanced through such haptic interfaces

Taking into account human experience

When designing thermal and haptic feedback we take into account human experiences. In our first study, we have been interviewing individuals about their thermal sensations in order to identify common experiential qualities of heat experiences that can support the design process.

Thermal Body Map by a study participant describing their heat experience of sauna
Thermal Body Map by a study participant describing their heat experience of sauna

Open API and Electronics Design

As part of the lighthouse project, we design a configurable modular haptic device, with an API that supports controlling the device on a per-actuator/sensor level and as a whole. The modules, including Peltier elements (capable of generating warm and cold stimuli), vibration motors (ERM), and temperature sensors can be set to produce for example a desired temperature or vibration intensity. With the published designs, it is possible to create a wearable device from the modules, or adapt them to add your own actuators and sensors. 

Controlling the device and its thermo-haptic components, such as adjusting intensity, duration, and location, opens up exciting possibilities for creating a diverse range of haptic and thermal patterns. This advanced level of customization enables users to experience a rich and immersive tactile environment, tailored to their preferences and needs. Whether it’s simulating the gentle touch of a soft breeze, the rhythmic pulses of a heartbeat, or the soothing warmth of a summer sun, these versatile patterns can enhance virtual reality experiences, gaming, therapy, and even practical applications like remote communication. The ability to finely tune the thermo-haptic components empowers developers and designers to craft ever more engaging and realistic multisensory interactions, revolutionizing the way we perceive and interact with technology. 

 

Documents and Files

Find the hardware and API specifications of the smart garments and how to control them to download from here 

 

Find the open source code from here: https://version.aalto.fi/gitlab/vikbere2/wearable-device-api 

Publications

    

Tim Moesgen, Ramyah Gowrishankar, and Yu Xiao. 2024. Designing Beyond Hot and Cold – Exploring Full-Body Heat Experiences in Sauna. In Proceedings of the Eighteenth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '24). Association for Computing Machinery, New York, NY, USA, Article 16, 1–14. https://doi.org/10.1145/3623509.3633364

T. a. Pham, T. Moesgen, S. Siltanen, J. Bergström and Y. Xiao, "ARiana: Augmented Reality Based In-Situ Annotation of Assembly Videos," in IEEE Access, vol. 10, pp. 111704-111724, 2022, doi: https://doi.org/10.1109/access.2022.3216015

T.A. Pham, J. Wang, R. Iyengar, Y. Xiao, P. Pillai, R. Klatzky and M. Satyanarayanan, "Ajalon: Simplifying the Authoring of Wearable Cognitive Assistants, " in Journal of Software: Practice and Experience, 25 pages, 2021, doi: https://doi.org/10.1002/spe.2987

N. Savela, A., Oksanen, M. Kaakinen, M. Noreikis and Y. Xiao, "Does Augmented Reality Affect Sociability, Entertainment, and Learning? A Field Experiment", in Journal of Applied Sciences, vol. 10, no. 4, art. no. 1392, 15 pages, February 2020. doi: https://doi.org/10.3390/app10041392

C.F.S. Leite and Y. Xiao, "Optimal sensor channel selection for resource-efficient deep activity recognition, " in Proceedings of the international conference on information processing in sensor networks (IPSN 2021), May 2021, art. no. 10, 13 pages, DOI: https://doi.org/10.1145/3412382.3458278

C. F. S. Leite and Y. Xiao, "Improving Cross-Subject Activity Recognition via Adversarial Learning," in IEEE Access, vol. 8, pp. 90542-90554, May 2020, doi: https://doi.org/10.1109/ACCESS.2020.2993818

E. Pouta and J. Mikkonen, "Hand Puppet As Means for eTextile Synthesis, " In Proceedings of the 13th International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’19). ACM, New York, NY, USA, 415–421. https://doi.org/10.1145/3294109.3300987

Skip to content