Abstract
Studies in virtual reality (VR) have introduced numerous multisensory simulation techniques for more immersive VR experiences. However, although they primarily focus on expanding sensory types or increasing individual sensory quality, they lack consensus in designing appropriate interactions between different sensory stimuli. This paper explores how the congruence between auditory and visual (AV) stimuli, which are the sensory stimuli typically provided by VR devices, affects the cognition and experience of VR users as a critical interaction factor in promoting multisensory integration. We defined the types of (in)congruence between AV stimuli, and then designed 12 virtual spaces with different types or degrees of congruence between AV stimuli. We then evaluated the presence, immersion, motion sickness, and cognition changes in each space. We observed the following key findings: 1) there is a limit to the degree of temporal or spatial incongruence that can be tolerated, with few negative effects on user experience until that point is exceeded; 2) users are tolerant of semantic incongruence; 3) a simulation that considers synesthetic congruence contributes to the user's sense of immersion and presence. Based on these insights, we identified the essential considerations for designing sensory simulations in VR and proposed future research directions.
Original language | English |
---|---|
Pages (from-to) | 2080-2090 |
Number of pages | 11 |
Journal | IEEE Transactions on Visualization and Computer Graphics |
Volume | 28 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2022 May 1 |
Bibliographical note
Publisher Copyright:© 1995-2012 IEEE.
All Science Journal Classification (ASJC) codes
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Computer Graphics and Computer-Aided Design