Real-time auditory-visual distance rendering for a virtual reaching task

Abstract
This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.

This publication has 13 references indexed in Scilit: