Real-time auditory-visual distance rendering for a virtual reaching task
- 5 November 2007
- proceedings article
- Published by Association for Computing Machinery (ACM)
- p. 179-182
- https://doi.org/10.1145/1315184.1315217
Abstract
This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.Keywords
This publication has 13 references indexed in Scilit:
- Near-Field Virtual Audio DisplaysPRESENCE: Virtual and Augmented Reality, 2002
- On specification and the sensesBehavioral and Brain Sciences, 2001
- Range dependence of the response of a spherical head modelThe Journal of the Acoustical Society of America, 1998
- A structural model for binaural sound synthesisIEEE Transactions on Speech and Audio Processing, 1998
- The effect of head rotations on vertical plane sound localizationThe Journal of the Acoustical Society of America, 1997
- Preferred Sound Intensity Increase for Sensation of Half DistancePerceptual and Motor Skills, 1991
- Absolute and Relative Cues for the Auditory Perception of Egocentric DistancePerception, 1979
- Distance Estimation of 0° or Apparent 0°-Oriented Speech Signals in Anechoic SpaceThe Journal of the Acoustical Society of America, 1969
- Effect of Induced Head Movements on Localization of Direction of SoundsThe Journal of the Acoustical Society of America, 1967
- The role of head movements and vestibular and visual cues in sound localization.Journal of Experimental Psychology, 1940