Investigating the time and spatial constraints under which visual and auditory stimuli are perceived as a unique percept or as spatially coincident has been a topic of numerous researches. However, these findings have been derived up to now in extremely simplified stimulation context consisting in the combination of elementary auditory and visual stimuli usually displayed in dark and anechoic conditions. The present experiment is conducted in a VR environment using a passive stereoscopic display and binaural audio rendering. Subjects have to indicate the point of subjective spatial alignment (PSSA) between a horizontally moving visual stimulus that crosses the direction of a stationary sound. Auditory stimuli are displayed on headphones using individualized head-related transfer functions and the visual stimulus is integrated in a visual background texture in order to convey visual perspective. Two types of audio stimuli are used to evaluate the influence of auditory localisation acuity on the auditory-visual integration: periodic white noise bursts providing optimal localisation cues and periodic 1kHz tone bursts. The present study will indicate whether previous findings (Lewald et al., Behavioural Brain Research, 2001) still hold in more complex audio-visual contexts such as those offered by cutting edge VR environments.

Investigation of auditory-visual integration in VR Environments

Dellepiane M;
2007

Abstract

Investigating the time and spatial constraints under which visual and auditory stimuli are perceived as a unique percept or as spatially coincident has been a topic of numerous researches. However, these findings have been derived up to now in extremely simplified stimulation context consisting in the combination of elementary auditory and visual stimuli usually displayed in dark and anechoic conditions. The present experiment is conducted in a VR environment using a passive stereoscopic display and binaural audio rendering. Subjects have to indicate the point of subjective spatial alignment (PSSA) between a horizontally moving visual stimulus that crosses the direction of a stationary sound. Auditory stimuli are displayed on headphones using individualized head-related transfer functions and the visual stimulus is integrated in a visual background texture in order to convey visual perspective. Two types of audio stimuli are used to evaluate the influence of auditory localisation acuity on the auditory-visual integration: periodic white noise bursts providing optimal localisation cues and periodic 1kHz tone bursts. The present study will indicate whether previous findings (Lewald et al., Behavioural Brain Research, 2001) still hold in more complex audio-visual contexts such as those offered by cutting edge VR environments.
2007
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
Neuroscience
Crossmodal
Perception
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/85910
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact