It has been demonstrated that recalibrations of audio-visual asynchrony are likely to occur in sensory processing rather than in the higher domains of cognition in the brain. The aim of the present study was to investigate recalibration of time perception to judge auditory and visual input simultaneity using a virtual environment (VE). A virtual corridor built for this experiment has depth of field, and includes six light sources (light-emitting diodes, LEDs) affixed on a computer monitor, which appear to be situated at different distances. Subjects in the VE were presented with both the flashes of LEDs and associated bursts of white noise with random stimulus onset asynchrony (SOA). Even though the auditory and visual stimuli were presented from the same distance on the display device, the subjects showed different time recalibration effects (TREs) depending on subjects' tendencies of immersion in VE. The results suggest that the differences in the TREs can be explained by subject-specific tendencies such as absorption to stimuli, which can construct subjective reality in top-down processing. Future research on neural substrates of recalibration for simultaneity will contribute toward understanding of how the brain creates the representation of spatiotemporal coherence.

You do not currently have access to this content.