How do we perceive objects when what we see and what we touch is not at the same place? In a virtual environment, we observed that spatial delocation promotes visual dominance when judging the rotation angle of a hand-operated handle. Thus, the delocation of perceptual information appears to increase considerably the weight of the dominant sense at the expense of the other. We relate this result to the design of teleoperation and virtual reality systems, in which, typically, the visual and haptic sensory information originates in spatially distinct devices.

This content is only available as a PDF.
You do not currently have access to this content.