Abstract
This paper presents both an analysis of requirements for user control over simulated locomotion and a new control technique designed to meet these requirements. The goal is to allow the user to move through virtual environments in as similar a manner as possible to walking through the real world. We approach this problem by examining the interrelationships between motion control and the other actions people use to act, sense, and react to their environment. If the interactions between control actions and sensory feedback can be made comparable to those of actions in the real world, then there is hope for constructing an effective new technique. Candidate solutions are reviewed once the analysis is developed. This analysis leads to a promising new design for a sensor-based virtual locomotion called Gaiter. The new control allows users to direct their movement through virtual environments by stepping in place. The movement of a person's legs is sensed, and in-place walking is treated as a gesture indicating the user intends to take a virtual step. More specifically, the movement of the user's legs determines the direction, extent, and timing of their movement through virtual environments. Tying virtual locomotion to leg motion allows a person to step in any direction and control the stride length and cadence of his virtual steps. The user can walk straight, turn in place, and turn while advancing. Motion is expressed in a body-centric coordinate system similar to that of actual stepping. The system can discriminate between gestural and actual steps, so both types of steps can be intermixed.