The perception and control of stance are frequently studied in virtual environments, where computer-generated videographic displays are used to simulate the optical consequences of body sway. Generally, the intent of such studies is to understand how posture is controlled outside of virtual environments (i.e., in daily life). Accordingly, the validity of such studies will depend upon the extent to which postural responses to videographically generated optical flow resemble postural responses to optical flow that is generated in other ways. We conducted a direct test of postural responses to optical flow generated using two technologies: physical displacement of the visible surroundings (using a moving room), and videographic projection of computer-generated graphics. We attempted to make the two displays as similar as possible in terms of visual angle, optical texture, and the amplitude and frequency of oscillation. The results revealed several significant differences in postural responses to the two types of stimuli. This raises questions about the extent to which postural control in virtual environments can be generalized to postural control in the real world.