Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Stephen Webb
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2005) 14 (5): 501–510.
Published: 01 October 2005
Abstract
View article
PDF
Immersive, multiprojector systems are a compelling alternative to traditional head-mounted displays and have been growing steadily in popularity. However, the vast majority of these systems have been confined to laboratories or other special purpose facilities and have had little impact on general human—computer and human—human communication models. Cost, infrastructure requirements, and maintenance are all obstacles to the widespread deployment of immersive displays. We address these issues in the design and implementation of the Metaverse. The Metaverse system focuses on a multiprojector scalable display framework that supports automatic detection of devices as they are added/removed from the display environment. Multiple cameras support calibration over wide fields of view for immersive applications with little or no input from the user. The approach is demonstrated on a 24-projector display environment that can be scaled on the fly, reconfigured, and redeployed according to user needs. Using our method, subpixel calibration is possible with little or no user input. Because little effort is required by the user to either install or reconfigure the projectors, rapid deployment of large, immersive displays in somewhat unconstrained environments is feasible.