Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-2 of 2
Larry Gritz
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (1998) 7 (1): 67–77.
Published: 01 February 1998
Abstract
View article
PDF
Sounds are often the result of motions of virtual objects in a virtual environment. Therefore, sounds and the motions that caused them should be treated in an integrated way. When sounds and motions do not have the proper correspondence, the resultant confusion can lessen the effects of each. In this paper, we present an integrated system for modeling, synchronizing, and rendering sounds for virtual environments. The key idea of the system is the use of a functional representation of sounds, called timbre trees. This representation is used to model sounds that are parameterizable. These parameters can then be mapped to the parameters associated with the motions of objects in the environment. This mapping allows the correspondence of motions and sounds in the environment. Representing arbitrary sounds using timbre trees is a difficult process that we do not address in this paper. We describe approaches for creating some timbre trees including the use of genetic algorithms. Rendering the sounds in an aural environment is achieved by attaching special environmental nodes that represent the attenuation and delay as well as the listener effects to the timbre trees. These trees are then evaluated to generate the sounds. The system that we describe runs parallel in real time on an eight-processor SGI Onyx. We see the main contribution of the present system as a conceptual framework on which to consider the sound and motion in an integrated virtual environment.
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (1993) 2 (4): 353–360.
Published: 01 November 1993
Abstract
View article
PDF
Virtual environment research involves a number of related problems from a variety of domains. A joint research at the George Washington University and the Naval Research Laboratory is bringing together issues from these domains to study the factors that contribute to an integrated virtual environment. The research can be divided into three general categories: human factors, motion control, and sound synchronization. Human factors issues involve the development of new paradigms for movement and navigation, essential for performance of general tasks in virtual spaces. Novel approaches to motion control are being explored to help users of virtual environments interact and control virtual objects. This involves both interactive control as well as automation through evolutionary approaches. The sounds being generated as a result of these motions are modeled with compositional techniques to parameterize and synchronize them to the events in the environment. The research is being approached from both a fundamental point of view typical of an academic environment as well as from an application oriented point of view of interest to the Navy. The cooperative relationship has benefited both the George Washington University and the Naval Research Laboratory.