Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
TocHeadingTitle
Date
Availability
1-3 of 3
Kees van den Doel
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2007) 16 (6): 643–654.
Published: 01 December 2007
Abstract
View article
PDF
We describe a methodology for virtual reality designers to capture and resynthesize the variations in sound made by objects when we interact with them through contact such as touch. The timbre of contact sounds can vary greatly, depending on both the listener’s location relative to the object, and the interaction point on the object itself. We believe that an accurate rendering of this variation greatly enhances the feeling of immersion in a simulation. To do this, we model the variation with an efficient algorithm based on modal synthesis. This model contains a vector field that is defined on the product space of contact locations and listening positions around the object. The modal data are sampled on this high dimensional space using an automated measuring platform. A parameter-fitting algorithm is presented that recovers the parameters from a large set of sound recordings around objects and creates a continuous timbre field by interpolation. The model is subsequently rendered in a real-time simulation with integrated haptic, graphic, and audio display. We describe our experience with an implementation of this system and an informal evaluation of the results.
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (2004) 13 (1): 99–111.
Published: 01 February 2004
Abstract
View article
PDF
We demonstrate a method for efficiently rendering the audio generated by graphical scenes with a large number of sounding objects. This is achieved by using modal synthesis for rigid bodies and rendering only those modes that we judge to be audible to a user observing the scene. We show how excitations of modes can be estimated and inaudible modes eliminated based on the masking characteristics of the human ear. We describe a novel technique for generating contact events by performing closed-form particle simulation and collision detection with the aid of programmable graphics hardware. The effectiveness of our system is shown in the context of suitably complex simulations.
Journal Articles
Publisher: Journals Gateway
Presence: Teleoperators and Virtual Environments (1998) 7 (4): 382–395.
Published: 01 August 1998
Abstract
View article
PDF
We propose a general framework for the simulation of sounds produced by colliding physical objects in a virtual reality environment. The framework is based on the vibration dynamics of bodies. The computed sounds depend on the material of the body, its shape, and the location of the contact. This simulation of sounds allows the user to obtain important auditory clues about the objects in the simulation, as well as about the locations on the objects of the collisions. Specifically, we show how to compute (1) the spectral signature of each body (its natural frequencies), which depends on the material and the shape, (2) the “timbre” of the vibration (the relative amplitudes of the spectral components) generated by an impulsive force applied to the object at a grid of locations, (3) the decay rates of the various frequency components that correlate with the type of material, based on its internal friction parameter, and finally (4) the mapping of sounds onto the object's geometry for real-time rendering of the resulting sound. The framework has been implemented in a Sonic Explorer program which simulates a room with several objects such as a chair, tables, and rods. After a preprocessing stage, the user can hit the objects at different points to interactively produce realistic sounds.