Abstract
We demonstrate a generalizable method for unified multitouch detection and response on various nonparametric and parametric surfaces to support interactive physical-virtual experiences. The method employs multiple infrared (IR) cameras, one or more projectors, IR light sources, and a rear-projection surface. IR light reflected off human fingers is captured by cameras with matched IR pass filters, allowing for the detection and localization of multiple simultaneous finger-touch events. The processing of these events is tightly coupled with the rendering system to produce auditory and visual responses displayed on the surface using the projector(s) to achieve a responsive, interactive, physical–virtual experience. We demonstrate the method on two nonparametric face-shaped surfaces and a planar surface. We also illustrate the approach's applicability in an interactive medical training scenario using one of the head surfaces to support hands-on, touch-sensitive medical training with dynamic physical–virtual patient behavior.