With the introduction of minimally invasive techniques, surgeons must learn skills and procedures that are radically different from traditional open surgery. Traditional methods of surgical training that were adequate when techniques and instrumentation changed relatively slowly may not be as efficient or effective in training substantially new procedures. Virtual environments are a promising new medium for training.
This paper describes a testbed developed at the San Francisco, Berkeley, and Santa Barbara campuses of the University of California for research in understanding, assessing, and training surgical skills. The testbed includes virtual environments for training perceptual motor skills, spatial skills, and critical steps of surgical procedures. Novel technical elements of the testbed include a four-DOF haptic interface, a fast collision detection algorithm for detecting contact between rigid and deformable objects, and parallel processing of physical modeling and rendering. The major technical challenge in surgical simulation to be investigated using the testbed is the development of accurate, real-time methods for modeling deformable tissue behavior. Several simulations have been implemented in the testbed, including environments for assessing performance of basic perceptual motor skills, training the use of an angled laparoscope, and teaching critical steps of the cholecystectomy, a common laparoscopic procedure. The major challenges of extending and integrating these tools for training are discussed.