This paper presents the creation of an assembly simulation environment with multisensory feedback (auditory and visual), and the evaluation of the effects of auditory and visual feedback on the task performance in the context of assembly simulation in a virtual environment (VE). This VE experimental system platform brings together complex technologies such as constraint-based assembly simulation, optical motion tracking technology, and real time 3D sound generation technology around a virtual reality workbench and a common software platform. A peg-in-a-hole and a Sener electronic box assembly task have been used as the task cases to conduct the human factor experiment, using sixteen participants. Both objective performance data (i.e., task completion time, TCT; and human performance error rate, HPER) and subjective opinions (i.e., questionnaires) on the utilization of auditory and visual feedback in a virtual assembly environment (VAE) have been gathered from the experiment. Results showed that the introduction of auditory and/or visual feedback into VAE did improve the assembly task performance. They also indicated that integrated feedback (auditory plus visual) offered better assembly task performance than either feedback used in isolation. Most participants preferred integrated feedback to either individual feedback (auditory or visual) or no feedback. The participants' comments demonstrated that nonrealistic or inappropriate feedback had a negative effect on the task performance, and easily made them frustrated.

This content is only available as a PDF.