Many applications can be imagined for a system that processes sensory information collected during telemanipulation tasks in order to automatically identify properties of the remote environment. These applications include generating model-based simulations for training operators in critical procedures and improving real-time performance in unstructured environments or when time delays are large. This paper explores the research issues involved in developing such an identification system, focusing on properties that can be identified from remote manipulator motion and force data. As a case study, a simple block-stacking task, performed with a teleoperated two-fingered planar hand, is considered. An algorithm is presented that automatically segments the data collected during the task, given only a general description of the temporal sequence of task events. Using the segmented data, the algorithm then successfully estimates the weight, width, height, and coefficient of friction of the two blocks handled during the task. This data is used to calibrate a virtual model incorporating visual and haptic feedback. This case study highlights the broader research issues that must be addressed in automatic property identification.

This content is only available as a PDF.
You do not currently have access to this content.