Since its early use in the remote manipulation of radioactive materials, the field of teleoperation has expanded its scope to include manipulation at different scales and in virtual worlds. By manipulating a proximal master device, an operator controls the action of a distant or inaccessible slave device. Applications are expected in space and undersea exploration and servicing, forestry and mining, microsurgery and microassembly, and computer-user interfaces.
The goal of teleoperation is to achieve transparency by mimicking human motor and sensory functions. Within the relatively narrow scope of manipulating a tool, transparency is achieved if the operator cannot distinguish between maneuvering the master device and maneuvering the actual tool.
The ability of a teleoperation system to provide transparency depends largely on the performance of the master, a computer-controlled electromechanical interface. When this interface provides kinesthetic or tactile feedback to the user, it is called a haptic interface. Ideally, the haptic interface should be able to emulate any environment encountered by the tool, from free-space to infinitely stiff obstacles. Its performance depends on its electromechanical design and the algorithms used to control it.