This paper proposes a method to simultaneously and coherently present visual and laser sensors information through an augmented reality visualization interface further enhanced by stereoscopic viewing. The use of graphical objects is proposed to represent proximity measurements which are superimposed and suitably aligned to video information through image processing. This new methodology enables an operator to quickly comprehend scene layout and dynamics, and to respond in an accurate and timely manner. Therefore the human-robot interaction is expected to be intuitive, accurate and fast. The use of graphical elements to assist teleoperation, sometime discussed in the literature, is here proposed following a systematic approach and developed based on authors' previous works on stereoscopic teleoperation. The approach is experimented on a real telerobotic system where a user operates a robot located approximately 3, 000 kilometers apart. The results of a pilot test were very encouraging. They showed simplicity and effectiveness of the approach proposed and represent a base for further investigations.
|Titolo:||Augmented Reality Stereoscopic Visualization for Intuitive Robot Teleguide|
|Data di pubblicazione:||2010|
|Appare nelle tipologie:||4.1 Contributo in Atti di convegno|