A recent article published in the journal International Journal of Social Robotics introduced a novel robotic avatar system that allows a human operator to remotely control a humanoid robot. This system aims to provide the operator with sensory feedback, enabling them to experience the robot's surroundings in real-time.
Background
Robotic avatar systems are robots that operate in synchronization with their human operators. The operator and the robot communicate via a network, allowing the robot to replicate the operator's voice and movements in real-time. These systems have many applications, including performing tasks in hazardous or inaccessible locations, assisting individuals with disabilities, and enhancing social interactions and entertainment. A key challenge in developing these systems, however, is achieving teleoperation and telepresence.
About the Research
In this research, a novel robotic avatar system was developed, centered around a custom humanoid robot named TOCABI. This system aims to enhance immersive teleoperation and telepresence experiences for users. It is composed of two primary components: the operator station and the avatar robot itself.
Operator Station
The operator station is designed for remote control of TOCABI. It is equipped with a head-mounted display (HMD) that provides visual and auditory feedback to the operator, immersing them in the robot's environment. Haptic feedback devices are included to give force feedback, and haptic gloves are used to deliver tactile and kinesthetic feedback, enhancing the sense of touch. Additionally, pedals are incorporated to control the robot's mobile base. This setup allows the operator to control TOCABI’s upper body, hands, and mobility while receiving comprehensive sensory feedback, facilitating a rich interaction with the environment.
Avatar Robot - TOCABI
TOCABI, the humanoid robot, is a central piece of this system. It boasts 33 degrees of freedom and stands at a height of 1.65 meters. The robot is equipped with two head cameras that provide stereo vision, which is relayed back to the operator through the HMD. A wrist-mounted camera helps in object recognition tasks, such as identifying stones. TOCABI’s hands are designed with four fingers each, featuring force sensors and silicone fingertips to mimic human touch and interaction capabilities.
Voice communication is enabled through an integrated speaker and microphone, allowing direct interaction with people in the vicinity of the robot. Four mecanum wheels support TOCABI's mobility, which allows for omnidirectional movement. This mobility is further enhanced by a chair that the robot can utilize for sitting tasks and a battery that ensures a prolonged operational time.
Connectivity and Software
The connection between the operator station and TOCABI is managed wirelessly via a router, ensuring agile and responsive control. Data transmission across this system is handled through the Transmission Control Protocol/Internet Protocol (TCP/IP) and is facilitated by the Robot Operating System (ROS), which is standard in robotic applications for managing device control and feedback. The system also incorporates several software tools to boost its capabilities: Unity3D for real-time visualization, OpenCV for advanced image processing, TurboJPEG for efficient image compression, Mumble for clear voice communication, and Win-Ros for seamless cross-platform data exchange.
This system represents a significant advancement in robotic teleoperation. It aims to bridge the gap between remote operators and physical environments, providing a comprehensive and interactive experience that could be adapted for various applications, such as remote assistance, hazardous environment exploration, and interactive learning.
Research Findings
The study evaluated a new robotic avatar system, assessing its performance through user studies and a competition demonstration. The user studies included 10 participants who performed three tasks: activating a switch, lifting a drill, and recognizing a stone. Results indicated that the system offered intuitive interfaces, reliable and low-latency communication, and realistic, immersive sensory feedback. Participants reported high levels of satisfaction, presence, and immersion.
In the ANA Avatar XPRIZE Finals, the developers demonstrated their system by completing ten missions within an hour. These missions were designed to assess the system's mobility, dexterity, and perception. The team successfully completed eight out of 10 missions, achieving an eighth-place finish among 17 finalists. Judges provided positive feedback, commending the system's mobility, stability, and feedback capabilities.
Applications
The presented robotic avatar system has significant implications across various domains, including disaster response, space exploration, healthcare, education, and entertainment. It can enable operators to remotely control a humanoid robot with a mobile base and experience the robot's surroundings through diverse sensory feedback.
This capability can allow operators to perform tasks demanding mobility, dexterity, and perception, such as operating tools, manipulating objects, and recognizing features. Furthermore, the system can enhance social interaction and collaboration by allowing the operator to assist, guide, or provide companionship to people near the robot.
Conclusion
In summary, the novel robotic avatar system effectively allowed the human operator to control the robot with ease and precision while providing a fully immersive experience in the robot’s environment. It secured 8th place in the ANA Avatar XPRIZE competition by successfully completing 8 out of 10 missions.
Looking ahead, the developers plan to enhance the system with new features such as gesture recognition, facial expression, and emotion detection. They believe these improvements could greatly advance robotic avatar technology, opening up new possibilities for human-robot interaction and collaboration.
Journal Reference
Park, B., Kim, D., Lim, D. et al. Intuitive and Interactive Robotic Avatar System for Tele-Existence: TEAM SNU in the ANA Avatar XPRIZE Finals. Int J of Soc Robotics (2024). DOI: 10.1007/s12369-024-01152-y, https://link.springer.com/article/10.1007/s12369-024-01152-y
Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.