Posted in | News | Humanoids

Telepresence Breakthrough: Remote-Control Humanoid Robot

A recent article published in the journal International Journal of Social Robotics introduced a novel robotic avatar system that allows a human operator to remotely control a humanoid robot. This system aims to provide the operator with sensory feedback, enabling them to experience the robot's surroundings in real-time.

Study: Telepresence Breakthrough: Remote-Control Humanoid Robot. Image Credit: Mauro Rodrigues/Shutterstock.com

Background

Robotic avatar systems are robots that operate in synchronization with their human operators. The operator and the robot communicate via a network, allowing the robot to replicate the operator's voice and movements in real-time. These systems have many applications, including performing tasks in hazardous or inaccessible locations, assisting individuals with disabilities, and enhancing social interactions and entertainment. A key challenge in developing these systems, however, is achieving teleoperation and telepresence.

About the Research

In this research, a novel robotic avatar system was developed, centered around a custom humanoid robot named TOCABI. This system aims to enhance immersive teleoperation and telepresence experiences for users. It is composed of two primary components: the operator station and the avatar robot itself.

Operator Station

The operator station is designed for remote control of TOCABI. It is equipped with a head-mounted display (HMD) that provides visual and auditory feedback to the operator, immersing them in the robot's environment. Haptic feedback devices are included to give force feedback, and haptic gloves are used to deliver tactile and kinesthetic feedback, enhancing the sense of touch. Additionally, pedals are incorporated to control the robot's mobile base. This setup allows the operator to control TOCABI’s upper body, hands, and mobility while receiving comprehensive sensory feedback, facilitating a rich interaction with the environment.

Avatar Robot - TOCABI

TOCABI, the humanoid robot, is a central piece of this system. It boasts 33 degrees of freedom and stands at a height of 1.65 meters. The robot is equipped with two head cameras that provide stereo vision, which is relayed back to the operator through the HMD. A wrist-mounted camera helps in object recognition tasks, such as identifying stones. TOCABI’s hands are designed with four fingers each, featuring force sensors and silicone fingertips to mimic human touch and interaction capabilities.

Voice communication is enabled through an integrated speaker and microphone, allowing direct interaction with people in the vicinity of the robot. Four mecanum wheels support TOCABI's mobility, which allows for omnidirectional movement. This mobility is further enhanced by a chair that the robot can utilize for sitting tasks and a battery that ensures a prolonged operational time.

Connectivity and Software

The connection between the operator station and TOCABI is managed wirelessly via a router, ensuring agile and responsive control. Data transmission across this system is handled through the Transmission Control Protocol/Internet Protocol (TCP/IP) and is facilitated by the Robot Operating System (ROS), which is standard in robotic applications for managing device control and feedback. The system also incorporates several software tools to boost its capabilities: Unity3D for real-time visualization, OpenCV for advanced image processing, TurboJPEG for efficient image compression, Mumble for clear voice communication, and Win-Ros for seamless cross-platform data exchange.

This system represents a significant advancement in robotic teleoperation. It aims to bridge the gap between remote operators and physical environments, providing a comprehensive and interactive experience that could be adapted for various applications, such as remote assistance, hazardous environment exploration, and interactive learning.

Research Findings

The study evaluated a new robotic avatar system, assessing its performance through user studies and a competition demonstration. The user studies included 10 participants who performed three tasks: activating a switch, lifting a drill, and recognizing a stone. Results indicated that the system offered intuitive interfaces, reliable and low-latency communication, and realistic, immersive sensory feedback. Participants reported high levels of satisfaction, presence, and immersion.

In the ANA Avatar XPRIZE Finals, the developers demonstrated their system by completing ten missions within an hour. These missions were designed to assess the system's mobility, dexterity, and perception. The team successfully completed eight out of 10 missions, achieving an eighth-place finish among 17 finalists. Judges provided positive feedback, commending the system's mobility, stability, and feedback capabilities.

Applications

The presented robotic avatar system has significant implications across various domains, including disaster response, space exploration, healthcare, education, and entertainment. It can enable operators to remotely control a humanoid robot with a mobile base and experience the robot's surroundings through diverse sensory feedback.

This capability can allow operators to perform tasks demanding mobility, dexterity, and perception, such as operating tools, manipulating objects, and recognizing features. Furthermore, the system can enhance social interaction and collaboration by allowing the operator to assist, guide, or provide companionship to people near the robot.

Conclusion

In summary, the novel robotic avatar system effectively allowed the human operator to control the robot with ease and precision while providing a fully immersive experience in the robot’s environment. It secured 8th place in the ANA Avatar XPRIZE competition by successfully completing 8 out of 10 missions.

Looking ahead, the developers plan to enhance the system with new features such as gesture recognition, facial expression, and emotion detection. They believe these improvements could greatly advance robotic avatar technology, opening up new possibilities for human-robot interaction and collaboration.

Journal Reference

Park, B., Kim, D., Lim, D. et al. Intuitive and Interactive Robotic Avatar System for Tele-Existence: TEAM SNU in the ANA Avatar XPRIZE Finals. Int J of Soc Robotics (2024). DOI: 10.1007/s12369-024-01152-y, https://link.springer.com/article/10.1007/s12369-024-01152-y

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Muhammad Osama

Written by

Muhammad Osama

Muhammad Osama is a full-time data analytics consultant and freelance technical writer based in Delhi, India. He specializes in transforming complex technical concepts into accessible content. He has a Bachelor of Technology in Mechanical Engineering with specialization in AI & Robotics from Galgotias University, India, and he has extensive experience in technical content writing, data science and analytics, and artificial intelligence.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Osama, Muhammad. (2024, July 12). Telepresence Breakthrough: Remote-Control Humanoid Robot. AZoRobotics. Retrieved on September 06, 2024 from https://www.azorobotics.com/News.aspx?newsID=15071.

  • MLA

    Osama, Muhammad. "Telepresence Breakthrough: Remote-Control Humanoid Robot". AZoRobotics. 06 September 2024. <https://www.azorobotics.com/News.aspx?newsID=15071>.

  • Chicago

    Osama, Muhammad. "Telepresence Breakthrough: Remote-Control Humanoid Robot". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=15071. (accessed September 06, 2024).

  • Harvard

    Osama, Muhammad. 2024. Telepresence Breakthrough: Remote-Control Humanoid Robot. AZoRobotics, viewed 06 September 2024, https://www.azorobotics.com/News.aspx?newsID=15071.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.