A Carnegie Mellon University research group is developing the next generation of explorers—robots.
The Autonomous Exploration Research Team has created a collection of robotic systems and planners that let robots explore more swiftly, go deeper into uncharted territory, and produce more precise and thorough maps. The methods enable robots to complete all of these tasks independently, navigating themselves and drawing a map without human assistance.
You can set it in any environment, like a department store or a residential building after a disaster, and off it goes. It builds the map in real-time, and while it explores, it figures out where it wants to go next. You can see everything on the map. You don’t even have to step into the space. Just let the robots explore and map the environment.
Ji Zhang, System Scientist, Robotics Institute, Carnegie Mellon University
Over the course of more than three years, the team has worked on exploration systems. On the CMU campus, they have investigated and mapped a number of underground mines, a parking garage, the Cohon University Center, and several other indoor and outdoor places.
Almost any robotic platform can be equipped with the system’s processors and sensors to make it a modern explorer. For the majority of its testing, the team made use of drones and a customized motorized wheelchair.
Using the group’s technology, robots can explore in three modes. In one mode, a person could guide and control the robot’s motions, while autonomous mechanisms protect it from colliding with walls, ceilings, or other objects.
In another mode, a user can point to a location on a map, and the robot will go to that location. The third mode is sheer exploration. The robot starts on its own, exploring the entire area and creating a map.
This is a very flexible system to use in many applications, from delivery to search-and-rescue.
Howie Choset, Professor, Robotics Institute, Carnegie Mellon University
The team used an exploration algorithm in conjunction with a 3D scanning lidar sensor, a forward-looking camera, and inertial measurement unit sensors to give the robot the ability to understand where it is, where it has been, and where it should go next.
The resulting systems are far more effective than earlier methods, producing maps that are more comprehensive while cutting the algorithm run time in half.
The new technologies operate in low-light, hazardous environments with intermittent communication, such as caverns, tunnels, and abandoned constructions. Team Explorer, a CMU and Oregon State University participant in DARPA’s Subterranean Challenge, was powered by a version of the group’s exploration system.
Team Explorer finished fourth in the final competition but received the Most Sectors Explored Award for charting most of the route.
All of our work is open-sourced. We are not holding anything back. We want to strengthen society with the capabilities of building autonomous exploration robots. It is a fundamental capability. Once you have it, you can do a lot more.
Chao Cao, Ph.D. Student, Robotics Institute, Carnegie Mellon University
The team's most recent study was published online in Science Robotics under the title “Representation Granularity Enables Time-Efficient Autonomous Exploration in Large, Complex Worlds." At renowned robotics conferences, past research has won the top accolades.
At the Robotics Science and Systems Conference in 2021, “TARE: A Hierarchical Framework for Efficiently Exploring Complex 3D Environments” earned the Best Paper and Best Systems Paper awards.
A paper winning both prizes was unprecedented in the conference’s history. At the International Conference on Intelligent Robots and Systems in 2022, the paper “FAR Planner: Fast, Attemptable Route Planner Using Dynamic Visibility Update” received the Best Student Paper Award.
Autonomous Exploration Summary Video for Science Robotics Article
Video Credit: Carnegie Mellon University
Journal Reference
Cao, C., et al. (2023) Representation granularity enables time-efficient autonomous exploration in large, complex worlds. Science Robotics. doi:10.1126/scirobotics.adf0970