Researchers from Delft University of Technology were inspired by biological discoveries on how ants use their vision and step counting to navigate safely back home. They developed an autonomous navigation method for small, lightweight robots based on these insect behaviors. Their findings were published in the journal Science Robotics.
The strategy enables these robots to return home after long journeys, requiring minimal computation and memory (1.16 kilobytes per 100 m). In the future, tiny autonomous robots could have a wide range of applications, from monitoring stock in warehouses to detecting gas leaks in industrial sites.
Sticking Up for the Little Guy
Small robots weighing a few hundred to tens of grams have a wide range of intriguing practical uses. Due to their small weight, they are very safe even in the event that they unintentionally run into someone. They can also maneuver through confined spaces.
Additionally, if they are inexpensive to produce, they can be used in greater quantities to swiftly cover a wide area, such as in greenhouses, for the early identification of pests or diseases. Since they have far fewer resources than larger robots, it is challenging to build such tiny robots that can function independently.
A major obstacle to the use of tiny robots is that they must be able to navigate autonomously to perform real-world applications. Robots can receive assistance from outside infrastructure in this way. They can take advantage of internal wireless communication beacons or outdoor GPS satellites to approximate their location.
However, depending on this kind of infrastructure, it is frequently not desired. Indoors, GPS is unreliable, and in crowded areas like urban canyons, it can become very inaccurate. Furthermore, it might be very costly or impossible to construct and maintain beacons in interior locations, as in search-and-rescue situations.
The artificial intelligence required for onboard resources to be the only means of navigation has been developed with large robots, such as self-driving automobiles, in mind. Certain methods need bulky, energy-intensive sensors, such as LiDAR laser rangers, which are impractical for tiny robots to carry or power.
Other methods take advantage of vision, a very power-efficient sensor that offers a wealth of environmental data. Nevertheless, these methods usually aim to produce extremely comprehensive three-dimensional environment maps. Large amounts
Other methods take advantage of vision, a very power-efficient sensor that offers a wealth of environmental data. Nevertheless, these methods usually aim to produce extremely comprehensive three-dimensional environment maps. Large amounts
of processing and memory are needed for this, yet the only devices that can supply them are computers, which are too big and power-hungry for tiny robots.
Counting Steps and Visual Breadcrumbs
For this reason, several academics have looked to the natural world for ideas. Because they use extremely limited sensing and computation resources, insects are particularly interesting because they can function over distances that could be important to many real-world applications.
Scientists studying biology are learning more and more about the fundamental tactics employed by insects. In particular, insects use their low-resolution, nearly omnidirectional visual system known as “view memory” to combine tracking their own motion, or “odometry,” with visually guided behaviors.
The specific mechanisms behind visual memory remain little understood, in contrast to the expanding understanding of odometry, even down to the neural level. Thus, there are several conflicting views regarding how insects use their vision for navigation. An early theory puts forth a “snapshot” model. This idea suggests that an insect, like an ant, would periodically take pictures of its surroundings.
The insect can then move to minimize the disparities between its current visual perception and the snapshot when it gets close to it. By doing this, any drift that inevitably accumulates when only performing odometry is eliminated and the insect is able to navigate, or “home,” to the snapshot location.
Snapshot-based navigation can be compared to how Hansel tried not to get lost in the fairy tale of Hansel and Gretel. When Hans threw stones on the ground, he could get back home. However, when he threw bread crumbs that were eaten by the birds, Hans and Gretel got lost. In our case, the stones are the snapshots.
Tom van Dijk, Study First Author, Delft University of Technology
Tom van Dijk continued, “As with a stone, for a snapshot to work, the robot has to be close enough to the snapshot location. If the visual surroundings get too different from that at the snapshot location, the robot may move in the wrong direction and never get back anymore. Hence, one has to use enough snapshots or in the case of Hansel drop a sufficient number of stones. On the other hand, dropping stones to close to each other would deplete Hans’ stones too quickly.”
Dijk said, “In the case of a robot, using too many snapshots leads to large memory consumption. Previous works in this field typically had the snapshots very close together, so that the robot could first visually home to one snapshot and then to the next.”
The main insight underlying our strategy is that you can space snapshots much further apart, if the robot travels between snapshots based on odometry. Homing will work as long as the robot ends up close enough to the snapshot location, i.e., as long as the robot’s odometry drift falls within the snapshot’s catchment area. This also allows the robot to travel much further, as the robot flies much slower when homing to a snapshot than when flying from one snapshot to the next based on odometry.
Guido de Croon, Full Professor and Study Co-Author, Delft University of Technology
With just 1.16 kiloBytes, a 56-gram “CrazyFlie” drone with an omnidirectional camera was able to travel up to 100 m using the suggested insect-inspired navigation method. A tiny computer known as a “micro-controller,” which is present in many inexpensive electrical gadgets, handles all visual processing.
Putting the Robot Technology to Work
The proposed insect-inspired navigation strategy is an important step on the way to applying tiny autonomous robots in the real world. The functionality of the proposed strategy is more limited than that provided by state-of-the-art navigation methods. It does not generate a map and only allows the robot to come back to the starting point. Still, for many applications this may be more than enough.
Guido de Croon, Full Professor and Study Co-Author, Delft University of Technology
Croon said, “For instance, for stock tracking in warehouses or crop monitoring in greenhouses, drones could fly out, gather data, and then return to the base station. They could store mission-relevant images on a small SD card for post-processing by a server. But they would not need them for navigation itself.”
Visual Route-following for Tiny Autonomous Robots - Science Robotics
Video Credit: Delft University of Technology
Journal Reference:
Tom van Dijk, V. T., et al. (2024) Visual route following for tiny autonomous robots. Science Robotics. doi.org/10.1126/scirobotics.adk0310.