According to a study published in the journal Aerospace Science and Technology, UPV/EHU researcher Julián Estévez has created a low-cost, autonomous navigation system that avoids drone collisions between two or more whose routes intersect in midair. This technology is produced using just on-board sensors and cameras. He has obtained positive and encouraging outcomes.
Despite the reduced cost of the technology, the solution we have developed has been successfully validated in commercial drones. Using simple, low-cost equipment and an algorithm based on artificial vision and color identification, we have developed a robust piece of technology to satisfactorily prevent collision between drones and which can be easily extrapolated to most commercial and research aerial robots; we have also made available the complete software code for the solution.
Julián Estévez, Professor, University of the Basque Country
Most drones are manned, even if they are out of the operator's vision. To be fully autonomous, a drone must be able to make flying decisions independently and without human involvement, such as avoiding collisions, maintaining the course in the face of wind gusts, controlling flight speed, avoiding buildings and trees, and so on.
“This work is a small step towards fully autonomous navigation, without any human intervention, so that drones can decide which maneuver to perform, which direction to take, thus preventing collisions with each other or with other airborne obstacles. If we assume that, in the future, our airspace will be much more populated by commercial services performed by these drones, our work is a small contribution in this respect,” Estévez added.
He further stated, “Our approach to preventing collisions does not require drones to exchange information with each other; instead, they rely solely on their on-board sensors and cameras. We get the signal from the camera on board the drones, and by processing the images, we adjust the reactions of the robots so that they fly smoothly and accurately.”
In the experiments, they attempted to mimic realistic drone conditions, in other words, scenarios that could occur in a typical urban area under uncontrolled lighting conditions, with drones flying in different directions, and so on, so their contributions are directed toward real-world applications, regardless of the initial laboratory work.
Color-Based Algorithms
“We equipped each drone with a red card that allows the software algorithm to detect the presence of an approaching drone and measure its proximity. Our proposal is very simple: each drone is equipped with an on-board camera, the screen of which is divided into two halves (left and right). This camera always seeks out the color red of the cards mentioned above,” Estévez added.
He further noted, “Through simple image processing, we can find out what percentage of the camera is occupied by the color red, and whether most of this red is on the left- or right-hand side of the screen. If most of the red zone is on the left-hand side of the screen, the drone will fly to the right to avoid collision. If the red zone is on the right, it will move to the left. And this happens with all airborne drones.”
“When the percentage of the color red on the screen increases, it means that the drones are approaching each other head-on. So, when a threshold is exceeded, the robot knows that it has to perform the avoidance maneuver. All this happens autonomously, without the human operator intervening. It is a simple way to prevent collisions, and can be performed by low-cost sensors and equipment,” Estévez highlighted.
It is similar to when a person goes down the street and notices someone approaching from the left; in such a scenario, the person attempts to shift to the right so they do not collide.
Journal Reference:
Estévez, J., et. al. (2024) A low-cost vision system for online reciprocal collision avoidance with UAVs. Aerospace Science and Technology. doi:10.1016/j.ast.2024.109190