Posted in | News | Drones and UAVs

Advancing Drone Autonomy with AI Algorithms

Artificial intelligence (AI)-powered sophisticated algorithms may eventually allow drones to pilot themselves without the need for human assistance, utilizing visual cues to guide them from one location to another.

Advancing Drone Autonomy with AI AlgorithmsDrone Autonomy with AI Algorithms" />

University of Missouri students spent a month at Yuma Proving Grounds in Arizona, one of the largest military installations in the world, working to collect visible and infrared video data using custom-built drones. Their project helped build the foundation for this two-year project supported by the US Army Engineer Research and Development Center. Image Credit: US Department of Defense.

This is the goal of a two-year project being headed by academics from the University of Missouri and funded by a $3.3 million grant from the US Army Corps of Engineers' premier research and development institution, the Engineer Research and Development Institution (ERDC).

Principal investigator on the project and Curators' Distinguished Professor of Electrical Engineering and Computer Science, Kannappan Palaniappan, stated that in situations where there is an interruption or loss of signal from GPS navigation, such as after a natural disaster or in military situations, the ability to operate autonomously becomes critical.

This typically occurs in the aftermath of natural disasters, occlusions in the built environment and terrain, or from human-involved intervention, most drones operating today require GPS navigation to fly, so when they lose that signal, they are not able to find their way around and will typically just land wherever they are. Unlike ground-based GPS navigation apps, which can reroute you if you miss a turn, there’s currently no option for airborne drones to re-route in these situations.

Kannappan Palaniappan, Professor and Principal investigator, Department of Engineering and Computer Science, MU College of Engineering

Currently, to maintain a drone within the drone pilot's line of sight and avoid collisions with objects such as buildings, trees, mountains, bridges, signs, or other notable structures, the drone pilot needs to operate the device manually and possess a high degree of situational awareness.

Palaniappan and colleagues are currently working on software that will enable drones to fly autonomously, independently detecting and interacting with their surroundings as they accomplish predetermined goals or objectives. This will be accomplished through the use of optical sensors and algorithms.

We want to take the range of skills, attributes, contextual scene knowledge, mission planning, and other capacities that drone pilots possess and incorporate them along with weather conditions into the drone’s software so it can make all of those decisions independently.

 Kannappan Palaniappan, Professor and Principal investigator, Department of Engineering and Computer Science, MU College of Engineering

Advancing Intelligent Scene Perception

Drones can already carry out some restricted advanced-level tasks, including object detection and visual recognition, thanks to recent developments in visual sensor technology such as light detection and ranging or lidar, and thermal imaging.

Drones could also aid in the development of 3D or 4D advanced imaging for mapping and monitoring applications when paired with the team's algorithms, which are driven by machine learning and deep learning, two subsets of AI.

As humans, we have been incorporating 3D models and dynamical knowledge of movement patterns in our surroundings using our visual system since we were little kids. Now, we are trying to decode the salient features of the human visual system and build those capabilities into autonomous vision-based aerial and ground-based navigation algorithms.

Kannappan Palaniappan, Professor and Principal investigator, Department of Engineering and Computer Science, MU College of Engineering

Developing advanced imagery capabilities requires computer-related resources like processing power, memory or time. That capability is beyond what is currently available through the software system typically available on board a drone.

To find a possible solution, the MU-led team is looking into ways to take advantage of the advantages of cloud, high-performance, and edge computing techniques.

Palaniappan said, “After a severe storm or a natural disaster, there will be damage to buildings, waterways, and other forms of infrastructure, and a 3D reconstruction of the area could help first responders to government officials understand how much damage has taken place. By allowing the drone to collect the raw data and transmit that information to the cloud, the cloud-supporting high-performance computing software can complete the analysis and develop the 3D digital twin model without the need for additional software to be physically installed and accessible on the drone.”

Prasad Calyam, Filiz Bunyak, and Joshua Fraser are all part of the MU team. The team also has researchers from Saint Louis University, the University of California-Berkeley, and the University of Florida.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.