Jun 6 2013
Engineers in the Coordinated Robotics Lab at the University of California, San Diego, have developed new image processing techniques for rapid exploration and characterization of structural fires by small Segway-like robotic vehicles.
A sophisticated on-board software system takes the thermal data recorded by the robot’s small infrared camera and maps it onto a 3D scene constructed from the images taken by a pair of stereo RGB cameras.
This allows small mobile robotic vehicles to create a virtual reality picture that includes a 3D map and temperature data that can be used immediately by first responders as the robot drives through a building on fire.
The research is part of a plan to develop novel robotic scouts that can help firefighters to assist in residential and commercial blazes. Researchers will present their results at the International Conference on Robotics and Automation to be held from May 31 to June 5, 2014, in Hong Kong.
The robots will map and photograph the interior of burning buildings by using stereo vision. They will use data gathered from various sensors to characterize the state of a fire, including temperatures, volatile gases, and structural integrity while looking for survivors. Working together both collaboratively and autonomously, a number of such vehicles would quickly develop an accurate augmented virtual reality picture of the building interior. They would then provide it in near real time to rescuers, who could better assess the structure and plan their firefighting and rescue activities. Bewley’s dynamics and control team has already built the first vehicle prototype, essentially a self-righting Segway-like vehicle that can climb stairs.
3D Thermal RGB Mapping for Firefighting Robot Applications
“These robot scouts will be small, inexpensive, agile, and autonomous,” said Thomas Bewley, a professor of mechanical engineering at the Jacobs School of Engineering at UC San Diego. “Firefighters arriving at the scene of a fire have 1000 things to do. To be useful, the robotic scouts need to work like well-trained hunting dogs, dispatching quickly and working together to achieve complex goals while making all necessary low-level decisions themselves along the way to get the job done.”
This project represents a collaboration between researchers at the Jacobs School of Engineering and the University of Illinois at Urbana-Champaign, the San Diego Fire-Rescue Department, and local corporate partners. The team has applied for large block funding from the National Science Foundation’s Robotics Initiative to sustain this effort. But significant preliminary work has been conducted to get the research in motion even before this proposal was submitted. This interdisciplinary effort also involves:
- computer scientists in Prof. Yoav Freund’s lab developing software for robotic mapping in smoky and partially obstructed spaces,
- nanoengineers in Prof. Deli Wang’s lab developing an “electronic nose” capable of detecting volatile organic compounds, and
- researchers at the University of Illinois addressing the inherent human interface and obstacle avoidance problems.
Engineers at San Diego-based ATA Engineering, L-P3 and Brain Corporation are also closely involved.