Posted in | News | Consumer Robotics

Measurable Virtual Reality Visualizes Robot's Perceptions

In a darkened, hangar-like space inside MIT's Building 41, a small, Roomba-like robot is trying to make up its mind.

Standing in its path is an obstacle — a human pedestrian who's pacing back and forth. To get to the other side of the room, the robot has to first determine where the pedestrian is, then choose the optimal route to avoid a close encounter.

As the robot considers its options, its "thoughts" are projected on the ground: A large pink dot appears to follow the pedestrian — a symbol of the robot's perception of the pedestrian's position in space. Lines, each representing a possible route for the robot to take, radiate across the room in meandering patterns and colors, with a green line signifying the optimal route. The lines and dots shift and adjust as the pedestrian and the robot move.

This new visualization system combines ceiling-mounted projectors with motion-capture technology and animation software to project a robot's intentions in real time. The researchers have dubbed the system "measurable virtual reality (MVR) — a spin on conventional virtual reality that's designed to visualize a robot's "perceptions and understanding of the world," says Ali-akbar Agha-mohammadi, a postdoc in MIT's Aerospace Controls Lab..

"Normally, a robot may make some decision, but you can't quite tell what's going on in its mind — why it's choosing a particular path," Agha-mohammadi says. "But if you can see the robot's plan projected on the ground, you can connect what it perceives with what it does to make sense of its actions."

Agha-mohammadi says the system may help speed up the development of self-driving cars, package-delivering drones, and other autonomous, route-planning vehicles.

"As designers, when we can compare the robot's perceptions with how it acts, we can find bugs in our code much faster," Agha-mohammadi says. "For example, if we fly a quadrotor, and see something go wrong in its mind, we can terminate the code before it hits the wall, or breaks."

The system was developed by Shayegan Omidshafiei, a graduate student, and Agha-mohammadi. They and their colleagues, including Jonathan How, a professor of aeronautics and astronautics, will present details of the visualization system at the American Institute of Aeronautics and Astronautics' SciTech conference in January.

Seeing into the mind of a robot

The researchers initially conceived of the visualization system in response to feedback from visitors to their lab. During demonstrations of robotic missions, it was often difficult for people to understand why robots chose certain actions.

"Some of the decisions almost seemed random," Omidshafiei recalls.

The team developed the system as a way to visually represent the robots' decision-making process. The engineers mounted 18 motion-capture cameras on the ceiling to track multiple robotic vehicles simultaneously. They then developed computer software that visually renders "hidden" information, such as a robot's possible routes, and its perception of an obstacle's position. They projected this information on the ground in real time, as physical robots operated.

The researchers soon found that by projecting the robots' intentions, they were able to spot problems in the underlying algorithms, and make improvements much faster than before.

"There are a lot of problems that pop up because of uncertainty in the real world, or hardware issues, and that's where our system can significantly reduce the amount of effort spent by researchers to pinpoint the causes," Omidshafiei says. "Traditionally, physical and simulation systems were disjointed. You would have to go to the lowest level of your code, break it down, and try to figure out where the issues were coming from. Now we have the capability to show low-level information in a physical manner, so you don't have to go deep into your code, or restructure your vision of how your algorithm works. You could see applications where you might cut down a whole month of work into a few days."

Bringing the outdoors in

The group has explored a few such applications using the visualization system. In one scenario, the team is looking into the role of drones in fighting forest fires. Such drones may one day be used both to survey and to squelch fires — first observing a fire's effect on various types of vegetation, then identifying and putting out those fires that are most likely to spread.

To make fire-fighting drones a reality, the team is first testing the possibility virtually. In addition to projecting a drone's intentions, the researchers can also project landscapes to simulate an outdoor environment. In test scenarios, the group has flown physical quadrotors over projections of forests, shown from an aerial perspective to simulate a drone's view, as if it were flying over treetops. The researchers projected fire on various parts of the landscape, and directed quadrotors to take images of the terrain — images that could eventually be used to "teach" the robots to recognize signs of a particularly dangerous fire.

Going forward, Agha-mohammadi says, the team plans to use the system to test drone performance in package-delivery scenarios. Toward this end, the researchers will simulate urban environments by creating street-view projections of cities, similar to zoomed-in perspectives on Google Maps.

"Imagine we can project a bunch of apartments in Cambridge," Agha-mohammadi says. "Depending on where the vehicle is, you can look at the environment from different angles, and what it sees will be quite similar to what it would see if it were flying in reality."

Because the Federal Aviation Administration has placed restrictions on outdoor testing of quadrotors and other autonomous flying vehicles, Omidshafiei points out that testing such robots in a virtual environment may be the next best thing. In fact, the sky's the limit as far as the types of virtual environments that the new system may project.

"With this system, you can design any environment you want, and can test and prototype your vehicles as if they're fully outdoors, before you deploy them in the real world," Omidshafiei says.

This work was supported by Boeing.

Disclaimer: The views expressed here are those of the author expressed in their private capacity and do not necessarily represent the views of AZoM.com Limited T/A AZoNetwork the owner and operator of this website. This disclaimer forms part of the Terms and conditions of use of this website.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chu, Jennifer. (2019, February 19). Measurable Virtual Reality Visualizes Robot's Perceptions. AZoRobotics. Retrieved on November 23, 2024 from https://www.azorobotics.com/News.aspx?newsID=6399.

  • MLA

    Chu, Jennifer. "Measurable Virtual Reality Visualizes Robot's Perceptions". AZoRobotics. 23 November 2024. <https://www.azorobotics.com/News.aspx?newsID=6399>.

  • Chicago

    Chu, Jennifer. "Measurable Virtual Reality Visualizes Robot's Perceptions". AZoRobotics. https://www.azorobotics.com/News.aspx?newsID=6399. (accessed November 23, 2024).

  • Harvard

    Chu, Jennifer. 2019. Measurable Virtual Reality Visualizes Robot's Perceptions. AZoRobotics, viewed 23 November 2024, https://www.azorobotics.com/News.aspx?newsID=6399.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.