Rescue Drones Fly by Vision, Inertial Sensing

Rescue Drones Fly by Vision, Inertial Sensing

PALO ALTO, Calif. — MIT and the Charles Stark Draper Laboratory have developed vision-aided navigation technology for rescue or reconnaissance drones flying in environments where GPS doesn't reach.

Unlike previous, unsuccessful attempts to solve this problem with other navigation technologies like lidar, the technology doesn't rely on a system of orienting structures, motion capture systems, or maps. Those alternative technologies have also been hindered by drone flight speeds of up to 45 mph that outpace many onboard communications signal speeds.

The team's solution for first responders or soldiers operating unmanned aerial vehicles (UAVs) in GPS-denied locations — like underground, under dense forest canopies, or inside urban "canyons" — lets UAVs autonomously maneuver through unknown environments.

A team from MIT and the Charles Stark Draper Laboratory equipped an unmanned aerial vehicle (UAV) with vision so it can navigate without GPS in environments such as urban canyons, forests, and underground locations. Source: Charles Stark Draper LaboratoryA team from MIT and the Charles Stark Draper Laboratory equipped an unmanned aerial vehicle (UAV) with vision so it can navigate without GPS in environments such as urban canyons, forests, and underground locations.
Source: Charles Stark Draper Laboratory

Funded by DARPA's Fast Lightweight Autonomy program, the sensor- and camera-loaded quadcopter was tested in a mix of indoor and outdoor situations, both cluttered and open, to simulate conditions UAVs may encounter when quickly navigating unknown environments without a remote pilot.

The team says its navigation method, developed by Draper, combines vision with inertial sensing in a new approach to state estimation: estimating a vehicle's position, orientation, and velocity. Called smoothing and mapping with inertial state estimation (SAMWISE), the method accumulates errors more slowly than either method alone.

Draper's smoothing and mapping with inertial state estimation (SAMWISE) sensor-fusion algorithm enables drones to fly 45 mph in unmapped, GPS-denied environments. Source: Charles Stark Draper LaboratoryDraper's smoothing and mapping with inertial state estimation (SAMWISE) sensor-fusion algorithm enables drones to fly 45 mph in unmapped, GPS-denied environments.
Source: Charles Stark Draper Laboratory

Using certain configurations of sensors and algorithms, as well as a monocular camera with IMU-centric navigation, the test drone successfully maintained precise position estimates while dodging trees, locating building entrances, and entering and exiting buildings. The technology may also be applicable to other GPS-denied locations on the ground or underwater. 

At MIT, engineers have been working on autonomous robot and UAV navigation of all kinds for several years, including lidar, RFID, and simultaneous location and mapping (SLAM). 


PreviousSamsung Secures IoT Node-to-Cloud
Next    This Garment Pattern Could Power Biosensor Nets