Autonomous vehicles that rely on GPS position fixes can crash or stray off course if GPS signals become unavailable. Without accurate, timely positioning information, a vehicle does not know where it is, so it cannot map or accurately follow a course to another location.
To help autonomous vehicles navigate reliably and accurately in GPS-denied areas, Draper developed a robust algorithm called SAMWISE (Smoothing And Mapping With Inertial State Estimation). It estimates a vehicle’s attitude, velocity and relative position and orientation by fusing data from an inertial sensor and one or more aiding navigation sensors to enable closed-loop autonomous vehicle motion and mapping in GPS-denied areas.
SAMWISE can work in combination with a variety of sensors/data sources, including GPS, Light Detecting and Ranging (LiDAR) or cameras. Combining data from multiple sources enhances state estimation accuracy and reliability. SAMWISE even can use new information to update previous movement measurements to correct stored position data. The flexible design of SAMWISE also allows it to work as an independent inertial navigation system (INS), making vehicles robust to error conditions like temporary sensor problems and dropouts.
Draper demonstrated SAMWISE with a team from MIT for the Defense Advanced Research Projects Agency (DARPA) as part of the Fast Lightweight Autonomy (FLA) program. During phase 1 of the program in 2017, SAMWISE guided a drone as it dodged trees, found building entrances and entered/exited buildings, all the while maintaining precise position estimates — at speeds of up to 10 meters per second (~22 miles per hour) in cluttered areas and 20 meters per second (~45 miles per hour) in open areas. During a phase 2 field demonstration for DARPA in 2018, SAMWISE enabled the team’s drone to maneuver around buildings, over fences and under tree branches at speeds of up to 20 miles per hour — without GPS — instead using a monocular camera, a laser altimeter and an inertial measurement unit (IMU) as sensors.
IMUs can measure high rates of acceleration and rotation in six degrees of freedom, but when used alone they tend to drift over time due to accumulated integration error. Vision-aided navigation (VAN) uses image processing techniques to track features in the environment and estimate the relative position and orientation of a camera, but it is typically computationally intensive, drifts over time and its accuracy and robustness are closely tied to the quality and viewpoint of the camera imagery. When used alone, these sensors are limited.
The SAMWISE navigation system accumulates error more slowly over time than either an INS or VAN on their own due to the algorithm’s incremental smoother and a novel measurement buffering approach to estimate the vehicle trajectory, which allows the fusion of delayed (or high-latency) sensor data. SAMWISE enables high-speed flight by producing low-latency position and velocity outputs, which enables agile obstacle-avoidance maneuvers, and by tracking visual features in environments without requiring detailed mapping, which allows the algorithm to run within the constraints of onboard computation on small autonomous vehicles.
In the future, Draper could apply SAMWISE to self-driving cars, augmented reality systems, landing vehicles on Mars or the Moon, and other ground, marine and underwater systems.
Access Our Capabilities
We push boundaries and deliver the capabilities you need.