CAMBRIDGE, MA—NASA’s OSIRIS-REx spacecraft orbited around the asteroid Bennu before it descended and gathered rocks from the surface that are as old as the solar system. Besides being without a crew, the spacecraft was without a GPS system to tell it how to navigate to the surface to collect the sample.
That wasn’t a problem, according to Courtney Mario, a perception and autonomy engineer at Draper. She builds vision-aided navigation systems for cars, spacecraft and unmanned aerial vehicles.
“The spacecraft has been collecting data about Bennu for almost two years now and the mission science team used that data to build an impressive asteroid shape model,” Mario said. “We used that shape model as essentially a map for the onboard computer so that the spacecraft could navigate itself to the surface.”
The data about Bennu is being used in a navigation system onboard OSIRIS-REx called Natural Feature Tracking (NFT) that allowed the spacecraft to navigate autonomously to the asteroid surface to collect the sample. NFT was designed by Lockheed Martin and developed for this mission with contributions from Draper. Similar to how people determine their location every day by identifying familiar landmarks around them, NFT relies on matching expected surface features from Bennu shape model data with camera images collected onboard during descent to determine the spacecraft’s location.
“The challenge, though,” said Mario, “is that these algorithms cannot yet replicate all that the human brain does to filter out bad information, especially in complex environments. So we need to make sure we are providing NFT with the best, most robust set of shape model features to ensure NFT can correctly identify them in the images.”
Mario set out to identify Bennu terrain characteristics that yield strong navigation features and worked with Lockheed Martin engineers to incorporate the data into the feature selection process. As feature models were built and tested against available Bennu images, she dug into the data to determine the strongest terrain features as well as areas where models needed additional work. She then shared her findings with the scientists on the team who specialize in building shape models. Those models were further refined and tested, said Mario, “leading to a final set of features that enabled NFT to navigate to the surface of Bennu.”
NFT is a relatively new optical-based autonomous navigation system that wasn’t even in the original plans for OSIRIS-REx. The team had baselined a LiDAR system to navigate to a touchdown site that was anticipated to be 164 feet in diameter. Upon arrival at Bennu though, the team discovered that the surface was much rockier than expected and the largest safe area was only 52 feet wide. So NFT, which provides a more precise navigation solution than LiDAR, was upgraded from a backup system to the primary navigation system for the sample collection.
“This touch-and-go mission to Bennu is a big step in deep space autonomous exploration, and developing NFT truly was a team effort among Lockheed Martin, Goddard Space Flight Center, University of Arizona, KinetX Aerospace, Draper and the science team,” said Mario, a member of the Natural Feature Tracking team for OSIRIS-REx. “It’s exciting to work on a navigation system that can enable a spacecraft to understand its environment and target touchdown locations in a challenging environment like Bennu, just as those systems are beginning to do the same for self-driving cars, robots and other autonomous systems right here on Earth.”
Lockheed Martin designed and built the OSIRIS-REx spacecraft, asteroid sampling system and sample return capsule, performs spacecraft flight operation, and led development of the Natural Feature Tracking system. NASA Goddard Space Flight Center provides overall mission management, systems engineering and safety and mission assurance for OSIRIS-REx. The University of Arizona leads the mission and the science team.