Solving Problems with Satellite Data

Computers still no match for human intuition

CAMBRIDGE, MA – The Cyber Grand Challenge (CGC), sponsored by the Defense Advanced Research Projects Agency, made history last week when competitors proved “that machine speed, scalable cyber defense is indeed possible,” according to an Agency release. While this may advance the autonomous computer reasoning required to ultimately eliminate many of the advantages networked attackers currently use today, much work needs to be done to enable computers to be as effective as human intuition in the world of big data analytics.

Draper also concluded a competition last week called Chronos to identify whether algorithms alone were sufficient to place approximately 300 sets of images, each set spanning a week at one location, in chronological order.

“An event like the CGC has the potential to advance autonomy significantly. But, cyber and code are far more constrained environments than the world in which we live,” explained Troy Lau, a senior computer scientist at Draper. “What we proved with Chronos is that computers still lack the ability to take the unlimited and unconstrained data in large-scale visual imagery and apply context to it.”

Draper sponsored the online Chronos competition to find tools and approaches to analysis, and after eight weeks the winners of a prize pool of $75,000 were selected from more than 215 teams. Competitors submitted more than 260 scripts for online tools that could be used to analyze image datasets, such as sets of satellite images that will become available as increasing numbers of small satellites are launched in coming years.

Competitors’ approaches to the challenge included computer machine learning, pure brainpower, and, most often, human brainpower combined with computers to help speed up the process. As competitors’ scores began to rise, debate in Kaggle’s online forums arose over man versus machine. Competitors battled over what would lead to success:  strict, methodical computer approaches or the human brain and its ability to solve complex puzzles. A central point was raised:  Can the human brain reliably uncover trends in persistent imagery faster than computer machine learning?

While many of the 400 posted algorithms indeed helped speed up processing, no algorithm was posted that beat the human brain’s ability to discover subtle feature changes. Using hand annotations and algorithms, the top three winners developed distinct solutions. The winners hail from Slovakia, Spain and France. David Duris, the third-place winner, and a mathematician in Paris, France, submitted a solution so compelling that Draper has invited him to speak at its Cambridge, Mass. headquarters.

David’s approach involved a mathematical framework to aggregate any number of rules into a ‘possibility space,’ or a construct modeling a real-world process. As humans, or machines, added rules the possibility space was reduced. What was left was reality, in this case the only possible ordering of all the sets of images.

The winners, and all participants, contributed to a deeper understanding of how to process and analyze the images. Draper engineers are now working to combine this approach with the human and machine learning techniques developed, into a new technology to better analyze imagery.

Chronos was hosted on kaggle.com, a platform for predictive modeling and analytics competitions. Contestants had to devise tools to put the images in chronological order, and the tools had to be repeatable. Chronos was the first of Kaggle’s contests to allow contestants to submit solutions that included both algorithms and hand annotations.

One image in a set of aerial pictures of San Diego, California, taken over months and used in the Chronos Data Science Contest sponsored by Draper and hosted online by Kaggle.
One image in a set of aerial pictures of San Diego, California, taken over months and used in the Chronos Data Science Contest sponsored by Draper and hosted online by Kaggle.