Santa Fe’s Descartes Labs lands $7.2M contract for satellite imagery project
Descartes Labs is working on the next generation of satellite imagery analysis, the business that brought the relatively young Santa Fe software company commercial success.
Descartes, one of several firms assembled for the project by the Defense Advanced Research Projects Agency, better known as DARPA, won a contract in September worth $7.2 million to work over 18 months on Geospatial Cloud Analytics.
Spun off in 2015 from Los Alamos National Laboratory, the tech firm recently moved into a newer, larger home downtown at North Guadalupe and West Water streets.
Descartes and other firms are working to gather millions of gigabytes of data collected by satellites into a form easily processed and accessible to Defense Department analysts and “warfighters,” according to Descartes CEO Mark Johnson and Joe Evans, DARPA geospatial program manager.
For example, in a video game such as SimCity, players build imaginary, digital communities from the ground up. Geospatial Cloud Analytics would gather satellite data to create models based on real life, Evans said.
The concept sounds like a good idea, said Robert Radigan, associate director of geospatial and population studies at the University of New Mexico.
In some cases, analysts need up-to-date imagery; in other cases, they need imagery of the highest resolution, he said. Having that data compiled in a central repository would make the analysts’ job much easier.
Johnson said the project could yield complex, dynamic models of systems on a planetary scale.
“Part of the idea of having these folks build models is that those models should be useful in building more complicated models,” Johnson said. “So now I’m not just trying to model how much production is happening in the U.S. economy, actually we want to start modeling something like a whole economy.”
DARPA is trying to create a layered, digital view of huge swathes of the planet, rather than the view through a drinking straw currently available, he said. Information sources might range from Google Maps to Landsat images.
“Traditionally, it’s been difficult to bring those different types of data together — maps and radar and other types of data — and access it in one place and then write programs that process it,” Evans said. “What we’re trying to do is knock down the barriers to bring that data together.”
Descartes is one of three firms building a digital platform, a system of collecting and organizing data from multiple satellite sources. On those platforms, other firms are contracted to build models based on that data. Like other DARPA projects — from GPS to the internet — if successful, this project could have commercial applications.
“Originally, the program was motivated by all the new activity in the commercial space industry and, in particular, new satellite constellations collecting new data, new types of data,” Evans said Thursday. “It’s really a revolution in the commercial geospatial data industry.”
Competition is built into the DARPA approach, Evans said. By creating three parallel platforms, the government agencies have a choice. They can “vote with their feet” and choose the platform that works best for them, he said. The project aims to employ computer analysis of those models.
Descartes is one of three companies — DigitalGlobe and BAE Systems are the others — working on data-aggregating platforms. Beyond that, Descartes also has a contract to build analytical models in three areas: food security, fracking and tracking oceangoing vessels, including illegal fishing.
The firm’s work on an earlier DARPA project to monitor wheat production across North Africa and the Middle East helped earn it a place on the cloud analytics project, Evans said. It’s the kind of technical challenge, “DARPA-hard,” that Johnson said the company was created to take on.
“Oh, absolutely,” he said. “At the very, very beginning it was sort of an abstract idea that there’s a lot of data out there, and deriving intelligence from that data is hard. And then it evolved a little bit and we realized that in order to enable modeling of these complex systems, you can’t just take some of the data, you need all of the data.”