An Uncompromised View: Fusing Satellite Data Streams for Continuous Spaceborne Insights

By David Potere

A tradeoff exists in satellite imagery between quality and frequency. Either you have a low-resolution, continuous look at the ground, or a high-resolution, isolated snapshot – but not both, given the limitations of different satellite constellations. Imagine how any other experience would be disrupted by this tradeoff. You can, say, have your music one of two ways: complete in its runtime, but with every instrument and singer muffled and distorted; or a couple crisp seconds of sound for every verse, chorus, and bridge. Forget having an opinion on a whole album – try making sense of a single song!

Nowhere is this tradeoff felt as strongly as in agricultural remote sensing. To understand crop progress and health of a field, farm, or entire region, we need to have a daily, detailed view of the ground. The Moderate Resolution Imaging Spectroradiometer (MODIS)  provides the first half of that equation, offering “daily” imagery. But the spatial resolution (at best 250 meters) is too coarse for field-scale work. On the other hand, the Harmonized Landsat Sentinel-2 product provides high enough spatial resolutions (30 meters) to get detailed looks into crop health, soil moisture, and irrigation activity, but only delivers observations on a weekly basis (or less often, due to clouds and shadows). Growers rely on frequent updates for proactive decision-making, especially during the growing season, which makes this tradeoff unacceptable.

ndvi_tiles_for atlas blog_01

Images gathered from MODIS and HLS can be seen on the left and right, respectively. The specific metric assessed here is plant greenness through NDVI, or the Normalized Difference Vegetative Index, which correlates to the overall biomass and health of crops within the fields.

The GeoInnovation team knew that we had to resolve this. So we began a “data fusion” effort of sorts: synthesizing data from three satellite constellations (MODIS, Landsat, and Sentinel) in order to generate a single stream of information. The resolution would be high enough to know what’s playing out on the ground, and delivered frequently enough to track changes. After all of the modeling, we arrived at a spaceborne timelapse without tradeoff; in the video below, you can see the 2018 growing season play out just south of Burley, Idaho. Look closely for the center pivot activity, crop green-up and then harvest – all of which is captured 400 miles or more above the Earth’s surface.

The “data fusion” project will not stop here, as the team explores additional remote sensing and weather inputs that could give a more complete look at the ground. Our field-scale yield modeling, which we have been doing for the U.S., Brazil, and Argentina for the past year at Indigo, will improve with this work. Stay tuned for the latest off our Atlas lab bench as we build out the capabilities of Indigo’s living map of the world’s food system.

 

Across analytics, engineering, and agronomics, learn more about how you can join GeoInnovation.
Or, just explore other open positions at Indigo.