The Importance of Knowing Google Dynamic World: An Undiscovered Gem in the EO Industry

Matthias Sammer
3 min readFeb 11, 2023

--

It is quite surprising to know that many people within the Earth Observation (EO) industry are not aware of Google Dynamic World (at least the subset of people I talked to). It is an application launched by Google that has flown under the radar, without creating much noise or buzz in the industry.

The Dynamic World app calculates class probabilities of land use/land cover in near real-time, using Sentinel-2 Level-2A products. Each pixel is classified as either: water, trees, grass, crops, flooded vegetation, shrub and scrub, built area, bare ground, snow and ice. The app rests on a full convolutions neural network, which essentially transforms optical bands to a discrete probability distribution. Labels to train this model were generated by experts (4000 images) and non-experts (20000 images), whereas expert labels were used for training and measuring accuracy of nonexpert labels.

In short, Google Dynamic World has the potential to become a near-realtime LULC classification. Actually, it is already that, but classification robustness over time is still an issue (from my point of view). But let's skip that discussion for now.

So, why is it that this tool has not gained more recognition and popularity within the EO community? It could be due to Google’s quiet launch of the application or the industry’s lack of exposure to it. Regardless, the fact remains that Google Dynamic World is a valuable resource that should not be overlooked.

If you are keen to find out more, here is a quick guide for how to download Dynamic World data of an AOI to your machine using Python. Let's get started!

AOI

Prerequisite: We have a GeoJson file that defines our AOI as a relatively complex polygon.

The first thing we have to do is draw a rectangular bounding box that covers the entire polygon.

import geojson
def get_bounding_box(geometry):
coords = np.array(list(geojson.utils.coords(geometry)))
bbox = coords[:,0].min(), coords[:,0].max(), coords[:,1].min(), coords[:,1].max()
return bbox
with open(path_to_geojson) as igj:
data = geojson.load(igj)

geoms = [feat["geometry"] for feat in data["features"]]
coordinates = [feat["coordinates"] for feat in geoms]
cords = get_bounding_box(geoms)
x_sorted = np.sort(cords)
# sorting the coordinates does not do the trick, so I had to adjust them accordingly
bbox = bbox_sorted[0], bbox_sorted[2], bbox_sorted[1], bbox_sorted[3]
bbox = list(bbox)

Initialize Earth Engine API

Now that we have the bounding box, we can start downloading the data. Let's start by loading the required libraries.

import ee
import numpy as np
import geemap

First, we need to initialize Earth Engine API with

ee.Initialize()

Then we transform the bounding box to a format suitable for the Earth Engine API.

poly = ee.Geometry.Rectangle(bbox)

Once we have that, we need to authorize access to Earth Engine via OAuth2.

ee.Authenticate()

Note that this only works with Jupyter Notebooks. You also need an account with the Google Earth Engine (check this source for further information).

Downloading Time Series Data

start_date = '2022-01-01'
end_date = '2022-12-01'
frequency='month'
return_type = 'class'
images_time_series = geemap.dynamic_world_timeseries(poly, start_date, end_date, frequency=frequency, return_type=return_type)
images_time_series_list = images_time_series.toList(images_time_series.size())
for i in range(0, images_time_series_list.size().getInfo()):
img = ee.Image(images_time_series_list.get(i))

try:
img_date = img.getInfo()['properties']['system:date']
postfix = 'class'
except:
img_date = str(img.getInfo()['properties']['system:time_start'])
postfix = '_hillside'

geemap.ee_export_image_to_drive(img, description=img_date + postfix, folder='googledynamic', region=poly, scale=10)

That's it, the script should now start downloading the data and save each scene of your AOI as .tif file.

If you found this interesting or would like to share your thoughts then drop me a message in the comment section below or via LinkedIn.

Stay tuned,

this is Matthias

--

--