Google DeepMind has debuted AlphaEarth Foundations, an AI model that treats Earth like a living dataset, tracking crop cycles, coastlines, urban expansion, melting ice, and much, much more. AlphaEarth weaves together disparate data streams, from satellite imagery and sensor data to geotagged Wikipedia entries, into a unified digital representation that scientists can probe to uncover patterns unfolding worldwide.
AlphaEarth produces a 64-dimensional “embedding” for every 10-by-10-meter cell of the planet annually from 2017 to 2024, covering both raw imagery and the relationships present in the underlying data. An embedding is a dense numeric summary of a place’s key features, making locations directly comparable. This approach cuts storage needs sixteenfold while preserving fine spatial and temporal detail. Altogether, the system amounts to over 1.4 trillion embeddings per year.
Detailed snapshots of year-round surface conditions will prove valuable in a wide range of fields, including planetary analysis, urban planning, ecosystem tracking, wildlife conservation, and wildfire risk management.
Digital Embeddings of Earth
A key challenge in building the model was handling the messy sprawl of geospatial data itself. Traditional satellites capture large volumes of information-rich images and measurements that can be difficult to connect and efficiently analyze.
The AlphaEarth Foundations team told IEEE Spectrum that one limitation in Earth observation is the inherent irregularity and sparsity of the data. Unlike a continuous video feed, satellite data is a collection of intermittent snapshots with frequent gaps caused by factors like persistent cloud cover.
To ensure consistent performance, the model needed a wide net of training data: A global sample of images covering more than 5 million locations acquired from the Google Earth Engine public data catalog, including optical imagery, radar, climate models, topographic maps, lidar, gravitational field strength, and surface temperature measurements. To enrich the dataset, the team also incorporated Wikipedia articles on landmarks and other features.
That diversity makes the model’s representations more detailed, but still broad enough to be relevant across different regions and scientific tasks. In Ecuador, for example, embeddings enable analysts to see through persistent cloud cover, revealing agricultural plots in various development stages.
“Given we were aiming to integrate this data into a unified digital representation to provide scientists with a more complete and consistent picture of our planet’s evolution, we had to grapple with petabytes of multi-source, multi-resolution imagery and other geospatial datasets,” says Chris Brown, a senior research engineer at Google DeepMind.
The team first had to get data pipelines and modeling infrastructure to a place where working on petabyte scales was feasible. “We prioritized respecting the nuances of geospatial data, such as…
Read full article: Google DeepMind’s AlphaEarth Tracks Earth’s Changes

The post “Google DeepMind’s AlphaEarth Tracks Earth’s Changes” by Shannon Cuthrell was published on 09/15/2025 by spectrum.ieee.org
Leave a Reply