AI image processing aboard satellites in space has been a goal of the Earth observation industry for years. Now it has finally been achieved. Planet Labs, based in Calif., released an image captured by its Pelican-4 multispectral satellite showing an airport in Alice Springs, Australia. On the tarmac, more than a dozen aircraft are scattered, each highlighted in a neat green box, identified by an AI model running aboard the satellite.
Planet Labs’ engineers had worked 18 months to accomplish reliable autonomous object classification from space. They hope the technology will put Earth observation on steroids, enabling autonomous tasking and real-time sharing of insights with users on Earth.
“The entire remote-sensing industry has been known to put exotic sensors in space,” said Kiruthika Devaraj, vice president of engineering at Planet Labs. “We have very good eyes in space looking at everything that’s going on. But then, we collect so much data and have to wait six to 12 hours to get the information out. So, you’re essentially looking at the past.”
Planet Labs currently operates a constellation of several hundred Dove and SuperDove CubeSats, each only 30 centimeters long. These low-cost space cameras scan the entire surface of Earth multiple times a day at a resolution of around 5 meters. The company is also building up a fleet of 32 larger satellites, called Pelicans, which image the planet’s surface in 30-centimeter detail. The fourth of these, deployed into orbit in 2025, ran the airplane-recognition algorithm.
All Planet’s satellites combined generate 30 terabytes of data per day—equivalent to 10,000 hours of high-definition video, which gets beamed to the ground for processing and analysis via tens of radio stations scattered all over the world.
Transferring the downloaded data into the cloud for processing and subsequent AI analysis takes hours, leading to delays, which could mean that a sparked wildfire gets noticed only when it’s too large to quickly contain.
“Minutes matter in some sectors,” Devaraj said. “And real-time insights really enable us to provide answers to problems as they’re unfolding.”
The AI image-recognition algorithms developed by Devaraj and her team analyze a single Pelican image comprising 16,000 pixels in half a second, using onboard GPUs. The results can be in the hands of users in minutes from the moment the image was taken.
Planet Labs
So far, only the Pelican satellites are fitted with AI-capable processors—the Nvidia Jetson Orin GPU modules frequently used in autonomous drones. But Devaraj says Planet plans to augment the SuperDove constellation with a new type of satellite, called the Owl. The satellite will provide daily revisits with a higher resolution of up to 1 meter and will also be fitted with Nvidia’s Jetson processors, which are capable of AI detection.
The new fleet would enable the company to begin working on…
Read full article: AI Earth Observation In Space Delivers Live Insight
The post “AI Earth Observation In Space Delivers Live Insight” by Tereza Pultarova was published on 05/01/2026 by spectrum.ieee.org



































Leave a Reply