📆 Project Period | April - July, 2025 |
👤 CIN Visiting Researcher |
Project Summary
Tip and Cue is a novel Earth observation strategy foreseeing the collaboration of multiple space assets to enhance observation. In its canonical form, it entails two satellites: a tip satellite -- equipped with wide field-of-view sensor -- scanning for anomalies and tasking a second satellite to zoom in (cueing satellite), provided with a high-resolution/low-swath imager.
This project aims to lay the foundation for the integration of AI into this concept, by developing mission simulation software capable to model distortions due to off-nadir acquisitions and benchmarks to demonstrate its potential focusing on whale-detection use case.
Figure 1: The Tip and Cue strategy with a wide field-of-view satellite (Tip) on the right, followed by a complementary high spatial resolution sensor (Cue) on the left.
The main outcomes of the collaboration are:
- Development of software to model geometric and radiometric effects at off-nadir viewing angles.
- Generation of a Cue satellite dataset under off-nadir conditions using a whale-detection use case.
- Fine-tuning of a deep learning-based detection model on the custom dataset.
- Assessment of how radiance model settings impact AI detection performance.
- Establishment of a performance benchmark for AI-based Tip and Cue systems.
- Integration of radiation modelling, onboard AI, and orbital simulation into a single framework.
- Simulation of Tip and Cue missions to determine the maximum feasible off-nadir angle.
- Comparative evaluation of different satellite configurations and cueing delays.
Development Tools
- Mitsuba: physics-based rendering engine to model radiance and geometric effects at off-nadir viewing angles.
- SMARTS (Simple Model of the Atmospheric Radiative Transfer of Sunshine): to simulate incident solar spectral irradiance based on solar geometry, atmosphere, and surface characteristics.
- PASEOS: to simulate satellite orbits and propagate satellite positions using orbital elements.
- Linux/Bash: for executing model training workflows on remote systems.
- GitHub: for version control and collaborative development.
- VS Code / LaTeX: for technical writing and documentation.
Development Outputs
- On-going: The first Tip and Cue mission simulation software that integrates orbital modelling, off-nadir effects, and onboard AI performance evaluation.
- On-going: A dataset of off-nadir satellite images including sun glint and geometric distortions, generated from actual orbital geometry and physics-based rendering.
- Planned: A publication outlining the prior work, performance trade-offs, application framework of AI-based Tip and Cue systems and providing full-description of the proposed benchmark.
- On-going: Public release of the full mission simulation software, datasets, and benchmarks as open-source repositories. First version available at GitHub - AI-Based Tip and Cue.
Project Description
Introduction
Earth observation systems typically face trade-offs between spatial resolution, revisit time, and coverage. The Tip and Cue strategy can overcome these limitations: one satellite (Tip) with a wide field of view detects anomalies, while a second satellite (Cue) follows up with targeted high-resolution imaging. While traditional Tip and Cue approaches rely on ground-based processing and manual intervention, introducing latency, integrating onboard AI can significantly reduce latency and improve autonomy.
However, current concepts exploring AI-based Tip and Cue overlook the effects of off-nadir imaging: geometric and radiometric distortions, which can significantly degrade detection performance. For example, increased ground sampling distance or sun glint effects can alter pixel intensity and perspective as illustrated in Figure 2. This project addresses that gap by introducing off-nadir effects into a full mission simulation and evaluating how they impact onboard AI. The main objective is to establish an AI-based Tip and Cue simulation and benchmarking framework to evaluate performance across different satellite configurations, orbital parameters, and AI models.
Figure 2: Simulated off-nadir effects showing a) a change in viewing perspective, thus increased ground sampling distance and b) sun glints effects, forming a glittery path on the ocean. Satellite image © 2022 Maxar Technologies.
Source: a) Maxar Technologies, Whales from Space dataset, b) Joseph A. Shaw and Michael Vollmer
Off-Nadir Effects
The first stage focused on modelling the geometric and radiometric distortions caused by off-nadir angles. A rendering pipeline was developed to transform satellite images to off-nadir perspectives based on viewing geometry and lighting conditions. A synthetic Digital Elevation Map (DEM) of ocean surface waves was generated as shown in Figure 3 and loaded into Mitsuba, a physically-based rendering engine, to account for varying surface angles and wave heights. Incident solar spectral irradiance was calculated using SMARTS (Simple Model of the Atmospheric Radiative Transfer of Sunshine), considering atmospheric and surface parameters.
Figure 3: a synthetically generated Digital Elevation Model (DEM) based on a combination of sine and cosine functions, with a) its amplitude plot and b) 3D .obj file, to simulate the wave patterns.
Rendering was then performed for different off-nadir angles, with the sun and satellite positions taken from orbital simulations. A Bidirectional Reflectance Distribution Function (BRDF) was applied to simulate sun glint effects. For each pixel, incoming and outgoing angles, material roughness, and spectral wavelength were used to compute the additional radiance from sun glint effects. After rendering separate RGB channels and converting radiance values into digital numbers, the sun glint component was added back to the geometrically transformed image. With that, a Cue satellite image at off-nadir perspective could be generated, from which the result is shown in Figure 4.
Figure 4: Artificially generated sun glint image with a) the additional computed solar radiance component, b) original image, c) combined result. Satellite image © 2022 Maxar Technologies.
Source: b) and c) Maxar Technologies, Whales from Space dataset
Onboard AI
To evaluate detection performance under realistic conditions, a dataset of off-nadir whale images is generated using the rendering pipeline. After converting the dataset to COCO format, the DEIM-RT-DETR detection model, with transformer-based architecture, will be fine-tuned on the custom data. The goal is to evaluate the effects of radiance modelling accuracy on model performance, and to determine the maximum off-nadir angle at which AI detection remains reliable.
Simulation and Integration
The second part of the project, currently under completion, involves integrating orbital simulations, off-nadir rendering, and onboard AI processing into one simulation environment. A baseline scenario was defined, using 700 km SSO with 98.19° inclination based on Sentinel-2 (Tip) and WorldView-3 (Cue) data characteristics. For each timestep:
- Orbits of both satellites are propagated using classical orbital elements.
- The Tip satellite checks whether an Area of Interest (AOI) is within its footprint.
- If detected, the Cue satellite attempts to image the same AOI after a configurable time delay for processing and camera stabilization.
- Whale movement is modelled using a randomized swimming speed and heading.
- Cueing feasibility is checked based on off-nadir angle and target position.
- If successful, the off-nadir image is generated and processed by the detection AI model.
All modules: geometric transformation, radiation effects rendering, satellite propagation, and AI evaluation, are implemented in Python and will be integrated into one coherent simulation.
Benchmarking
A benchmark will be defined to evaluate the full Tip and Cue system, accounting for the following performance metrics:
- AI model performance: mean average precision (mAP), precision, recall, F1-score.
- Coverage: number of unique AOIs cued per day.
- Observation time per target: how long targets remain under observation.
- Latency: time between Tip detection and Cue acquisition.
- Observation efficiency: the number of satellites utilized.
Based on that, AI-based Tip and Cue performance can be quantified based on two methods, one with directly quantifiable focus, the other with a practical orientation.
- The number of valid events correctly handled per day, per satellite unit. A valid event is defined as a complete Tip - Cue - AI detection sequence.
- The number of satellites that can be saved with respect to a baseline constellation, maintaining threshold performance.
Different experiments will be conducted to evaluate:
- Impact of varying time delays (5 to 30 minutes) between Tip and Cue.
- Effect of orbit altitude on Field of View, latency, and observation time.
- Multi-satellite scenarios with one or more Cue satellites or full T&C constellations.
Application Framework
To identify the target use-case for the study, i.e., whale detection, a rigorous analysis was conducted to identify and prioritise potential applications of AI-based Tip and Cue. More specifically, we developed a structured application framework based on stakeholder type (governmental, institutional, commercial) and thematic category (security, environment, industry, disaster response, and space). This resulted in over fifty use cases, ranging from illegal fishing and deforestation to rip current detection, space debris monitoring, and wind turbine inspection, as highlighted in Figure 5. Whale detection was prioritised given the uncertainty in target location and the requirement for high-resolution imaging. Each category reflects different performance requirements, for example, low latency for search and rescue, or high spatial resolution for infrastructure monitoring. This framework is planned to be published in future work as part of our journal publication.
Figure 5: Examples of satellite imagery for various applications with a) harmful algal bloom detection, b) rib current detection, c) solar event prediction, d) marine litter detection, e) lightning research, f) wildlife monitoring, g) archaeological site detection, h) wind turbine inspection.
Source: a) NASA/USGS Landsat-8, b) Google Earth, collected by A. de Silva, et al., c) ESA & NASA/Solar Orbiter/EUI team; Data processing: E. Kraaikamp (ROB), d) Marine Debris Dataset from PlanetScope Imagery, e) DTU Space, Mount Visual / Daniel Schmelling, f) WorldView-4, Maxar Technologies, g) Father Antoine Poidebard, h) Sentinel 2-B; Data processing: Equinor
Future Steps
- Verify the off-nadir radiance modelling software and generate the Cue satellite dataset for whale detection.
- Fine-tune the DEIM-RT-DETR detection model on the custom off-nadir dataset.
- Formalize the performance benchmark for AI-based Tip and Cue systems.
- Integrate the radiation modelling, onboard AI, and orbital simulation into a unified simulation framework.
- Run mission simulations to determine the maximum feasible off-nadir angle and compare performance across satellite configurations.
- Release the full mission simulation software, datasets, and performance benchmark as open-source repositories at the GitHub page, and publication of the simulation results.
With that, this work aims to provide a technical foundation for future AI-based Tip and Cue missions, to enhance Earth observation system capabilities.