A flexible multi-temporal and multi-modal framework for Sentinel-1 and Sentinel-2 analysis ready data

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)
78 Downloads (Pure)


The rich, complementary data provided by Sentinel-1 and Sentinel-2 satellite constellations host considerable potential to transform Earth observation (EO) applications. However, a substantial amount of effort and infrastructure is still required for the generation of analysis-ready data (ARD) from the low-level products provided by the European Space Agency (ESA). Here, a flexible Python framework able to generate a range of consistent ARD aligned with the ESA-recommended processing pipeline is detailed. Sentinel-1 Synthetic Aperture Radar (SAR) data are radiometrically calibrated, speckle-filtered and terrain-corrected, and Sentinel-2 multi-spectral data resampled in order to harmonise the spatial resolution between the two streams and to allow stacking with multiple scene classification masks. The global coverage and flexibility of the framework allows users to define a specific region of interest (ROI) and time window to create geo-referenced Sentinel-1 and Sentinel-2 images, or a combination of both with closest temporal alignment. The framework can be applied to any location and is user-centric and versatile in generating multi-modal and multi-temporal ARD. Finally, the framework handles automatically the inherent challenges in processing Sentinel data, such as boundary regions with missing values within Sentinel-1 and the filtering of Sentinel-2 scenes based on ROI cloud coverage.
Original languageEnglish
Article number1120
Number of pages20
JournalRemote Sensing
Issue number5
Early online date24 Feb 2022
Publication statusPublished - 24 Feb 2022


  • Sentinel-1
  • Sentinel-2
  • analysis ready data
  • multi-modal
  • multi-temporal


Dive into the research topics of 'A flexible multi-temporal and multi-modal framework for Sentinel-1 and Sentinel-2 analysis ready data'. Together they form a unique fingerprint.

Cite this