7 Downloads (Pure)

Abstract

Cloud cover remains a significant limitation to a broad range of applications relying on optical remote sensing imagery, including crop identification/yield prediction, climate monitoring, and land cover classification. A common approach to cloud removal treats the problem as an inpainting task and imputes optical data in the cloud-affected regions employing either mosaicing historical data or making use of sensing modalities not impacted by cloud obstructions, such as SAR. Recently, deep learning approaches have been explored in these applications; however, the majority of reported solutions rely on external learning practices, i.e., models trained on fixed datasets. Although these models perform well within the context of a particular dataset, a significant risk of spatial and temporal overfitting exists when applied in different locations or at different times. Here, cloud removal was implemented within an internal learning regime through an inpainting technique based on the deep image prior. The approach was evaluated on both a synthetic dataset with an exact ground truth, as well as real samples. The ability to inpaint the cloud-affected regions for varying weather conditions across a whole year with no prior training was demonstrated, and the performance of the approach was characterised.
Original languageEnglish
Article number1342
Number of pages18
JournalRemote Sensing
Volume14
Issue number6
Early online date10 Mar 2022
DOIs
Publication statusPublished - 10 Mar 2022

Keywords

  • cloud removal
  • sentinel 1
  • sentinel 2
  • deep image prior
  • internal learning
  • image inpainting

Fingerprint

Dive into the research topics of 'Deep internal learning for inpainting of cloud-affected regions in satellite imagery'. Together they form a unique fingerprint.

Cite this