Heterogeneous remote sensing source-based change detection with optical and SAR data and their combined all-time and all-weather observation capability provides a reliable and promising solution for a wide range of applications. State-of-the-art supervised methods typically take a two-stage strategy that suffers from the loss of original image features and the introduction of noise on the transferred images. This paper proposes a domain adaptation-based multi-source change detection network (DA-MSCDNet) suitable to process heterogeneous optical and SAR images. DA-MSCDNet employs feature-level transformation to align inconsistent deep feature spaces in heterogeneous data. Feature space transformation and change detection are bridged within the network to encourage task communication. Experiments are conducted on two public datasets based on Sentinel-1A and Landsat-8 imagery acquired over the Sacramento, Yuba, and Sutter Counties (California, USA), and QuickBird-2 and TerraSAR-X imagery over Gloucester (UK), as well as one new large-scale dataset of Sentinel-2 and COSMO-SkyMed imagery over Wuhan (China). Compared with other six supervised and unsupervised approaches, the proposed method achieves the highest performance with an average precision of 80.81%, recall of 84.39%, mIOU of 73.67% and F1 score of 82.58%, beating the state-of-the-art method with 5.42% improvements on F1 score and 10 times efficiency on training time cost on the large-scale change detection task.

A domain adaptation neural network for change detection with heterogeneous optical and SAR remote sensing images

Francesca Cigna;
2022

Abstract

Heterogeneous remote sensing source-based change detection with optical and SAR data and their combined all-time and all-weather observation capability provides a reliable and promising solution for a wide range of applications. State-of-the-art supervised methods typically take a two-stage strategy that suffers from the loss of original image features and the introduction of noise on the transferred images. This paper proposes a domain adaptation-based multi-source change detection network (DA-MSCDNet) suitable to process heterogeneous optical and SAR images. DA-MSCDNet employs feature-level transformation to align inconsistent deep feature spaces in heterogeneous data. Feature space transformation and change detection are bridged within the network to encourage task communication. Experiments are conducted on two public datasets based on Sentinel-1A and Landsat-8 imagery acquired over the Sacramento, Yuba, and Sutter Counties (California, USA), and QuickBird-2 and TerraSAR-X imagery over Gloucester (UK), as well as one new large-scale dataset of Sentinel-2 and COSMO-SkyMed imagery over Wuhan (China). Compared with other six supervised and unsupervised approaches, the proposed method achieves the highest performance with an average precision of 80.81%, recall of 84.39%, mIOU of 73.67% and F1 score of 82.58%, beating the state-of-the-art method with 5.42% improvements on F1 score and 10 times efficiency on training time cost on the large-scale change detection task.
2022
Istituto di Scienze dell'Atmosfera e del Clima - ISAC
Heterogeneous change detection
Feature alignment
Siamese network
Domain adaptation
Image fusion
Feature transformation
Satellite imagery
File in questo prodotto:
File Dimensione Formato  
Zhang et al 2022 JAG_opt.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 2.8 MB
Formato Adobe PDF
2.8 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/444497
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 31
  • ???jsp.display-item.citation.isi??? ND
social impact