Thanks to the relevant spectral-spatial information, hyperspectral images (HSIs) have been widely exploited in Earth observation. Recently, graph convolutional networks (GCNs) have attracted increasing attention in HSI classification due to their advantages in processing non-Euclidean structure data. Unlike convolutional neural networks (CNNs), which perform convolution operations on regular square regions, GCNs can directly work on graph structure data to extract the relationships among adjacent land covers. However, extracting meaningful and deep discriminative spectral-spatial features from HSIs is still a challenging task. In this article, a novel multi-scale feature learning via residual dynamic graph convolutional network is designed for HSI classification, which can extract large-scale contextual spatial structures at superpixel-level graph and local spectral-spatial information at pixel-level, significantly improving the performance of HSI classification. Unlike from the existing GCN-based methods that operate on a graph with a fixed neighbourhood size, multiple graphs with diverse neighbourhood scales are built to comprehensively leverage spectral-spatial information and relationship at multiple scales, and these graphs are dynamically updated to generate more discriminative features (via dynamic GCN) during the convolution process. Moreover, to fully use the multi-scale features extracted from HSIs, a multi-scale feature fusion module is developed to emphasize important features and suppress irrelevant ones. Extensive experiments carried on three benchmark data sets demonstrate the superiority of the proposed approach over other state-of-the-art methods.

Multi-scale feature learning via residual dynamic graph convolutional network for hyperspectral image classification

Vivone, Gemine
Secondo
;
2024

Abstract

Thanks to the relevant spectral-spatial information, hyperspectral images (HSIs) have been widely exploited in Earth observation. Recently, graph convolutional networks (GCNs) have attracted increasing attention in HSI classification due to their advantages in processing non-Euclidean structure data. Unlike convolutional neural networks (CNNs), which perform convolution operations on regular square regions, GCNs can directly work on graph structure data to extract the relationships among adjacent land covers. However, extracting meaningful and deep discriminative spectral-spatial features from HSIs is still a challenging task. In this article, a novel multi-scale feature learning via residual dynamic graph convolutional network is designed for HSI classification, which can extract large-scale contextual spatial structures at superpixel-level graph and local spectral-spatial information at pixel-level, significantly improving the performance of HSI classification. Unlike from the existing GCN-based methods that operate on a graph with a fixed neighbourhood size, multiple graphs with diverse neighbourhood scales are built to comprehensively leverage spectral-spatial information and relationship at multiple scales, and these graphs are dynamically updated to generate more discriminative features (via dynamic GCN) during the convolution process. Moreover, to fully use the multi-scale features extracted from HSIs, a multi-scale feature fusion module is developed to emphasize important features and suppress irrelevant ones. Extensive experiments carried on three benchmark data sets demonstrate the superiority of the proposed approach over other state-of-the-art methods.
2024
Istituto di Metodologie per l'Analisi Ambientale - IMAA
Graph convolutional network (GCN)
Hyperspectral image classification
Convolutional neural networks
Multi-resolution analysis
Remote sensing
File in questo prodotto:
File Dimensione Formato  
interactcadsample_GV.pdf

solo utenti autorizzati

Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.18 MB
Formato Adobe PDF
2.18 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/526225
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact