Lightweight remote sensing image super-resolution (RSISR) methods aim to reconstruct remote sensing images (RSIs) while reducing computational complexity. Previous lightweight model development has primarily focused on the design of convolutional neural networks (CNNs). While CNNs excel at capturing local features, they are limited in establishing long-range dependencies. Mamba, as a model for long-range modeling, has linear computational complexity, making it a viable option for lightweight models. Based on these considerations, this article proposes an efficient Mamba-attention network (EMAN) that can efficiently capture the intricate details and broader semantic information in RSIs. Specifically, we designed a multiscale detail extraction unit (MDEU) and a multidimensional Mamba-attention (MDMA). In MDEU, we introduced a multiscale mechanism and local variance to focus on structural information in RSIs. In MDMA, we integrated spatial expansion and an atrous-based selective scan mechanism to design an efficient scanning method. This method ensures the lightweight nature of the model while establishing global correlations. Additionally, MDMA establishes interchannel correlations to enhance information exchange. We conducted a comprehensive evaluation of the proposed method on two remote sensing datasets and five benchmark super-resolution (SR) datasets. Extensive experiments demonstrate that our method can achieve superior performance while maintaining a model complexity similar to other lightweight models.
Efficient Mamba-Attention Network for Remote Sensing Image Super-Resolution
Vivone, GemineUltimo
2025
Abstract
Lightweight remote sensing image super-resolution (RSISR) methods aim to reconstruct remote sensing images (RSIs) while reducing computational complexity. Previous lightweight model development has primarily focused on the design of convolutional neural networks (CNNs). While CNNs excel at capturing local features, they are limited in establishing long-range dependencies. Mamba, as a model for long-range modeling, has linear computational complexity, making it a viable option for lightweight models. Based on these considerations, this article proposes an efficient Mamba-attention network (EMAN) that can efficiently capture the intricate details and broader semantic information in RSIs. Specifically, we designed a multiscale detail extraction unit (MDEU) and a multidimensional Mamba-attention (MDMA). In MDEU, we introduced a multiscale mechanism and local variance to focus on structural information in RSIs. In MDMA, we integrated spatial expansion and an atrous-based selective scan mechanism to design an efficient scanning method. This method ensures the lightweight nature of the model while establishing global correlations. Additionally, MDMA establishes interchannel correlations to enhance information exchange. We conducted a comprehensive evaluation of the proposed method on two remote sensing datasets and five benchmark super-resolution (SR) datasets. Extensive experiments demonstrate that our method can achieve superior performance while maintaining a model complexity similar to other lightweight models.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


