Fully Transformer Network for Remote Sensing Image Change Detection

Abstract: Change Detection (CD) is a crucial task in remote sensing, aiming to identify significant changes between multi-temporal images of the same area. While deep learning has shown promise in CD, challenges remain in fully utilizing semantic information, preserving boundary details, and incorporating temporal information. This paper proposes a novel Fully Transformer Network (FTN) for remote sensing image CD. FTN leverages the power of Transformers for global feature extraction and introduces a pyramid structure with a Progressive Attention Module (PAM) to refine boundary perception. Additionally, a deeply-supervised learning strategy with multiple boundary-aware loss functions is employed to address irregular boundary issues. Extensive experiments on four public CD benchmarks demonstrate that FTN outperforms state-of-the-art methods, highlighting its effectiveness in remote sensing image CD.

Introduction: Change detection (CD) in remote sensing images plays a vital role in various applications, including land-use monitoring, disaster management, and urban planning. The task involves identifying 'changed' regions within images acquired at different times over the same geographical area. Traditional methods often struggle with the complexities of high-resolution imagery and the need for accurate boundary delineation.

Deep learning, particularly Convolutional Neural Networks (CNNs), has revolutionized remote sensing image analysis. However, CNNs often fall short in capturing global context and precisely segmenting changed regions.

This paper introduces a novel Fully Transformer Network (FTN) designed to address these limitations. FTN leverages the strengths of Transformers, known for their ability to capture long-range dependencies, for effective global feature extraction. Furthermore, a pyramid structure incorporating a Progressive Attention Module (PAM) enhances the network's capacity to discern intricate boundaries.

Key Contributions: This work makes the following significant contributions:- Novel Framework: Proposes FTN, a novel learning framework for remote sensing image CD, integrating Transformers for global feature representation and a pyramid structure with PAM for refined boundary perception.- Enhanced Boundary Awareness: Introduces a pyramid structure with PAM to improve feature representation and address the challenge of irregular boundaries often encountered in CD.- Deeply-Supervised Learning: Utilizes a deeply-supervised learning strategy with multiple boundary-aware loss functions to further enhance the network's ability to accurately delineate changed regions.- State-of-the-art Performance: Comprehensive experiments on four publicly available CD benchmarks demonstrate that FTN achieves superior performance compared to existing state-of-the-art methods.

**Conclusion:**The proposed FTN framework presents a significant advancement in remote sensing image change detection. By effectively capturing global context, refining boundary details, and incorporating a robust training strategy, FTN achieves impressive results on benchmark datasets. Future work will explore the application of FTN to other remote sensing tasks and investigate further improvements in computational efficiency.


原文地址: https://www.cveoy.top/t/topic/fc9X 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录