Publications

Guo, DZ; Li, ZH; Gao, X; Gao, ML; Yu, C; Zhang, CL; Shi, WZ (2025). RealFusion: A reliable deep learning-based spatiotemporal fusion framework for generating seamless fine-resolution imagery. REMOTE SENSING OF ENVIRONMENT, 321, 114689.

Abstract
Spatiotemporal fusion of multisource remote sensing data offers a viable way for precise and dynamic Earth monitoring. However, existing methods struggle with reliable spatiotemporal fusion in two commonly occurring yet complex scenarios: drastic surface changes, such as those caused by natural disasters and human activities, and poor image quality, which caused by thick cloud cover, cloud shadows, haze and noise. To address these challenges, this study proposes a Reliable deep learning-based spatiotemporal Fusion framework (RealFusion), designed to blend Landsat and MODIS imagery to generate daily seamless Landsat-like imagery. ReadFusion enhances fusion reliability through several advancements: (1) integrating diverse input data with complementary information, (2) implementing task decoupled architectures, (3) developing advanced restoration and fusion networks, (4) adopting adaptive training strategy, (5) and establishing a comprehensive accuracy assessment framework. Extensive experiments, comprising 25 trials in three distinct areas, demonstrate that RealFusion outperforms four methods proposed in recent years (Object-Level Hybrid SpatioTemporal Fusion Method, OLHSTFM; Enhanced Deep Convolutional Spatiotemporal Fusion Network, EDCSTFN; Generative Adversarial Network-based SpatioTemporal Fusion Model, GAN-STFM; and Multilevel Feature Fusion with Generative Adversarial Network, MLFF-GAN). Notably, RealFusion is the only model capable of robustly and accurately reconstructing information of areas with drastic surface changes and poor image quality in experiments. RealFusion, thus, facilitates the reliable reconstruction of high-quality images in complex scenarios, marking a meaningful advancement in spatiotemporal fusion technique.

DOI:
10.1016/j.rse.2025.114689

ISSN:
1879-0704