Publications

Tan, ZY; Gao, ML; Yuan, J; Jiang, LC; Duan, HT (2022). A Robust Model for MODIS and Landsat Image Fusion Considering Input Noise. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 5407217.

Abstract
Significant progress has been made in spatiotemporal fusion for remote sensing images; however, most models require inputs to be free of clouds and without missing data, considerably confining their applications in practice. Due to recent advances in deep learning technologies, powerful modeling capabilities could be leveraged to bring potential solutions to this problem. This article proposes a novel architecture named the robust spatiotemporal fusion network (RSFN) based on the generative adversarial network and attention mechanism with dual temporal references to automatically handle input noise. The RSFN only needs one coarse-resolution image on the prediction date and two referential fine-resolution images before and after the prediction date as model inputs. Most notably, there is no special restriction attached on the data quality of referential images. The comparison with other models demonstrates the effectiveness of the RSFN model quantitatively and visually in four study areas using MODIS and Landsat images. Two main conclusions can draw from the experiments. First, the input data noise hardly affects the prediction results of the RSFN, and the RSFN can gain a comparable or even higher accuracy; conversely, the other methods only show limited resistance to input noise. Second, the RSFN with cloud-contaminated references outperforms the other models with cloud-free references after data filtering in the same study area during the same period. The satellite data quality usually varies significantly; the model robustness and fault tolerance are considered critical for actual applications. The RSFN is a simple end-to-end deep model with high accuracy and fault tolerance designed for spatiotemporal fusion with imperfect data inputs, showing promising prospects in practical applications.

DOI:
10.1109/TGRS.2022.3145086

ISSN:
1558-0644