Publications

Jiang, XL; Huang, B (2022). Unmixing-Based Spatiotemporal Image Fusion Accounting for Complex Land Cover Changes. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 5623010.

Abstract
Spatiotemporal reflectance fusion has received considerable attention in recent decades. However, various challenges remain despite varying levels of success, especially regarding the recovery of spatial details with complex land cover changes. Taking the blending of Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS) images as an example, this article presents a locally weighted unmixing-based spatiotemporal image fusion model (LWU-STFM) that focuses on recovering complex land cover changes. The core idea is to redefine the land use class of each pixel featuring land cover change at the prediction date. The spatial unmixing process is enhanced using a proposed geographically spectrum-weighted regression (GSWR), and then, we optimize similar neighboring pixels for the final weighted-based prediction. Experiments are conducted using semisimulated and actual time-series Landsat-MODIS datasets to demonstrate the performance of the proposed LWU-STFM compared with the classic spatial and temporal adaptive reflectance fusion model (STARFM), flexible spatiotemporal data fusion (FSDAF), two enhanced FSDAF models (SFSDAF and FSDAF 2.0), and a virtual image pair-based spatiotemporal fusion model for spatial weighting (VIPSTF-SW). The results reveal that the proposed LWU-STFM outperforms the other five models with the best quantitative accuracy. In terms of the relative dimensionless global error (ERGAS) index, the errors of Landsat-like images generated using LWU-STFM are 2.8%-63.4% lower than those of other models. From visual comparisons, LWU-STFM predictions illustrate encouraging improvements in recovering spatial details of pixels with complex land cover changes in heterogeneous landscapes and, thus, advancing applications of spatiotemporal image fusion for continuous and fine-scale land surface monitoring.

DOI:
10.1109/TGRS.2022.3173172

ISSN:
1558-0644