Publications

Sun, HX; Xiao, W (2022). Similarity Weight Learning: A New Spatial and Temporal Satellite Image Fusion Framework. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 5408617.

Abstract
Spatiotemporal fusion is a topical framework for solving the mutual restricted problem between the spatial and temporal resolution of satellite images. We pioneer an approach to replace similarity measurement steps in spatiotemporal fusion algorithms with convolutional neural networks (CNNs), building a bridge between weight function-based models and the learning-based models. Specifically, we propose a nonlocal form that separates the relational computation part from the value representation part, and construct the CNN-based similarity weight learning block for learning normalized weights. The block can be inserted into spatial and temporal adaptive reflectance fusion model (STARFM) to replace the manually designed weight calculation rules common in weight function-based methods, or into the CNN model StfNet to better utilize neighboring high-resolution images. The trained model outputs a high-resolution prediction from each base date image pair. The final result is a combination of the two predictions. In this regard, we propose the standard deviation-based weights to combine two prediction results. Four experiments are performed on Landsat-Moderate-resolution Imaging Spectroradiometer (MODIS) image pairs to determine the following: 1) the performance of the model at the target training date; 2) the generalization of the model in the target training time period; and 3) the generalization of the model at different dates and different geographical locations, each considering the different cases of giving one and two pairs of known images. Experimental results demonstrate the superiority of the similarity weight learning block and standard deviation-based weights. Among them, STARFM with the similarity weight learning block exhibits strong generalization, which testifies to the practical value of our model.

DOI:
10.1109/TGRS.2022.3161070

ISSN:
1558-0644