Publications

Shang, C; Li, XY; Yin, ZX; Li, XD; Wang, LH; Zhang, YH; Du, Y; Ling, F (2022). Spatiotemporal Reflectance Fusion Using a Generative Adversarial Network. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 5400915.

Abstract
The spatiotemporal reflectance fusion method is used to blend high-temporal and low-spatial resolution images with their low-temporal and high-spatial resolution counterparts that were previously acquired by various satellite sensors. Recently, a wide variety of learning-based solutions have been developed, but challenges remain. These solutions usually require two sets of data acquired before and after the prediction time, making them unsuitable for near-real-time predicting. The solutions are always trained band by band and thus do not consider the spectral correlation. High-resolution temporal changes are difficult to reconstruct accurately with the network structure used, which lowers the accuracy of the fusion result. To address these problems, this study proposes a novel spatiotemporal adaptive reflectance fusion model using a generative adversarial network (GASTFN). In GASTFN, an end-to-end network, including a generative and discriminative network, is simultaneously trained for all spectral bands. The proposed model can be applied to the one-pair case, consider the spectral correlation of each band, and improve the process of producing super-resolution imagery by adopting the discriminative network for image reflectance values rather than temporal changes in reflectance. The proposed model has been verified with two actual satellite data sets acquired in heterogeneous landscapes and areas with abrupt changes, with a comparison of the state-of-art methods. The results show that GASTFN can generate the most accurate fusion images with more detailed textures, more realistic spatial shapes, and higher accuracy, demonstrating that the GASTFN is effective for predicting near-real-time changes in image reflectance and preserves the most valuable spatial information.

DOI:
10.1109/TGRS.2021.3065418

ISSN:
1558-0644