Publications

Liu, Q; Meng, XC; Shao, F; Li, ST (2022). PSTAF-GAN: Progressive Spatio-Temporal Attention Fusion Method Based on Generative Adversarial Network. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 5408513.

Abstract
Spatio-temporal fusion aims to integrate multisource remote sensing images with complementary high spatial and temporal resolutions, so as to obtain time-series high spatial resolution fused images. Currently, deep learning (DL)-based spatio-temporal fusion methods have received broad attention. However, on one hand, most of the existing DL-based methods train the model in a band-by-band manner, ignoring the correlations among bands. On the other hand, the general coarse spatio-temporal changes in low spatial resolution images (e.g., MODIS) calculated at the pixel domain cannot completely cover the fine spatio-temporal changes in high spatial resolution images (e.g., Landsat), due to complex surface features and the general large spatial resolution ratio between fine and coarse images. Besides, the existing DL-based spatio-temporal fusion methods are insufficient in exploring multiscale information by only stacking convolutional kernels with different sizes. To alleviate the above challenges, we propose a progressive spatio-temporal attention fusion model in a multiband training manner based on generative adversarial network (PSTAF-GAN). Specifically, we design a flexible multiscale feature extraction architecture to extract multiscale feature hierarchies. Then, spatio-temporal changes are calculated on the feature domain in different feature hierarchies. Besides, a spatio-temporal attention fusion architecture is proposed to fuse the spatio-temporal changes and ground details in a coarse-to-fine manner, which can explore multiscale information more sufficient and gradually recover the target image. The results of quantitative and qualitative experiments on two publicly available benchmark datasets show that the proposed PSTAF-GAN can achieve the best performance compared with the state-of-the-art methods.

DOI:
10.1109/TGRS.2022.3161563

ISSN:
1558-0644