Publications

Zhang, Y; Yang, YB; Pan, X; Ding, Y; Hu, J; Dai, Y (2023). Multiinformation Fusion Network for Mapping Gapless All-Sky Land Surface Temperature Using Thermal Infrared and Reanalysis Data. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 61, 5001915.

Abstract
Toward fully exploring multidisciplinary research topics under global warming, the spatiotemporal discontinuities of land surface temperature (LST) products due to cloud contamination still challenge such research topics. Data fusion (DF) that seeks the best compromise between multiple data sources plays a key role in providing gapless LST data. However, most models only use information about the variation of LST at different times or the difference between different LST products without effectively combining the two pieces of information. With the rapid development of deep-learning methods, powerful modeling capabilities can solve this problem. This article proposes a novel multiinformation fusion network (MIFN) based on convolutional neural networks (CNNs) and attention mechanisms (AMs) to map gapless all-sky LST, taking temporal-changing (TC) and data-differentiated (DD) information into consideration. Temporal normalization (TN) is used as a preprocessing step to match the time of moderate resolution imaging spectroradiometer (MODIS) and European Center for Medium-Range Weather Forecasts (ECMWF) Reanalysis Fifth-Generation (ERA5) LSTs. The MIFN extracts multiscale multitemporal TC and DD features by network constraints. Then, the weights of the target LST features reconstructed by TC and DD features are assigned through an AM to fuse them reasonably. Finally, we design a loss function that combines TC, DD, and LST reconstruction to improve the accuracy of LST prediction further. We performed comprehensive data experiments to validate the performance of the new method. Our technique is more advantageous in generating MODIS-like LSTs than four state-of-the-art methods and two CNN models using a single piece of information.

DOI:
10.1109/TGRS.2023.3269622

ISSN:
1558-0644