Publications

Shangguan, YL; Min, XX; Shi, Z (2023). Gap Filling of the ESA CCI Soil Moisture Data Using a Spatiotemporal Attention-Based Residual Deep Network. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 16, 5344-5354.

Abstract
As an essential climate variable, soil moisture (SM) exerts an indispensable influence on numerous disciplines. However, various degrees of data gaps exist in current microwave SM products. Therefore, this article proposed a spatiotemporal attention-based residual deep network (STARN) to reconstruct gaps of the daily SM data from the Climate Change Initiative program of the European Space Agency (ESA CCI) over the Qinghai-Tibet Plateau (QTP) during unfrozen seasons (May to September) from 2001 to 2021. The developed model is an end-to-end residual network embedded with three attention modules to comprehensively consider the potential relationship between SM and surface variables. Evaluation results revealed that the proposed model could well reconstruct SM gaps with an overall median R and unbiased RMSE (ubRMSE) values of 0.52 and 0.054 m(3)/m(3), while the overall median R and ubRMSE values for the ESA CCI SM were 0.41 and 0.058 m(3)/m(3). Besides, comparison with five baseline methods (e.g., the artificial neural network, convolutional neural network, extreme gradient boosting, long-short term memory, and DCT-PLS model) indicated that the STARN model had certain advantages over the five baseline models with higher correlation and more reasonable distribution patterns. The R/ubRMSE values for the five models were 0.38/0.057, 0.34/0.058, 0.40/0.058, 0.41/0.056, and 0.41/0.058, respectively. The pretraining using the ERA5-Land SM data could further improve the accuracy of generated seamless SM data since the ERA5-Land and ESA CCI SM complemented each other to a certain extent on the QTP. In summary, by leveraging the spatiotemporal information and attention modules, the STARN model showed great potentials in SM gap filling.

DOI:
10.1109/JSTARS.2023.3284841

ISSN:
2151-1535