Publications

Yu, WT; Li, J; Liu, QH; Zhao, J; Dong, YD; Wang, C; Lin, SR; Zhu, XR; Zhang, H (2022). Spatial-Temporal Prediction of Vegetation Index With Deep Recurrent Neural Networks. IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 19, 2501105.

Abstract
Vegetation index (VI) derived from remotely sensed images is a proxy of terrestrial vegetation information and widely used in land monitoring and global change studies. Recently, the prediction of vegetation properties has been an interest in related communities. With the accumulation of satellite records over the past few decades, the spatial-temporal prediction of VI becomes feasible. In this letter, we developed deep recurrent neural networks (RNNs) with long short-term memory (LSTM) and gated recurrent units (GRUs) to predict the short-term VI based on historical observations. The pixel-based fully connected networks GRU and LSTM (FCGRU and FCLSTM) and patch-based convolutional networks (ConvGRU and ConvLSTM) are established and compared with the traditional multilayer perceptron (MLP) model. Moderate Resolution Imaging Spectroradiometer (MODIS) and Sentinel-2 normalized difference VI (NDVI) data sets were used in the experiments. The prediction performance is evaluated globally in different regions, different vegetation types, and different growing seasons. Results demonstrate that the RNN models can predict VI with high accuracy (average root mean square error (RMSE) around 0.03), which is superior to the MLP model. In general, the pixel-based RNN models performed better than the patch-based models especially in regions with a larger proportion of outliers. And the prediction accuracy is stable over different vegetation types and growing seasons.

DOI:
10.1109/LGRS.2021.3064814

ISSN:
1558-0571