Liao, CH; Wang, JF; Pritchard, I; Liu, JG; Shang, JL (2017). A Spatio-Temporal Data Fusion Model for Generating NDVI Time Series in Heterogeneous Regions. REMOTE SENSING, 9(11), 1125.
Abstract
Time series vegetation indices with high spatial resolution and high temporal frequency are important for crop growth monitoring and management. However, due to technical constraints and cloud contamination, it is difficult to obtain such datasets. In this study, a spatio-temporal vegetation index image fusion model (STVIFM) was developed to generate high spatial resolution Normalized Difference Vegetation Index (NDVI) time-series images with higher accuracy, since most of the existing methods have some limitations in accurately predicting NDVI in heterogeneous regions, or rely on very computationally intensive steps and land cover maps for heterogeneous regions. The STVIFM aims to predict the fine-resolution NDVI through understanding the contribution of each fine-resolution pixel to the total NDVI change, which was calculated from the coarse-resolution images acquired on two dates. On the one hand, it considers the difference in relationships between the fine- and coarse-resolution images on different dates and the difference in NDVI change rates at different growing stages. On the other hand, it neither needs to search similar pixels nor needs to use land cover maps. The Landsat-8 and MODIS data acquired over three test sites with different landscapes were used to test the spatial and temporal performance of the proposed model. Compared with the spatial and temporal adaptive reflectance fusion model (STARFM), enhanced spatial and temporal adaptive reflectance fusion model (ESTARFM) and the flexible spatiotemporal data fusion (FSDAF) method, the proposed STVIFM outperforms the STARFM and ESTARFM at three study sites and different stages when the land cover or NDVI changes were captured by the two pairs of fine- and coarse-resolution images, and it is more robust and less computationally intensive than the FSDAF.
DOI:
10.3390/rs9111125
ISSN:
2072-4292