Publications

Guo, DZ; Shi, WZ; Zhang, H; Hao, M (2022). A Flexible Object-Level Processing Strategy to Enhance the Weight Function-Based Spatiotemporal Fusion Method. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 60, 4414811.

Abstract
Spatiotemporal fusion technique provides a cost-efficient way to achieve dense time series observation. Among all categories of spatiotemporal fusion methods, the weight function-based method attracted considerable attention. However, this kind of method selects similar pixels in a regular window without considering the distribution of features, which will weaken its ability to preserve the structure information. Besides, the weight function-based method carries out pixel-by-pixel fusion computation, which leads to computational inefficiency. To solve the aforementioned issues, a flexible object-level (OL) processing strategy is proposed in this article. Three popular spatiotemporal fusion methods include the spatial and temporal adaptive reflectance fusion model (STARFM), the enhance STARFM (ESTARFM) and the three-step method (Fit-FC) were selected as examples to analyze and validate the effectiveness of the OL processing strategy. Four study sites with different surface landscapes and change patterns were adopted for experiments. Experimental results indicated that the OL fusion versions of STARFM, ESTARFM, and Fit-FC can better preserve the structural information, and were 102.89-113.71, 92.77-115.73, and 30.51-36.15 times faster than their original methods. Remarkably, the OL fusion versions of Fit-FC outperform all competing methods in one-pair case fusion experiments, especially in Poyang lake wetland (PY) area (root mean square error (RMSE) is 0.0343 versus 0.0380, correlation coefficient ( r ) is 0.7469 versus 0.6986 compare with Fit-FC). Additionally, the OL processing strategy can also be adopted to enhance other methods which use the principle of combining similar adjacent information. The program and test data are available at https://github.com/Andy-cumt.

DOI:
10.1109/TGRS.2022.3212474

ISSN:
1558-0644