Publications

Zhang, M; Zhang, HQ; Li, XY; Liu, Y; Cai, YT; Lin, H (2020). Classification of Paddy Rice Using a Stacked Generalization Approach and the Spectral Mixture Method Based on MODIS Time Series. IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 13, 2264-2275.

Abstract
Paddy rice is a major stable food, accounting for about 20% world's food supply. And the rice paddy, an important artificial wetland type, plays an important role in the regional ecological environment. This study proposes a stacked generalization and spectral mixture approach to map paddy rice using coarse spatial resolution images [Moderate Resolution Imaging Spectralradiometer, (MODIS)]. By this method, the time series MODIS enhanced vegetation index images, phenological variables, land surface water index, elevation, and slope images are all employed to produce the optimal feature combination, which is then used to map paddy rice by the stacking algorithm. The validation experiment using the data of the Dongting Lake area showed that the proposed method can improve the overall accuracy of single classifiers, including the support vector machine, random forest, k-nearestneigbor (kNN), extreme gradient boosting (XGB), and decision tree. Stacking (XGB) achieves the highest overall accuracy (90.3%) and Kappa coefficient (0.86), which are 2.8% and 0.03 higher than that of using the single kNN classifier. Furthermore, its user accuracies for distinguishing double-cropping rice and single-season rice are 92.5% and 90.0%, respectively. In terms of the paddy rice classification accuracy, the stacking model is also superior to single classifiers. Moreover, the MODIS-derived rice map obtained by the stacked generalization approach and the spectral mixture method area has a large determination coefficient (R-2 = 0.9975) with the government statistic data. The results demonstrate the potential of the proposed method in using coarse spatial resolution images for large-scale paddy rice mapping.

DOI:
10.1109/JSTARS.2020.2994335

ISSN:
1939-1404