Publications

Liu, S; Wang, HD; Hu, Y; Zhang, MT; Zhu, YX; Wang, ZB; Li, DY; Yang, MY; Wang, F (2023). Land Use and Land Cover Mapping in China Using Multimodal Fine-Grained Dual Network. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 61, 4405219.

Abstract
With the advancement of geo-systems and the increased availability of satellite data, a plethora of land-use and land-cover (LULC) products have been developed. The existing LULC products primarily relied on time-series imagery to classify land by pixel-based classifiers, allowing for local analysis and accurate boundary detection. However, the advent of deep learning has shifted toward the use of patch-based CNN models for generating land cover maps. In this article: 1) we create a training dataset for China using a voting strategy based on three off-the-shelf available LULC products, avoiding the labor-intensive manual annotation. 2) We design a novel CNN-based model for the LULC task, called multimodal fine-grained dual network (dubbed as Dual-Net), which takes dual-date images to generate final maps and reduces the need for gap-free temporal sequences or separate cloud detection. To leverage the correlation between location, date, and category, we embed modal information (dates and geo-locations) into the model. Furthermore, by incorporating low-level constraints and using pseudolabel refinement, we continually improve the performance and achieve more refined segmentation. 3) Due to the lack of a suitable validation dataset for China, we create a new validation dataset called the China Sentinel2 Validation Dataset (CSVD) by manually annotating 733 finely labeled images of 1024 x 1024 pixels of China-specific Sentinel2 data. 4) Extensive experiments demonstrate that our model outperforms existing LULC products and produces more fine-grained segmentation results comparable to other patch-based products. Finally, we release annual LULC maps for China in 2020-2022 and also make our model accessible online for real-time results export.

DOI:
10.1109/TGRS.2023.3285912

ISSN:
1558-0644