Publications

Zhang, JJ; Zhao, LF; Yang, H (2025). A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes. REMOTE SENSING, 17(4), 726.

Abstract
Accurate information on crop planting and spatial distribution is critical for understanding and tracking long-term land use changes. The method of using deep learning (DL) to extract crop information has been applied in large-scale datasets and plain areas. However, current crop classification methods face some challenges, such as poor image time continuity, difficult data acquisition, rugged terrain, fragmented plots, and diverse planting conditions in complex scenes. In this study, we propose the Complex Scene Crop Classification U-Net (CSCCU), which aims to improve the mapping accuracy of staple crops in complex scenes by combining multi-spectral bands with spectral features. CSCCU features a dual-branch structure: the main branch concentrates on image feature extraction, while the auxiliary branch focuses on spectral features. In our method, we use the hierarchical feature-level fusion mechanism. Through the hierarchical feature fusion of the shallow feature fusion module (SFF) and the deep feature fusion module (DFF), feature learning is optimized and model performance is improved. We conducted experiments using GaoFen-2 (GF-2) images in Xiuwen County, Guizhou Province, China, and established a dataset consisting of 1000 image patches of size 256, covering seven categories. In our method, the corn and rice accuracies are 89.72% and 88.61%, and the mean intersection over union (mIoU) is 85.61%, which is higher than the compared models (U-Net, SegNet, and DeepLabv3+). Our method provides a novel solution for the classification of staple crops in complex scenes using high-resolution images, which can help to obtain accurate information on staple crops in larger regions in the future.

DOI:
10.3390/rs17040726

ISSN:
2072-4292