Next Article in Journal
Improved Landslide Deformation Prediction Using Convolutional Neural Network–Gated Recurrent Unit and Spatial–Temporal Data
Previous Article in Journal
Inverse Synthetic Aperture Radar Image Multi-Modal Zero-Shot Learning Based on the Scattering Center Model and Neighbor-Adapted Locally Linear Embedding
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes

1
State Forestry and Grassland Administration Key Laboratory of Forest Resources & Environmental Management, Beijing Forestry University, Beijing 100083, China
2
Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing 100094, China
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(4), 726; https://doi.org/10.3390/rs17040726
Submission received: 26 December 2024 / Revised: 10 February 2025 / Accepted: 17 February 2025 / Published: 19 February 2025

Abstract

Accurate information on crop planting and spatial distribution is critical for understanding and tracking long-term land use changes. The method of using deep learning (DL) to extract crop information has been applied in large-scale datasets and plain areas. However, current crop classification methods face some challenges, such as poor image time continuity, difficult data acquisition, rugged terrain, fragmented plots, and diverse planting conditions in complex scenes. In this study, we propose the Complex Scene Crop Classification U-Net (CSCCU), which aims to improve the mapping accuracy of staple crops in complex scenes by combining multi-spectral bands with spectral features. CSCCU features a dual-branch structure: the main branch concentrates on image feature extraction, while the auxiliary branch focuses on spectral features. In our method, we use the hierarchical feature-level fusion mechanism. Through the hierarchical feature fusion of the shallow feature fusion module (SFF) and the deep feature fusion module (DFF), feature learning is optimized and model performance is improved. We conducted experiments using GaoFen-2 (GF-2) images in Xiuwen County, Guizhou Province, China, and established a dataset consisting of 1000 image patches of size 256, covering seven categories. In our method, the corn and rice accuracies are 89.72% and 88.61%, and the mean intersection over union (mIoU) is 85.61%, which is higher than the compared models (U-Net, SegNet, and DeepLabv3+). Our method provides a novel solution for the classification of staple crops in complex scenes using high-resolution images, which can help to obtain accurate information on staple crops in larger regions in the future.
Keywords: crop classification; complex scene; deep learning; high-resolution image crop classification; complex scene; deep learning; high-resolution image
Graphical Abstract

Share and Cite

MDPI and ACS Style

Zhang, J.; Zhao, L.; Yang, H. A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes. Remote Sens. 2025, 17, 726. https://doi.org/10.3390/rs17040726

AMA Style

Zhang J, Zhao L, Yang H. A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes. Remote Sensing. 2025; 17(4):726. https://doi.org/10.3390/rs17040726

Chicago/Turabian Style

Zhang, Jiajin, Lifang Zhao, and Hua Yang. 2025. "A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes" Remote Sensing 17, no. 4: 726. https://doi.org/10.3390/rs17040726

APA Style

Zhang, J., Zhao, L., & Yang, H. (2025). A Dual-Branch U-Net for Staple Crop Classification in Complex Scenes. Remote Sensing, 17(4), 726. https://doi.org/10.3390/rs17040726

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop