Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experimental Area Profile
2.2. Data Collection
2.2.1. UAV Image Acquisition and Preprocessing
2.2.2. Ground Data Collection
2.3. Feature Extraction
2.3.1. Construction of Vegetation Indices
2.3.2. First-Order Differential
2.4. Soybean Lodging Grade Classification Based on RGB and Hyperspectral Images with SMOTE-ResNet
2.4.1. Representation Learning and Feature Fusion Based on ResNet18
2.4.2. Category Balancing of Feature Vectors Based on the SMOTE Strategy
2.4.3. Lodging Classification and Performance Evaluation
2.5. The Construction and Validation of the Yield Estimation Model
3. Results
3.1. Spectral Changes in Soybean Canopy Under Lodging Stress
3.2. Lodging Classification
3.3. Sample Balancing Strategy Based on the SMOTE Module
3.4. Yield Estimation Optimization Based on Lodging Information
4. Discussion
4.1. The Impact of Hyperspectral Wavelengths on Lodging Classification
4.2. Comparison of Different Modeling Methods
4.3. Performance of the SMOTE Module
4.4. The Impact of Lodging on Yield Prediction
5. Conclusions
- (1)
- At all growth stages examined in this study, the feature vectors encoded by ResNet18 consistently outperformed manually extracted image-derived features in lodging classification. This underscores the effectiveness and potential of deep learning-based automated feature extraction methods for accurate soybean lodging monitoring.
- (2)
- In the context of imbalanced lodging class samples, the minority class initially exhibited low classification accuracy. However, incorporating the SMOTE module into the framework significantly improved the accuracy of the minority class. This enhancement resulted in more balanced classification outcomes across various lodging levels, enhancing the model’s capability to effectively identify lodging samples.
- (3)
- Incorporating ground-truth lodging grades into the multimodal intermediate-level fusion strategy improved yield estimation accuracy from 0.62 to 0.65. When lodging grades obtained through SMOTE-ResNet classification were introduced, the yield estimation accuracy reached 0.63, comparable to the results obtained using ground-truth lodging grades. This suggests that including lodging information can significantly enhance yield estimation accuracy, providing valuable theoretical support for yield management in precision agriculture.
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Pagano, M.C.; Miransari, M. The importance of soybean production worldwide. In Abiotic and Biotic Stresses in Soybean Production; Elsevier: Amsterdam, The Netherlands, 2016; pp. 1–26. [Google Scholar]
- Liu, Z.; Ying, H.; Chen, M.; Bai, J.; Xue, Y.; Yin, Y.; Batchelor, W.D.; Yang, Y.; Bai, Z.; Du, M. Optimization of China’s maize and soy production can ensure feed sufficiency at lower nitrogen and carbon footprints. Nat. Food 2021, 2, 426–433. [Google Scholar] [CrossRef] [PubMed]
- Xiao, S.; Ye, Y.; Fei, S.; Chen, H.; Cai, Z.; Che, Y.; Wang, Q.; Ghafoor, A.; Bi, K.; Shao, K. High-throughput calculation of organ-scale traits with reconstructed accurate 3D canopy structures using a UAV RGB camera with an advanced cross-circling oblique route. ISPRS J. Photogramm. Remote Sens. 2023, 201, 104–122. [Google Scholar] [CrossRef]
- Sun, G.; Zhang, Y.; Chen, H.; Wang, L.; Li, M.; Sun, X.; Fei, S.; Xiao, S.; Yan, L.; Li, Y. Improving soybean yield prediction by integrating UAV nadir and cross-circling oblique imaging. Eur. J. Agron. 2024, 155, 127134. [Google Scholar] [CrossRef]
- Han, L.; Yang, G.; Feng, H.; Zhou, C.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Quantitative Identification of Maize Lodging-Causing Feature Factors Using Unmanned Aerial Vehicle Images and a Nomogram Computation. Remote Sens. 2018, 10, 1528. [Google Scholar] [CrossRef]
- Wang, J.-J.; Ge, H.; Dai, Q.; Ahmad, I.; Dai, Q.; Zhou, G.; Qin, M.; Gu, C. Unsupervised discrimination between lodged and non-lodged winter wheat: A case study using a low-cost unmanned aerial vehicle. Int. J. Remote Sens. 2018, 39, 2079–2088. [Google Scholar] [CrossRef]
- Dai, J.; Zhang, G.; Guo, P.; Zeng, T.; Cui, M.; Xue, J. Information extraction of cotton lodging based on multi-spectral image from UAV remote sensing. Trans. CSAE 2019, 35, 63–70. [Google Scholar]
- Chu, T.; Starek, M.J.; Brewer, M.J.; Murray, S.C.; Pruter, L.S. Assessing lodging severity over an experimental maize (Zea mays L.) field using UAS images. Remote Sens. 2017, 9, 923. [Google Scholar]
- Yang, M.-D.; Huang, K.-S.; Kuo, Y.-H.; Tsai, H.P.; Lin, L.-M. Spatial and spectral hybrid image classification for rice lodging assessment through UAV imagery. Remote Sens. 2017, 9, 583. [Google Scholar] [CrossRef]
- Rajapaksa, S.; Eramian, M.; Duddu, H.; Wang, M.; Shirtliffe, S.; Ryu, S.; Josuttes, A.; Zhang, T.; Vail, S.; Pozniak, C. Classification of crop lodging with gray level co-occurrence matrix. In Proceedings of the 2018 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Tahoe, NV, USA, 12–15 March 2018. [Google Scholar]
- Han, L.; Yang, G.; Yang, X.; Song, X.; Xu, B.; Li, Z.; Wu, J.; Yang, H.; Wu, J. An explainable XGBoost model improved by SMOTE-ENN technique for maize lodging detection based on multi-source unmanned aerial vehicle images. Comput. Electron. Agric. 2022, 194, 106804. [Google Scholar] [CrossRef]
- Sarkar, S.; Zhou, J.; Scaboo, A.; Zhou, J.; Aloysius, N.; Lim, T.T. Assessment of Soybean Lodging Using UAV Imagery and Machine Learning. Plants 2023, 12, 2893. [Google Scholar] [CrossRef]
- Chemchem, A.; Alin, F.; Krajecki, M. Combining SMOTE sampling and machine learning for forecasting wheat yields in France. In Proceedings of the 2019 IEEE Second International Conference on Artificial Intelligence and Knowledge Engineering (AIKE), Sardinia, Italy, 3–5 June 2019. [Google Scholar]
- Zhu, W.; Li, S.; Zhang, X.; Li, Y.; Sun, Z. Estimation of winter wheat yield using optimal vegetation indices from unmanned aerial vehicle remote sensing. Trans. Chin. Soc. Agric. Eng. 2018, 34, 78–86. [Google Scholar]
- Gong, Y.; Duan, B.; Fang, S.; Zhu, R.; Wu, X.; Ma, Y.; Peng, Y. Remote estimation of rapeseed yield with unmanned aerial vehicle (UAV) imaging and spectral mixture analysis. Plant Methods 2018, 14, 1–14. [Google Scholar] [CrossRef]
- Fei, S.; Hassan, M.A.; Xiao, Y.; Su, X.; Chen, Z.; Cheng, Q.; Duan, F.; Chen, R.; Ma, Y. UAV-based multi-sensor data fusion and machine learning algorithm for yield prediction in wheat. Precis. Agric. 2023, 24, 187–212. [Google Scholar] [CrossRef] [PubMed]
- Yang, Q.; Shi, L.; Han, J.; Zha, Y.; Zhu, P. Deep convolutional neural networks for rice grain yield estimation at the ripening stage using UAV-based remotely sensed images. Field Crops Res. 2019, 235, 142–153. [Google Scholar] [CrossRef]
- Zhong, L.; Hu, L.; Zhou, H.; Tao, X. Deep learning based winter wheat mapping using statistical data as ground references in Kansas and northern Texas, US. Remote Sens. Environ. 2019, 233, 111411. [Google Scholar] [CrossRef]
- Hao, X.; Jia, J.; Khattak, A.M.; Zhang, L.; Guo, X.; Gao, W.; Wang, M. Growing period classification of Gynura bicolor DC using GL-CNN. Comput. Electron. Agric. 2020, 174, 105497. [Google Scholar] [CrossRef]
- Karmakar, P.; Teng, S.W.; Murshed, M.; Pang, S.; Li, Y.; Lin, H. Crop monitoring by multimodal remote sensing: A review. Remote Sens. Appl. Soc. Environ. 2024, 33, 101093. [Google Scholar] [CrossRef]
- Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
- Zhang, Y.; Yang, Y.; Zhang, Q.; Duan, R.; Liu, J.; Qin, Y.; Wang, X. Toward multi-stage phenotyping of soybean with multimodal UAV sensor data: A comparison of machine learning approaches for leaf area index estimation. Remote Sens. 2022, 15, 7. [Google Scholar] [CrossRef]
- Chauhan, S.; Darvishzadeh, R.; Boschetti, M.; Pepe, M.; Nelson, A. Remote sensing-based crop lodging assessment: Current status and perspectives. ISPRS J. Photogramm. Remote Sens. 2019, 151, 124–140. [Google Scholar] [CrossRef]
- Fischer, R.; Stapper, M. Lodging effects on high-yielding crops of irrigated semidwarf wheat. Field Crops Res. 1987, 17, 245–258. [Google Scholar] [CrossRef]
- Kendall, S.; Holmes, H.; White, C.; Clarke, S.; Berry, P. Quantifying lodging-induced yield losses in oilseed rape. Field Crops Res. 2017, 211, 106–113. [Google Scholar] [CrossRef]
- Berry, P.M.; Spink, J. Predicting yield losses caused by lodging in wheat. Field Crops Res. 2012, 137, 19–26. [Google Scholar] [CrossRef]
- Zhang, N.; Zhang, X.; Yang, G.; Zhu, C.; Huo, L.; Feng, H. Assessment of defoliation during the Dendrolimus tabulaeformis Tsai et Liu disaster outbreak using UAV-based hyperspectral images. Remote Sens. Environ. 2018, 217, 323–339. [Google Scholar] [CrossRef]
- Shu, M.; Zhou, L.; Gu, X.; Ma, Y.; Sun, Q.; Yang, G.; Zhou, C. Monitoring of maize lodging using multi-temporal Sentinel-1 SAR data. Adv. Space Res. 2020, 65, 470–480. [Google Scholar] [CrossRef]
- Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
- Nguyen, C.; Sagan, V.; Skobalski, J.; Severo, J.I. Early detection of wheat yellow rust disease and its impact on terminal yield with multi-spectral UAV-imagery. Remote Sens. 2023, 15, 3301. [Google Scholar] [CrossRef]
- Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
- Woebbecke, D.M.; Meyer, G.E.; Von Bargen, K.; Mortensen, D.A. Color indexes for weed identification under various soil, residue, and lighting conditions. Trans. Asae 1995, 38, 259–269. [Google Scholar] [CrossRef]
- Steward, B.L.; Tian, L.F. Real-time machine vision weed-sensing. In Proceedings of the ASAE Annual International Meeting, Orlando, FL, USA, 12–16 July 1998; p. 11. [Google Scholar]
- Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are vegetation indices derived from consumer-grade cameras mounted on UAVs sufficiently reliable for assessing experimental plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
- Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
- Metternicht, G. Vegetation indices derived from high-resolution airborne videography for precision crop management. Int. J. Remote Sens. 2003, 24, 2855–2877. [Google Scholar] [CrossRef]
- Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), Kobe, Japan, 20–24 July 2003. [Google Scholar]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
- Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.; Burgos-Artizzu, X.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric. 2011, 75, 75–83. [Google Scholar] [CrossRef]
- Gamon, J.A.; Surfus, J.S. Assessing leaf pigment content and activity with a reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
- Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
- Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Daughtry, C.S.T.; Walthall, C.; Kim, M.; De Colstoun, E.B.; McMurtrey Iii, J. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- Gong, P.; Pu, R.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from Hyperion hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1355–1362. [Google Scholar] [CrossRef]
- Chen, Y.; Gillieson, D. Evaluation of Landsat TM vegetation indices for estimating vegetation cover on semi-arid rangelands: A case study from Australia. Can. J. Remote Sens. 2009, 35, 435–446. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Viña, A.; Verma, S.B.; Rundquist, D.C.; Arkebauer, T.J.; Keydan, G.; Leavitt, B.; Ciganda, V.; Burba, G.G.; Suyker, A.E. Relationship between gross primary production and chlorophyll content in crops: Implications for the synoptic monitoring of vegetation productivity. J. Geophys. Res.-Atmos. 2006, 111, D08S11. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Merzlyak, M.N. Remote estimation of chlorophyll content in higher plant leaves. Int. J. Remote Sens. 1997, 18, 2691–2697. [Google Scholar] [CrossRef]
- Rouse, J.W., Jr.; Haas, R.H.; Deering, D.; Schell, J.; Harlan, J.C. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation. 1974. Available online: https://ntrs.nasa.gov/citations/19740022555 (accessed on 16 April 2025).
- Chen, H.; Huang, W.; Li, W.; Niu, Z.; Zhang, L.; Xing, S. Estimation of LAI in Winter Wheat from Multi-Angular Hyperspectral VNIR Data: Effects of View Angles and Plant Architecture. Remote Sens. 2018, 10, 1630. [Google Scholar] [CrossRef]
- Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
- Roujean, J.L.; Breon, F.M. Estimating par absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
- Tian, Y.; Li, Y.; Feng, W.; Tian, Y.; Yao, X.; Cao, W. Monitoring leaf nitrogen in rice using canopy reflectance spectra. In Proceedings of the International Symposium on Intelligent Information Technology in Agriculture, Online, 21–23 November 2007. [Google Scholar]
- Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
- Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A modified soil adjusted vegetation index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
- Broge, N.H.; Leblanc, E. Comparing prediction power and stability of broadband and hyperspectral vegetation indices for estimation of green leaf area index and canopy chlorophyll density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
- Srivastava, R.K.; Greff, K.; Schmidhuber, J. Training very deep networks. arXiv 2015, arXiv:1507.06228. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Meng, Z.; Li, L.; Tang, X.; Feng, Z.; Jiao, L.; Liang, M. Multipath residual network for spectral-spatial hyperspectral image classification. Remote Sens. 2019, 11, 1896. [Google Scholar] [CrossRef]
- Ioffe, S. Batch normalization: Accelerating deep network training by reducing internal covariate shift. arXiv 2015, arXiv:1502.03167. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Identity mappings in deep residual networks. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; Proceedings, Part IV 14. Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
- Batista, G.E.; Prati, R.C.; Monard, M.C. A study of the behavior of several methods for balancing machine learning training data. ACM SIGKDD Explor. Newsl. 2004, 6, 20–29. [Google Scholar] [CrossRef]
- Chicco, D.; Warrens, M.J.; Jurman, G. The coefficient of determination R-squared is more informative than SMAPE, MAE, MAPE, MSE and RMSE in regression analysis evaluation. Peerj Comput. Sci. 2021, 7, e623. [Google Scholar] [CrossRef]
- Tian, B.; Luan, S.; Zhang, L.; Liu, Y.; Zhang, L.; Li, H. Penalties in yield and yield associated traits caused by stem lodging at different developmental stages in summer and spring foxtail millet cultivars. Field Crops Res. 2018, 217, 104–112. [Google Scholar] [CrossRef]
- Zhang, X.; Liu, D.; Ma, J.; Wang, X.; Li, Z.; Zheng, D. Visible Near-Infrared Hyperspectral Soil Organic Matter Prediction Based on Combinatorial Modeling. Agronomy 2024, 14, 789. [Google Scholar] [CrossRef]
- Tian, M.; Ban, S.; Yuan, T.; Ji, Y.; Ma, C.; Li, L. Assessing rice lodging using UAV visible and multispectral image. Int. J. Remote Sens. 2021, 42, 8840–8857. [Google Scholar] [CrossRef]
- Zhang, T.-X.; Su, J.-Y.; Liu, C.-J.; Chen, W.-H. Potential bands of sentinel-2A satellite for classification problems in precision agriculture. Int. J. Autom. Comput. 2019, 16, 16–26. [Google Scholar] [CrossRef]
- Zhan, Z.; Qin, Q.; Ghulan, A.; Wang, D. NIR-red spectral space based new method for soil moisture monitoring. Sci. China Ser. D Earth Sci. 2007, 50, 283–289. [Google Scholar] [CrossRef]
- Laroche-Pinel, E.; Albughdadi, M.; Duthoit, S.; Chéret, V.; Rousseau, J.; Clenet, H. Understanding vine hyperspectral signature through different irrigation plans: A first step to monitor vineyard water status. Remote Sens. 2021, 13, 536. [Google Scholar] [CrossRef]
- Lu, Y.; Lu, R. Detection of surface and subsurface defects of apples using structured-illumination reflectance imaging with machine learning algorithms. Trans. ASABE 2018, 61, 1831–1842. [Google Scholar] [CrossRef]
- Marsland, S. Machine Learning: An Algorithmic Perspective; Chapman and Hall/CRC: London, UK, 2011. [Google Scholar]
- Easson, D.; White, E.; Pickles, S. The effects of weather, seed rate and cultivar on lodging and yield in winter wheat. J. Agric. Sci. 1993, 121, 145–156. [Google Scholar] [CrossRef]
- Lang, Y.-Z.; Yang, X.-D.; Wang, M.-E.; Zhu, Q.-S. Effects of lodging at different filling stages on rice yield and grain quality. Rice Sci. 2012, 19, 315–319. [Google Scholar] [CrossRef]
- Mi, C.; Zhang, X.; Li, S.; Yang, J.; Zhu, D.; Yang, Y. Assessment of environment lodging stress for maize using fuzzy synthetic evaluation. Math. Comput. Model. 2011, 54, 1053–1060. [Google Scholar] [CrossRef]
Vegetation Indices | Definition | References |
---|---|---|
The value of each band | / | |
EXR | [31] | |
EXG | [32] | |
EXGR | [33] | |
MGRVI | [29] | |
NGRDI | [34] | |
RGRI | [35] | |
PPRb | [36] | |
CIVE | [37] | |
VARI | [38] | |
WI | [32] | |
GLA | [39] | |
RGBVI | [40] | |
VEG | [41] | |
COM | [39] | |
COM2 | [39] | |
CI | [42] | |
DVI | nir | [43] |
GNDVI | nir − g)/(nir + g) | [44] |
GRVI | [41,45] | |
MCARI | [45] | |
MNVI | [46] | |
MSR | [47] | |
MTCI | (nirre)re | [48] |
NDRE | (nirre)/(nirre) | [49] |
NDVI | [50] | |
NLI | [51] | |
OSAVI | [52] | |
RDVI | sqrt(nirr) | [53] |
RVI1 | [43] | |
RVI2 | [54] | |
TO | reg reg | [55] |
SAVI | [56] | |
TVI | [57] |
Date | ResNet | SMOTE-ResNet | ||||
---|---|---|---|---|---|---|
Accuracy | Recall | F1 Score | Accuracy | Recall | F1 Score | |
55 DAE | 0.69 | 0.68 | 0.67 | 0.72 | 0.73 | 0.71 |
65 DAE | 0.76 | 0.76 | 0.73 | 0.77 | 0.76 | 0.77 |
76 DAE | 0.66 | 0.64 | 0.61 | 0.73 | 0.72 | 0.70 |
85 DAE | 0.72 | 0.71 | 0.70 | 0.75 | 0.72 | 0.71 |
95 DAE | 0.66 | 0.66 | 0.64 | 0.70 | 0.70 | 0.69 |
Date | Metrics | ResNet-EF | ResNet-EF + Measured Lodging | ResNet-EF + Estimated Lodging | ResNet-MF | ResNet-MF + Measured Lodging | ResNet-MF + Estimated Lodging |
---|---|---|---|---|---|---|---|
55 DAE | R2 | 0.19 | 0.22 | 0.21 | 0.20 | 0.22 | 0.21 |
RMSE | 798.01 | 786.45 | 790.72 | 795.33 | 784.98 | 789.18 | |
65 DAE | R2 | 0.40 | 0.42 | 0.41 | 0.40 | 0.44 | 0.43 |
RMSE | 677.63 | 662.63 | 670.03 | 676.48 | 653.92 | 655.13 | |
76 DAE | R2 | 0.47 | 0.50 | 0.49 | 0.48 | 0.51 | 0.49 |
RMSE | 636.13 | 615.23 | 624.27 | 630.20 | 612.41 | 622.94 | |
85 DAE | R2 | 0.59 | 0.63 | 0.60 | 0.62 | 0.65 | 0.63 |
RMSE | 564.97 | 541.25 | 559.04 | 547.18 | 529.56 | 539.75 | |
95 DAE | R2 | 0.49 | 0.53 | 0.52 | 0.50 | 0.54 | 0.53 |
RMSE | 625.59 | 600.69 | 606.48 | 618.34 | 594.25 | 599.53 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, X.; Fang, Y.; Sun, G.; Zhang, Y.; Wang, L.; Chen, C.; Ren, L.; Meng, L.; Li, Y.; Qiu, L.; et al. Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. Remote Sens. 2025, 17, 1490. https://doi.org/10.3390/rs17091490
Xu X, Fang Y, Sun G, Zhang Y, Wang L, Chen C, Ren L, Meng L, Li Y, Qiu L, et al. Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. Remote Sensing. 2025; 17(9):1490. https://doi.org/10.3390/rs17091490
Chicago/Turabian StyleXu, Xingmei, Yushi Fang, Guangyao Sun, Yong Zhang, Lei Wang, Chen Chen, Lisuo Ren, Lei Meng, Yinghui Li, Lijuan Qiu, and et al. 2025. "Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning" Remote Sensing 17, no. 9: 1490. https://doi.org/10.3390/rs17091490
APA StyleXu, X., Fang, Y., Sun, G., Zhang, Y., Wang, L., Chen, C., Ren, L., Meng, L., Li, Y., Qiu, L., Guo, Y., Yu, H., & Ma, Y. (2025). Soybean Lodging Classification and Yield Prediction Using Multimodal UAV Data Fusion and Deep Learning. Remote Sensing, 17(9), 1490. https://doi.org/10.3390/rs17091490