Fusion of UAV-Acquired Visible Images and Multispectral Data by Applying Machine-Learning Methods in Crop Classification
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Area
2.2. UAV-Based Remote-Sensing Data Acquisition
2.3. Data Preprocessing
2.4. Research Methodology
2.4.1. Image Segmentation
2.4.2. Feature Extraction
2.4.3. Experimental Protocol
2.4.4. Machine-Learning Modeling
2.4.5. Accuracy Evaluation
- , user accuracy for class i;
- , the number of pixels in the confusion matrix that actually belong to class i and are correctly classified as class i (diagonal elements);
- , the number of all pixels predicted to be of class i (i.e., the sum of all elements in column i of the confusion matrix).
- , producer accuracy of class i;
- , the number of pixels in the confusion matrix that actually belong to class i and are correctly classified as class i (diagonal elements);
- , the number of all pixels that actually belong to class i (i.e., the sum of all elements in row i of the confusion matrix).
- , number of samples in the confusion matrix where category i actually belongs to category i and is correctly categorized (diagonal element in the confusion matrix);
- n, total number of categories (number of categories categorized);
- N, total number of samples, i.e., the sum of the number of samples from all categories (sum of all elements in the confusion matrix);
- , the sum of the number of correctly categorized samples in all categories.
3. Results
3.1. Optimal Split Ratio
3.2. Classification Accuracy
3.3. Feature Importance
3.4. Crop-Distribution Map and Confusion Matrix
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Kwak, G.-H.; Park, N.-W. Impact of Texture Information on Crop Classification with Machine Learning and UAV Images. Appl. Sci. 2019, 9, 643. [Google Scholar] [CrossRef]
- Kim, Y.; Park, N.-W.; Lee, K.-D. Self-Learning Based Land-Cover Classification Using Sequential Class Patterns from Past Land-Cover Maps. Remote Sens. 2017, 9, 921. [Google Scholar] [CrossRef]
- Al-Awar, B.; Awad, M.M.; Jarlan, L.; Courault, D. Evaluation of Nonparametric Machine-Learning Algorithms for an Optimal Crop Classification Using Big Data Reduction Strategy. Remote Sens. Earth Syst. Sci. 2022, 5, 141–153. [Google Scholar] [CrossRef]
- Song, T.Q.; Zhang, X.Y.; Li, J.X.; Fan, H.S.; Sun, Y.Y.; Zong, D.; Liu, T.X. Research on application of deep learning in multi-temporal greenhouse extraction. Comput. Eng. Appl. 2020, 5, 12. [Google Scholar]
- Navalgund, R.R.; Jayaraman, V.; Roy, P.S. Remote Sensing Applications: An Overview. Curr. Sci. 2007, 93, 1747–1766. [Google Scholar]
- Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ. 2003, 88, 157–169. [Google Scholar] [CrossRef]
- Hufkens, K.; Melaas, E.K.; Mann, M.L.; Foster, T.; Ceballos, F.; Robles, M.; Kramer, B. Monitoring crop phenology using a smartphone based near-surface remote sensing approach. Agric. For. Meteorol. 2019, 265, 327–337. [Google Scholar] [CrossRef]
- Sivakumar, M.V.K.; Roy, P.S.; Harmsen, K.; Saha, S.K. Satellite remote sensing and gis applications in agricultural meteorology. In Proceedings of the Training Workshop, Dehradun, India, 7–11 July 2003. [Google Scholar]
- Li, Z.M.; Zhao, J.; Lan, Y.B.; Cui, X.; Yang, H.B. Crop classification based on UAV visible image. J. Northwest A F Univ. (Nat. Sci. Ed.) 2019, 11, 27. [Google Scholar]
- Malamiri, H.R.G.; Aliabad, F.A.; Shojaei, S.; Morad, M.; Band, S.S. A study on the use of UAV images to improve the separation accuracy of agricultural land areas. Comput. Electron. Agric. 2021, 184, 106079. [Google Scholar] [CrossRef]
- Rodríguez, J.; Lizarazo, I.; Prieto, F.A.; Morales, V.D.A. Assessment of potato late blight from UAV-based multispectral imagery. Comput. Electron. Agric. 2021, 184, 106061. [Google Scholar] [CrossRef]
- Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef]
- Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-based vegetation indices and image classification to estimate flower number in oilseed rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef]
- Li, L.; Mu, X.; Macfarlane, C.; Song, W.; Chen, J.; Yan, K.; Yan, G. A half-Gaussian fitting method for estimating fractional vegetation cover of corn crops using unmanned aerial vehicle images. Agric. For. Meteorol. 2018, 262, 379–390. [Google Scholar] [CrossRef]
- Marcaccio, J.V.; Markle, C.E.; Chow-Fraser, P. Unmanned Aerial Vehicles Produce High-Resolution, Seasonally-Relevant Imagery for Classifying Wetland Vegetation. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2015, XL-1/W4, 249–256. [Google Scholar] [CrossRef]
- Yu, J.; Shan, L.; Li, F. Application of multi-source image fusion technology in UAV. Radio Eng. J. 2019, 49, 581–586. [Google Scholar]
- Torres-Sánchez, J.; López-Granados, F.; Peña, J.M. An automatic object-based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops. Comput. Electron. Agric. 2015, 114, 43–52. [Google Scholar] [CrossRef]
- Shackelford, A.K.; Davis, C.H. A combined fuzzy pixel-based and object-based approach for classification of high-resolution multispectral data over urban areas. IEEE Trans. Geosci. Remote Sens. 2003, 41, 2354–2363. [Google Scholar] [CrossRef]
- Rodriguez-Galiano, V.F.; Ghimire, B.; Rogan, J.; Chica-Olmo, M.; Rigol-Sánchez, J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS J. Photogramm. Remote Sens. 2012, 67, 93–104. [Google Scholar] [CrossRef]
- Du, P.; Samat, A.; Waske, B.; Liu, S.; Li, Z. Random Forest and Rotation Forest for fully polarized SAR image classification using polarimetric and spatial features. ISPRS J. Photogramm. Remote Sens. 2015, 105, 38–53. [Google Scholar] [CrossRef]
- Radočaj, D.; Jurišić, M.; Gašparović, M.; Plaščak, I.; Antonić, O. Cropland Suitability Assessment Using Satellite-Based Biophysical Vegetation Properties and Machine Learning. Agronomy 2021, 11, 1620. [Google Scholar] [CrossRef]
- Li, M.; Ma, L.; Blaschke, T.; Cheng, L.; Tiede, D. A systematic comparison of different object-based classification techniques using high spatial resolution imagery in agricultural environments. Int. J. Appl. Earth Obs. Geoinf. 2016, 49, 87–98. [Google Scholar] [CrossRef]
- Lottes, P.; Khanna, R.; Pfeifer, J.; Siegwart, R.; Stachniss, C. UAV-based crop and weed classification for smart farming. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 3024–3031. [Google Scholar] [CrossRef]
- Peñá-Barragán, J.M.; Ngugi, M.K.; Plant, R.E.; Six, J. Object-based crop identification using multiple vegetation indices, textural features and crop phenology. Remote Sens. Environ. 2011, 115, 1301–1316. [Google Scholar] [CrossRef]
- Duke, O.P.; Alabi, T.; Neeti, N.; Adewopo, J.B. Comparison of UAV and SAR performance for Crop type classification using machine learning algorithms: A case study of humid forest ecology experimental research site of West Africa. Int. J. Remote Sens. 2022, 43, 4259–4286. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, F.; Qi, Y.; Deng, L.; Wang, X.; Yang, S. New research methods for vegetation information extraction based on visible light remote sensing images from an unmanned aerial vehicle (UAV). Int. J. Appl. Earth Obs. Geoinf. 2019, 78, 215–226. [Google Scholar] [CrossRef]
- Barrero, O.; Perdomo, S.A. RGB and multispectral UAV image fusion for Gramineae weed detection in rice fields. Precis. Agric. 2018, 19, 809–822. [Google Scholar] [CrossRef]
- Cimtay, Y.; Özbay, B.; Yilmaz, G.; Bozdemir, E. A New Vegetation Index in Short-Wave Infrared Region of Electromagnetic Spectrum. IEEE Access 2021, 9, 148535–148545. [Google Scholar] [CrossRef]
- Fan, Y.-G.; Feng, H.-K.; Liu, Y.; Bian, M.-B.; Zhao, Y.; Yang, G.-J.; Qian, J.-G. Estimation of potato plant nitrogen content using UAV multi-source sensor information. Spectrosc. Spect. Anal. 2022, 42, 3217–3225. [Google Scholar]
- Ma, Y.; Bian, M.; Fan, Y.; Chen, Z.; Yang, G.; Feng, H. Estimation of potassium content of potato plants based on UAV RGB images. Trans. Chin. Soc. Agric. Mach. 2023, 54, 196–203+233. [Google Scholar] [CrossRef]
- Bauer, S.D.; Korč, F.; Förstner, W. The potential of automatic methods of classification to identify leaf diseases from multispectral images. Precis. Agric. 2011, 12, 361–377. [Google Scholar] [CrossRef]
- AHuete, R.; Liu, H.; de Lira, G.R.; Batchily, K.; Escadafal, R. A soil color index to adjust for soil and litter noise in vegetation index imagery of arid regions. In Proceedings of the IGARSS’94—1994 IEEE International Geoscience and Remote Sensing Symposium, Pasadena, CA, USA, 8–12 August 1994; Volume 2, pp. 1042–1043. [Google Scholar] [CrossRef]
- Miura, T.; Huete, A.R.; Yoshioka, H. Evaluation of sensor calibration uncertainties on vegetation indices for MODIS. IEEE Trans. Geosci. Remote Sens. 2000, 38, 1399–1409. [Google Scholar] [CrossRef]
- Meyer, G.E.; Mehta, T.; Kocher, M.F.; Mortensen, D.A.; Samal, A. Textural imaging and discriminant analysis for distinguishing weeds for spot spraying. Trans. ASABE 1998, 41, 1189–1197. [Google Scholar] [CrossRef]
- Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
- Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine vision detection parameters for plant species identification. In Proceedings of the Precision Agriculture and Biological Quality, Boston, MA, USA, 3–4 November 1998. [Google Scholar]
- Daughtry, C.S.T.; Gallo, K.; Goward, S.N.; Prince, S.D.; Kustas, W.P. Spectral estimates of absorbed radiation and phytomass production in corn and soybean canopies. Remote Sens. Environ. 1992, 39, 141–152. [Google Scholar] [CrossRef]
- Chen, J.; Josef, C.; Jing, C. Retrieving leaf area index of boreal conifer forests using Landsat TM images. Remote Sens. Environ. 2000, 162, 153–162. [Google Scholar]
- Lyon, J.G.; Yuan, D.; Lunetta, R.; Elvidge, C.D. A change detection experiment using vegetation indices. Photogramm. Eng. Remote Sens. 1998, 64, 143–150. [Google Scholar]
- Rouse, J.W.; Haas, R.H.; Deering, D.W.; Schell, J.A.; Harlan, J.C. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; Great Plains Corridor; NTRS-NASA Technical Reports Server: Washington, DC, USA, 1973. [Google Scholar]
- Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubühler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
- Clay, D.E.; Kim, K.-I.; Chang, J.; Clay, S.A.; Dalsted, K. Characterizing Water and Nitrogen Stress in Corn Using Remote Sensing. Agron. J. 2006, 98, 579–587. [Google Scholar] [CrossRef]
- Kross, A.; McNairn, H.; Lapen, D.; Sunohara, M.; Champagne, C. Assessment of Rapid Eye vegetation indices for estimation of leaf area index and biomass in corn and soybean crops. Int. J. Appl. Earth Obs. Geoinf. 2015, 34, 235–248. [Google Scholar]
- Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
- Baloloy, A.B.; Blanco, A.C.; Candido, C.G.; Argamosa, R.J.L.; Dumalag, J.B.L.C.; Dimapilis, L.L.C.; Paringit, E.C. Estimation of mangrove forest aboveground biomass using multispectral bands, vegetation indices and biophysical variables derived from optical satellite imageries: Rapid eye, planet scope and sentinel-2. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 4, 29–36. [Google Scholar] [CrossRef]
- Penuelas, J.; Frederic, B.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll-a Ratio from Leaf Spectral Reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
- Haboudanea, D.; Millera, J.R.; Patteyc, E.; Zarco-Tejadad, P.J.; Strachane, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
- Tomás, A.d.; Nieto, H.; Guzinski, R.; Mendiguren, G.; Sandholt, I.; Berline, P. Multi-scale approach of the surface temperature/vegetation index triangle method for estimating evapotranspiration over heterogeneous landscapes. Geophys. Res. Abstr. 2012, 14, EGU2012-697. [Google Scholar]
- Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
- Ling, C.; Liu, H.; Ji, P.; Hu, H.; Wang, X.; Hou, R. Estimation of Vegetation Coverage Based on VDVI Index of UAV Visible Image—Using the Shelterbelt Research Area as An Example. For. Eng. 2021, 2, 57–66. [Google Scholar]
- Wen, L.; Yang, B.; Cui, C.; You, L.; Zhao, M. Ultrasound-Assisted Extraction of Phenolics from Longan (Dimocarpus longan Lour.) Fruit Seed with Artificial Neural Network and Their Antioxidant Activity. Food Anal. Methods 2012, 5, 1244–1251. [Google Scholar] [CrossRef]
- Gitelson, A.A. Wide Dynamic Range Vegetation Index for remote quantification of biophysical characteristics of vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef]
- Guo, Q.; Zhang, J.; Guo, S.; Ye, Z.; Deng, H.; Hou, X.; Zhang, H. Urban Tree Classification Based on Object-oriented Approach and Random Forest Algorithm Using Unmanned Aerial Vehicle (UAV) Multispectral Imagery. Remote Sens. 2022, 14, 3885. [Google Scholar] [CrossRef]
- Garg, R.; Kumar, A.; Prateek, M.; Pandey, K.; Kumar, S. Land Cover Classification of Spaceborne Multifrequency SAR and Optical Multispectral Data Using Machine Learning. Adv. Space Res. 2022, 69, 1726–1742. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef]
- Ajayi, O.G.; Opaluwa, Y.D.; Ashi, J.; Zikirullahi, W.M. Applicability of artificial neural network for automatic crop type classification on UAV-based images. Environ. Technol. Sci. J. 2022, 13. [Google Scholar] [CrossRef]
- Antoniadis, A.; Cugliari, J.; Fasiolo, M.; Goude, Y.; Poggi, J.M. Random Forests. In Statistical Learning Tools for Electricity Load Forecasting. Statistics for Industry, Technology, and Engineering; Birkhäuser: Cham, Switzerland, 2024. [Google Scholar]
- Liu, B.; Shi, Y.; Duan, Y.; Wu, W. UAV-Based Crops Classification with Joint Features from Orthoimage and DSM Data. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII-3, 1023–1028. [Google Scholar] [CrossRef]
- Guo, W.; Gong, Z.; Gao, C.; Yue, J.; Fu, Y.; Sun, H.; Zhang, H.; Zhou, L. An accurate monitoring method of peanut southern blight using unmanned aerial vehicle remote sensing. Precis. Agric. 2024, 25, 1857–1876. [Google Scholar] [CrossRef]
- Lan, Y.; Huang, Z.; Deng, X.; Zhu, Z.; Huang, H.; Zheng, Z.; Lian, B.; Zeng, G.; Tong, Z. Comparison of machine learning methods for citrus greening detection on UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105234. [Google Scholar] [CrossRef]
- Krzywinski, M.; Altman, N. Classification and regression trees. Nat. Methods 2017, 14, 757–758. [Google Scholar] [CrossRef]
- Xu, W.C.; Lan, Y.B.; Li, Y.H.; Luo, Y.F.; He, Z.Y. Classification method of cultivated land based on UAV visible light remote sensing. Int. J. Agric. Biol. Eng. 2019, 12, 103–109. [Google Scholar] [CrossRef]
- Peng, X.; Xue, W.; Luo, Y.; Zhao, Z. Precise classification of cultivated land based on visible remote sensing image of UAV. Int. J. Agric. Biol. Eng. 2019, 21, 79–86. [Google Scholar]
- Deng, H.; Zhang, W.; Zheng, X.; Zhang, H. Crop Classification Combining Object-Oriented Method and Random Forest Model Using Unmanned Aerial Vehicle (UAV) Multispectral Image. Agriculture 2024, 14, 548. [Google Scholar] [CrossRef]
- Yang, D.J.; Zhao, J.; Lan, Y.B.; Wen, Y.T.; Pan, F.J.; Cao, D.L.; Hu, C.X.; Guo, J.K. Research on farmland crop classification based on UAV multispectral remote sensing images. Int. J. Precis Agric. Aviat. 2021, 4, 29–35. [Google Scholar] [CrossRef]
- Yang, M.D.; Huang, K.S.; Kuo, Y.H.; Hui, T.; Lin, L.M. Spatial and spectral hybrid image classification for rice lodging assessment through UAV imagery. Remote Sens. 2017, 9, 583. [Google Scholar] [CrossRef]
- Hunt, E.R., Jr.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; McCarty, G.W. Acquisition of NIR-green–blue digital photographs from unmanned aircraft for crop monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef]
- Liu, T.; Abd-Elrahman, A. Multi-view object-based classification of wetland land covers using unmanned aircraft system images. Remote Sens. Environ. 2018, 216, 122–138. [Google Scholar] [CrossRef]
- Feng, Q.; Liu, J.; Gong, J. UAV Remote sensing for urban vegetation mapping using random forest and texture analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef]
- Allu, A.R.; Mesapam, S. Fusion of different multispectral band combinations of Sentinel-2A with UAV imagery for crop classification. J. Appl. Remote Sens. 2024, 18, 016511. [Google Scholar] [CrossRef]
- Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Biau, G.; Scornet, E. A random forest guided tour. Test 2016, 25, 197–227. [Google Scholar] [CrossRef]
- Fradkin, D.; Muchnik, I. Support vector machines for classification. DIMACS Ser. Discrete. Math. Theor. Comput. Sci. 2006, 70, 13–20. [Google Scholar]
- Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2019, 129, 341–351. [Google Scholar] [CrossRef]
- Zhu, J.; Pan, Z.; Wang, H.; Huang, P.; Sun, J.; Qin, F.; Liu, Z. An Improved Multi-temporal and Multi-feature Tea Plantation Identification Method Using Sentinel-2 Imagery. Sensors 2019, 19, 2087. [Google Scholar] [CrossRef]
- Huang, X. High Resolution Remote Sensing Image Classification Based on Deep Transfer Learning and Multi Feature Network. IEEE Access 2023, 11, 110075–110085. [Google Scholar] [CrossRef]
- Thakur, R.; Panse, P. Classification Performance of Land Use from Multispectral Remote Sensing Images Using Decision Tree, K-Nearest Neighbor, Random Forest and Support Vector Machine Using EuroSAT Data. Int. J. Intell. Syst. Appl. Eng. 2022, 10, 67–77. [Google Scholar]
- Wang, R.; Shi, W.; Kronzucker, H.; Li, Y. Oxygenation Promotes Vegetable Growth by Enhancing P Nutrient Availability and Facilitating a Stable Soil Bacterial Community in Compacted Soil. Soil. Tillage Res. 2023, 230, 105686. [Google Scholar] [CrossRef]
- Sharma, R.C. Dominant Species-Physiognomy-Ecological (DSPE) System for the Classification of Plant Ecological Communities from Remote Sensing Images. Ecologies 2022, 3, 25. [Google Scholar] [CrossRef]
UAV and Sensors | Technical Specifications | Specific Values |
---|---|---|
UAV | Total weight (including battery and lens) | 7.1 kg |
Maximum horizontal flight Speed (automatic mode) | 17 m/s | |
Maximum flight time | 55 min | |
Symmetrical motor axial distance | 895 mm | |
DJI Zenmuse P1 | Lens | DJI Zenmuse P1\35 mm\FOV 63.5° |
Image size | 3:2 (8192 × 5460) | |
Effective pixels | 45 million | |
Aperture range | f/2.8–f/16 | |
Spectral range | R: 620–750 nm G: 490–570 nm B: 450–490 nm | |
MS600 Pro | Effective pixels | 1.2 Mpx |
Spectral channels | 450 nm @ 35 nm, 555 nm @ 27 nm, 660 nm @ 22 nm, 720 nm @ 10 nm, 750 nm @ 10 nm, 840 nm @ 30 nm | |
Image format | 16-bit Raw TIFF & 8-bit Reflectance JPEG | |
Coverage width | 110 m × 83 m @ h120 m |
Vegetation Index | Abbreviation | Formula | Brief Description | Source |
---|---|---|---|---|
Chlorophyll Index–Green Light | CIg | Evaluate chlorophyll content in leaves | * | |
Chlorophyll index–red edge | CIre | Evaluate chlorophyll content in leaves | * | |
Vegetation Color Index | CIVE | Reflects the color characteristics of vegetation, can be used to identify vegetation types and estimate biomass | [32] | |
Enhanced Vegetation Index | EVI | Reduces the influence of atmospheric and soil noise, providing a stable response to the vegetation condition in the measured area | [33] | |
Excess Green Index | EXG | Used for detecting vegetation | [34] | |
Excess Green Minus Red Index | EXGR | Can effectively distinguish green vegetation from non-vegetated areas in complex backgrounds | [35] | |
Excess Red Index | EXR | Used for detecting non-vegetated areas | [36] | |
Normalized Green Difference Vegetation Index | GNDVI | Used to enhance the detection and analysis of green vegetation | [37] | |
Modified second ratio index | MSRI | Enhances vegetation signals while reducing atmospheric interference and soil background noise | [38] | |
Normalized difference green degree index | NDGI | Evaluate the greenness of vegetation | [39] | |
Normalized Difference Vegetation Index | NDVI | Used to measure vegetation coverage and health status | [40] | |
Red-edge normalized difference vegetation index | NDVIre | Evaluate the health status of vegetation | * | |
Normalized Green–Blue Difference Index | NGBDI | Identify vegetated areas and reflect their health status | [41] | |
Chlorophyll Normalized Vegetation Index | NPCI | Assess chlorophyll content and photosynthetic capacity of plants | [42] | |
Red Edge Triangle Vegetation Index | RTVICore | Evaluate the leaf area index and biomass vegetation index | [43] | |
Source Address Validation Improvement | SAVI | Reduce soil background influence | [44] | |
Structure-Independent Pigment Index | SIPI | Monitor vegetation health, detect plant physiological stress, and analyze crop yield | [45] | |
simple ratio | SR | Common vegetation index for assessing vegetation quantity | [46] | |
Transform Chlorophyll Absorption Index | TCARI | Evaluate chlorophyll content and plant health | [47] | |
Triangle Vegetation Index | TVI | Indicate the relationship between the absorbed radiant energy by vegetation and the reflectance in red, green, and near-infrared bands, useful for monitoring crop biomass | [48] | |
Visible Atmospherically Resistant Index | VARI | Reduce the impact of atmospheric conditions on vegetation index calculations | [49] | |
Visible–band Difference Vegetation Index | VDVI | Utilize differences in the visible spectrum to assess vegetation health and coverage | [50] | |
The vegetative index | VEG | Vegetation index based on RGB channel information, used to estimate grassland vegetation coverage | [51] | |
Wide Dynamic Range Vegetation Index | WDRVI | Overcome the sensitivity reduction of NDVI under medium-to-high-density green biomass | [52] |
Number | Features | Source of Features |
---|---|---|
P1 | SF | Visible light |
P2 | GEOM | |
P3 | GLCM | |
P4 | SF + GEOM | |
P5 | SF + GLCM | |
P6 | GEOM + GLCM | |
P7 | ALL | |
P8 | SF | Multispectral |
P9 | GEOM | |
P10 | GLCM | |
P11 | SF + GEOM | |
P12 | SF + GLCM | |
P13 | GEOM + GLCM | |
P14 | ALL | |
P15 | SF | Visible light + multispectral |
P16 | GEOM | |
P17 | GLCM | |
P18 | SF + GEOM | |
P19 | SF + GLCM | |
P20 | GEOM + GLCM | |
P21 | ALL |
Class | Number of Training Sample Plots | Number of Random Points |
---|---|---|
Background | 159 | 610 |
Corn | 46 | 435 |
Red bean | 106 | 1549 |
Poplar | 48 | 357 |
Soybean | 66 | 518 |
Wheat | 14 | 109 |
Rice | 66 | 1329 |
Road | 30 | 239 |
Models | Packages in the R Language4.2.1 | Key Parameters |
---|---|---|
RF | randomForest | Max tree number = 500 Min sample count = 1 Max categories = 16 mtry = 3 |
SVM | e1071 | Kernel = linear C = 2 gamma = 0.1 |
KNN | class | K = 1 |
CART | rpart | Max categories = 16 minsplit = 20 Maxdepth = 10 Complexity parameter = 0.01 |
ANN | nnet | size = 10 Maxit = 500 decay = 0.01 learningrate = 0.01 |
Experimental Protocol | RF | SVM | KNN | CART | ANN | |||||
---|---|---|---|---|---|---|---|---|---|---|
OA | Kappa | OA | Kappa | OA | Kappa | OA | Kappa | OA | Kappa | |
P1 | 92.48% | 0.91 | 80.96% | 0.78 | 89.66% | 0.88 | 85.90% | 0.84 | 92.36% | 0.90 |
P2 | 54.17% | 0.47 | 46.52% | 0.39 | 44.30% | 0.36 | 50.06% | 0.43 | 55.17% | 0.47 |
P3 | 81.08% | 0.78 | 37.18% | 0.21 | 75.44% | 0.72 | 67.33% | 0.63 | 80.59% | 0.78 |
P4 | 93.54% | 0.93 | 83.78% | 0.78 | 93.74% | 0.91 | 88.13% | 0.86 | 92.89% | 0.92 |
P5 | 92.60% | 0.91 | 74.15% | 0.67 | 79.08% | 0.76 | 88.37% | 0.87 | 91.24% | 0.91 |
P6 | 82.37% | 0.80 | 38.76% | 0.25 | 72.50% | 0.68 | 76.73% | 0.73 | 83.76% | 0.81 |
P7 | 93.07% | 0.92 | 73.13% | 0.67 | 78.65% | 0.74 | 88.25% | 0.86 | 92.17% | 0.91 |
P8 | 93.42% | 0.92 | 58.34% | 0.55 | 68.50% | 0.64 | 87.66% | 0.86 | 91.25% | 0.90 |
P9 | 65.10% | 0.60 | 58.34% | 0.49 | 55.58% | 0.49 | 53.11% | 0.46 | 66.28% | 0.61 |
P10 | 87.90% | 0.86 | 47.85% | 0.39 | 61.27% | 0.59 | 77.44% | 0.74 | 85.49% | 0.83 |
P11 | 94.60% | 0.94 | 57.56% | 0.51 | 68.73% | 0.59 | 88.25% | 0.86 | 93.66% | 0.92 |
P12 | 96.83% | 0.96 | 39.86% | 0.29 | 50.13% | 0.47 | 93.42% | 0.92 | 97.23% | 0.96 |
P13 | 88.01% | 0.86 | 60.76% | 0.51 | 70.11% | 0.68 | 76.26% | 0.73 | 89.23% | 0.88 |
P14 | 96.83% | 0.96 | 33.86% | 0.22 | 45.27% | 0.43 | 94.71% | 0.94 | 96.25% | 0.95 |
P15 | 95.30% | 0.95 | 64.50% | 0.58 | 74.38% | 0.71 | 85.31% | 0.83 | 94.56% | 0.93 |
P16 | 66.16% | 0.61 | 59.72% | 0.51 | 63.16% | 0.61 | 54.17% | 0.47 | 67.25% | 0.62 |
P17 | 92.60% | 0.91 | 54.43% | 0.56 | 53.25% | 0.51 | 80.14% | 0.77 | 93.05% | 0.91 |
P18 | 96.36% | 0.96 | 64.77% | 0.61 | 71.26% | 0.69 | 94.36% | 0.93 | 94.89% | 0.93 |
P19 | 97.06% | 0.97 | 36.69% | 0.25 | 46.28% | 0.44 | 89.42% | 0.88 | 96.85% | 0.95 |
P20 | 93.65% | 0.93 | 64.07% | 0.55 | 69.27% | 0.63 | 85.08% | 0.83 | 94.25% | 0.93 |
P21 | 97.77% | 0.97 | 42.80% | 0.31 | 51.75% | 0.49 | 93.30% | 0.92 | 96.28% | 0.95 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zheng, Z.; Yuan, J.; Yao, W.; Kwan, P.; Yao, H.; Liu, Q.; Guo, L. Fusion of UAV-Acquired Visible Images and Multispectral Data by Applying Machine-Learning Methods in Crop Classification. Agronomy 2024, 14, 2670. https://doi.org/10.3390/agronomy14112670
Zheng Z, Yuan J, Yao W, Kwan P, Yao H, Liu Q, Guo L. Fusion of UAV-Acquired Visible Images and Multispectral Data by Applying Machine-Learning Methods in Crop Classification. Agronomy. 2024; 14(11):2670. https://doi.org/10.3390/agronomy14112670
Chicago/Turabian StyleZheng, Zuojun, Jianghao Yuan, Wei Yao, Paul Kwan, Hongxun Yao, Qingzhi Liu, and Leifeng Guo. 2024. "Fusion of UAV-Acquired Visible Images and Multispectral Data by Applying Machine-Learning Methods in Crop Classification" Agronomy 14, no. 11: 2670. https://doi.org/10.3390/agronomy14112670