Next Article in Journal
Evaluating the Effect of Nano-SiO2 on Different Types of Soils: A Multi-Scale Study
Previous Article in Journal
Sentiment Analysis in Understanding the Potential of Online News in the Public Health Crisis Response
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accuracy of Vegetation Indices in Assessing Different Grades of Grassland Desertification from UAV

1
Ministry of Education Key Laboratory of Ecology and Resource Use of the Mongolian Plateau, School of Ecology and Environment, Inner Mongolia University, Hohhot 010021, China
2
Collaborative Innovation Center for Grassland Ecological Security (Jointly Supported by the Ministry of Education of China and Inner Mongolia Autonomous Region), Hohhot 010021, China
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2022, 19(24), 16793; https://doi.org/10.3390/ijerph192416793
Submission received: 9 November 2022 / Revised: 11 December 2022 / Accepted: 12 December 2022 / Published: 14 December 2022

Abstract

:
Grassland desertification has become one of the most serious environmental problems in the world. Grasslands are the focus of desertification research because of their ecological vulnerability. Their application on different grassland desertification grades remains limited. Therefore, in this study, 19 vegetation indices were calculated for 30 unmanned aerial vehicle (UAV) visible light images at five grades of grassland desertification in the Mu Us Sandy. Fractional Vegetation Coverage (FVC) with high accuracy was obtained through Support Vector Machine (SVM) classification, and the results were used as the reference values. Based on the FVC, the grassland desertification grades were divided into five grades: severe (FVC < 5%), high (FVC: 5–20%), moderate (FVC: 21–50%), slight (FVC: 51–70%), and non-desertification (FVC: 71–100%). The accuracy of the vegetation indices was assessed by the overall accuracy (OA), the kappa coefficient (k), and the relative error (RE). Our result showed that the accuracy of SVM-supervised classification was high in assessing each grassland desertification grade. Excess Green Red Blue Difference Index (EGRBDI), Visible Band Modified Soil Adjusted Vegetation Index (V-MSAVI), Green Leaf Index (GLI), Color Index of Vegetation Vegetative (CIVE), Red Green Blue Vegetation Index (RGBVI), and Excess Green (EXG) accurately assessed grassland desertification at severe, high, moderate, and slight grades. In addition, the Red Green Ratio Index (RGRI) and Combined 2 (COM2) were accurate in assessing severe desertification. The assessment of the 19 indices of the non-desertification grade had low accuracy. Moreover, our result showed that the accuracy of SVM-supervised classification was high in assessing each grassland desertification grade. This study emphasizes that the applicability of the vegetation indices varies with the degree of grassland desertification and hopes to provide scientific guidance for a more accurate grassland desertification assessment.

1. Introduction

Grassland desertification is a serious ecological and environmental problem in arid and semi-arid areas [1]. Thus, how to effectively mitigate and combat grassland desertification has become the focus of global ecological and environmental studies [2]. China has nearly 400 million hectares of natural grasslands, accounting for 41.7% of the country’s land area. However, 90% of these grasslands show different degrees of degradation [3]. Grassland desertification is one of the most important aspects of grassland degradation [4]. Grassland desertification seriously affects the balance of ecosystems in a region and leads to the deterioration of the ecological environment, such as local soil erosion and fertility decline [5]. Therefore, assessing the degree of grassland desertification plays an important role in grassland restoration and continued healthy development.
Grassland desertification assessment is important for effective desertification management [6]. Remote sensing is the main method used to assess the extent of desertification in large-scale grasslands [7]. Initially, scholars used empirical visual interpretation to assess desertification, but this is usually labor-intensive and time-consuming and is not suitable for assessing large areas [8]. Moreover, the results of visual interpretation can be very inaccurate [9]. The development of remote sensing indices has facilitated desertification assessment, and a single remote sensing index is widely used in research [10,11,12,13]. However, since a single remote sensing index is greatly influenced by atmospheric and soil factors, its assessment results have certain limitations. For example, the Normalized Difference Vegetation Index (NDVI) is related to humidity, and the Enhanced Vegetation Index (EVI) is affected by soil moisture [14]. It is more reasonable to combine a single remote sensing index with various other remote sensing data, such as precipitation and soil data, to monitor desertification [15].
Fractional Vegetation Coverage (FVC) and its changes can directly reflect the changes in the ecological environment in desertification areas and indicate their climate, hydrology, and ecology [16]. Thus, FVC is an important indicator that reflects the degree of desertification [17]. Traditional vegetation cover can be obtained by field investigation, but the process takes up considerable labor and economic resources [18]. The supervised classification method consumes considerable storage space and time, so its usage is limited in large-scale research [19]. With the continuous optimization of the vegetation index, the calculation of the vegetation index through remote sensing images can more quickly and effectively reflect the vegetation coverage at any scale on the surface [10,20,21]. Therefore, methods based on vegetation indices have been used for estimating vegetation cover [20].
Some vegetation indices are constructed based on ratios such as the NDVI, Soil-Adjusted Vegetation Index (SAVI), Modified SAVI (MSAVI), Green Red Vegetation Index (GRVI), and EVI [20,22,23,24,25,26]. Some vegetation indices are constructed in the form of linear combinations, such as the Perpendicular Vegetation Index (PVI), Weighted Difference Vegetation Index (WDVI), Tasseled Cap Greenness (TCG), and Transformed Vegetation Index (TVI) [27,28,29]. Some vegetation indices are constructed based on soil information, such as Soil-Adjusted and Atmospherically Resistant Vegetation Index (SARVI) and Modified Non-Linear Vegetation Index (MNLI) [30,31]. A series of vegetation indices have been shown to be closely related to desertification and can quickly assess regional desertification [32]. Thus, this study assessed the applicability of these vegetation indices in desertification to provide a scientific basis for grassland desertification assessment [33].
Some studies have found that the applicability of vegetation indices is usually related to datasets, sensors, soil type, atmosphere, etc. These factors have led to a large variation in the applicability of vegetation indices [30,34]. Although many studies have conducted grassland desertification assessments using vegetation indices, to our knowledge, no research has explored the application of vegetation indices to different grades of grassland desertification. Usually, desertification is classified into five levels, which are severe, severe, moderate, slight, and non-desertification. The lower the FVC, the more serious the desertification. With the advantages of low cost, simple operation, and high spatial resolution, UAVs have been widely used in many fields, such as vegetation monitoring and environmental assessment [35,36,37,38]. However, a few studies have conducted desertification assessments based on UAV visible light images [39].
In this study, we selected 19 vegetation indices for evaluating grassland desertification based on UAV visible light images in a typical area of the Mu Us Sandy. We assessed the applicability of the vegetation indices at different grades of grassland desertification. The objectives of this study were to calculate visible vegetation indices using UAV images, to assess the accuracy of vegetation indices under five grassland desertification grades, and to find the vegetation indices suitable for assessing desertification in each grade.

2. Materials and Methods

2.1. Study Area

The study area is located in a typical area of the Mu Us Sandy (108°17′–109°40′ E, 37°38′–39°23′ N), which has a temperate continental monsoon climate with a mean annual temperature of 6–9 °C. It has an increase in the mean annual precipitation from approximately 250 mm in the northwest to approximately 440 mm in the southeast. The altitude of the area is 1200–1600 m, and the terrain slopes from northwest to southeast [40,41]. The main soils are Kastanozems, Arenosols, Histosols, and Solonchaks. The zonal vegetation in the study area is dominated by Stipa bungeana and Thymus serpyllum [42].

2.2. UAV Visible Light Image Acquisition

This study was conducted during the vegetation growing season from 28 August to 5 September 2021. Images are taken from 11 a.m. to 3 p.m. daily. During this period, the weather was good when the images were taken. The average wind speed of the images taken was 8 km/h, the average visibility was 25 km, and the average relative humidity was 31%. Thus, the obtained UAV images were less affected by meteorological factors and other factors. The DJI Air 2s drone (with a camera of 20 megapixels and a 1-inch image sensor) was used to take pictures in different areas with large differences in grassland desertification grades (Figure 1). The imagery produced by the consumer-grade visible light UAV provides an economical and efficient method for monitoring and evaluating the surface vegetation, so the UAV was selected for grassland desertification assessment. The UAV images were taken with the camera pointing vertically downward at a flight height of 10 m and a spatial resolution of approximately 0.20 cm/pixel. To obtain a clearer image and eliminate the distortion of the image as much as possible, we chose a flying height of 10 m. Because this study does not involve the central wavelength position of each band and the range of the band, radiometric correction of the obtained UAV images is not needed [43]. DJI Air 2s drone does not support waypoint flight, so a single image was used.

2.3. Supervised Classification and Desertification Grades

In this paper, we conducted 15 flights with a total of 15 sites (Figure 1), and 2 images were randomly selected for each site. A total of 30 images of different grassland desertification grades were selected. According to the previous studies [44,45], this paper selects 30 images to meet the statistical minimum sample size. We selected 60 vegetation and 60 non-vegetated regions of interest (ROIs) as the training set from each image for supervised classification by support vector machine (SVM) to obtain the desertification grades from these images. The kernel function is a radial basis function, the gamma in the kernel function is 0.333, and the classification probability threshold is a default value of 0. For each image, we selected 15 vegetated and 15 non-vegetated ROIs as the test set. We calculated the confusion matrix to derive the overall accuracy (OA) and the kappa coefficient (k) to verify the accuracy of the supervised classification results. We counted each image’s vegetation and non-vegetation cells after supervised classification to obtain the FVC and used it as the reference value. The above process was calculated based on ENVI 5.3 software [46,47].
The confusion matrix is an effective descriptive tool to demonstrate accuracy assessment, and it is the most basic, intuitive, and computationally simplest method for measuring classification accuracy [48]. Therefore, we chose the confusion matrix to evaluate the classification accuracy. OA is one of the most common evaluation metrics that can be used to assess a model’s performance visually. The higher the accuracy, the better the classifier. k is applied to determine whether the model’s results are consistent with the actual results: k = 1 indicates that the results are completely consistent; k ≥ 0.75 is considered satisfactory; and k < 0.4 is considered not ideal [49].
O A = T P + T N T P + F N + F P + T N
k = N i = 1 r x i i i = 1 r ( x i + x + i ) N 2 i = 1 r ( x i + x + i )
where the number of pixels correctly classified as positive samples is denoted by TP, and the number of pixels correctly classified as negative samples is denoted by FN. The number of pixels with errors for negative samples is denoted by FP, and the number of pixels with errors for positive samples is denoted by TN. In the equation of k, r is the total number of categories in the confusion matrix; N is the total number of pixels used for accuracy evaluation; x i i is the total number of pixels correctly extracted in the confusion matrix; x i + and x + i are the total number of pixels for each row and column of the confusion matrix.
According to the classification standard of desertification grades used in the 1:100,000 Distribution Atlas of Chinese Deserts published by the “Environmental & Ecological Science Data Center for West China, National Natural Science Foundation of China (http://westdc.westgis.ac.cn) and China’s national standard Technical Code of Practice on the Sandified Land Monitoring (GB/T 24255-2009). Desertification is classified as severe desertification, heavy desertification, moderate desertification, light desertification, and non-desertification based on the FVC from the supervised classification (Table 1) [50].

2.4. Vegetation Index Assessment of Desertification

To assess grassland desertification, we selected 19 vegetation indices to calculate the FVC of UAV visible light images (Table 2). Vegetation and soil were separated by vegetation indices, and vegetation and non-vegetation areas in the image were segmented by the Otsu thresholding method to obtain a binary image [51]. Then, the numbers of vegetation and non-vegetation cells were counted to obtain the FVC of the image. This method used a threshold to separate the image into two parts, the target and background (vegetation and non-vegetation), based on the gray-level distribution of the image. When the between-class variance target and background are larger, the difference between these two parts is larger. When part of the target is misclassified into the background or part of the background is misclassified into the target, the between-class variance decreases. The maximum between-class variance can minimize the misclassification and omission in the segmentation and then extract the target information effectively [52]. The method has better results for images where the target and background have distinctly different grayscale features. The calculation is simple and fast and is not affected by image brightness and contrast [53]. Therefore, this study selected this method to segment the vegetation and non-vegetation areas on the image. The above calculations were performed using Python 3.7.

2.5. Accuracy Verification and Statistical Analysis

In this study, we used two methods to evaluate the accuracy of the vegetation indices for assessing grassland desertification. First, 80 vegetated and 80 non-vegetated surface realistic ROIs were randomly selected as references from each image, and the confusion matrix was calculated to derive the OA and k [46]. They did not overlap with the train and test set above, and contain approximately 12% of the entire image area.
Assessment of the accuracy of vegetation indices based on surface realistic ROI takes advantage of the ease of visual identification of vegetation and non-vegetation information in high-resolution images [66]. Although the vegetation and non-vegetation realistic areas of interest are selected evenly over the entire image, they are still partial validations of the image elements, and human error cannot be avoided [63]. This study also chose RE as another accuracy evaluation method to reduce this error. The supervised classification results of the UAV visible light images were taken as the reference values of surface vegetation cover, and the Relative Error (RE) of the FVC by vegetation indices was calculated. The smaller the RE is, the higher the accuracy of the vegetation indices is [67]. The average value of the accuracy verification from 6 images of every desertification grade was calculated and used as the final result of different grassland desertification grades.
R E = V S U P V V I V S U P
where R E is the relative error of FVC, V S U P is the FVC obtained by supervised classification, and V V I is the FVC obtained by the vegetation index.
To assess the applicability of the vegetation indices under different grades of grassland desertification, all statistical analyses in this study were performed by one-way analysis of variance (ANOVA) using Python 3.7. Duncan’s method was used to test for significant differences between the three accuracy verification indicators of different vegetation indices in five desertification grades (p < 0.05) [60].
This study only distinguished vegetation and non-vegetation areas. The images taken by the UAV had a high resolution of 0.2 cm, which was able to monitor most of the vegetation features. Therefore, the validation samples were directly selected on the image without field measurements.

3. Results

3.1. UAV Visible Light Image Surveillance Classification

The OA of the supervised classification for all 30 images was above 98%, and the k was above 0.96, so the supervised classification results were accurate (Table 3). Therefore, the FVC calculated after the supervised classification was taken as the true value of the image. Based on the FVC of 30 images, five grades of grassland desertification were classified as severe desertification, severe desertification, moderate desertification, mild desertification, and non-desertification.

3.2. Visible Light Vegetation Index Accuracy Assessment

3.2.1. Vegetation Index Accuracy Assessment of the Severe Desertification Grade

There were significant differences between the 19 vegetation indices under the three accuracy indicators of the severe desertification grade (Table 4). The accuracies of EGRBDI, V-MSAVI, GLI, RGBVI, CIVE, EXG, COM2, NGBDI, EXB, RGRI, GBRI, and VEG were significantly higher than those of COM, DEVI, and g in the OA (Figure 2A). The accuracies of V-MSAVI, EGRBDI, GLI, RGBVI, CIVE, EXG, RGRI, and COM2 in the k were significantly higher than those of EXR, COM, g, and DEVI (Figure 2B). The accuracies of the remaining indices in the RE were significantly higher than those of DEVI, g, and COM (Figure 2C). Therefore, V-MSAVI, EGRBDI, GLI, RGBVI, CIVE, EXG, RGRI, and COM2 were more accurate and stable for severe desertification grade assessment.

3.2.2. Vegetation Index Accuracy Assessment of the High Desertification Grade

There were significant differences between the 19 vegetation indices under the three accuracy indicators of the high desertification grade (Table 5). The accuracies of EGRBDI, V-MSAVI, CIVE, GLI, RGBVI, and EXG were significantly higher than those of DEVI, EXR, GBRI, and COM in the OA (Figure 3A). The accuracies of EGRBDI, V-MSAVI, GLI, CIVE, RGBVI, EXG, and COM2 in the k were significantly higher than those of g, DEVI, EXR, GBRI, and COM (Figure 3B). The accuracies of the remaining indices in the RE were significantly higher than those of DEVI and COM (Figure 3C). Therefore, EGRBDI, V-MSAVI, GLI, CIVE, RGBVI, and EXG were more accurate and stable for high desertification grade assessment.

3.2.3. Vegetation Index Accuracy Assessment of the Moderate Desertification Grade

There were significant differences between the 19 indices under the three accuracy indicators of the high desertification grade (Table 6). The accuracies of the remaining indices were significantly higher than those of DEVI, g, and GBRI in the OA (Figure 4A). The accuracies of EGRBDI, GLI, V-MSAVI, CIVE, RGBVI, and EXG in the k were significantly higher than those of DEVI, g, and GBRI (Figure 4B). The accuracies of RGBVI, GLI, EXGR, EXR, NGRDI, MGRVI, COM, V-MSAVI, EGRBDI, and VEG in the RE were significantly higher than those of the other indices (Figure 4C). Therefore, EGRBDI, GLI, V-MSAVI, CIVE, RGBVI, and EXG were more accurate and stable for moderate desertification grade assessment.

3.2.4. Vegetation Index Accuracy Assessment of the Slight Desertification Grade

There were significant differences between the 19 indices under the three accuracy indicators of the high desertification grade (Table 7). The accuracies of EGRBDI, V-MSAVI, RGBVI, GLI, CIVE, and EXG were significantly higher than those of GBRI, COM2, COM, DEVI, and g in the OA (Figure 5A). The accuracies of EGRBDI, V-MSAVI, RGBVI, GLI, CIVE, EXG, VEG, NGBDI, and EXB in the k were significantly higher than those of GBRI, COM2, COM, DEVI, and g (Figure 5B). g was significantly better than COM2 in the RE, but the OA of g and k was lower (Figure 5C). Therefore, EGRBDI, V-MSAVI, RGBVI, GLI, CIVE, and EXG were more accurate and stable for slight desertification grade assessment.

3.2.5. Vegetation Index Accuracy Assessment of the Non-Desertification Grade

Significant differences between the 19 vegetation indices in the k at the non-desertification grade and non-significant differences between the 19 vegetation indices in the OA and RE were observed (Table 8). The accuracies of EGRBDI, RGBVI, GLI, EXG, RGRI, V-MSAVI, CIVE, NGRDI, MGRVI, EXGR, EXR, and GBRI in the k were significantly higher than those of the other indices (Figure 6B). The RE of VEG and DEVI was smaller (Figure 6C), but their OA and k were lower (Figure 6A,B), so the accuracy of VEG and DEVI were lower. The FVC derived from the vegetation indices was similar to the supervised classification values, but the OA and k for all indices were low. There were misclassifications of images, and 19 vegetation indices had low accuracy. All vegetation indices were not applicable to assessments at the non-desertification grade.

4. Discussion

Previous studies have shown that supervised classification is one of the tools for the effective identification of vegetation information [68,69]. In this study, SVM supervised classification of 30 UAV visible light images revealed that the method was accurate in assessing grassland desertification (Table 1). Therefore, using the FVC obtained by SVM supervised classification as reference values is reliable for dividing desertification grades. Ma et al. showed that SVM is accurate in planting structures of varying complexity, which is consistent with our results [70]. The reason is that SVM is based on the Vapnik–Chervonenkis dimension theory of statistical learning and the principle of minimum structural risk. SVM provides a good balance between the complexity of the model and the learning ability [70]. In addition, SVM transforms a complex learning problem into a high-dimensional simplified linear problem, which can improve reliability and control classification ability [70,71].
Our study found that V-MSAVI, EGRBDI, GLI, RGBVI, CIVE, and EXG are suitable for the assessment of severe, high, and moderate desertification. RGRI and COM2 are suitable for the assessment of slight desertification. In addition, there were no vegetation indices suitable for desertification assessment of the non-desertification grade. This indicated that the application of vegetation indices was not the same in different desertification grades. Our findings are similar to the results of Lima-Cueto et al., who used vegetation indices to quantify olive grove cover [72]. Their study indicated that the vegetation indices were affected by olive grove coverage and that the vegetation indices were less accurate at high coverage. This is mainly because the ability of the vegetation indices to distinguish between vegetation and soil is weakened in areas with very high coverage heterogeneity. Meanwhile, our results are similar to those of Zhao et al., who extracted maize vegetation cover based on visible light UAV images [73]. They concluded that maize cover affects the accuracy of the vegetation indices and that the accuracy of the vegetation indices decreases with increasing maize cover. The main reason is that vegetation leaves reflect light under the sunlight, and the increase in the area of the shaded part with the increase in vegetation cover.
The accuracy of the vegetation index varies in different classes of grassland desertification, which may be due to the following reasons. In the visible light range, healthy green vegetation has a strong reflection in the green light band and strong absorption in the blue and red light bands [74]. V-MSAVI, EGRBDI, GLI, RGBVI, and EXG use the square of the green band or two times the green band to further enhance the strong reflection of vegetation in the green band, improving the ability of the index to identify green vegetation information. Determined by the spectral reflection characteristics of vegetation and soil, there is usually a spectral overlap between vegetation and bare soil in the red and green bands [43]. Therefore, vegetation indices from a single band and those containing only green-red light bands or red-blue light bands do not separate vegetated and non-vegetated areas well [75]. Both CIVE and the above five vegetation indices utilize the combined operation of red, green, and blue bands to extract vegetation information effectively. The accuracy of the RGRI and COM2 indices is high in severe desertification. This may be because there is very little green vegetation in severe desertification, and most of them are bare soil, which weakens the influence of vegetation and soil spectral overlap on the extraction of vegetation information by these two indices. These factors affect the accuracy and lead to difficulty identifying vegetation in desert grassland areas. Therefore, the six vegetation indices can be well used for desertification assessment compared to other vegetation indices.
Our study found that the vegetation index is not applicable to the desertification assessment of the non-desertification grade. This is mainly because UAV photography was taken during the vegetation bloom period, and vegetation had a very high coverage in some areas. High vegetation coverage often means high spatial heterogeneity [76]. Generally, the greenness of desert grassland vegetation is less than that of other grassland types, and some vegetation appears brown or yellowish brown. The color of desert grassland non-vegetation is closer to yellow, and it is difficult to distinguish vegetation and non-vegetation parts from each other at the color level [74]. Sufficient remote sensing information is needed to reflect vegetation characteristics. Therefore, it is difficult to distinguish vegetation and non-vegetation using only the vegetation index constructed only from visible light [77]. This study empirically demonstrated that supervised classification methods were more effective in grassland desertification assessment of the non-desertification grade (Figure 6). In the future, new methods, such as near-infrared and visible band combinations, color mixture analysis, image texturing, and neural network machine learning, will be helpful to improve the accuracy of grassland desertification assessment [55,78].

5. Conclusions

Our study collected UAV visible light images of different grades of desertification grassland during the plant growing season. We evaluated the accuracy of 19 vegetation indices in 5 grades of grassland desertification. V-MSAVI, EGRBDI, GLI, RGBVI, CIVE, and EXG have high accuracy in assessing severe, high, and moderate-grade desertification. RGRI and COM2 have high accuracy in assessing slight-grade desertification. All vegetation indices have low accuracy in non-desertification. This study emphasized that the application of vegetation indices was controlled by the grade of desertification. Therefore, we suggest that the desertification grade should be considered when using the vegetation indices to assess grassland desertification.

Author Contributions

Conceptualization, Q.Z.; methodology, X.X. and Q.Z.; software, X.X., L.L. and P.H.; validation, X.X. and Q.Z.; formal analysis, X.X. and Q.Z.; investigation, X.X., L.L., X.G. and P.H.; resource, X.X. and Q.Z.; data curation, X.X.; writing—original draft preparation, X.X.; writing—review and editing, Q.Z.; visualization, X.X., L.L. and P.H.; supervision, X.X., Q.Z. and X.G.; project administration, Q.Z.; funding acquisition, Q.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported by the Major Program of Inner Mongolia (2021ZD0008), the Cooperation project of science and technology promotion in Inner Mongolia (2022EEDSKJXM002-1), the Key Science and Technology Program of Inner Mongolia (2019ZD007), and the National Natural Science Foundation of China (32071582).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors are thankful for the financial support provided by the Major Program of Inner Mongolia, the Cooperation project of science and technology promotion in Inner Mongolia, the Key Science and Technology Program of Inner Mongolia, and the National Natural Science Foundation of China. In addition, special thanks to Feng Gang for support in writing the article and Yongzhi Yan for support in data analysis.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhang, G.; Biradar, C.M.; Xiao, X.; Dong, J.; Zhou, Y.; Qin, Y.; Zhang, Y.; Liu, F.; Ding, M.; Thomas, R.J. Exacerbated grassland degradation and desertification in Central Asia during 2000–2014. Ecol. Appl. 2018, 28, 442–456. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Liu, S.; Wang, T.; Kang, W.; David, M. Several challenges in monitoring and assessing desertification. Environ. Earth Sci. 2015, 73, 7561–7570. [Google Scholar] [CrossRef]
  3. Qian, K.; Yuan, Q.-Z.; Han, J.-C.; Leng, R.; Wang, Y.-S.; Zhu, K.-H.; Lin, S.; Ren, P. A remote sensing monitoring method for alpine grasslands desertification in the eastern Qinghai-Tibetan Plateau. J. Mt. Sci. 2020, 17, 1423–1437. [Google Scholar] [CrossRef]
  4. Reynolds, J.F.; Stafford Smith, D.M.; Lambin, E.F.; Turner, B.L.; Mortimore, M.; Batterbury, S.P.J.; Downing, T.E.; Dowlatabadi, H.; Fernandez, R.J.; Herrick, J.E.; et al. Global desertification: Building a science for dryland development. Science 2007, 316, 847–851. [Google Scholar] [CrossRef] [Green Version]
  5. Rubio, J.L.; Bochet, E. Desertification indicators as diagnosis criteria for desertification risk assessment in Europe. J. Arid. Environ. 1998, 39, 113–120. [Google Scholar] [CrossRef]
  6. Li, J.; Xu, B.; Yang, X.; Qin, Z.; Zhao, L.; Jin, Y.; Zhao, F.; Guo, J. Historical grassland desertification changes in the Horqin Sandy Land, Northern China (1985–2013). Sci. Rep. 2017, 7, 3009. [Google Scholar] [CrossRef] [Green Version]
  7. Collado, A.D.; Chuvieco, E.; Camarasa, A. Satellite remote sensing analysis to monitor desertification processes in the crop-rangeland boundary of Argentina. J. Arid. Environ. 2002, 52, 121–133. [Google Scholar] [CrossRef]
  8. Song, X.; Wang, T.; Xue, X.; Yan, C.; Li, S. Monitoring and analysis of aeolian desertification dynamics from 1975 to 2010 in the Heihe River Basin, northwestern China. Environ. Earth Sci. 2015, 74, 3123–3133. [Google Scholar] [CrossRef]
  9. Du, Y.; Song, W.; He, Q.; Huang, D.; Liotta, A.; Su, C. Deep learning with multi-scale feature fusion in remote sensing for automatic oceanic eddy detection. Inf. Fusion 2019, 49, 89–99. [Google Scholar] [CrossRef] [Green Version]
  10. Sternberg, T.; Tsolmon, R.; Middleton, N.; Thomas, D. Tracking desertification on the Mongolian steppe through NDVI and field-survey data. Int. J. Digit. Earth 2011, 4, 50–64. [Google Scholar] [CrossRef]
  11. Liu, Q.; Zhang, Q.; Yan, Y.; Zhang, X.; Niu, J.; Svenning, J.-C. Ecological restoration is the dominant driver of the recent reversal of desertification in the Mu Us Desert (China). J. Clean. Prod. 2020, 268, 122241. [Google Scholar] [CrossRef]
  12. Zhang, X.; Liao, C.; Li, J.; Sun, Q. Fractional vegetation cover estimation in arid and semi-arid environments using HJ-1 satellite hyperspectral data. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 506–512. [Google Scholar] [CrossRef]
  13. Fensholt, R.; Langanke, T.; Rasmussen, K.; Reenberg, A.; Prince, S.D.; Tucker, C.; Scholes, R.J.; Le, Q.B.; Bondeau, A.; Eastman, R.; et al. Greenness in semi-arid areas across the globe 1981–2007—An Earth Observing Satellite based analysis of trends and drivers. Remote Sens. Environ. 2012, 121, 144–158. [Google Scholar] [CrossRef]
  14. Bari, E.; Nipa, N.J.; Roy, B. Association of vegetation indices with atmospheric & biological factors using MODIS time series products. Environ. Chall. 2021, 5, 100376. [Google Scholar] [CrossRef]
  15. Chen, A.; Yang, X.; Guo, J.; Xing, X.; Yang, D.; Xu, B. Synthesized remote sensing-based desertification index reveals ecological restoration and its driving forces in the northern sand-prevention belt of China. Ecol. Indic. 2021, 131, 108230. [Google Scholar] [CrossRef]
  16. Jiapaer, G.; Chen, X.; Bao, A.M. A comparison of methods for estimating fractional vegetation cover in arid regions. Agric. For. Meteorol. 2011, 151, 1698–1710. [Google Scholar] [CrossRef]
  17. Qi, J.; Marsett, R.C.; Moran, M.S.; Goodrich, D.C.; Heilman, P.; Kerr, Y.H.; Dedieu, G.; Chehbouni, A.; Zhang, X.X. Spatial and temporal dynamics of vegetation in the San Pedro River basin area. Agric. For. Meteorol. 2000, 105, 55–68. [Google Scholar] [CrossRef] [Green Version]
  18. Tucker, C.J.; Dregne, H.E.; Newcomb, W.W. Expansion and contraction of the sahara desert from 1980 to 1990. Science 1991, 253, 299–300. [Google Scholar] [CrossRef]
  19. Li, X.B.; Chen, Y.H.; Yang, H.; Zhang, Y.X. Improvement, comparison, and application of field measurement methods for grassland vegetation fractional coverage. J. Integr. Plant Biol. 2005, 47, 1074–1083. [Google Scholar] [CrossRef]
  20. Purevdorj, T.; Tateishi, R.; Ishiyama, T.; Honda, Y. Relationships between percent vegetation cover and vegetation indices. Int. J. Remote Sens. 1998, 19, 3519–3535. [Google Scholar] [CrossRef]
  21. Song, B.; Park, K. Detection of Aquatic Plants Using Multispectral UAV Imagery and Vegetation Index. Remote Sens. 2020, 12, 387. [Google Scholar] [CrossRef] [Green Version]
  22. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  23. Huete, A.R. A Soil-adjusted vegetation index SAVI. Remote Sens. Environ. 1988, 25, 295–310. [Google Scholar] [CrossRef]
  24. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  25. Huete, A.R.; Liu, H.Q.; Batchily, K.; Van Leeuwen, W. A comparison of vegetation indices over a global set of TM image for EOS-MODIS. Remote Sens. Environ. 1997, 59, 440–451. [Google Scholar] [CrossRef]
  26. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  27. Naji, T.A.-H. Study of vegetation cover distribution using DVI, PVI, WDVI indices with 2D-space plot. J. Phys. Conf. Ser. IOP Publ. 2018, 1003, 012083. [Google Scholar] [CrossRef]
  28. Dos Santos Galvanin, E.A.; Alves da Silva Neves, S.M.; Madureira Cruz, C.B.; Neves, R.J.; Hack de Jesus, P.H.; Kreitlow, J.P. Evaluation of vegetation indexes NDVI, SR and TVI in the discrimination of vegetation types of environments of ‘pantanal’ in Caceres, Mato Grosso State. Cienc. Florest. 2014, 24, 707–715. [Google Scholar]
  29. Pickell, P.D.; Hermosilla, T.; Frazier, R.J.; Coops, N.C.; Wulder, M.A. Forest recovery trends derived from Landsat time series for North American boreal forests. Int. J. Remote Sens. 2016, 37, 138–149. [Google Scholar] [CrossRef]
  30. Wu, W. The Generalized Difference Vegetation Index (GDVI) for Dryland Characterization. Remote Sens. 2014, 6, 1211–1233. [Google Scholar] [CrossRef] [Green Version]
  31. Li, X.; Zhang, Y.; Luo, J.; Jin, X.; Xu, Y.; Yang, W. Quantification winter wheat LAI with HJ-1CCD image features over multiple growing seasons. Int. J. Appl. Earth Obs. Geoinf. 2016, 44, 104–112. [Google Scholar] [CrossRef]
  32. Becerril-Pina, R.; Diaz-Delgado, C.; Mastachi-Loza, C.A.; Gonzalez-Sosa, E. Integration of remote sensing techniques for monitoring desertification in Mexico. Hum. Ecol. Risk Assess. 2016, 22, 1323–1340. [Google Scholar] [CrossRef]
  33. Marcial-Pablo, M.D.; Gonzalez-Sanchez, A.; Jimenez-Jimenez, S.I.; Ontiveros-Capurata, R.E.; Ojeda-Bustamante, W. Estimation of vegetation fraction using RGB and multispectral images from UAV. Int. J. Remote Sens. 2019, 40, 420–438. [Google Scholar] [CrossRef]
  34. Higginbottom, T.P.; Symeonakis, E. Assessing Land Degradation and Desertification Using Vegetation Index Data: Current Frameworks and Future Directions. Remote Sens. 2014, 6, 9552–9575. [Google Scholar] [CrossRef] [Green Version]
  35. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology. Front. Ecol. Environ. 2013, 11, 138–146. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Miranda, V.; Pina, P.; Heleno, S.; Vieira, G.; Mora, C.; Schaefer, C.E.G.R. Monitoring recent changes of vegetation in Fildes Peninsula (King George Island, Antarctica) through satellite imagery guided by UAV surveys. Sci. Total Environ. 2020, 704, 135295. [Google Scholar] [CrossRef]
  37. Feng, Q.L.; Liu, J.T.; Gong, J.H. UAV Remote Sensing for Urban Vegetation Mapping Using Random Forest and Texture Analysis. Remote Sens. 2015, 7, 1074–1094. [Google Scholar] [CrossRef] [Green Version]
  38. Getzin, S.; Wiegand, K.; Schoning, I. Assessing biodiversity in forests using very high-resolution images and unmanned aerial vehicles. Methods Ecol. Evol. 2012, 3, 397–404. [Google Scholar] [CrossRef]
  39. Tu, M.; Lu, H.; Shang, M. Monitoring Grassland Desertification in Zoige County Using Landsat and UAV Image. Pol. J. Environ. Stud. 2021, 30, 5789–5799. [Google Scholar] [CrossRef]
  40. Zhang, L.; Hong, G.Y.; Li, Z.F.; Gao, X.W.; Wu, Y.Z.; Wang, X.J.; Wang, P.P.; Yang, J. Assessment of the Ecosystem Service Function of Sandy Lands at Different Times Following Aerial Seeding of an Endemic Species. Sustainability 2018, 10, 14. [Google Scholar] [CrossRef] [Green Version]
  41. Liu, L.; Xu, X.; Wu, J.; Jarvie, S.; Li, F.; Han, P.; Zhang, Q. Comprehensive evaluation and scenario simulation of carrying capacity of water resources in Mu Us Sandy Land, China. Water Supply 2022, 22, 7256–7271. [Google Scholar] [CrossRef]
  42. Guo, Z.C.; Wang, T.; Liu, S.L.; Kang, W.P.; Chen, X.; Feng, K.; Zhang, X.Q.; Zhi, Y. Biomass and vegetation coverage survey in the Mu Us sandy land—Based on unmanned aerial vehicle RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 94, 13. [Google Scholar] [CrossRef]
  43. Gao, Y.; Lin, Y.; Wen, X.; Jian, W.; Gong, Y. Vegetation information recognition in visible band based on UAV images. Nongye Gongcheng Xuebao/Trans. Chin. Soc. Agric. Eng. 2020, 36, 178–189. [Google Scholar] [CrossRef]
  44. Wu, J.; Yang, G.; Yang, X.; Xu, B.; Han, L.; Zhu, Y. Automatic Counting of in situ Rice Seedlings from UAV Images Based on a Deep Fully Convolutional Neural Network. Remote Sens. 2019, 11, 691. [Google Scholar] [CrossRef] [Green Version]
  45. Fahmi, F.; Triandal, D.; Andayani, U.; Siregar, B. Image processing analysis of geospatial uav orthophotos for palm oil plantation monitoring. In Proceedings of the 2nd International Conference on Computing and Applied Informatics. J. Phys. Conf. Ser. IOP Publ. 2018, 11, 28–30. [Google Scholar]
  46. Hartling, S.; Sagan, V.; Maimaitijiang, M. Urban tree species classification using UAV-based multi-sensor data fusion and machine learning. GISci. Remote Sens. 2021, 58, 1250–1275. [Google Scholar] [CrossRef]
  47. Cuneo, P.; Jacobson, C.R.; Leishman, M.R. Landscape-scale detection and mapping of invasive African Olive (Olea europaea L. ssp cuspidata Wall ex G. Don Ciferri) in SW Sydney, Australia using satellite remote sensing. Appl. Veg. Sci. 2009, 12, 145–154. [Google Scholar] [CrossRef]
  48. Stehman, S.V. Selecting and interpreting measures of thematic classification accuracy. Remote Sens. Environ. 1997, 62, 77–89. [Google Scholar] [CrossRef]
  49. Wang, Y.; Li, S.; Teng, F.; Lin, Y.; Wang, M.; Cai, H. Improved Mask R-CNN for Rural Building Roof Type Recognition from UAV High-Resolution Images: A Case Study in Hunan Province, China. Remote Sens. 2022, 14, 265. [Google Scholar] [CrossRef]
  50. Jianhua, W.; Yimou, W.; Changzhen, Y.; Yuan, Q. 1:100,000 Desert (Sand) Distribution Dataset in China; National Tibetan Plateau Data Center: Lhasa, China, 2013. [Google Scholar] [CrossRef]
  51. Otsu, N. A threshold selection method from gray level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef] [Green Version]
  52. Vala, H.J. A Review on Otsu Image Segmentation Algorithm Miss. Int. J. Adv. Res. Comput. Eng. Technol. 2013, 2, 387–389. [Google Scholar]
  53. Xue, J.-H.; Titterington, D.M. t-Tests, F-Tests and Otsu’s Methods for Image Thresholding. Ieee Trans. Image Process. 2011, 20, 2392–2396. [Google Scholar] [CrossRef] [PubMed]
  54. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  55. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Color indices for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1994, 38, 259–269. [Google Scholar] [CrossRef]
  56. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  57. Guijarro, M.; Pajares, G.; Riomoros, I.; Herrera, P.J.; Burgos-Artizzu, X.P.; Ribeiro, A. Automatic segmentation of relevant textures in agricultural images. Comput. Electron. Agric. 2011, 75, 75–83. [Google Scholar] [CrossRef] [Green Version]
  58. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  59. Sellaro, R.; Crepy, M.; Trupkin, S.A.; Karayekov, E.; Buchovsky, A.S.; Rossi, C.; Casal, J.J. Cryptochrome as a Sensor of the Blue/Green Ratio of Natural Radiation in Arabidopsis. Plant Physiol. 2010, 154, 401–409. [Google Scholar] [CrossRef] [Green Version]
  60. Verrelst, J.; Schaepman, M.E.; Koetz, B.; Kneubuhler, M. Angular sensitivity analysis of vegetation indices derived from CHRIS/PROBA data. Remote Sens. Environ. 2008, 112, 2341–2353. [Google Scholar] [CrossRef]
  61. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Kobe, Japan, 20–24 July 2003; pp. 1079–1083. [Google Scholar]
  62. Hague, T.; Tillett, N.D.; Wheeler, H. Automated crop and weed monitoring in widely spaced cereals. Precis. Agric. 2006, 7, 21–32. [Google Scholar] [CrossRef]
  63. Zhou, T.; Hu, Z.; Han, J.; Zhang, H. Green vegetation extraction based on visible light image of UAV. China Environ. Sci. 2021, 41, 2380–2390. [Google Scholar]
  64. Zaiming, Z.; Yanming, Y.; Benqing, C. Study on the extraction of exotic species spartina alterniflora from UAV visible images. J. Subtrop. Resour. Environ. 2017, 12, 90–95. [Google Scholar]
  65. Guerrero, J.M.; Pajares, G.; Montalvo, M.; Romeo, J.; Guijarro, M. Support Vector Machines for crop/weeds identification in maize fields. Expert Syst. Appl. 2012, 39, 11149–11155. [Google Scholar] [CrossRef]
  66. Daryaei, A.; Sohrabi, H.; Atzberger, C.; Immitzer, M. Fine-scale detection of vegetation in semi-arid mountainous areas with focus on riparian landscapes using Sentinel-2 and UAV data. Comput. Electron. Agric. 2020, 177, 13. [Google Scholar] [CrossRef]
  67. Zhao, Y.H.; Chen, N.H.; Chen, J.Y.; Hu, C.Q. Automatic extraction of yardangs using Landsat 8 and UAV images: A case study in the Qaidam Basin, China. Aeolian Res. 2018, 33, 53–61. [Google Scholar] [CrossRef]
  68. Munyati, C. Wetland change detection on the Kafue Flats, Zambia, by classification of a multitemporal remote sensing image dataset. Int. J. Remote Sens. 2000, 21, 1787–1806. [Google Scholar] [CrossRef]
  69. Sankey, T.T.; McVay, J.; Swetnam, T.L.; McClaran, M.P.; Heilman, P.; Nichols, M. UAV hyperspectral and lidar data and their fusion for arid and semi-arid land vegetation monitoring. Remote Sens. Ecol. Conserv. 2018, 4, 20–33. [Google Scholar] [CrossRef]
  70. Ma, Q.; Han, W.T.; Huang, S.J.; Dong, S.D.; Li, G.; Chen, H.P. Distinguishing Planting Structures of Different Complexity from UAV Multispectral Images. Sensors 2021, 21, 22. [Google Scholar] [CrossRef]
  71. Wu, J.; Liu, Y.; Wang, J.; He, T. Application of Hyperion data to land degradation mapping in the Hengshan region of China. Int. J. Remote Sens. 2010, 31, 5145–5161. [Google Scholar] [CrossRef]
  72. Lima-Cueto, F.J.; Blanco-Sepulveda, R.; Gomez-Moreno, M.L.; Galacho-Jimenez, F.B. Using Vegetation Indices and a UAV Imaging Platform to Quantify the Density of Vegetation Ground Cover in Olive Groves (Olea Europaea L.) in Southern Spain. Remote Sens. 2019, 11, 2564. [Google Scholar] [CrossRef] [Green Version]
  73. Jing, Z.; Huanbo, Y.; Yubin, L.; Liqun, L.; Peng, J.; Zhiming, L. Extraction Method of Summer Corn Vegetation Coverage Based on Visible Light Image of Unmanned Aerial Vehicle. J. Trans. Chin. Soc. Agric. Mach. 2019, 50, 232–240. [Google Scholar]
  74. Zhang, C.-M.; Zhang, J.-M. Research on the Spectral Characteristics of Grassland in Arid Regions Based on Hyperspectral Image. Spectrosc. Spectr. Anal. 2012, 32, 445–448. [Google Scholar] [CrossRef]
  75. Xue, J.; Su, B. Significant Remote Sensing Vegetation Indices: A Review of Developments and Applications. J. Sens. 2017, 2017, 1353691. [Google Scholar] [CrossRef] [Green Version]
  76. Zhang, Q.; Niu, J.; Buyantuyev, A.; Zhang, J.; Ding, Y.; Dong, J. Productivity-species richness relationship changes from unimodal to positive linear with increasing spatial scale in the Inner Mongolia steppe. Ecol. Res. 2011, 26, 649–658. [Google Scholar] [CrossRef]
  77. Yan, G.J.; Li, L.Y.; Coy, A.; Mu, X.H.; Chen, S.B.; Xie, D.H.; Zhang, W.M.; Shen, Q.F.; Zhou, H.M. Improving the estimation of fractional vegetation cover from UAV RGB imagery by colour unmixing. ISPRS-J. Photogramm. Remote Sens. 2019, 158, 23–34. [Google Scholar] [CrossRef]
  78. Pi, W.; Du, J.; Liu, H.; Zhu, X. Desertification Glassland Classification and Three-Dimensional Convolution Neural Network Model for Identifying Desert Grassland Landforms with Unmanned Aerial Vehicle Hyperspectral Remote Sensing Images. J. Appl. Spectrosc. 2020, 87, 309–318. [Google Scholar] [CrossRef]
Figure 1. Location of the study area.
Figure 1. Location of the study area.
Ijerph 19 16793 g001
Figure 2. OA, k, and RE differences of 19 vegetation indices on the severe desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Figure 2. OA, k, and RE differences of 19 vegetation indices on the severe desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Ijerph 19 16793 g002
Figure 3. OA, k, and RE differences of 19 vegetation indices on the high desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Figure 3. OA, k, and RE differences of 19 vegetation indices on the high desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Ijerph 19 16793 g003
Figure 4. OA, k, and RE differences of 19 vegetation indices on the moderate desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Figure 4. OA, k, and RE differences of 19 vegetation indices on the moderate desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Ijerph 19 16793 g004
Figure 5. OA, k, and RE differences of 19 vegetation indices on the slight desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Figure 5. OA, k, and RE differences of 19 vegetation indices on the slight desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Ijerph 19 16793 g005
Figure 6. OA, k, and RE differences of 19 vegetation indices on the non-desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Figure 6. OA, k, and RE differences of 19 vegetation indices on the non-desertification grade. (A) figure is the result of one-way analysis of variance for OA. (B) figure is the result of one-way analysis of variance for k. (C) figure is the result of one-way analysis of variance for RE. Values with the same lowercase letters within vegetation indices are not significantly different at p < 0.05.
Ijerph 19 16793 g006
Table 1. Desertification grades.
Table 1. Desertification grades.
Desertification GradesFVCDesertification Area in the ImageUAV Visible Light Images
Severe<5%≥95 m2Ijerph 19 16793 i001
High5–20%80–94 m2Ijerph 19 16793 i002
Moderate21–50%50–79 m2Ijerph 19 16793 i003
Slight51–70%30–49 m2Ijerph 19 16793 i004
Non-desertification>70%<30 m2Ijerph 19 16793 i005
Note: The sample images are all from the 30 images in this study.
Table 2. Vegetation indices.
Table 2. Vegetation indices.
Vegetation IndexFull NameEquation
GLI [54]Green Leaf Index(2 × G − R − B)/(2 × G + R + B)
ExG [55]Excess Green2g − r − b
ExR [56]Excess Red1.4r − g
ExB [57]Excess Blue1.4b − g
NGBDI [56]Normalized Green Red Difference Index(G − B)/(G + B)
NGRDI [56]Normalized Green Red Difference Index(G − R)/(G + R)
ExGR [56]Excess Green Minus Excess RedE × G − E × R
MGRVI [58]Modified(G2 − R2)/(G2 + R2)
RGBVI [58]Red Green Blue Vegetation Index(G2 − B × R)/(G2 + B × R)
GBRI [59]Green Blue Ratio Indexb/g
RGRI [60]Red Green Ratio Indexr/g
CIVE [61]Color Index of Vegetation0.441r − 0.881g + 0.385b + 18.78745
VEG [62]Vegetativeg/(rαb1−α)
DEVI [63]Difference Excess Vegetation IndexG/3G + R/3G + B/3G
EGRBDI [43]Excess Green Red Blue Difference Index((2G)2 − B × R)/((2G)2 + B × R)
V-MSAVI [64]Visible Band Modified Soil Adjusted Vegetation Index ( 1   ×   G   +   1 )     ( 2   ×   G   +   1 ) 2     8   ×   ( 2   ×   G     R     B ) 2
g [55]Green Chromatic CoordinatesG
COM [57]Combined0.25E × G + 0.3E × GR + 0.33CIVE + 0.12VEG
COM2 [65]Combined 20.36E × G + 0.47CIVE + 0.17VEG
Note: R: Red channel; G: Green channel; B: Blue channel; r: Standardization of red channel; g: Standardization of green channel; b: Standardization of blue channel. Else: r   =   R R   +   G   +   B ; g   =   G R   +   G   +   B ; b   =   B R   +   G   +   B ; α = 0.667.
Table 3. Supervised classification results of 30 images.
Table 3. Supervised classification results of 30 images.
Image NumberCenter CoordinateAltitude
m
FVC
%
Desertification GradeOA
%
k
LatitudeLongitude
139°20′3871″ E109°04′4434″ N1269.74.3243Severe99.11680.9821
238°30′3617″ E108°04 ′0653″ N1351.91.5050Severe99.0110.9783
339°20′3910″ E109°04′4392″ N1352.14.0864Severe99.65250.9922
438°50′5350″ E108°44′2955″ N1356.03.3052Severe99.92520.9966
538°50′5348″ E108°44′2929″ N1356.03.3717Severe98.71450.9642
638°30′3629″ E108°46′0643″ N1269.92.8890Severe99.74040.9939
739°15′3308″ E109°00′1219″ N1267.613.2679High98.97880.9671
839°15′3364″ E109°00′1315″ N1267.518.3795High99.57510.9725
938°09′4113″ E108°38′1296″ N1247.76.7185High99.52390.9879
1038°25′4054″ E108°42′2405″ N1293.816.2495High99.78150.9956
1138°25′4082″ E108°42′2359″ N1293.915.9524High99.540.9902
1238°09′4144″ E108°38′1247″ N1247.59.4557High98.82130.9764
1338°38′4889″ E108°56′4260″ N1270.632.6411Moderate99.81110.9676
1439°07′1405″ E108°53′2209″ N1295.843.2160Moderate99.95830.99
1539°07′1527″ E108°53′2101″ N1296.931.7365Moderate99.59850.9747
1638°33′1085″ E108°43′5874″ N1314.845.6735Moderate99.8330.982
1738°33′1040″ E108°44′0091″ N1314.531.8236Moderate99.60050.9892
1838°38′4745″ E108°56′4154″ N1270.326.3766Moderate99.54150.9894
1938°52′5931″ E108°44′2805″ N1344.153.0043Slight99.55560.9855
2038°52′5931″ E108°44′2816″ N1344.155.3886Slight99.68760.9937
2138°57′1514″ E109°25′2166″ N1266.051.5241Slight99.9310.9985
2238°57′1522″ E109°25′2124″ N1266.151.5331Slight99.89630.9928
2338°40′7309″ E108°37′4289″ N1287.454.7885Slight99.68760.9937
2438°40′4888″ E108°37′4914″ N1287.351.5424Slight99.97650.9946
2538°46′4981″ E108°31′0358″ N1284.986.2322Non-desertification99.8740.9606
2638°46′5164″ E108°31′3291″ N1284.896.5132Non-desertification99.88730.9619
2738°56′4931″ E109°17′3098″ N1261.483.4444Non-desertification99.48570.9808
2638°56′5034″ E109°17′2680″ N1261.389.2501Non-desertification99.92790.9674
2938°11′5005″ E108°52′2792″ N1266.971.6581Non-desertification99.56380.9833
3038°11′5022″ E108°52′2811″ N1266.876.3327Non-desertification99.81110.9676
Note: OA is the overall accuracy. k is the kappa coefficient.
Table 4. OA, k, and RE differences of 19 vegetation indices on the severe desertification grade.
Table 4. OA, k, and RE differences of 19 vegetation indices on the severe desertification grade.
TSSdfMSFp
OA (%)Between Groups14,840.24918824.4585.5620.000
Within Groups14,083.02495148.242
Grand Total28,923.272113
kBetween Groups5.895180.3285.4300.000
Within Groups5.730950.060
Grand Total11.625113
REBetween Groups1459.7161881.0954.7100.000
Within Groups1635.6079517.217
Grand Total3095.324113
Table 5. OA, k, and RE differences of 19 vegetation indices on the high desertification grade.
Table 5. OA, k, and RE differences of 19 vegetation indices on the high desertification grade.
TSSdfMSFp
OA (%)Between Groups3454.56218191.9203.5500.000
Within Groups5136.4809554.068
Grand Total8591.042113
kBetween Groups3.914180.2177.8290.000
Within Groups2.639950.028
Grand Total6.553113
REBetween Groups23.606181.3112.5840.002
Within Groups48.222950.508
Grand Total71.828113
Table 6. OA, k, and RE differences of 19 vegetation indices on the moderate desertification grade.
Table 6. OA, k, and RE differences of 19 vegetation indices on the moderate desertification grade.
TSSdfMSFp
OA (%)Between Groups6957.37018386.5218.4130.000
Within Groups4364.8489545.946
Grand Total11,322.218113
kBetween Groups4.774180.26516.4510.000
Within Groups1.531950.016
Grand Total6.305113
REBetween Groups2.998180.1674.6710.000
Within Groups3.387950.036
Grand Total6.385113
Table 7. OA, k, and RE differences of 19 vegetation indices on the slight desertification grade.
Table 7. OA, k, and RE differences of 19 vegetation indices on the slight desertification grade.
TSSdfMSFp
OA (%)Between Groups11,608.48918644.91623.3740.000
Within Groups2621.2169527.592
Grand Total14,229.705113
kBetween Groups4.392180.24426.6170.000
Within Groups0.871950.009
Grand Total5.262113
REBetween Groups1.785180.09913.9810.000
Within Groups0.674950.007
Grand Total2.459113
Table 8. OA, k, and RE differences of 19 vegetation indices on the non-desertification grade.
Table 8. OA, k, and RE differences of 19 vegetation indices on the non-desertification grade.
TSSdfMSFp
OA (%)Between Groups18,799.030181044.3911.0090.457
Within Groups98,285.865951034.588
Grand Total117,084.895113
kBetween Groups5.465180.3041.7500.044
Within Groups16.485950.174
Grand Total21.950113
REBetween Groups2.203180.1221.6920.054
Within Groups6.871950.072
Grand Total9.074113
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Xu, X.; Liu, L.; Han, P.; Gong, X.; Zhang, Q. Accuracy of Vegetation Indices in Assessing Different Grades of Grassland Desertification from UAV. Int. J. Environ. Res. Public Health 2022, 19, 16793. https://doi.org/10.3390/ijerph192416793

AMA Style

Xu X, Liu L, Han P, Gong X, Zhang Q. Accuracy of Vegetation Indices in Assessing Different Grades of Grassland Desertification from UAV. International Journal of Environmental Research and Public Health. 2022; 19(24):16793. https://doi.org/10.3390/ijerph192416793

Chicago/Turabian Style

Xu, Xue, Luyao Liu, Peng Han, Xiaoqian Gong, and Qing Zhang. 2022. "Accuracy of Vegetation Indices in Assessing Different Grades of Grassland Desertification from UAV" International Journal of Environmental Research and Public Health 19, no. 24: 16793. https://doi.org/10.3390/ijerph192416793

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop