Next Article in Journal
Evaluation of Wetland Area Effects on Hydrology and Water Quality at Watershed Scale
Previous Article in Journal
Utilizing Circular Economy Policies to Maintain and Transform Mining Facilities: A Case Study of Brzeszcze, Poland
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Land Use/Cover Classification of Large Conservation Areas Using a Ground-Linked High-Resolution Unmanned Aerial Vehicle

by
Lazaro J. Mangewa
1,2,*,†,
Patrick A. Ndakidemi
1,†,
Richard D. Alward
1,3,†,
Hamza K. Kija
4,
Emmanuel R. Nasolwa
1 and
Linus K. Munishi
1,†
1
School of Life Sciences and Bio-Engineering (LISBE), Nelson Mandela African Institution of Science and Technology (NM-AIST), Arusha P.O. Box 447, Tanzania
2
College of Forestry, Wildlife, and Tourism (CFWT), Sokoine University of Agriculture (SUA), Morogoro P.O. Box 3009, Tanzania
3
Aridlands, LLC, Grand Junction, CO 81507, USA
4
Conservation Information Monitoring Section (CIMS), Tanzania Wildlife Research Institute (TAWIRI), Arusha P.O. Box 661, Tanzania
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Resources 2024, 13(8), 113; https://doi.org/10.3390/resources13080113
Submission received: 2 May 2024 / Revised: 31 July 2024 / Accepted: 14 August 2024 / Published: 22 August 2024

Abstract

:
High-resolution remote sensing platforms are crucial to map land use/cover (LULC) types. Unmanned aerial vehicle (UAV) technology has been widely used in the northern hemisphere, addressing the challenges facing low- to medium-resolution satellite platforms. This study establishes the scalability of Sentinel-2 LULC classification with ground-linked UAV orthoimages to large African ecosystems, particularly the Burunge Wildlife Management Area in Tanzania. It involved UAV flights in 19 ground-surveyed plots followed by upscaling orthoimages to a 10 m × 10 m resolution to guide Sentinel-2 LULC classification. The results were compared with unguided Sentinel-2 using the best classifier (random forest, RFC) compared to support vector machines (SVMs) and maximum likelihood classification (MLC). The guided classification approach, with an overall accuracy (OA) of 94% and a kappa coefficient (k) of 0.92, outperformed the unguided classification approach (OA = 90%; k = 0.87). It registered grasslands (55.2%) as a major vegetated class, followed by woodlands (7.6%) and shrublands (4.7%). The unguided approach registered grasslands (43.3%), followed by shrublands (27.4%) and woodlands (1.7%). Powerful ground-linked UAV-based training samples and RFC improved the performance. The area size, heterogeneity, pre-UAV flight ground data, and UAV-based woody plant encroachment detection contribute to the study’s novelty. The findings are useful in conservation planning and rangelands management. Thus, they are recommended for similar conservation areas.

1. Introduction

The sustainability of wildlife conservation worldwide is threatened by the loss, degradation, and fragmentation of wildlife habitats, as exemplified in Africa [1,2,3,4,5,6,7,8]. These threats have been linked to anthropogenic pressure [9,10,11]. In Tanzania, these threats have been reported in many places around core protected areas, leading to blockages and the loss of corridors, buffer zones, and dispersal areas [8,12,13,14,15,16,17,18]. Tanzania an African country where reliable high-resolution land use/cover (LULC) classification maps are needed, portraying the status quo or trends of habitats in protected areas and entire landscapes/ecosystems. The information would adequately inform ecologists and wildlife managers to appropriately solve the habitat-related challenges in protected areas, including Burunge community wildlife management areas (BWMA). Burunge is one of the 21 operational wildlife management areas (WMAs) in Tanzania, established on village lands, managed by communities, and overseen by the Wildlife Division in the Ministry of Natural Resources and Tourism (MNRT) to safeguard wildlife habitats including migratory corridors, buffers, and dispersal areas. Another practical use of the land use/land cover (LULC) classification maps includes detecting and monitoring wildlife habitats [19,20,21].
Depending on the purpose, timely, high-resolution LULC classification maps with a high accuracy are essential for effectively managing and monitoring wildlife habitats [22,23,24,25,26,27,28,29]. Nevertheless, many LULC maps have been produced using space-borne remote sensing platforms such as Landsat [30,31,32] and Sentinel-2 [26,29,32], which have lower spatial resolutions than UAV platforms [33,34,35,36]. For example, the spatial resolution for MODIS is 250 m, 500 m and 1 km, while LANDSAT is 30 m, and Sentinel-2 is 10 m, 20 m, and 60 m. At the same time, UAV-based imagery can yield resolutions down to a few millimeters based on the sensor and flight characteristics. Low spatial resolutions affect accuracies in predicting LULC classes [37]. Extracting appropriate habitat-related information in complex and heterogeneous conservation settings is much more challenging using LULC maps produced from low-resolution imagery [35,38].
High-quality LULC classification maps can be produced with a high spatial resolution from UAV-based remote sensing products, as opposed to those satellite products with a >5 m resolution. Yet, the use of UAV platforms is still limited in African countries [39], including Tanzania. UAV-derived images have been used more extensively in the northern than southern hemispheres (Table 1 and Table 2). The studies covered small- to medium-sized areas and less-complex areas [40], areas with only a few land cover types [41,42], experimental study sites [43,44], and agricultural and grazing areas [45]. For instance, a study conducted in Serbia regarding land cover classification used UAV-derived orthoimages but focused on only four small sites comprising a total of 12 ha [23]. Another study conducted in Bangladesh successfully used multispectral UAV images to map agricultural land use and land cover in the area [46]. A study by Daryaei et al. [47] in Iran focused on riparian landscapes, whereas an experimental study by Duke et al. [48] in Ghana focused on crop type classification. Many studies did not start with ground surveys to ensure that the UAV flight missions were planned and executed in the true LULC classes to generate strongly reliable UAV-derived orthoimages. A key question remained: to what extent do high-resolution UAV-based RGB orthoimages, aided by pre-UAV flight ground data in sampled LULC types, reliably guide Sentinel-2 imagery to detect, classify, and assess the LULC classes in large and heterogenous conservation areas in comparison with the unguided Sentinel-2 LULC classification model? This formed the basis of this study to scale up the ground-linked UAV-based LULC classification approach to large and highly heterogeneous conservation areas in the southern hemisphere, taking a case study from the Burunge community wildlife management area (BWMA) in Northern Tanzania.
The ground-linked UAV-guided Sentinel-2 LULC classification approach means that the ground surveys are conducted in sampled land cover types followed by UAV-based flight mission planning and execution, producing UAV-RGB images that are processed to generate orthoimages to guide the Sentinel-2 LULC classification model. On the other hand, the unguided Sentinel-2 LULC classification approach is referred to as the widely used Sentinel-2 satellite platform. We therefore hypothesized that upscaling the ground-linked UAV-derived RGB orthoimages to Sentinel-2 resolution and obtaining powerful training samples depicting a remote sensing signature of a true LULC class would improve the Sentinel-2-based LULC classification process, producing almost-perfect products. We used random forest classifier (RFC), support vector machine (SVM), and maximum likelihood classification (MLC) algorithms, which are commonly used in the literature [49], to obtain the best LULC classification product based on classification accuracies and quality, as well as a clear delineation of different classes, providing practically useful information for wildlife habitat management. All of these machine learning algorithms (MLA) have been reported to be effective and efficient in LULC classification [50]. The RFC algorithm has been reported as effective in handling high-dimensional data [51]. Despite its effectiveness, the SVM requires many parameters for adjustment and imposes some barriers to automation [50]. The MLC has a high practicality and can delineate different cover classes [52].
Hence, this study is anchored on the rationale that attaining sustainable wildlife conservation objectives requires adequate and timely information, mostly extracted from high-quality LULC classification maps. Deficiencies in such data may lead to wildlife loss due to delayed actions to address the drivers of habitat change [12,29]. High-resolution LULC classification maps are useful in conservation planning, rangeland management, poverty reduction, improving tourism development in protected areas, and informing policy reviews for sustainable biodiversity conservation. In turn, these enhance biodiversity financing, partly by avoiding future conservation costs likely to be attributed to inadequate relevant information for managing protected areas and ecosystems/landscapes. The information would also help with effective and efficient delivery in wildlife management practice. Ultimately, it contributes to achieving the targets of the United Nations Sustainable Development Goals (SDGs), such as SDG 15 and the Kunming-Montreal Global Biodiversity Framework. SDG 15 focuses on protecting, restoring, and promoting the sustainable use of terrestrial ecosystems. The Goal also urges the sustainable management of forests, combating desertification, halting and reversing land degradation, and halting biodiversity loss. The contributions of scientific and environmental information derived from UAV-based remote sensing imageries have been underscored [53,54].
Table 1. Examples of similar studies conducted in the northern hemisphere.
Table 1. Examples of similar studies conducted in the northern hemisphere.
UAV TypeSatellite
System
Research and Major FindingsLocation, Size, and HabitatsScopeReference
mX-SIGHT, Germany-Analyzed and revealed the potential use of UAV-derived imagery to measure areas of land plots for monitoring land policiesSpain: experimental sites (0.3–29 ha of crops)Limited to land plots[35]
DJI-Phantom 2 with a spatial resolution of 2.8 cmPleiades-IBCompared and established the higher capability of the UAV over the satellite in mapping mangroves in terms of image quality: accuracies, area coverage, and costs (time and user). Reported that better spectral resolution provides Pleiades-IB with more advantages over UAV-derived RGB orthoimages for assessing health and biomass.Setiu wetland in Malaysia: mangroves (4.18 km2)Focused on small areas of mangroves[55]
Bormatec-MAJA: Bormatec, Mooswiesen,
Ravensburg, Germany
Satellite tracking toolAssessed and demonstrated the usefulness of combining UAV and satellite tracking of individual animals (e.g., proboscis monkey) for detecting key conservation issues such as deforestation and influencing policy reviewsSabah, Malaysian Borneo. Riparian habitats (273.51 ha)Riparian habitats for a proboscis monkey[56]
Octocopter (OktoXL–HiSystems GmbH)Sentinel-2Developed a methodological framework for estimating the fractional coverage (FC%) of an invasive shrub species, Ules Europaeus (common gorse)Chiloé Island (south–central Chile): Ten flown sites, each 50 haSelected areas invaded with shrubs[57]
Parrot Bluegrass quadcopter and DJI Phantom 4 ProSentinel-2Assessed and quantitatively demonstrated the improvements of a multispectral UAV mapping technique for higher-resolution images used for advanced mapping and assessing coastal land cover. It also compared UAV and satellite capabilities in the same area.Indian River Lagoon along the central Atlantic coast of Florida, USACoastal habitats[58]
Fixed-wing Sense fly eBee-S.O.D.A. and Parrot Sequoia cameras-Evaluated the potential of UAVs for the collection of ultra-high spatial resolution imagery for mapping tree line ecotone land covers, showing a higher efficiencyNorway: 32 tree-line ecotone sitesAlpine tree-line ecotone[59]
Phantom 4-Pro with MicaSense RedEdge-M multispectral camera systemWorldView-4 satelliteUtilized high-spatial-resolution drone and WorldView-4 satellite data to map and monitor grazing land cover change and pasture quality pre- and post-flooding. The two platforms were found to be useful in detecting grazing land cover change at a finer scale.Cheatham County, middle Tennessee, USACattle grazing land[60]
DJI INSPIRE 1 quad-rotor with Zenmuse × 5 onboard cameras.-Quantified the spatial pattern distributions of dominated vegetation along the elevation gradientLuntai County, China: 22 sample plotsField experimental plots[34]
DJI Inspire 1 v2 (Shenzen, China). MicaSense RedEdge cameraWorld-View 3 Sentinel-2Investigated using UAV and satellite platforms to monitor and classify aquatic vegetation in irrigation channels. The UAV was found to be effective for intensive monitoring of weeds in small areas of irrigation channels.Murrumbidgee Irrigation Area (MIA), Australia: 38.5 km2Irrigation channels[61]
Sensefly eBee with multispectral Parrot Sequoia and RGB sensors-Examined object-based classification accuracies for different cover types and vegetation species using data from UAV-based multispectral camerasTrent University campus, Central Ontario, Canada: 10 haSmall, mixed forest and agricultural area[62]
Octocopter (University of Tehran) with a MAPIR Survey1 Visible Light Camera (San Diego, CA, USA)Sentinel-2Assessed and proved the suitability of integrating UAV-obtained RGB images, Sentinel-2 data, and ML models for estimating forest canopy cover (FCC), intended for precise and fast mapping at the landscape-level scale.Kheyrud Experimental Forest, Northern Iran. Four flown plots: 20 ha, 15 ha, 17 ha and 19 ha.Canopy cover in a Forest[63]
DJI Phantom 4 Pro (DJI, Shenzhen, China)Sentinel-2Assessed and revealed that UAV-based RGB orthophotos and CHM data have a very good ability to detect and classify scattered trees and different land covers along the narrow river.Chaharmahal-va-Bakhtiari province of Iran: Five plotsRiparian landscape adjoining a narrow river[47]
Table 2. Examples of similar studies conducted in the southern hemisphere.
Table 2. Examples of similar studies conducted in the southern hemisphere.
UAV TypeSatellite SystemResearch and Major FindingsLocation, Size, and HabitatsScopeReference
DJI Inspire, Ebee (senseFly SA, Cheseaux-sur-Lausanne, Switzerland) and Parrot Disco (Parrot, Paris, France)S1 SAR and S2Used UAV-based imagery to create a ground-truthing dataset for mapping cropped areas, establishing a higher potential use of UAVs compared to satellite platforms.Rwanda: Small mono-cropped fields, intercropped and natural vegetation (80 ha each location)Crops, mixed crops and grassland, small tree stands, woodlands and small forests.[64]
SenseFly eBeeX with a Parrot Sequoia+ multispectral cameraSynthetic aperture radar (SAR)Assessed the synergistic approach of a multispectral UAV-based dataset and SAR on understanding the spectral features of intended objects. Used SVM and RFC.Nigeria International Institute of Tropical Agriculture (IITA) agricultural fields.Experimental plots[48]
DJI Mavic Pro micro-quadcopter and a Sequoia parrot multispectral sensor-Explored whether fractional vegetation component (FVC) estimates vary with different classification approaches (pixel- and segment-based random forest classifiers) applied to very high-resolution small UAV-derived imagery.Botswana: Chobe Enclave, Southern African dryland savanna: nine sitesSavanna cover: grass-, shrub-, and tree-dominated sites[65]
Micro-quadcopter and a multispectral sensor (Micasense)-Assessed the efficacy of UAS for monitoring vegetation structural characteristics in a mixed savanna woodland using UAS imagery.Botswana: Chobe Enclave, grass, shrub, and tree sites (9)Savanna cover and woody vegetation structure[38]
eBeeX fixed-wing (Airinov multispec 4C sensor)--Successfully mapped the spatial extent of banana farmland mixed with buildings, bareland, and other areas of vegetation in 4 villages in Rwanda.Rwanda: Small-holder farmlandSmall plots of Banana farmland[66]
DJI Phantom 4 ProSentienl-2Assessed the coastal shoreline changes using multi-platform data drones, a shore-based camera, Sentinel satellite images, and a dumpy level for effective monitoring. The UAV and local video cameras were more effective than Sentinel-2.Elmina Bay, Ghana: 1.5 m beach.Beach area[67]

2. Materials and Methods

2.1. Study Area

This study was conducted in the Burunge WMA, with an area of approximately 300 km2. Its core area is 243 km2. It is located between Lake Manyara and the Tarangire National Parks in Northern Tanzania between the latitudes of 3°38′31.22″ to 3°56′55.78″ S and longitudes of 35°43′43.98″ to 36°34.65″ E (Figure 1). The WMA was established in 2003 with the participation of ten villages (Minjingu, Mwada, Vilima Vitatu, Sangaiwe, Magara, Manyara, Maweni, Ngolei, Kakoi, and Olasiti). It is both a buffer around the two national parks and a part of the large Kwakuchinja wildlife migratory corridor (approximately 1280 km2) connecting the parks. Its average elevation is 1000 m [68]. The monthly average temperature ranges from 8 °C to 33 °C [69]. The area experiences low and unpredictable annual rainfall [70]. The critical migratory animals found in the Burunge WMA include wildebeest (Connochaetes taurinus), zebra (Equus quagga), and African elephants (Loxodonta africana). Other large mammals found in the area include wild dogs (Lycaon pictus), buffalo (Syncerus caffer), waterbuck (Kobus ellipsiprymnus), bushbuck (Tragelaphus scriptus), giraffes (Giraffa camelopadalis), lions (Panthera leo) and leopards (Panthera pardus).

2.2. Data Collection

2.2.1. Ground Survey for the Determination of Land Cover Types

Before conducting UAV flight mission planning, pre-UAV flight ground data were collected for each land cover type between July and August 2021. To obtain a broad range of ground-based cover types, we surveyed plots along a gradient from the Tarangire National Park boundary on the eastern side of the Burunge WMA to the Lake Manyara National Park boundary on the western side [18,72,73]. We used the definitions of LULC classes provided by researchers in the ecosystem and other similar areas (Table 3). Based on the high resolution of Google Earth images (https://www.google.com/earth/about/versions/) accessed on 8 July 2021 and direct observations, we identified and purposively selected uniform areas for each cover type, reflecting their likely physiognomy and floristic composition [72,73]. We then used a ground sampling scheme following Braun-Blanquet [72]. After identifying a uniform area for the land cover type, plots of 50 × 50 m were randomly selected for vegetation assessments, focusing on the percentage canopy cover and species composition. The tree percentage canopy cover was estimated in the 50 × 50 m plots, whereas shrubs were estimated in 20 × 20 m sub-plots nested within the tree plot, and grasses/herbs were assessed in 2 × 2 m sub-plots nested within the shrub plot.
The ground survey determined the major vegetation cover types (grasslands, shrublands, woodlands, riverine, and mosaic) (Table 4). Since the small, forested area in the extreme western part of the Burunge WMA was not accessed for ground or UAV-based surveys due to accessibility logistical challenges, we relied on Google Earth for identification. Other LULC types recorded on the ground were water, settlement and cultivation, and bareland.

2.2.2. UAV-Based Survey

UAV Used and Flight Mission Planning

We used a DJI Phantom 3 UAV outfitted with an RGB camera (DJI, Shenzen, China) and a built-in global positioning system (GPS) to collect field data from July to August 2021. The camera had a 1/2.3” CMOS sensor with 12 MP images of 4000 × 3000 pixels. We planned 19 flight missions for eight LULC types determined during the ground survey (Figure 1). Each mission plan was designed at a flight altitude of 120 m above ground level (AGL), a velocity of 5 m/s, and a camera set to nadir. During the reconnaissance phase, the forward (75%) and side (65%) overlaps without the use of ground control points (GCPs) did not produce high-quality orthoimages. This necessitated an increase in the forward and side overlaps to 85% and 75%, respectively, and utilized GCPs in each plot. Some researchers have used overlaps ranging from 60 to 80% [47,49,57,62,79]. Other researchers suggest higher forward and side overlaps, such as 85% or above [38,65,80,81,82]. However, higher overlaps require substantial flight endurance and computer processing times to generate orthoimages [83].
The total ground area captured from each flight mission ranged from 16.3 to 47.9 ha. The wind speed measured using an anemometer ranged from 0 to 2 m/s. The sky cloud cover varied during flight missions. It was measured using the Oktas scale, as described by Ahmad et al. [84]. The sky was covered by a few clouds, defined as FEW, 1/8, and 2/8 coverage at 1 and 2 Oktas, and scattered clouds (SCT), defined as 3/8 and 4/8 coverage at 3 and 4 Oktas.

Image Processing to Create RGB Orthoimages

The UAV RGB images were processed into georectified orthomosaics using Pix4Dmapper software package version 4.8.4 with the standard option. The Linear Standard option was selected in the Linear Rolling Model to correct for any rolling shutter effects, followed by optimization of the camera’s internal and external parameters. The number of GCPs ranged from 5 to 10 per plot. The numbers aligned with other researchers who used 5–12 GCPs per 100 ha [47,59,85,86,87,88]. A standard calibration method was selected within the ‘All-Prior’ option in Pix4Dmapper. After completing the next three steps (Point Cloud and Mesh, DSM orthomosaic, and Index processing), UAV RGB orthoimages were created, possessing an average resolution of 0.043 m/pixel. The GCPs and rolling shutter correction significantly improved the accuracy indicated by the obtained very low georeferencing mean RMS error values (0.004–0.02 m) for all the orthoimages produced.

Upscaling Orthoimages to Sentinel-2 Grid Cell Resolutions

The derived UAV-RGB orthoimages were classified using the MLC, SVM, and RFC algorithms to compare the accuracies at a plot level based on the Kappa coefficient (k). The UAV RGB orthoimages and classified plot-level images were upscaled to 10 m × 10 m grid cells: the resolution of the overlaid Sentinel-2 imagery [47,64]. The dominant pixels in the classified plot-level orthoimages determined its cover type (Figure 2 and Figure 3). Before upscaling, the alignment of orthomosaics was also visually checked against satellite imagery.

2.3. Collection of Training and Validation Sample Points

The upscaled UAV-orthoimages were used to create a training dataset (Figure 2 and Figure 3). The dataset was split into 70% for training data and 30% for testing data (Table 5). This dataset was used for the ground-linked UAV-guided sentinel-2 LULC classification approach. The accuracy of the classified images of both classifiers was assessed post-classification using a confusion matrix to generate the producer accuracy (PA), user accuracy (UA), overall accuracy (OA), and kappa coefficient index (k). A similar approach was noted in Iran [47].
For the unguided Sentinel-2 LULC classification approach, the training samples were generated from sentinel RGB images. Validation sample points for accuracy assessments of unguided Sentinel-2 classification were derived from Google Earth imagery from the same date. For each classification approach, the training dataset was split into 70% and 30% for the training and testing data sets, respectively (Table 5).
In both classification approaches, we determined the number of training samples for each LULC class, as suggested by ref. [89]. For less frequent land cover types (water, settlements, and forests), additional training samples were obtained from Google Earth at a 10 m × 10 m scale. We considered the kappa coefficient to select the best classified LULC map [28,89,90] for comparative agreement and statistical tests between UAV-guided and unguided sentinel-2 classification approaches.
Figure 2. Examples of UAV RGB orthoimages in upper rows (A) from (iv) for different land cover types, classified using the MLC, SVM, and RFC algorithms to determine the different cover types at the plot level compared with Sentinel-2-based classification presented in the respective lower rows (B). Plots of different cover types are presented: Grassland plot (i), shrubland (ii,iii), woodland (iv), and riverine vegetation with bare areas and alternating patches of shrubs, trees, and mosaic (v).
Figure 2. Examples of UAV RGB orthoimages in upper rows (A) from (iv) for different land cover types, classified using the MLC, SVM, and RFC algorithms to determine the different cover types at the plot level compared with Sentinel-2-based classification presented in the respective lower rows (B). Plots of different cover types are presented: Grassland plot (i), shrubland (ii,iii), woodland (iv), and riverine vegetation with bare areas and alternating patches of shrubs, trees, and mosaic (v).
Resources 13 00113 g002aResources 13 00113 g002b
Figure 3. Example of sections from upscaled heterogenous UAV-RGB orthoimages to Sentinel-2 grid cells of a 10 m × 10 m resolution in each land cover type: mosaic cover with fairly close mixture of trees, shrubs, and bushes (A,E), water pond in a woodland plot (B), woodland (C,IK), cultivated land (D), grassland (F,H), shallow muddy water ponds in shrubland plot (G), and bareland (L). The dominant cover (≥75%) in each grid cell determined its corresponding landcover type [47,64].
Figure 3. Example of sections from upscaled heterogenous UAV-RGB orthoimages to Sentinel-2 grid cells of a 10 m × 10 m resolution in each land cover type: mosaic cover with fairly close mixture of trees, shrubs, and bushes (A,E), water pond in a woodland plot (B), woodland (C,IK), cultivated land (D), grassland (F,H), shallow muddy water ponds in shrubland plot (G), and bareland (L). The dominant cover (≥75%) in each grid cell determined its corresponding landcover type [47,64].
Resources 13 00113 g003

2.4. Satellite Image Acquisition and Pre-Processing

We downloaded Top-of-Atmosphere, Level-2A Sentinel-2 images that had been processed for atmospheric correction. The scene had Path N0301 and Row R092. The effects of vegetation phenology were minimized by downloading dry season images from August 2021. The downloaded image had 13 spectral bands. For this classification scheme, we used bands 2 (blue), 3 (green), 4 (red), and near-infrared (NIR) band 8. We also calculated the normalized difference vegetation index (NDVI) and normalized difference water index (NDWI). The bands were clipped to the Burunge WMA boundary and projected to the Arc 1960 datum; the specific projection for East Africa.

2.5. Image Classification

We used ArcGIS Version 10.7.1 spatial analysis tools (segmentation and classification) using the MLC, SVM, and RFC algorithms to classify UAV orthoimages at the plot level, UAV-guided sentinel-2, and unguided Sentinel-2. For RFC, we used 500 as the maximum number of trees, 300 as the maximum number of tree depths, and 10,000 for several samples per class. The number of active variables used to split an RFC node was set to the square root of the number of input variables [91]. The number of trees and depth were selected due to their ability to ensure stability, robustness, and classification accuracy [50,92]. For the SVM, we used 1000 maximum samples per class in the pixel-based classification. For both models, the selected number of samples per class was useful in balancing the respective training dataset, providing adequate examples to enhance the algorithm’s learning.
The MLC is a standard and widely used parametric algorithm whose use requires the assumption that the data for each LULC class are normally distributed [93,94]. Similar UAV-based images have been classified for LULC in Bangladesh [46]. The SVM is a non-parametric, supervised machine learning algorithm that does not assume a normal data distribution [94]. The RFC is a non-parametric and robust machine-learning classification algorithm with a high classification accuracy [95]. It can assign missing values and rank variables in order of importance, allowing for a reliable assessment [95].

2.6. Accuracy Assessment

The agreement between the validation data and the classified map was assessed using an error matrix table, with 30% of the samples used as the test dataset [96,97]. The overall classification accuracy (OA) and Kappa coefficient (k) were computed. The Kappa coefficient measures how well the classified map and reference data match [98]. The acceptable classification threshold for all classes is 85%, and each LULC type has a minimum of 75% accuracy [99,100].

2.7. Combining UAV-Guided and Unguided Sentinel-2 LULC Classification Maps

We combined the two LULC classification maps (RFC) to show where the two LULC classification approaches match or mismatch in predicting each pixel with the same LULC class. The “0” value was classified as a non-zero integer for raster values. For each pixel, if both algorithms agreed that the predicted pixel belonged to the same LULC class, that pixel was assigned a value of 2 and a blue color code. Where pixels in the unguided Sentinel-2 LULC classification map did not match with the UAV-guided classification map in assigning a pixel to the same LULC class, the pixels were assigned a value of zero (0) and a red color code.
We used the combined tool in ArcGIS 10.7.1 to match the UAV-guided and unguided Sentinel-2 LULC classification maps pixel-to-pixel [48]. We used 29 plots, including the UAV-flown plots, in the combined map to test for the agreement between the two classification approaches. We calculated the agreement ratio (AR) between UAV-guided (AO) and unguided Sentinel-2 (Bo) LULC classification combined map following Duke et al. [48], as indicated in Table 6 below. The following important values were calculated for the computation: matched pixels (AOBO) of the two classification approaches and unmatched pixels for UAV-guided (A1) and unguided Sentinel-2 (B1).
A methodological flowchart guided the research team in accomplishing various components of the research (Figure 4).

3. Results

3.1. Accuracy Assessment for Ground-Linked UAV-Guided Sentinel-2 LULC Classification Approach

The RFC was the best algorithm for the ground-linked UAV-guided Sentinel-2 LULC classification approach. It registered the highest overall accuracy (94%) and kappa coefficient (0.91), followed by SVM (OA = 91%; k = 0.89) and MLC (OA = 90%; k = 0.88) (Table 7 and Tables S1–S3). All three algorithms attained UA values above 0.90 for grassland and water cover classes. The RFC attained higher UA values of shrublands (0.96) and woodlands (0.95) than the other algorithms.

3.2. Accuracy Assessment for Unguided Sentinel-2 LULC Classification Approach

The RFC algorithm, when compared with SVM and MLC, gave the best unguided Sentinel-2 LULC classification. Its accuracy was also higher (OA = 90; k = 0.87) than those registered by SVM (OA = 87%; k = 0.85) and MLC (OA = 80%; k = 0.77), (Table 8; and Supplementary Materials Tables S4–S6). The RFC registered the largest UA values compared to SVM and MLC for woodland, shrubland, bareland, water, forest, and cultivation LULC classes. Its UA value for grasslands was second to that registered by the SVM.

3.3. Comparative Extent and Spatial Distribution Patterns of LULC Classes Derived from UAV-Guided and Unguided Sentinel-2 Classification Approaches

Considering the RFC algorithm as the best classifier, the ground-linked UAV-guided LULC classification map showed clear differences between the two classification approaches. It registered the largest proportional area coverage of grasslands (55.20%), followed by water (24.28%), woodlands (7.59%), shrublands (4.69%), and bareland (3.96%). The proportional coverage areas of all these cover classes, except shrublands and bareland, were higher than those registered by the unguided Sentinel-2 LULC classification approach. The UAV-guided classification approach also showed lower proportional coverage areas for cultivation (1.6%), settlements (0.01%), and mosaic (1.05%) than those generated by the unguided Sentinel-2 LULC classification approach (3.30%, 0.03%, and 1.90%, respectively) (Figure 5 and Figure 6 and Table S7).
On the other hand, the unguided Sentinel-2 classification approach registered grassland (43.33%) as the largest cover type, followed by shrubland (27.37%), water (15.34%), bareland (6.02%), and cultivation (3.3%) (Figure 5 and Table S7). It showed the proportional coverage areas for shrublands, bareland, cultivation, and mosaic as larger than those registered by the ground-linked UAV-based LULC classification approach (Figure 5 and Table S5).

3.4. Agreement of UAV-Guided and Unguided Sentinel-2 LULC Classification Maps—RFC

The agreement ratio (%) of pixels matched between the UAV-guided and unguided Sentinel-2 LULC classification maps was determined using 29 plots in the combined classification map (Figure 7 and Figure 8, Table S8). The first agreement test considered the proportions (%) of each LULC class identified and predicted by the unguided Sentinel-2 LULC classification approach (RFC) that matched with the UAV-guided Sentinel-2 classification map (Figure 8A). The results showed a high agreement in the LULC classes of forests (83.8%), grasslands (75.5%), and water (55.2%). The lowest agreement was recorded in settlement LULC (0.4%), followed by cultivation (8.9%), shrublands (11.2%), and woodlands (26.2%). The agreement ratios for the bareland, riverine, and mosaic cover classes were also below 50%. Similar agreements were revealed based on the coverage areas (ha) determined by each platform (Figure 8B). The combined map clearly showed where the predictions of one LULC pixel class by the two platforms matched and where they did not. A t-test affirmed a statistically significant difference in the means for grasslands, shrublands, woodlands, water, and cultivation (Table 9).

4. Discussion

Precise detection, prediction, and mapping of LULC classes in protected areas are central to sustainable wildlife conservation, including habitat management practices. This study sought to establish the scalability of the ground-linked UAV-based Sentinel-2 LULC classification approach to large and complex conservation areas in the southern hemisphere. Our results suggest that undertaking ground surveys in different LULC types followed by high-resolution UAV flights generates reliable orthoimages for each covet type. Upscaling to a 10 × 10 m resolution and obtaining effective training samples from these products for Sentinel-2 LULC classification improves the performance accuracies. In this study, focusing on the Burunge WMA, we demonstrate that this classification approach using the RFC algorithm generated LULC classification maps with higher accuracies than those produced from the unguided Sentinel-2 LULC classification approach.
This study has provided interesting insights regarding the proportional area coverages of LULC classes generated using the UAV-guided LULC classification approach compared with the unguided Sentinel-2 LULC classification approach. The UAV-guided approach effectively delineated the LULC classes mainly attributed to the flight missions executed in ground-surveyed cover types. The observed reliable performance of the ground-linked UAV-guided LULC classification approach is explained by the UAV-based training samples that led to both high user and overall accuracy values [47]. For instance, the unguided Sentinel-2 LULC classification approach experienced lower user accuracy values for the riverine, woodland, and shrubland classes than those attained by the ground-linked UAV-guided LULC classification approach. It incorrectly assigned one class to other classes due to its medium resolution (Tables S7 and S8). This is particularly expected in highly heterogeneous habitats such as riverine vegetation [101,102]. The riverine cover class in our study area was highly heterogeneous, composed of alternating and mixed vegetation classes, with both open- and closed-canopy woodlands, shrublands, mosaics, bareland, and dry riverbeds.
The large proportional area coverage of the grassland (75.5%) and forest (83.8%) cover classes predicted by the unguided Sentinel-2 that matched with those predicted by the UAV-guided Sentinel-2 classification approach provide insights into the ability of Sentinel-2 to remain reliable for these classes (Figure 8A). This is well-supported by the high user and producer accuracy values attained using RFC (Tables S7 and S8). The bands in Sentinel-2 are well-recognized for improving LULC classifications, such as grasslands and forests [103,104]. For the water class (55.2%), the 29 plots from the combined LULC classification map did not include any open water. Hence, the lower accuracy can be explained by problems identifying shallow water [105,106,107], such as lake shores, water ponds, reservoirs, and small streams that were easily discriminated from other classes in the ground-linked UAV-guided classification approach. The ability of the UAV-guided Sentinel-2 LULC classification approach to detect small and sparse cultivated areas and few settlements and tourism facilities in the area would be explained by the low flight heights from the ground and its high resolution, leading to high user accuracies. The unguided Sentinel-2 LULC classification approach experienced detection confusion between other classes and the settlements, leading to relatively lower user accuracies, as indicated in the confusion matrix tables (Table S7).
The recorded agreement gaps between the LULC classification maps produced from the two classification approaches make the ground-linked UAV-guided classification approach useful as the ground truth for the unguided Sentinel-2 LULC classification approach. A similar agreement gap between UAV-based and Sentinel-2 classification was reported in the Chobe Enclave, Botswana [65]. The UAV-guided classification approach registered a significantly smaller proportional area coverage of the cultivation LULC class compared to the unguided approach, which was quite similar to the reality on the ground. The agreement ratio test confirmed this result. Few pixels predicted by the unguided Sentinel-2 classification approach matched those predicted by the UAV-guided approach. The lack of a statistically significant difference in the proportional area coverages detected using the two classification approaches for the forest and riverine cover types would be explained by the bands in Sentinel-2 that improve its capacity to detect well-established vegetated cover types [108].
Another study in the same landscape used Landsat TM5 and Landsat-8 imagery to evaluate the performance of the Kwakuchinja wildlife corridor (1280 km2), including the Burunge WMA, reported that grasslands dominate (33.34%), followed by shrublands (22.11%) and woodlands (11.95%), with declining trends of woodlands and increasing shrublands and cultivation from 2008–2018 [109]. Another study in the Kwakuchinja wildlife corridor, covering 407 km2, including parts of the Burunge WMA, delineated only three classes (agriculture, woodlands, and mixed) and reported an increasing trend of agriculture and declining woodland and mixed cover classes from 2002 to 2018 [13]. These results differ from our findings, in that the ground-linked high-resolution UAV-guided training samples assured a higher accuracy in predicting and delineating LULC classes. Furthermore, while underscoring the possible limitations of the platforms used in these studies, our findings revealed the novel power of the ground-linked high-resolution UAV-guided Sentinel-2 LULC classification approach in providing practically useful information for appropriate wildlife habitat management practices.
Although our study scope did not include the trends of LULC change, the UAV-guided Sentinel-2 classification approach revealed that the woodland cover class was the second vegetated cover type after grasslands. On the other hand, the unguided Sentinel-2 LULC classification approach registered shrublands as second after grasslands. The detected extent of the woodland and cultivated LULC classes revealed by the UAV-guided Sentinel-2 LULC classification approach reflected the reality on the ground. The differences between the two classification approaches would be explained by the confusion matrix, where some woodland pixels detected by the unguided Sentinel-2 were confused with shrublands. The UAV-guided Sentinel-2 LULC classification approach provided reliable information and was hence useful for appropriate decision-making.
This work provides the first scientific evidence showing the second-largest vegetation cover class inside the WMA. Forage improvement for browsers and grazers is one of the primary goals for establishing the WMA. The findings reflect on the effectiveness of the Burunge WMA in protecting wildlife habitats. There has been improvements in vegetation cover following integrated conservation efforts involving tourism-related investors in collaboration with communities, the Ministry of Natural Resources and Tourism (MNRT), and other stakeholders since its establishment in 2023. Unlike the protected Burunge WMA, a large part of the Kwakuchinja wildlife corridor is not protected, and the village land use plans are inadequate to guide the different land uses.
In the context of biodiversity conservation, the second position of woodland cover after grassland, registered by the UAV-guided classification approach, contrary to the unguided Sentine-2 classification approach and the other reported studies in the area [13,109], provides an early warning sign of its potential excessive expansion in the WMA. The expansion would reduce grassland cover unnoticed. It informs WMA management, ecologists, researchers, and other stakeholders to take possible interventions to ensure balanced grazing and browsing forages in the WMA. As woody plants increase in cover, density, and biomass, the result is a decline of other cover types such as grasslands and shrublands [110]. A study in South Africa reported that the expanding woodland cover in the grassy biome threatens the productivity of its rangelands [111]. In the Serengeti ecosystem, small and medium-sized prey species declined in some woodland areas due to woody plants, especially young trees, making dense cover available for lions to ambush [112]. A study on the Maswa Game Reserve in Tanzania reported woody plant cover encroachment and its negative impact on the other plant species, cover types, and grazers [113]. This study, therefore, provides a novel contribution to advancing LULC classifications in large and heterogeneous African conservation ecosystems using UAV technology. The novelty includes the size and heterogeneous complexity of the study area, pre-UAV flight ground data, and UAV-based woody plant encroachment detection. In addition, the first UAV-based detection of a relatively true proportional area coverage of woodland cover is similar to the ground situation in the area, reflecting expansion into the grasslands. Unlike other studies in the southern hemisphere that also used UAV platforms in small and less heterogeneous landscapes, this study explored and revealed the feasibility and utility of the ground-linked UAV-guided Setinel-2 LULC classification of large conservation areas in generating highly reliable information.
The role of UAV technology in generating significantly reliable information and data in large environments and their contributions to SDGs has been underscored [53,114]. Likewise, its usefulness in mapping rangelands has been emphasized [115], contributing to SDGs. The high-resolution LULC classification maps (RFC) with clearly delineated LULC classes generated using the ground-linked UAV-guided approach provide useful information for effective conservation planning, sustainable management, and monitoring of wildlife habitats in the Burunge WMA. The information is useful as input to the review process of the general management plan (GMP) for Burunge WMA, particularly for the zoning scheme that assigns the type and level of resource utilization. As a community-based conservation area, livestock grazing is allowed at a limit of acceptable use (LAU) in the earmarked zones. The generated information would be integrated with the pastorals’ indigenous knowledge for appropriate community rangeland management and to reduce land resource use conflicts in the WMA. The potential usefulness of our study findings contributes to the SDGs.
For instance, SDG 15 urges countries to “Protect, restore and promote sustainable use of terrestrial ecosystems, sustainably manage forests, combat desertification, and halt and reverse land degradation and halt biodiversity loss”. This study’s findings regarding effective habitat management and conservation operations would enhance wildlife populations and biodiversity, encouraging further tourism investment and other sustainable socioeconomic activities and providing worthwhile employment opportunities (SDG 8). The achievements from these activities would contribute to poverty reduction (SDG 1) and to combating climate change and its impacts (SDG 13). Since the Kunming-Montreal Global Biodiversity Framework (GBF) needs urgent action over the decade to 2030, the methodological approach demonstrated in this study will support countries in producing timely and reliable information contributing to appropriate actions toward reaching the GBF targets. The stated contributions to achieving the SDGs are reflected in the practical values of the LULC analysis in landscapes and ecosystems. Many other practical values are underscored, for instance, (i) detecting, monitoring, and predicting any changes for timely interventions [116], and (ii) integrating the status of LULC and habitats for wildlife populations, which helps us to link anthropogenic activities with the degradation of habitats [117]. Hence, this study has established the first UAV-based baseline data of the LULC classification map of the Burunge WMA with well-delineated LULC classes showing detailed information on its current status. It will be useful for monitoring and predicting changes in the future using a similar approach.
Although the UAV technology is promising for large areas, the large dataset generated for this study required time to process. Similar observations have been reported [49]. The use of appropriate training samples [118], powerful licensed Pix4Dmapper, a robust classification algorithm (RFC) selected against SVM and MLC based on accuracies, and a large number of trees for the RFC produced reliable LULC classification products at high-performance accuracies. A similar conclusion was drawn from a study reported by Bhatt and Maclean [49], whereby the RFC algorithm, appropriate training samples, and ancillary dataset improved their classification performance. Interestingly, their study achieved high accuracies for the RFC algorithm: the overall accuracies (87.3–93.7%) and kappa values (0.83–0.92) were categorized as “almost perfect”, similar to those achieved in this study (Table 8). Hence, the methodological flow demonstrated in this study assured the scalability of the ground-linked UAV-guided Sentinel-2 LULC classification approach to community wildlife management areas. However, this study did not cover extra-large protected areas.

5. Conclusions and Recommendations

5.1. Conclusions

This study has established the scalability of the ground-linked high-resolution UAV-guided Sentinel-2 LULC classification approach to large heterogenous conservation areas in Africa, taking the case study of the Burunge WMA in Tanzania. UAV technology has been used in small- to medium-sized areas with less heterogenous habitats. High-quality LULC classification maps for conservation areas are needed for appropriate management practices. This requires powerful training samples and a strong machine learning classifier algorithm. The classification approach using the RFC algorithm outperformed the unguided Sentinel-2 LULC classification approach. It produced high-quality LULC classification maps clearly showing almost perfect delineated classes, supported by the higher overall accuracy and Kohen’s kappa coefficient values than those obtained from the unguided Sentinel-2 LULC classification approach. It also revealed the interesting finding that woodlands are the second most prominent vegetated cover class after grasslands, followed by shrublands for the proportional area coverage. On the other hand, the unguided Sentine-2 LULC classification approach showed that shrublands were second after grasslands, followed by woodlands. A similar unguided LULC analysis using LANDSAT in the same landscape reported the same results. Hence, this is the first UAV-based record of its kind in the area to almost perfectly map the proportional area coverages of grasslands, woodlands, and shrublands. These findings were attributed to the powerful training samples obtained from the ground-linked UAV-guided Sentinel-2 LULC classification approach and the RFC algorithm used.
The comparative pixel-wise agreement ratio test of the combined LULC classification map (RFC) affirmed the high outperformance of the ground-linked UAV-guided Sentinel-2 LULC classification over the unguided approach in predicting each pixel with the same LULC class. High agreement ratios were only obtained for the grassland, water, and forest cover classes. The revealed high performance makes the LULC classification products generated using this approach reliable ground-truth data. Hence, it has strong potential contributions to sustainable development goals, particularly SDG 15, followed by SDGs 1, 8, and 13. The methodological approach would support countries in generating timely and reliable information for effective monitoring and management of habitats, appropriate decision-making, and ultimately, effective actions towards reaching the Kunming-Montreal Global Biodiversity Framework (GBF) targets.
This study has established the first UAV-based baseline data of the LULC classification map of the Burunge WMA, with well-delineated LULC classes showing detailed information on its current status. It will be useful for monitoring and predicting changes in the future using a similar approach. It has provided alerting management implications of the woodland and shrubland cover classes, altogether urging for close monitoring to safeguard the grassland areas for grazers. Linking the ground-linked high-resolution UAV data to the Sentinel-2 data has thus provided a novel dual-validation approach for LULC classification. It has provided a novel contribution to advancing LULC classifications in large and heterogeneous African conservation ecosystems using UAV technology. The novelty of this study is mainly invested in the size and heterogeneous complexity of the study area in Africa, pre-UAV flight ground data and UAV-based woody plant encroachment detection. Researchers and conservation practitioners interested in LULC classifications in Africa and beyond would use the approach to generate information for sustainable wildlife habitat management, rangeland assessment and management, and conservation planning, such as general management plans of conservation areas. Generally, a research gap remains for extra-large protected areas.

5.2. Recommendations

  • Scale up this approach to the entire Kwakuchinja wildlife corridor (1280 km2), which is less protected in the landscape than the Burunge WMA (~300 km2), forming part of the important corridor. Two studies conducted in the entire corridor used Landsat, which has a medium resolution. This calls for an application of the approach we deployed in this study to the entire corridor, since there are different levels of protection concerning the legal status of lands.
  • Scale up this approach to other community wildlife management areas in the country whose sizes range from 61 to 5372 km2 for updating their LULC maps. Using the same approach would generate high-resolution baseline information for future assessments of any LULC changes. For significantly large core protected areas such as national parks, further studies are necessary regarding how to address key challenges: costs (time and resources), the magnitude of heterogeneity, and levels of LULC classes (e.g., intact and disturbed forests with canopy gaps and regenerating ecosystems recovering from disturbances).
  • A follow-up study in the study area to assess the woody plant expansion to other vegetation types, mainly grasslands, to inform management, government, and other key players about appropriate interventions.

Supplementary Materials

The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/resources13080113/s1, Tables S1–S3: Error matrices for ground-linked UAV-guided LULC classification (RFC, SVM, and MLC, respectively); Tables S4–S6: Error matrices for unguided Sentinel-2 LULC classification (RFC, SVM, and MLC, respectively); Table S7: Comparative proportional coverage area (%) of LULC classes derived from UAV-guided and unguided Sentinel-2 classification approaches (total area = 299.93 km2); Table S8: Agreement test between the ground-linked UAV-guided Sentinel-2 and unguided Sentinel-2 classification approaches (RFC).

Author Contributions

Conceptualization, L.J.M., L.K.M. and P.A.N.; methodology, L.J.M., L.K.M. and H.K.K.; software, R.D.A.; validation, L.K.M. and P.A.N.; formal analysis, L.J.M. and H.K.K.; investigation, L.J.M., L.K.M., R.D.A., H.K.K. and E.R.N.; resources, L.J.M., R.D.A., L.K.M., P.A.N. and H.K.K.; data curation, L.J.M. and H.K.K.; writing—original draft preparation, L.J.M., L.K.M., P.A.N., R.D.A., H.K.K. and E.R.N.; writing—review and editing, L.J.M., L.K.M., P.A.N., R.D.A., H.K.K. and E.R.N.; visualization, L.J.M. and H.K.K.; supervision, L.K.M., P.A.N. and R.D.A.; project administration, L.K.M., P.A.N. and R.D.A.; funding acquisition, L.J.M., L.K.M., P.A.N. and R.D.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the African Centre for Research, Agricultural Advancement, Teaching Excellence and Sustainability (CREATES), which is funded by the World Bank’s African Centers of Excellence (ACE II) initiative, grant number P15847. Sokoine University of Agriculture (SUA) provided complementing financial support for field costs.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Acknowledgments

The authors acknowledge non-financial support provided in the research and production of this paper: (i) Tanzania Wildlife Research Institute, Commission for Science and Technology, Permanent Secretary President’s Office Regional Administration and Local Government, and the Local Government Authority in Babati District for providing the relevant research permits; (ii) Tanzania Civil Aviation Authority and the Ministry of Defense and National Service for providing legal permits to fly a drone for the research field data collections; (iii) Joely Efraim for his extra support with data analysis using R software; (iv) Paul Julius for additional technical support in the visualization part; (v) Burunge WMA Management and Game Rangers for logistical arrangements and safety support in the field during the data collection phase; (vi) Chem Chem Safari Lodge for supporting the field car and fuel; (vi) Emeritus Tamera Minnick of Colorado Mesa University, USA for editorial review.

Conflicts of Interest

The authors declare that this study received funding from the African Centre for Research, Agricultural Advancement, Teaching Excellence and Sustainability (CREATES) and Sokoine University of Agriculture (SUA). The funders were not involved in the study design, collection, analysis, interpretation of data, the writing of this article, or the decision to submit it for publication. The authors declare no conflicts of interest.

References

  1. Mmbaga, N.E.; Munishi, L.K.; Treydte, A.C. How dynamics and drivers of land use/land cover change impact elephant conservation and agricultural livelihood development in Rombo, Tanzania. J. Land Use Sci. 2017, 12, 168–181. [Google Scholar] [CrossRef]
  2. Kidane, Y.; Stahlmann, R.; Beierkuhnlein, C. Vegetation dynamics, and land use and land cover change in the Bale Mountains, Ethiopia. Environ. Monit. Assess. 2012, 184, 7473–7489. [Google Scholar] [CrossRef] [PubMed]
  3. Hamilton, C.M.; Martinuzzi, S.; Plantinga, A.J.; Radeloff, V.C.; Lewis, D.J.; Thogmartin, W.E.; Heglund, P.J.; Pidgeon, A.M. Current and future land use around a nationwide protected area network. PLoS ONE 2013, 8, e55737. [Google Scholar] [CrossRef]
  4. Ndegwa Mundia, C.; Murayama, Y. Analysis of land use/cover changes and animal population dynamics in a wildlife sanctuary in East Africa. Remote Sens. 2009, 1, 952–970. [Google Scholar] [CrossRef]
  5. Jewitt, D.; Goodman, P.S.; Erasmus, B.F.; O’Connor, T.G.; Witkowski, E.T. Systematic land-cover change in KwaZulu-Natal, South Africa: Implications for biodiversity. S. Afr. J. Sci. 2015, 111, 1–9. [Google Scholar] [CrossRef] [PubMed]
  6. Mashapa, C.; Gandiwa, E.; Muboko, N.; Mhuriro-Mashapa, P. Land use and land cover changes in a human-wildlife mediated landscape of Save Valley Conservancy, south-eastern lowveld of Zimbabwe. J. Anim. Plant Sci 2021, 31, 583–595. [Google Scholar]
  7. Kiffner, C.; Wenner, C.; LaViolet, A.; Yeh, K.; Kioko, J. From savannah to farmland: Effects of land-use on mammal communities in the T arangire–Manyara ecosystem, T anzania. Afr. J. Ecol. 2015, 53, 156–166. [Google Scholar] [CrossRef]
  8. Hadfield, L.A.; Durrant, J.O.; Jensen, R.R.; Melubo, K.; Weisler, L.; Martin, E.H.; Hardin, P.J. Protected Areas in Northern Tanzania: Local Communities, Land Use Change, and Management Challenges; Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
  9. Rahmonov, O.; Szypuła, B.; Sobala, M.; Islamova, Z.B. Environmental and Land-Use Changes as a Consequence of Land Reform in the Urej River Catchment (Western Tajikistan). Resources 2024, 13, 59. [Google Scholar] [CrossRef]
  10. Twisa, S.; Mwabumba, M.; Kurian, M.; Buchroithner, M.F. Impact of land-use/land-cover change on drinking water ecosystem services in Wami River Basin, Tanzania. Resources 2020, 9, 37. [Google Scholar] [CrossRef]
  11. Sharma, R.; Rimal, B.; Baral, H.; Nehren, U.; Paudyal, K.; Sharma, S.; Rijal, S.; Ranpal, S.; Acharya, R.P.; Alenazy, A.A. Impact of land cover change on ecosystem services in a tropical forested landscape. Resources 2019, 8, 18. [Google Scholar] [CrossRef]
  12. Kideghesho, J.R.; Nyahongo, J.W.; Hassan, S.N.; Tarimo, T.C.; Mbije, N.E. Factors and ecological impacts of wildlife habitat destruction in the Serengeti ecosystem in northern Tanzania. Afr. J. Environ. Assess. Manag. 2006, 11, 17–32. [Google Scholar]
  13. Martin, E.H.; Jensen, R.R.; Hardin, P.J.; Kisingo, A.W.; Shoo, R.A.; Eustace, A. Assessing changes in Tanzania’s Kwakuchinja Wildlife Corridor using multitemporal satellite imagery and open source tools. Appl. Geogr. 2019, 110, 102051. [Google Scholar] [CrossRef]
  14. Jones, T.; Bamford, A.J.; Ferrol-Schulte, D.; Hieronimo, P.; McWilliam, N.; Rovero, F. Vanishing wildlife corridors and options for restoration: A case study from Tanzania. Trop. Conserv. Sci. 2012, 5, 463–474. [Google Scholar] [CrossRef]
  15. Riggio, J.; Caro, T. Structural connectivity at a national scale: Wildlife corridors in Tanzania. PLoS ONE 2017, 12, e0187407. [Google Scholar] [CrossRef] [PubMed]
  16. Mangewa, L.J.; Kikula, I.S.; Lyimo, J.G. Ecological Viability of the Upper Kitete-Selela Wildlife Corridor in the Tarangire-Manyara Ecosystem: Implications to African Elephants and Buffalo Movements. ICFAI J. Environ. Econ. 2009, 7, 62–73. [Google Scholar]
  17. Debonnet, G.; Nindi, S. Technical Study on Land Use and Tenure Options and Status of Wildlife Corridors in Tanzania: An Input to the Preparation of Corridor; USAID: Washington, DC, USA, 2017.
  18. Mtui, D.T. Evaluating Landscape and Wildlife Changes over Time in Tanzania’s Protected Areas. Ph.D. Thesis, University of Hawai’i at Manoa, Honolulu, HI, USA, 2014. [Google Scholar]
  19. Yadav, P.; Kapoor, M.; Sarma, K. Land use land cover mapping, change detection and conflict analysis of Nagzira-Navegaon Corridor, Central India using geospatial technology. Int. J. Remote Sens. 2012, 1, 90–98. [Google Scholar]
  20. Kiffner, C.; Kioko, J.; Baylis, J.; Beckwith, C.; Brunner, C.; Burns, C.; Chavez-Molina, V.; Cotton, S.; Glazik, L.; Loftis, E. Long-term persistence of wildlife populations in a pastoral area. Ecol. Evol. 2020, 10, 10000–10016. [Google Scholar] [CrossRef] [PubMed]
  21. Zhi, X.; Du, H.; Zhang, M.; Long, Z.; Zhong, L.; Sun, X. Mapping the habitat for the moose population in Northeast China by combining remote sensing products and random forests. Glob. Ecol. Conserv. 2022, 40, e02347. [Google Scholar] [CrossRef]
  22. Ishida, T.; Kurihara, J.; Viray, F.A.; Namuco, S.B.; Paringit, E.C.; Perez, G.J.; Takahashi, Y.; Marciano Jr, J.J. A novel approach for vegetation classification using UAV-based hyperspectral imaging. Comput. Electron. Agric. 2018, 144, 80–85. [Google Scholar] [CrossRef]
  23. Ivošević, B.; Lugonja, P.; Brdar, S.; Radulović, M.; Vujić, A.; Valente, J. UAV-based land cover classification for hoverfly (Diptera: Syrphidae) habitat condition assessment: A case study on Mt. Stara Planina (Serbia). Remote Sens. 2021, 13, 3272. [Google Scholar] [CrossRef]
  24. Kija, H.K.; Ogutu, J.O.; Mangewa, L.J.; Bukombe, J.; Verones, F.; Graae, B.J.; Kideghesho, J.R.; Said, M.Y.; Nzunda, E.F. Land use and land cover change within and around the greater Serengeti ecosystem, Tanzania. Am. J. Remote Sens. 2020, 8, 1–19. [Google Scholar] [CrossRef]
  25. Seefeldt, S.S.; Booth, D.T. Measuring plant cover in sagebrush steppe rangelands: A comparison of methods. Environ. Manag. 2006, 37, 703–711. [Google Scholar] [CrossRef] [PubMed]
  26. Sumari, N.; Shao, Z.; Huang, M.; Sanga, C.; Van Genderen, J. Urban expansion: A geo-spatial approach for temporal monitoring of loss of agricultural land. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 1349–1355. [Google Scholar] [CrossRef]
  27. Al-Ali, Z.; Abdullah, M.; Asadalla, N.; Gholoum, M. A comparative study of remote sensing classification methods for monitoring and assessing desert vegetation using a UAV-based multispectral sensor. Environ. Monit. Assess. 2020, 192, 389. [Google Scholar] [CrossRef]
  28. Ge, G.; Shi, Z.; Zhu, Y.; Yang, X.; Hao, Y. Land use/cover classification in an arid desert-oasis mosaic landscape of China using remote sensed imagery: Performance assessment of four machine learning algorithms. Glob. Ecol. Conserv. 2020, 22, e00971. [Google Scholar] [CrossRef]
  29. Tsuyuki, S. Completing yearly land cover maps for accurately describing annual changes of tropical landscapes. Glob. Ecol. Conserv. 2018, 13, e00384. [Google Scholar]
  30. Singh, R.K.; Singh, P.; Drews, M.; Kumar, P.; Singh, H.; Gupta, A.K.; Govil, H.; Kaur, A.; Kumar, M. A machine learning-based classification of LANDSAT images to map land use and land cover of India. Remote Sens. Appl. Soc. Environ. 2021, 24, 100624. [Google Scholar] [CrossRef]
  31. Mwabumba, M.; Yadav, B.K.; Rwiza, M.J.; Larbi, I.; Twisa, S. Analysis of land use and land-cover pattern to monitor dynamics of Ngorongoro world heritage site (Tanzania) using hybrid cellular automata-Markov model. Curr. Res. Environ. Sustain. 2022, 4, 100126. [Google Scholar] [CrossRef]
  32. Sekertekin, A.; Marangoz, A.; Akcin, H. Pixel-based classification analysis of land use land cover using Sentinel-2 and Landsat-8 data. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 91–93. [Google Scholar] [CrossRef]
  33. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.-H. Monitoring agronomic parameters of winter wheat crops with low-cost UAV imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  34. Zhang, H.; Feng, Y.; Guan, W.; Cao, X.; Li, Z.; Ding, J. Using unmanned aerial vehicles to quantify spatial patterns of dominant vegetation along an elevation gradient in the typical Gobi region in Xinjiang, Northwest China. Glob. Ecol. Conserv. 2021, 27, e01571. [Google Scholar] [CrossRef]
  35. Mesas-Carrascosa, F.J.; Notario-García, M.D.; de Larriva, J.E.M.; de la Orden, M.S.; Porras, A.G.-F. Validation of measurements of land plot area using UAV imagery. Int. J. Appl. Earth Obs. Geoinf. 2014, 33, 270–279. [Google Scholar] [CrossRef]
  36. Yu, T.; Ni, W.; Zhang, Z.; Liu, Q.; Sun, G. Regional sampling of forest canopy covers using UAV visible stereoscopic imagery for assessment of satellite-based products in Northeast China. J. Remote Sens. 2022, 2022, 9806802. [Google Scholar] [CrossRef]
  37. Mishra, V.N.; Rai, P.K.; Kumar, P.; Prasad, R. Evaluation of land use/land cover classification accuracy using multi-resolution remote sensing images. Forum Geogr. 2016, 15, 45–53. [Google Scholar] [CrossRef]
  38. Kolarik, N.E.; Gaughan, A.E.; Stevens, F.R.; Pricope, N.G.; Woodward, K.; Cassidy, L.; Salerno, J.; Hartter, J. A multi-plot assessment of vegetation structure using a micro-unmanned aerial system (UAS) in a semi-arid savanna environment. ISPRS J. Photogramm. Remote Sens. 2020, 164, 84–96. [Google Scholar] [CrossRef]
  39. Ouattara, T.A.; Sokeng, V.-C.J.; Zo-Bi, I.C.; Kouamé, K.F.; Grinand, C.; Vaudry, R. Detection of forest tree losses in Côte d’Ivoire using drone aerial images. Drones 2022, 6, 83. [Google Scholar] [CrossRef]
  40. Campos, J.; García-Ruíz, F.; Gil, E. Assessment of vineyard canopy characteristics from vigour maps obtained using UAV and satellite imagery. Sensors 2021, 21, 2363. [Google Scholar] [CrossRef]
  41. Gbiri, I.A.; Idoko, I.A.; Okegbola, M.O.; Oyelakin, L.O. Analysis of Forest Vegetal Characteristics of Akure Forest Reserve from Optical Imageries and Unmanned Aerial Vehicle Data. Eur. J. Eng. Technol. Res. 2019, 4, 57–61. [Google Scholar]
  42. Gonzalez Musso, R.F.; Oddi, F.J.; Goldenberg, M.G.; Garibaldi, L.A. Applying unmanned aerial vehicles (UAVs) to map shrubland structural attributes in northern Patagonia, Argentina. Can. J. For. Res. 2020, 50, 615–623. [Google Scholar] [CrossRef]
  43. Vinci, A.; Brigante, R.; Traini, C.; Farinelli, D. Geometrical characterization of hazelnut trees in an intensive orchard by an unmanned aerial vehicle (UAV) for precision agriculture applications. Remote Sens. 2023, 15, 541. [Google Scholar] [CrossRef]
  44. Dash, J.P. On the Detection and Monitoring of Invasive Exotic Conifers in New Zealand Using Remote Sensing. Ph.D. Thesis, University of Canterbury, Christchurch, New Zealand, 2020. [Google Scholar]
  45. Michez, A.; Philippe, L.; David, K.; Sébastien, C.; Christian, D.; Bindelle, J. Can low-cost unmanned aerial systems describe the forage quality heterogeneity? Insight from a timothy pasture case study in southern Belgium. Remote Sens. 2020, 12, 1650. [Google Scholar] [CrossRef]
  46. Mollick, T.; Azam, M.G.; Karim, S. Geospatial-based machine learning techniques for land use and land cover mapping using a high-resolution unmanned aerial vehicle image. Remote Sens. Appl. Soc. Environ. 2023, 29, 100859. [Google Scholar] [CrossRef]
  47. Daryaei, A.; Sohrabi, H.; Atzberger, C.; Immitzer, M. Fine-scale detection of vegetation in semi-arid mountainous areas with focus on riparian landscapes using Sentinel-2 and UAV data. Comput. Electron. Agric. 2020, 177, 105686. [Google Scholar] [CrossRef]
  48. Duke, O.P.; Alabi, T.; Neeti, N.; Adewopo, J. Comparison of UAV and SAR performance for Crop type classification using machine learning algorithms: A case study of humid forest ecology experimental research site of West Africa. Int. J. Remote Sens. 2022, 43, 4259–4286. [Google Scholar] [CrossRef]
  49. Bhatt, P.; Maclean, A.L. Comparison of high-resolution NAIP and unmanned aerial vehicle (UAV) imagery for natural vegetation communities classification using machine learning approaches. GIScience Remote Sens. 2023, 60, 2177448. [Google Scholar] [CrossRef]
  50. Rodriguez-Galiano, V.F.; Ghimire, B.; Rogan, J.; Chica-Olmo, M.; Rigol-Sanchez, J.P. An assessment of the effectiveness of a random forest classifier for land-cover classification. ISPRS J. Photogramm. Remote Sens. 2012, 67, 93–104. [Google Scholar] [CrossRef]
  51. Bui, D.H.; Mucsi, L. From land cover map to land use map: A combined pixel-based and object-based approach using multi-temporal landsat data, a random forest classifier, and decision rules. Remote Sens. 2021, 13, 1700. [Google Scholar] [CrossRef]
  52. Ahmad, A.; Quegan, S. Analysis of maximum likelihood classification technique on Landsat 5 TM satellite data of tropical land covers. In Proceedings of the 2012 IEEE international Conference on Control System, Computing and Engineering, Penang, Malaysia, 23–25 November 2012; pp. 280–285. [Google Scholar]
  53. Varotsos, C.A.; Cracknell, A.P. Remote Sensing Letters contribution to the success of the Sustainable Development Goals-UN 2030 agenda. Remote Sens. Lett. 2020, 11, 715–719. [Google Scholar] [CrossRef]
  54. Christensen, M.; Jokar Arsanjani, J. Stimulating implementation of sustainable development goals and conservation action: Predicting future land use/cover change in Virunga National Park, Congo. Sustainability 2020, 12, 1570. [Google Scholar] [CrossRef]
  55. Ruwaimana, M.; Satyanarayana, B.; Otero, V.; Muslim, A.M.; Syafiq, A.; Ibrahim, S.M.; Raymaekers, D.; Koedam, N.; Dahdouh-Guebas, F. The advantages of using drones over space-borne imagery in the mapping of mangrove forests. PLoS ONE 2018, 13, e0200288. [Google Scholar] [CrossRef]
  56. Stark, D.J.; Vaughan, I.P.; Evans, L.J.; Kler, H.; Goossens, B. Combining drones and satellite tracking as an effective tool for informing policy change in riparian habitats: A proboscis monkey case study. Remote Sens. Ecol. Conserv. 2018, 4, 44–52. [Google Scholar] [CrossRef]
  57. Graenzig, T.; Fassnacht, F.E.; Kleinschmit, B.; Foerster, M. Mapping the fractional coverage of the invasive shrub Ulex europaeus with multi-temporal Sentinel-2 imagery utilizing UAV orthoimages and a new spatial optimization approach. Int. J. Appl. Earth Obs. Geoinf. 2021, 96, 102281. [Google Scholar] [CrossRef]
  58. Yang, B.; Hawthorne, T.L.; Torres, H.; Feinman, M. Using object-oriented classification for coastal management in the east central coast of Florida: A quantitative comparison between UAV, satellite, and aerial data. Drones 2019, 3, 60. [Google Scholar] [CrossRef]
  59. Mienna, I.M.; Klanderud, K.; Ørka, H.O.; Bryn, A.; Bollandsås, O.M. Land cover classification of treeline ecotones along a 1100 km latitudinal transect using spectral-and three-dimensional information from UAV-based aerial imagery. Remote Sens. Ecol. Conserv. 2022, 8, 536–550. [Google Scholar] [CrossRef]
  60. Akumu, C.E.; Amadi, E.O.; Dennis, S. Application of drone and WorldView-4 satellite data in mapping and monitoring grazing land cover and pasture quality: Pre-and post-flooding. Land 2021, 10, 321. [Google Scholar] [CrossRef]
  61. Brinkhoff, J.; Hornbuckle, J.; Barton, J.L. Assessment of aquatic weed in irrigation channels using UAV and satellite imagery. Water 2018, 10, 1497. [Google Scholar] [CrossRef]
  62. Ahmed, O.S.; Shemrock, A.; Chabot, D.; Dillon, C.; Williams, G.; Wasson, R.; Franklin, S.E. Hierarchical land cover and vegetation classification using multispectral data acquired from an unmanned aerial vehicle. Int. J. Remote Sens. 2017, 38, 2037–2052. [Google Scholar] [CrossRef]
  63. Nasiri, V.; Darvishsefat, A.A.; Arefi, H.; Griess, V.C.; Sadeghi, S.M.M.; Borz, S.A. Modeling forest canopy cover: A synergistic use of Sentinel-2, aerial photogrammetry data, and machine learning. Remote Sens. 2022, 14, 1453. [Google Scholar] [CrossRef]
  64. Hegarty-Craver, M.; Polly, J.; O’Neil, M.; Ujeneza, N.; Rineer, J.; Beach, R.H.; Lapidus, D.; Temple, D.S. Remote crop mapping at scale: Using satellite imagery and UAV-acquired data as ground truth. Remote Sens. 2020, 12, 1984. [Google Scholar] [CrossRef]
  65. Gaughan, A.E.; Kolarik, N.E.; Stevens, F.R.; Pricope, N.G.; Cassidy, L.; Salerno, J.; Bailey, K.M.; Drake, M.; Woodward, K.; Hartter, J. Using Very-High-Resolution Multispectral Classification to Estimate Savanna Fractional Vegetation Components. Remote Sens. 2022, 14, 551. [Google Scholar] [CrossRef]
  66. Kilwenge, R.; Adewopo, J.; Sun, Z.; Schut, M. UAV-based mapping of banana land area for village-level decision-support in Rwanda. Remote Sens. 2021, 13, 4985. [Google Scholar] [CrossRef]
  67. Angnuureng, D.B.; Brempong, K.; Jayson-Quashigah, P.; Dada, O.; Akuoko, S.; Frimpomaa, J.; Mattah, P.; Almar, R. Satellite, drone and video camera multi-platform monitoring of coastal erosion at an engineered pocket beach: A showcase for coastal management at Elmina Bay, Ghana (West Africa). Reg. Stud. Mar. Sci. 2022, 53, 102437. [Google Scholar] [CrossRef]
  68. Lee, D.E. Evaluating conservation effectiveness in a Tanzanian community wildlife management area. J. Wildl. Manag. 2018, 82, 1767–1774. [Google Scholar] [CrossRef]
  69. Kicheleri, R.P.; Treue, T.; Kajembe, G.C.; Mombo, F.M.; Nielsen, M.R. Power struggles in the management of wildlife resources: The case of Burunge wildlife management area, Tanzania. In Wildlife Management-Failures, Successes and Prospects; IntechOpen: London, UK, 2018. [Google Scholar]
  70. Prins, H.H.; Loth, P.E. Rainfall patterns as background to plant phenology in northern Tanzania. J. Biogeogr. 1988, 15, 451–463. [Google Scholar] [CrossRef]
  71. Mangewa, L.J.; Ndakidemi, P.A.; Alward, R.D.; Kija, H.K.; Bukombe, J.K.; Nasolwa, E.R.; Munishi, L.K. Comparative assessment of UAV and sentinel-2 NDVI and GNDVI for preliminary diagnosis of habitat conditions in Burunge wildlife management area, Tanzania. Earth 2022, 3, 769–787. [Google Scholar] [CrossRef]
  72. Braun-Blanquet, J. Plant sociology. In The Study of Plant Communities, 1st ed.; W.H. Freeman & Co. Ltd.: New York, NY, USA, 1932. [Google Scholar]
  73. Jennings, M.; Loucks, O.; Glenn-Lewin, D.; Peet, R.; Faber-Langendoen, D.; Grossman, D.; Damman, A.; Barbour, M.; Pfister, R.; Walker, M. Guidelines for Describing Associations and Alliances of the US National Vegetation Classification; The Ecological Society of America Vegetation Classification Panel: Washington, DC, USA, 2004; Volume 46. [Google Scholar]
  74. Friedl, M.A.; McIver, D.K.; Hodges, J.C.; Zhang, X.Y.; Muchoney, D.; Strahler, A.H.; Woodcock, C.E.; Gopal, S.; Schneider, A.; Cooper, A. Global land cover mapping from MODIS: Algorithms and early results. Remote Sens. Environ. 2002, 83, 287–302. [Google Scholar] [CrossRef]
  75. Mtui, D.T.; Lepczyk, C.A.; Chen, Q.; Miura, T.; Cox, L.J. Assessing multi-decadal land-cover–land-use change in two wildlife protected areas in Tanzania using Landsat imagery. PLoS ONE 2017, 12, e0185468. [Google Scholar] [CrossRef]
  76. Bukombe, J.; Senzota, R.B.; Fryxell, J.M.; Kittle, A.; Kija, H.; Hopcraft, J.G.C.; Mduma, S.; Sinclair, A.R. Do animal size, seasons and vegetation type influence detection probability and density estimates of Serengeti ungulates? Afr. J. Ecol. 2016, 54, 29–38. [Google Scholar] [CrossRef]
  77. Tekle, K.; Hedlund, L. Land cover changes between 1958 and 1986 in Kalu District, southern Wello, Ethiopia. Mt. Res. Dev. 2000, 20, 42–51. [Google Scholar] [CrossRef]
  78. Bennett, A.F.; Radford, J.Q.; Haslem, A. Properties of land mosaics: Implications for nature conservation in agricultural environments. Biol. Conserv. 2006, 133, 250–264. [Google Scholar] [CrossRef]
  79. Bhatt, P.; Edson, C.; Maclean, A. Image Processing in Dense Forest Areas using Unmanned Aerial System (UAS); Michigan Tech Publications: Houghton, MI, USA, 2022. [Google Scholar]
  80. Eskandari, R.; Mahdianpari, M.; Mohammadimanesh, F.; Salehi, B.; Brisco, B.; Homayouni, S. Meta-analysis of unmanned aerial vehicle (UAV) imagery for agro-environmental monitoring using machine learning and statistical models. Remote Sens. 2020, 12, 3511. [Google Scholar] [CrossRef]
  81. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; Van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef]
  82. Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2018, 19, 115–133. [Google Scholar] [CrossRef]
  83. Flores-de-Santiago, F.; Valderrama-Landeros, L.; Rodríguez-Sobreyra, R.; Flores-Verdugo, F. Assessing the effect of flight altitude and overlap on orthoimage generation for UAV estimates of coastal wetlands. J. Coast. Conserv. 2020, 24, 35. [Google Scholar] [CrossRef]
  84. Ahmad, L.; Habib Kanth, R.; Parvaze, S.; Sheraz Mahdi, S. Measurement of Cloud Cover. In Experimental Agrometeorology: A Practical Manual; Springer: Berlin/Heidelberg, Germany, 2017; pp. 51–54. [Google Scholar]
  85. Lim, S. Geospatial Information Data Generation Using Unmanned Aerial Photogrammetry and Accuracy Assessment; Department of Civil Engineering, Graduate School Chungnam National University: Daejeon, Republic of Korea, 2016. [Google Scholar]
  86. Yun, B.-Y.; Sung, S.-M. Location accuracy of unmanned aerial photogrammetry results according to change of number of ground control points. J. Korean Assoc. Geogr. Inf. Stud. 2018, 21, 24–33. [Google Scholar]
  87. James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef]
  88. Dash, J.P.; Pearse, G.D.; Watt, M.S. UAV multispectral imagery can complement satellite data for monitoring forest health. Remote Sens. 2018, 10, 1216. [Google Scholar] [CrossRef]
  89. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  90. Shiraishi, T.; Motohka, T.; Thapa, R.B.; Watanabe, M.; Shimada, M. Comparative assessment of supervised classifiers for land use–land cover classification in a tropical region using time-series PALSAR mosaic data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 1186–1199. [Google Scholar] [CrossRef]
  91. Avcı, C.; Budak, M.; Yağmur, N.; Balçık, F. Comparison between random forest and support vector machine algorithms for LULC classification. Int. J. Eng. Geosci. 2023, 8, 1–10. [Google Scholar] [CrossRef]
  92. Zhang, F.; Yang, X. Improving land cover classification in an urbanized coastal area by random forests: The role of variable selection. Remote Sens. Environ. 2020, 251, 112105. [Google Scholar] [CrossRef]
  93. Hansen, M.; Dubayah, R.; Defries, R. Classification trees: An alternative to traditional land cover classifiers. Int. J. Remote Sens. 1996, 17, 1075–1081. [Google Scholar] [CrossRef]
  94. Taati, A.; Sarmadian, F.; Mousavi, A.; Pour, C.T.H.; Shahir, A.H.E. Land use classification using support vector machine and maximum likelihood algorithms by Landsat 5 TM images. Walailak J. Sci. Technol. 2015, 12, 681–687. [Google Scholar]
  95. Frakes, R.A.; Belden, R.C.; Wood, B.E.; James, F.E. Landscape Analysis of Adult Florida Panther Habitat. PLoS ONE 2015, 10, e0133044. [Google Scholar] [CrossRef] [PubMed]
  96. Lu, D.; Weng, Q.; Moran, E.; Li, G.; Hetrick, S. Remote Sensing Image Classification; CRC Press/Taylor and Francis: Boca Raton, FL, USA, 2011. [Google Scholar]
  97. Lillesand, T.; Kiefer, R.; Chipman, J. Digital image interpretation and analysis. Remote Sens. Image Interpret. 2008, 6, 545–581. [Google Scholar]
  98. Olofsson, P.; Foody, G.M.; Herold, M.; Stehman, S.V.; Woodcock, C.E.; Wulder, M.A. Good practices for estimating area and assessing accuracy of land change. Remote Sens. Environ. 2014, 148, 42–57. [Google Scholar] [CrossRef]
  99. Foody, G.M. Harshness in image classification accuracy assessment. Int. J. Remote Sens. 2008, 29, 3137–3158. [Google Scholar] [CrossRef]
  100. Wulder, M.A.; Franklin, S.E.; White, J.C.; Linke, J.; Magnussen, S. An accuracy assessment framework for large-area land cover classification products derived from medium-resolution satellite data. Int. J. Remote Sens. 2006, 27, 663–683. [Google Scholar] [CrossRef]
  101. Nguyen, U.; Glenn, E.; Dang, T.; Pham, L. Mapping vegetation types in semi-arid riparian regions using random forest and object-based image approach: A case study of the Colorado River Ecosystem, Grand Canyon, Arizona. Ecol. Inform. 2019, 50, 43–50. [Google Scholar] [CrossRef]
  102. Van Iersel, W.; Straatsma, M.; Middelkoop, H.; Addink, E. Multitemporal classification of river floodplain vegetation using time series of UAV images. Remote Sens. 2018, 10, 1144. [Google Scholar] [CrossRef]
  103. Phiri, D.; Simwanda, M.; Salekin, S.; Nyirenda, V.R.; Murayama, Y.; Ranagalage, M. Sentinel-2 data for land cover/use mapping: A review. Remote Sens. 2020, 12, 2291. [Google Scholar] [CrossRef]
  104. Otunga, C.; Odindi, J.; Mutanga, O.; Adjorlolo, C. Evaluating the potential of the red edge channel for C3 (Festuca spp.) grass discrimination using Sentinel-2 and Rapid Eye satellite image data. Geocarto Int. 2019, 34, 1123–1143. [Google Scholar] [CrossRef]
  105. Van Leeuwen, B.; Tobak, Z.; Kovács, F. Machine learning techniques for land use/land cover classification of medium resolution optical satellite imagery focusing on temporary inundated areas. J. Environ. Geogr. 2020, 13, 43–52. [Google Scholar] [CrossRef]
  106. Komarkova, J.; Sedlak, P.; Pešek, R.; Čermáková, I. Small water bodies identification by means of remote sensing. In Proceedings of the 7th International Conference on Cartography and GIS, Sozopol, Bulgaria, 18–23 June 2018; Volume 1, p. 2. [Google Scholar]
  107. Psychalas, C.; Vlachos, K.; Moumtzidou, A.; Gialampoukidis, I.; Vrochidis, S.; Kompatsiaris, I. Towards a Paradigm Shift on Mapping Muddy Waters with Sentinel-2 Using Machine Learning. Sustainability 2023, 15, 13441. [Google Scholar] [CrossRef]
  108. Waśniewski, A.; Hościło, A.; Zagajewski, B.; Moukétou-Tarazewicz, D. Assessment of Sentinel-2 satellite images and random forest classifier for rainforest mapping in Gabon. Forests 2020, 11, 941. [Google Scholar] [CrossRef]
  109. TAWIRI; United States Agency for International Development (USAID). Ecological viability assessment to support piloting implementation of wildlife corridor regulations in the proposed kwakuchinja wildlife corridor. In Kwakuchinja Wildlife Corridor Ecological Viability Assessment; USAID: Washington, DC, USA, 2019. [Google Scholar]
  110. Van Auken, O. Causes and consequences of woody plant encroachment into western North American grasslands. J. Environ. Manag. 2009, 90, 2931–2942. [Google Scholar] [CrossRef]
  111. Skowno, A.L.; Thompson, M.W.; Hiestermann, J.; Ripley, B.; West, A.G.; Bond, W.J. Woodland expansion in South African grassy biomes based on satellite observations (1990–2013): General patterns and potential drivers. Glob. Change Biol. 2017, 23, 2358–2369. [Google Scholar] [CrossRef]
  112. Sinclair, A.R.; Mduma, S.A.; Hopcraft, J.G.C.; Fryxell, J.M.; Hilborn, R.; Thirgood, S. Long-term ecosystem dynamics in the Serengeti: Lessons for conservation. Conserv. Biol. 2007, 21, 580–590. [Google Scholar] [CrossRef]
  113. Kimaro, H.; Asenga, A.; Munishi, L.; Treydte, A. Woody encroachment extent and its associated impacts on plant and herbivore species occurrence in Maswa Game Reserve, Tanzania. Environ. Nat. Resour. Res. 2019, 9. [Google Scholar] [CrossRef]
  114. Kitonsa, H.; Kruglikov, S.V. Significance of drone technology for achievement of the United Nations sustainable development goals. R-Econ. 2018, 4, 115–120. [Google Scholar] [CrossRef]
  115. Laliberte, A.S.; Rango, A.; Herrick, J. Unmanned aerial vehicles for rangeland mapping and monitoring: A comparison of two systems. In Proceedings of the ASPRS Annual Conference Proceedings, Tampa, FL, USA, 7–11 May 2007; pp. 1–10. [Google Scholar]
  116. Gambo, J.; Shafri, H.M.; Shaharum, N.S.N.; Abidin, F.A.Z.; Rahman, M.T.A. Monitoring and predicting land use-land cover (LULC) changes within and around krau wildlife reserve (KWR) protected area in Malaysia using multi-temporal landsat data. Geoplanning J. Geomat. Plan. 2018, 5, 17–34. [Google Scholar] [CrossRef]
  117. Brink, A.B.; Martínez-López, J.; Szantoi, Z.; Moreno-Atencia, P.; Lupi, A.; Bastin, L.; Dubois, G. Indicators for assessing habitat values and pressures for protected areas—An integrated habitat and land cover change approach for the Udzungwa Mountains National Park in Tanzania. Remote Sens. 2016, 8, 862. [Google Scholar] [CrossRef]
  118. Lim, T.Y.; Kim, J.; Kim, W.; Song, W. A Study on Wetland Cover Map Formulation and Evaluation Using Unmanned Aerial Vehicle High-Resolution Images. Drones 2023, 7, 536. [Google Scholar] [CrossRef]
Figure 1. Map of Burunge WMA with a terrain model background indicating the survey sample plots. The blue point inside the small red circle in the insert map of Tanzania shows the location of the Burunge WMA reproduced with permission from [71]; published by MDPI, 2022.
Figure 1. Map of Burunge WMA with a terrain model background indicating the survey sample plots. The blue point inside the small red circle in the insert map of Tanzania shows the location of the Burunge WMA reproduced with permission from [71]; published by MDPI, 2022.
Resources 13 00113 g001
Figure 4. Overall methodological flowchart adopted and modified from other researchers [47,48].
Figure 4. Overall methodological flowchart adopted and modified from other researchers [47,48].
Resources 13 00113 g004
Figure 5. Proportional area coverage (%) of LULC classes identified and predicted by UAV-guided Sentinel-2 (brown bars) and unguided Sentinel-2 (blue bars).
Figure 5. Proportional area coverage (%) of LULC classes identified and predicted by UAV-guided Sentinel-2 (brown bars) and unguided Sentinel-2 (blue bars).
Resources 13 00113 g005
Figure 6. Ground-linked UAV-guided Sentinel-2 LULC classification maps using MLC, SVM, and RFC (A) showing the spatial distribution of different cover types predicted compared with unguided Sentinel-2 LULC classification maps (B) using the same algorithms. Since the RFC algorithm showed the highest overall classification accuracy and Kappa coefficient (k) values, the RFC-based thematic maps for the two classification approaches were selected for subsequent comparisons.
Figure 6. Ground-linked UAV-guided Sentinel-2 LULC classification maps using MLC, SVM, and RFC (A) showing the spatial distribution of different cover types predicted compared with unguided Sentinel-2 LULC classification maps (B) using the same algorithms. Since the RFC algorithm showed the highest overall classification accuracy and Kappa coefficient (k) values, the RFC-based thematic maps for the two classification approaches were selected for subsequent comparisons.
Resources 13 00113 g006
Figure 7. Agreement between the LULC classification map of the Burunge WMA after combining UAV-guided and unguided Sentinel-2 LULC classification maps using the RFC algorithm. The combined map’s blue and red color codes visually show matching and mismatching pixels for particular classes at a 10 m × 10 m grid cell resolution. This is revealed in the given examples of pixels from high-resolution UAV RGB images (left column) and the respective pixels from UAV-guided (middle column) and unguided (right column) LULC classification maps. BWMA_01 is from the grassland in Figure 2i; BWMA_02 is from a woodland plot with mixed palms, Vachellia tortilis, and shrubs in Figure 3I; BWMA_14 is from a shallow water pond located in the woodland plot in Figure 2iv and Figure 3B; BWMA_20 is from riverine vegetation near the river bank in Figure 2v.
Figure 7. Agreement between the LULC classification map of the Burunge WMA after combining UAV-guided and unguided Sentinel-2 LULC classification maps using the RFC algorithm. The combined map’s blue and red color codes visually show matching and mismatching pixels for particular classes at a 10 m × 10 m grid cell resolution. This is revealed in the given examples of pixels from high-resolution UAV RGB images (left column) and the respective pixels from UAV-guided (middle column) and unguided (right column) LULC classification maps. BWMA_01 is from the grassland in Figure 2i; BWMA_02 is from a woodland plot with mixed palms, Vachellia tortilis, and shrubs in Figure 3I; BWMA_14 is from a shallow water pond located in the woodland plot in Figure 2iv and Figure 3B; BWMA_20 is from riverine vegetation near the river bank in Figure 2v.
Resources 13 00113 g007
Figure 8. The proportions (%) of class-wise pixels (A) from the unguided classification map that matched with the same class of pixels of the ground-linked UAV-guided Sentinel-2 LULC classification map in a combined map based on 29 plots (A). Their area coverages (ha) were compared (B). Cover areas (ha) of each LULC class with different letters for UAV-guided and unguided Senitnel-2 LULC classification approaches present statistically significant differences.
Figure 8. The proportions (%) of class-wise pixels (A) from the unguided classification map that matched with the same class of pixels of the ground-linked UAV-guided Sentinel-2 LULC classification map in a combined map based on 29 plots (A). Their area coverages (ha) were compared (B). Cover areas (ha) of each LULC class with different letters for UAV-guided and unguided Senitnel-2 LULC classification approaches present statistically significant differences.
Resources 13 00113 g008
Table 3. Definitions of LULC classes determined in BWMA.
Table 3. Definitions of LULC classes determined in BWMA.
LULCDescriptionReference
Bare landExposed soil, sand, or rocks; vegetation % cover of <2%[18,74,75,76]
cultivation/
Agriculture
Characterized by a clear farm pattern covered by crops, harvested or with bare soil. Includes perennial woody crops cultivated inside or adjacent to protected areas.[18,24,74,75,77]
Settlement/Built-up areasHouses (scattered or clustered) inside and adjacent to the protected area. May include trees, shrubs, grasses, and roads, each with various proportions[18,74,75,76,77]
Water bodiesRivers, streams, lakes, ponds, and impoundments are composed of water, grasses, forbs, sedges, and reeds.[18,24,74,75]
GrasslandsDominated by grasses and herbs. Includes savanna grassland (widely scattered trees and shrub cover ≤ 2%) and wooded grassland (scattered tree and shrub cover < 10%)[18,74,75,76]
ShrublandsWoody vegetation (evergreen or deciduous) composed of shrubs (multi-stemmed woody plants ≤ 5 m tall) and trees ≤ 2 m tall; combined canopy cover of 10–60%[18,24,74,75,77]
WoodlandsWoody vegetation (evergreen or deciduous) comprises trees > 2 m tall. It includes open woodland/woody savanna (canopy cover 20–60%) and closed woodland (60–100% cover with canopy not thickly interlaced). The understory consists of small proportions of grasses, shrubs, and forbs.[74,76,77]
MosaicPlant community characterized by relatively similar proportions of two or more LULC
classes
[78]
Riverine vegetationTrees dominate vegetation along rivers. Includes mixtures of riverine forests, riverine woodlands, and dense shrubs.[77]
ForestsTrees forming closed or nearly closed canopies. May comprise an upper story of trees with heights of 40–50 m, a lower story (8–15 m), an understory (2–8 m), and vines. Degraded open/patched forests may look like intact open woodland.[23]
Table 4. Average percentage cover of plant layers in each vegetation cover type.
Table 4. Average percentage cover of plant layers in each vegetation cover type.
Layer
TreeShrub HerbaceousBare
Land Cover TypeHeight (m)Cover (%)Height (m)Cover (%)Cover (%)Cover (%)
Plain grassland000093.76.3
Wooded grassland7.98.51.82140.430.1
Shrubland6.72.82.172.918.45.9
Palm woodland7.126.62.925.837.510.1
Acacia woodland6.815.51.513.150.820.6
Riverine vegetation8.59.82.839.043.28.0
Mosaic712.91.813.845.228.1
Table 5. Training and testing samples for LULC classification using the RFC classifier. Columns (i) and (ii) are for ground-linked UAV-guided Sentinel-2 and unguided Sentinel-2 LULC classification approaches, respectively.
Table 5. Training and testing samples for LULC classification using the RFC classifier. Columns (i) and (ii) are for ground-linked UAV-guided Sentinel-2 and unguided Sentinel-2 LULC classification approaches, respectively.
LULC ClassMLCSVMRFC
Training SetTesting SetTraining SetTesting SetTraining SetTesting Set
(i)(ii)(i)(ii)(i)(ii)(i)(ii)(i)(ii)(i)(ii)
Grassland2341321018124411510587265363114156
Shrubland74893238501012231143676129
Woodland139995965921134022111574724
Bareland9110239256398271239461722
Water395817203932172327301211
Riverine327614155326233525311013
Forest22569203436151823341014
Cultivation1925123494721233135911
Settlement2531883859162813151010
Mosaic253210253773162123221010
Total700700300300700700300300700700300300
Table 6. Template table with examples of data for three LULC classes used for the computation to compare the agreement ratio (AR) between the two LULC classification approaches. Adopted and customized from Duke et al. [48].
Table 6. Template table with examples of data for three LULC classes used for the computation to compare the agreement ratio (AR) between the two LULC classification approaches. Adopted and customized from Duke et al. [48].
Pixel CountsMatched PixelsUnmatched
Pixels
Proportion of Matched Pixels of Unguided Sentinel-2 (AR1)Agreement Ratio for Total Pixels (AR2)
LULC ClassAo%Bo%AoBoA1B1 A R 1 = A 0 B 0 B 0 × 100 A R 2 = A 0 B 0 A 0 × 100
Grassland60,2217333,40357.425,21635,005818775.530.4
Shrubland5820718,98032.62121369916,85911.22.6
Woodland16,7822057869.9151615,266427026.21.8
Total82,823 58,169 28,85353,97029,316 34.8
Table 7. Classification accuracies of the ground-linked UAV-guided Sentinel-2 LULC approach.
Table 7. Classification accuracies of the ground-linked UAV-guided Sentinel-2 LULC approach.
LULC ClassLULC Classification Algorithm
MLCSVMRFC
UAPAUAPAUAPA
Grassland0.980.970.940.980.940.98
Woodland0.830.880.800.870.950.94
Shrubland0.880.900.910.910.960.94
Bareland0.920.870.920.860.890.86
Water0.930.821.000.931.000.87
Riverine0.790.820.780.810.890.89
Forest0.900.860.900.880.850.85
Cultivation0.810.860.960.870.900.87
Settlement0.890.911.000.80.910.86
Mosaic0.790.810.750.840.820.90
OA (%)909194
k0.880.890.92
Table 8. Accuracies for unguided Sentinel-2 LULC classification approach.
Table 8. Accuracies for unguided Sentinel-2 LULC classification approach.
LULC ClassLULC Classification Algorithm
MLCSVMRFC
UAPAUAPAUAPA
Grassland0.850.920.970.960.930.99
Woodland0.770.750.780.860.840.83
Shrubland0.750.800.820.820.860.76
Bareland0.800.760.900.780.920.84
Water0.870.760.860.910.980.84
Riverine0.750.740.840.840.750.88
Forest0.780.850.900.830.960.92
Cultivation0.840.760.790.900.910.80
Settlement0.790.750.760.780.780.76
Mosaic0.770.750.750.770.820.75
OA (%)808790
k0.770.850.87
Table 9. The table shows statistical tests for the means of area coverage of each LULC class determined using the two classification approaches. The p-values with a significance code “*” show significant differences (p ≤ 0.05).
Table 9. The table shows statistical tests for the means of area coverage of each LULC class determined using the two classification approaches. The p-values with a significance code “*” show significant differences (p ≤ 0.05).
LULC Classtdfp-Value
Grassland2.093866.57100.0426 *
Shrubland2.589028.76000.0149 *
Woodland2.413441.82600.0128 *
Bareland0.028159.12200.9777
Water2.80998.20620.0171 *
Riverine1.111812.76300.1084
Forest1.685117.23800.1100
Cultivation2.280951.40200.0267 *
Settlement1.96677.31440.1226
Mosaic1.212925.23100.2364
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mangewa, L.J.; Ndakidemi, P.A.; Alward, R.D.; Kija, H.K.; Nasolwa, E.R.; Munishi, L.K. Land Use/Cover Classification of Large Conservation Areas Using a Ground-Linked High-Resolution Unmanned Aerial Vehicle. Resources 2024, 13, 113. https://doi.org/10.3390/resources13080113

AMA Style

Mangewa LJ, Ndakidemi PA, Alward RD, Kija HK, Nasolwa ER, Munishi LK. Land Use/Cover Classification of Large Conservation Areas Using a Ground-Linked High-Resolution Unmanned Aerial Vehicle. Resources. 2024; 13(8):113. https://doi.org/10.3390/resources13080113

Chicago/Turabian Style

Mangewa, Lazaro J., Patrick A. Ndakidemi, Richard D. Alward, Hamza K. Kija, Emmanuel R. Nasolwa, and Linus K. Munishi. 2024. "Land Use/Cover Classification of Large Conservation Areas Using a Ground-Linked High-Resolution Unmanned Aerial Vehicle" Resources 13, no. 8: 113. https://doi.org/10.3390/resources13080113

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop