Next Article in Journal
Discriminating Forest Successional Stages, Forest Degradation, and Land Use in Central Amazon Using ALOS/PALSAR-2 Full-Polarimetric Data
Next Article in Special Issue
An Automated Framework for Plant Detection Based on Deep Simulated Learning from Drone Imagery
Previous Article in Journal
Reply to Comment on Choi et al. Correlation between Ionospheric TEC and the DCB Stability of GNSS Receivers from 2014 to 2016. Remote Sens. 2019, 11, 2657
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models

1
Department of Surveying and Geospatial Engineering, College of Engineering, University of Tehran, Tehran 1417466191, Iran
2
C-CORE, 1 Morrissey Rd, St. John’s, NL A1B 3X5, Canada
3
Department of Electrical and Computer Engineering, Memorial University of Newfoundland, St. John’s, NL A1B 3X5, Canada
4
College of Environmental Science and Forestry, Department of Environmental Resources Engineering, State University of New York, Syracuse, NY 13210, USA
5
The Canada Centre for Mapping and Earth Observation, Ottawa, ON K1S 5K2, Canada
6
Institut National de la Recherche Scientifique (INRS), Centre Eau Terre Environnement, Quebec City, QC G1K 9A9, Canada
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(21), 3511; https://doi.org/10.3390/rs12213511
Submission received: 11 September 2020 / Revised: 15 October 2020 / Accepted: 15 October 2020 / Published: 26 October 2020

Abstract

:
Unmanned Aerial Vehicle (UAV) imaging systems have recently gained significant attention from researchers and practitioners as a cost-effective means for agro-environmental applications. In particular, machine learning algorithms have been applied to UAV-based remote sensing data for enhancing the UAV capabilities of various applications. This systematic review was performed on studies through a statistical meta-analysis of UAV applications along with machine learning algorithms in agro-environmental monitoring. For this purpose, a total number of 163 peer-reviewed articles published in 13 high-impact remote sensing journals over the past 20 years were reviewed focusing on several features, including study area, application, sensor type, platform type, and spatial resolution. The meta-analysis revealed that 62% and 38% of the studies applied regression and classification models, respectively. Visible sensor technology was the most frequently used sensor with the highest overall accuracy among classification articles. Regarding regression models, linear regression and random forest were the most frequently applied models in UAV remote sensing imagery processing. Finally, the results of this study confirm that applying machine learning approaches on UAV imagery produces fast and reliable results. Agriculture, forestry, and grassland mapping were found as the top three UAV applications in this review, in 42%, 22%, and 8% of the studies, respectively.

Graphical Abstract

1. Introduction

Agriculture and environmental monitoring have a direct impact on the management of natural resources and the agricultural industry by improving our understanding of hydrological processes, optimizing water distribution, and aiding natural disaster prediction and prevention [1]. By collecting high-spatial and high-temporal resolution data, remote sensing tools play a key role in agro-environmental monitoring where the remoteness and vastness of observation sites make conventional data collection approaches laborious and costly [2]. Therefore, remote sensing techniques have been widely used by scientists and researchers for various agricultural and environmental applications over the past four decades [3].
Remote sensing data are collected by three types of platforms, namely spaceborne, airborne, and mobile mapping methods for terrestrial platforms [4]. Large area coverage is the main advantage of satellite data for environmental applications. However, high spatial and temporal resolution data are preferred for these purposes, thus hindering the usefulness of satellite imagery. Furthermore, fixed timing acquisition [5] and environmental conditions, such as cloud coverage, further affect the capability of satellite images for agro-environmental studies. Compared to the satellite-based platforms, airborne platforms collect high spatial resolution images and have flexibility in terms of changing their instrumentation during the flight (e.g., observation angle and flight route).
Additionally, flights can be planned and controlled during different weather conditions. Despite these benefits, collecting data using such systems (e.g., manned aircraft) is costly, making them impossible to be used for large scale applications [6]. Mobile Mapping System (MMS) is a category that uses multi-sensor integrated data acquisition and processing technology in terrestrial applications and is not an appropriate choice for environmental monitoring [7].
Unmanned Aerial Vehicle (UAV) platforms, also called remotely piloted aerial systems (RPAS) or drones, are the aircrafts controlled by the ground operators. In particular, UAV platforms have drawn attention in the remote sensing community for several environmental applications by leveraging the capabilities of both satellite and airborne systems [8]. Ultra-high spatial resolution imaging capability, low acquisition cost, low maintenance, and live data transmission are other advantages of UAV platforms compared to satellite and airborne systems [9].
Recently, a wide range of studies has investigated the unique capability of Earth Observation (EO) UAV platforms in various remote sensing applications. From an agricultural perspective, UAVs have been frequently applied to estimate several crop characteristics, such as crop water content, plant height, and canopy breadth. These are useful for monitoring crop growth and estimating the final yield [10,11,12,13,14,15,16,17,18,19,20]. In addition to agricultural applications, the remote sensing community benefits from dynamic and flexible UAV platforms in other environmental applications, including wetland vegetation mapping [21], water quality monitoring [22], sea ice classification [23], coastline monitoring [24], oil spill detection [25], mineral mapping [26], soil water content mapping [27,28,29], natural hazard mapping [30], detecting of diseases and nutrient management, and forest mapping [31,32,33,34,35].
The advancement of Machine Learning (ML) and statistical models for processing remote sensing data plays an essential role in a variety of UAV applications due to their capability to address linear and non-linear problems and to handle large numbers of inputs. In general, classification, clustering, regression, and dimension reduction are the main areas of ML application [36]. Classification is the most commonly used technique for remote sensing data processing. Several studies demonstrated the success of support vector machine, k-nearest neighbor, and random forest for the classification of UAV imagery in various applications, including agriculture and crop mapping [37], species classification [38], land cover mapping [39], wetland classification [40], and tree detection in forestry [41]. Furthermore, the success of UAV data for crop monitoring [42], water resource management, and mineral exploration [43] has been reported in several studies using regression techniques. Other applications include crop water stress estimation [44], vineyard variability assessment [45], and soil salinity estimation [46].
There are different types of UAV platforms, such as rotary- and fixed-wing, which are recommended for varying agro-environmental applications based on several factors, including user experience, required payload, available flight control software, and sensor type. Importantly, the remote sensing community was quick to adopt advances in UAV platform technologies with the introduction of miniature and low-cost versions of satellite sensors, such as multispectral (e.g., R, G, B, NIR, Red-edge), hyper-spectral, short/mid-wave range (e.g., thermal), and light-weight LiDAR. Despite several advantages of UAVs over satellite and airborne platforms, there are still substantial challenges to obtaining a high-quality UAV image in several remote sensing tasks. Due to payloads’ weight limitation, UAV platforms are typically equipped with low-weight, small size, and non-metric cameras, resulting in several problems, such as camera geometry and rolling shutter errors. The payload weight limitation also affects the onboard power, which has a direct impact on the flight time and, consequently, causes difficulties in mission planning for large-scale agro-environmental monitoring. Small FoV, large data volume, relief displacement, weather condition limitations, platform instability, and vibration effects are other challenges of collecting data using UAVs. Therefore, more creative practices and studies are needed to find smart solutions to alleviate the challenges associated with UAV data collection.
Several authors have already provided surveys concerning UAVs and their applications in remote sensing, summarized in Table 1. These include several comprehensive reviews for UAV remote sensing applications in agro-environmental monitoring. In particular, a number of reviews focus on the use of UAVs in precision agriculture, such as vegetated areas monitoring [47], natural and agricultural ecosystem monitoring [1], and aboveground biomass estimation [48]. Several surveys review the capability of UAV imagery for forestry applications. For example, [49] is a review of UAV-based forestry applications and the regulatory framework for UAV operation in the European union that evaluates technologies and scientific applications in forest sector methods. The affordability of both UAV and sensor technologies from the perspectives of photogrammetric processing and hardware development was discussed in [49,50]. A review of remote sensing data processing and applications for UAV-based photogrammetric surveying was presented in [51]. The selection of camera models and parameters, platforms, and sensors, and their advantages for the user’s technical needs were discussed in [52]. In addition, a meta-analysis review on studies discussing the diversity of UAV data processing procedures and techniques among the many possible strategies was presented in [53]. A comprehensive framework was defined in [54] in order to optimize data collection procedures and obtain high-quality products from UAV imagery and passive sensors. The framework contains five interconnected steps, including study design, pre-flight fieldwork, flight mission, UAV data processing, and data quality assessment [54]. Despite the diversity of environmental UAV reviews, less attention has been paid to advances in UAV data processing techniques using machine learning algorithms. Thus, an overview of UAV data processing from a machine learning perspective in agro-environmental applications is lacking. This paper is, therefore, the first review study to address this gap and open the door to further research in UAV remote sensing applications using machine learning algorithms for agro-environmental monitoring.
The main objective of this systematic review is to provide readers with a comprehensive treatment of current studies on UAV data processing steps for several agro-environmental applications using machine learning and statistical models. For this purpose, 163 peer-reviewed journal papers were reviewed, and a database was built based on the extracted information for platforms, sensors, algorithms, and accuracy results. This database was analyzed to identify: (1) the agro-environmental applications of UAVs in remote sensing; (2) the most frequently used UAV platform, sensor, and software employed in different remote sensing applications; (3) the essential features and requirements for the application of UAV in different agro-environmental studies; (4) the trend in using UAV data for different agro-environmental applications, such as crop classification and wetland mapping; (5) the most reliable remote sensing techniques for processing UAV imagery; and (6) the accuracy of 3D reconstruction in different applications according to the imaging system parameters. Furthermore, the effects of technical factors, such as aircraft maximum speed and flight time, overlapping ratio, and weather conditions on the quantity of photogrammetric data, aerial coverage, data volume, and accuracy were also investigated. Based on quantitative results of this systematic review, several trends and challenges for UAV image processing by machine learning and statistical models were reported and, thus, the potential avenues for future research to overcome the current challenges were introduced.

2. Data Processing Workflow

Figure 1 summarizes a general workflow to be followed for the processing of UAV imagery using machine learning and statistical modes. The workflow contains the following steps: (1) collecting UAV data, considering pre-flight preparation, mission planning, and system characteristics; (2) UAV data processing followed by image pre-processing and photogrammetry processing; (3) machine learning and statistical models, including classification and regression methods, according to the desired study goals, and the accuracy assessment of the final products.

2.1. UAV Data Collection

Some strategies must be taken into consideration for an accurate UAV data collection to ensure a safe survey. There are three primary phases, including pre-flight preparation, mission planning, and platform and sensor characteristics. Before an actual UAV flight, four major phases should be considered, namely UAV regulations, study area characteristics, weather conditions, and field data collection. These phases are essential for accurate UAV data collection and lead to a safely operated survey without collection mistakes.
Mission planning helps to establish a successful and safe UAV data collection by determining detailed information, such as flight altitude, flight direction, flight lines number, and interior orientation of the mounted camera. It depends on several factors, including the control system, ground control points (GCPs), overlap, and flight planning software. Flight height, sensor pixel size, and focal length are examples of parameters that need to be considered in-camera settings. The combination of these parameters can affect the final ground sampling distance (GSD) of the sensor. High flight altitude increases the field of view (FOV), which leads to a decreased spatial resolution and can consequently affect feature delineation [60].

2.2. UAV Data Processing

Images collected from UAV platforms often require pre-processing procedures, such as radiometric and geometric corrections, to ensure their usefulness for further processing. Image color adjustment, noise elimination, vignetting, and blur removal are different steps of radiometric calibration, which can be applied using spectral targets of known reflectance in the field. Some external factors, such as atmospheric effects (e.g., spectral variability of the surface materials, absorption, and scattering), lead to spectral image degradation [61]. The general workflow for generating spectro-radiometric consistent data is presented in [62], which includes (1) sensor spectral and radiometric calibration; (2) scene reflectance generation; (3) scene reflectance correction; (4) radiometric validation; and (5) metadata generation. Geometric calibration is the process of correcting and compensating the lens distortions and intrinsic parameters (i.e., focal length, principal point, radial, and tangential distortion coefficients) of a conventional camera. Image distortions (pincushion, barrel, and mustache distortion), which are influenced by the camera lens and the shooting angle, can alter object geometry. However, by applying the required corrections, images can be orthorectified.
In aerial photogrammetry, camera locations can be updated in the SfM algorithm, called self- calibration. As such, images can be used in the SfM algorithm to generate mapping products, including point clouds, digital terrain models (DTM), digital surface models (DSM), 3D models, and orthoimages [63]. For these purposes, the general workflow followed by SfM-based software is demonstrated in Figure 2. The workflow includes (1) aligning photos for sparse point cloud generation using calculated camera locations, (2) dense point cloud generation, and (3) mesh generation; (4) texture model generation; and (5) tilted model generation. Geo-referenced ortho-mosaic and DSM are final products in the photogrammetric data processing. As shown in Figure 2, the GCPs are used to perform aerial triangulation and produce georeferenced images.

2.3. Machine Learning and Statistical Models

Machine learning is generally categorized as supervised, unsupervised, and semi-supervised (or reinforcement) learning. In supervised methods, prediction models are developed based on a training dataset that contains input data and labelled values. Validation of the model is performed using a test dataset. The prediction problems can be categorized as classification, regression, and ranking. In contrast, an internal representation is discovered in unsupervised approaches using only input data without labelled responses. Clustering, segmentation, dimension reduction, and associate mining are the four main types of unsupervised methods [36]. Semi-supervised learning is a combination of supervised and unsupervised learning. This is because semi-supervised methods employ few labelled and many unlabelled data as part of the training process. The semi-supervised methods explore the information contained in the unlabelled data in order to generate predictive models with a better performance compared to those that only use labelled data.
Implementing machine learning methods on UAV data enhances the capability of data processing and prediction in various applications, such as crop mapping [8] and wetland classification [40]. Classification and regression are two main prediction problems that are commonly used in UAV-based applications. The main difference between classification and regression problems is that in classification, a categorical sample is used for prediction; however, in regression, a quantitative sample is used for prediction.
Several classification methods, such as Support Vector Machine (SVM), Classification and Regression Trees (CART) [64], Decision Tree (DT) [65], Random Forest (RF) [66], Artificial Neural Networks (ANNs) [67], and k-nearest neighbour (K-NN) [68] have been extensively used on UAV imagery. For example, [69] illustrated the superiority of the RF algorithm compared to SVM and A-NNs for leaf area index retrieving. Similarly, several regression methods such as random forest (RF) [70], and support vector regression (SVR) [12] have been used on UAV imagery.
Machine learning algorithms can be implemented using either open source or commercial software. Open source codes such as Python and R are freely available and may be redistributed and modified. On the other hand, tools such as MATLAB and SAS are developed and maintained by a company.
In contrast with machine learning algorithms, statistical models use parametric approaches and have high level of interpretability. Several regression methods are categorized as a statistical model, such as linear regression [71], polynomial regression [11], stepwise linear regression (SWL) [72], multiple linear regression (MLR) [42], and partial least square regression (PLSR) [73].

2.4. Accuracy Assessment

For the accuracy assessment of machine learning and statistical methods, sampling has a significant effect on the result. Samples can be collected for the purpose of both training and test data [74]. In the case of small sample size, cross-validation uses multiple partitions to allow each sample to be used multiple times for different purposes. There are four sample selection methods, including simple random, stratified random, deliberative sampling, and disproportional stratified random, and three cross-validation tuning methods, including k-fold, leave-one-out, and Monte Carlo [75].
Classification accuracy can be evaluated using several metrics exploited from a confusion matrix. These accuracy assessment indices include the overall accuracy (OA), producer’s accuracy (PA), User’s accuracy (UA), and Kappa coefficient (k) [76].
O A = i = 1 C n i i n
K a p p a = n i = 1 C n i i i = 1 C n i + n + i n 2 i = 1 C n i + n + i
where n is the total number of test samples, c   is the total number of classes, n i i is the diagonal element of the confusion matrix, n i + is the sum of all rows, and n + i is the sum of all columns. OA represents the ratio between the number of samples that are correctly recognized by the classification algorithm and the total number of test samples. The kappa coefficient of accuracy is a measure based on the difference between the actual agreement in the confusion matrix (as indicated by the main diagonal) and the chance agreement, which is indicated by the row and column totals. The kappa coefficient is widely adopted as it also uses off-diagonal elements from the error matrix and compensates for chance agreement.
In regression models, the most commonly known evaluation metrics are r-squared (R2), root mean square error (RMSE), residual standard error (RSE), and mean absolute error (MAE) RMSE measures the average model’s error in predicting the outputs. It is, mathematically, the square root of the mean squared error (MSE), which is the average squared difference between the observed actual output values and the values predicted by the model.
MSE = m e a n ( ( o b s e r v e d p r e d i c t e d ) 2 )
RMSE = MSE
Similar to the RMSE, Mean Absolute Error (MAE) measures the prediction error. It is the average absolute difference between observed and predicted outcomes.
M A E = m e a n | o b s e r v e d p r e d i c t e d |

3. Method

In this study, a systematic review and meta-analysis technique was conducted on remote sensing studies using machine learning and statistical models that applied Unmanned Aerial Vehicle (UAV) according to the PRISMA protocol [77]. Relevant literature was searched in the ISI Web of Science database using the following terms: (“UAV” Or “drone” Or “UAS” Or “unmanned aerial vehicle” Or “unmanned aerial system” Or “drone” in [Title]) and (“remote sensing” in [Topic]). The research was restricted to full-length English language articles published over the past two decades (i.e., 2000 to 2019) in 13 high-impact remote sensing journals, namely Remote Sensing of Environment, Sensors, Remote Sensing, ISPRS Journal of Photogrammetry and Remote Sensing, IEEE Transactions on Geoscience, IEEE Journal of Selected Topics in Applied Earth Observation and Remote Sensing, IEEE Transactions on Geoscience and Remote Sensing, International Journal of Applied Earth Observation and Geoinformation, Remote Sensing, Remote Sensing Letters, International Journal of Remote Sensing, and Photogrammetric Engineering, and ISPRS International Journal of Remote Sensing. Studies classified as review papers, book chapters, reports, and Ph.D. theses were not considered in this systematic review. Figure 3 demonstrates the selection process. Analysis of the keyword frequency is shown in Figure 4. The most frequently used keywords include “unmanned aerial vehicle,” “UAV,” “Remote sensing,” “Mapping,” “Classification,” and “Regression.”
In this review, only studies considering the machine learning and statistical methods in UAV geometric data processing were included. Studies concerning the development of UAV remote sensing techniques in urban applications were excluded from the database since the overreaching aim of this review was to provide readers with a general view of UAV remote sensing in agro-environmental applications. It also did not cover studies considering the development of fundamental methods for UAV remote sensing imageries. However, there remains a need to conduct a separate review for researchers interested in studies relevant to fundamental photogrammetric and remote sensing methods for UAV imageries. For the meta-analysis of UAV-remote sensing applications, the database with 34 fields was constructed (summarized in Table A1 in Appendix A). These parameters comprise general information such as title and author name, as well as other technical information, such as platform type and sensor specifications. Therefore, documenting complete information from the reviewed studies leads to a precise discussion on UAV imageries in different remote sensing applications.

4. Results and Discussion

Following the method described in Section 3, relevant data were extracted by reviewing a total number of 163 publications that used UAV-based remote sensing data in a machine learning and statistical models framework (i.e., regression and classification). A detailed review of meta-analysis results, including general characteristics of the articles and the remote sensing data analysis, is presented in the following section.

4.1. General Characteristics of Studies

The primary source of information in this study was articles published in 10 high-ranked journals. Only journals that published more than three papers are shown in Figure 5. As seen, the top five journals for agro-environmental monitoring using machine learning and statistical models were Remote Sensing (n = 87, 54%), Sensors (16, 10%), International Journal of Remote Sensing (16, 10%), International Journal of Applied Earth Observation and Geo-information (13, 8%), and ISPRS Journal of Photogrammetry and Remote Sensing (12, 7%).
Figure 6 illustrates the worldwide distribution of the 39 countries represented. As shown, most studies pertaining to UAV imagery for agro-environmental monitoring are mainly located in North America, Asia, and Europe. Countries with more than ten studies are China (45), United States (26), Canada (13), Italy (12), and Germany (11). In addition, Australia (9), Finland (9), Spain (6), Netherlands (5), Japan (5), South Africa (4), and Brazil (4) are also worthy of note, as their number of publications exceeds four.
The publication trends in accordance with the five platform types (fixed-wing, helicopter, quadcopter, hexacopter, and octocopter) that have been reviewed are illustrated in Figure 7. While the number of related articles did not noticeably increase from 2012 to 2015, it has accelerated since 2015. Fixed-wing UAVs were the most common platform from 2015 to 2018. However, an increasing trend in the use of hexacopter and quadcopter platforms is shown from 2018 to 2019. Multirotors have become more popular platforms because of their unique characteristics, including vertical take-off and landing, recovery capability, and low cost compared to the fixed-wings, which are a suitable choice for larger-scale surveys because of their energy efficiency characteristics [60]. However, multirotors accompany poor Global Positioning System (GPS) receivers that may result in decreasing position accuracy, especially in mountainous areas with weaker GPS reception. Harsh environments, such as heavy winds and cold temperatures, can also influence the accuracy of UAV products at the edges and cause battery drainage and magnetic declination [78]. The use of a compact and transferable ground station with a gimbal can be helpful in overcoming weather threats [23,78]. During pre-flight planning, fixed-wing platforms can be considered for data collection because of their stability in crosswind flights [79]. Inaccuracies of UAV products at the edges can be decreased by expanding the survey beyond the area of interest and consequently providing sufficient overlap [80].

4.2. UAV and Agro-environmental Applications

A wide variety of agro-environmental applications using UAV imagery based on machine learning and statistical models is shown in Figure 8. In total, approximately 64% of the studies focused on regression methods, whereas the remainder used classification methods. Among all the studies, the most significant number of articles were studies of agriculture (68) and forestry (36). Studies focusing on grassland (13), land use/land cover (LULC) (10), and wetlands (9) are the next most abundant applications, which mostly applied classification. Coastal management studies were the subject of eight research articles with an equal number of studies for classification and regression. Other topics were related to mining (7), soil (6), water (4), and sea ice (3). The association of machine learning and statistical models with UAV imagery has proved its efficiency in several fields such as crop monitoring and forestry applications, including irrigation and water management [81], soil moisture prediction [82], and forest metrics estimation [83].

4.3. UAV and Sensor Types

In terms of sensor technology, visible sensors are the most adopted technology (93, 51%) derived from the consumer market and easily installable on UAV platforms, followed by multispectral (40, 22%), hyperspectral (26, 14%), and LiDAR (11, 6%) technology (see Figure 9b).
High spatial resolution UAV-based remote sensing imagery with a resolution between 0 and 10 cm is the most frequently employed data source amongst those utilized for machine learning approaches (Figure 9a). Furthermore, the number of case studies in this range of resolution exceeds 90% of the studies, with these mainly involving visible and multispectral images. In terms of remote sensing imagery with medium resolutions (10 to 20 cm), the data sources for the case studies mainly come from multispectral images. Moreover, six case studies adopted hyperspectral images, and eight studies employed visible imagery with medium resolution. In terms of low resolution with a resolution range of more than 20 cm, studies that adopted hyperspectral images outperformed others. It should be noted that the final quality of DSM and orthomosaic and the utilized calibration method are profoundly affected by spatial image resolution. Increased sensor spatial resolution increases the need for proper calibration.
High-resolution visible sensors are used for a broad range of applications, including agriculture (29 studies), forestry (18), grassland (10), LULC (9), and coastal management (8). Other sensors such as multispectral and LiDAR are increasingly used in several applications since their miniaturization has encouraged their uptake in UAV surveys. However, they are yet to reach similar popularity to that of visible sensors due to their special characteristics and increased cost. Moreover, hyperspectral sensors are heavy, and their miniaturization is a challenging procedure [84]. Despite this, the use of hyperspectral sensors on UAV platforms is growing. For instance, Yuan et al. [69] used a hyperspectral snapshot camera (Cubert GmbH, Ulm, Baden-Wurttemberg, Germany) onboard a hexacopter for crop mapping. Thermal infrared sensors were also used in four studies. Thermal infrared sensors have lower resolution compared to the visible or multispectral sensors, and their radiometric calibration is challenging. Nevertheless, they are still adequate for brightness temperature differences monitoring in different applications such as land use/land cover mapping [69].
Based on commercial brands, the categories of sensors used in the surveyed studies are shown in Figure 10. Sensor brands with frequencies of less than three were ruled out. As shown, Canon (23%, 30 studies) is the most frequently used sensor brand, followed by Sony (20%, 27), MicaSense (21%, 21), and Tetracam (12%, 16). The majority of the articles used visible sensors, with the most popular being the Canon Powershot (14) and Sony NEX (9) series, followed by multispectral cameras with the commercial brands of MicaSense Parrot Sequoia (12) and MicaSense Rededge (9), and LiDAR sensor technology with the Velodyne brand. Hyperspectral sensor brands, such as Headwall and Cubert, are in the minority of usage in this survey.
The popularity of visible sensors is mainly because of their low-cost and high-resolution images that make them sufficient for many applications, such as vegetation classification, LULC mapping, and river surveys. They can also increase vegetation classification with modified near-infrared (NIR) cameras [16]. Elsner et al. [85] suggested using a Canon Powershot visible sensor with a resolution of 16 megapixels and a flight height of 70 m for beach surveys.
Figure 11 illustrates the categories on the basis of the utilized commercial brands for UAV platforms. DJI [86] was the prevailing multirotor brand with a 45% share in the surveyed literature. The second most predominant platform was the fixed-wing Sensfly brand with a 17% share. In particular, the Matrice M600 pro DJI model was the most frequently used multirotor, followed by eBee SensFly and Mikrokopter OktoXL. The popularity of multirotor is because of their reasonable price, easy transfer, and their convenience in safe launching and landing. However, fixed-wing UAVs offer some advantages, with increased flight control and longer flight duration.

4.4. UAV and Image Overlapping

There are optimal overlap thresholds for specific vegetation types regarding the specific characteristics of the surveyed area in different studies. As indicated in Figure 12 and Figure 13, agriculture and grassland area types have the same median of forward-and-side overlap at about 80% and 70%, respectively. The forestry boxplot shows higher medians of forwarding and side overlap (85%, 73%), and wetland showed lower medians (75%, 70%).
Image overlap influences the success of the SfM algorithm, and achieving an appropriate image forward-and-side overlap depends on several factors, such as flight height and study area characteristics (scene heterogeneity/homogeneity). Carrascosa et al. [87] recommended forward-and-side overlaps of at least 70% and 40% for an agriculture land cover type, assuming identifiable tie points in the images and a flight height of 60-90 m. Jayathunga et al. [31] used a forward overlap above 95% and an 80% side overlap in forestry land cover, while Honkavaara et al. [26] employed a minimal overlap of 30% for mining. By using high percentages of front and side overlap, the number of images required covering the study area, the flight time, and consequently the data volume and computation time are increased. Thus, study area types such as agriculture fields, grasslands, and forests with low feature diversity and relatively flat topography require higher percentages of overlaps to extract tie-points for the SfM algorithm. Sensor technology is another factor that influences the percentage of image overlap. For example, thermal images with small sensor resolution compared to visible sensors require image overlap above 80%. Considering all these aspects, achieving optimal forward-and-side overlap for UAV flights is a challenging task [88,89].

4.5. UAV Image Processing Software

Figure 14 illustrates the percentages of case studies in terms of SfM software. As seen, more than 92% of studies have used Agisoft Photoscan and Pix4D, which are significantly more affordable than open-source photogrammetric software, including VisualSfM, Bundler, etc. Of the case studies presenting utilized processing software (140 studies), 62% used Agisoft PhotoScan to process UAV remote sensing data (80 studies), followed by Pix4D with about 30% (39 studies). Moreover, 11 studies are categorized as “others,” since other open-source packages have been used, such as Blunder, NGATE, GerMAP, etc. (Figure 14). In this study, Agisoft was found to be the most popular SfM software.
Several studies compared the performance of Agisoft and Pix4D, in which the same workflow is used for creating orthomosaics and DSMs. For example, [90] compared the orthomosaic accuracies generated from the same survey using two popular SfM software, Pix4D, and Agisoft. As a result, higher error in the x and y position is observed when using Agisoft, while higher z error is observed when using Pix4D. Agisoft takes advantages of automatic image quality assessment by excluding low-quality images from the photogrammetric processing since they can cause significant errors in SfM products [91]. Agisoft also uses manual identification of GCPs in two consecutive images and then filters out the remaining images that contain the same GCPs. This process can significantly improve geo-referencing accuracy after optimizing camera alignment, in which the camera estimated coordinates and GCPs are updating with georeferencing error. The final SfM products’ quality has been improved in recent years by introducing different algorithms, such as Monte Carlo methods for optimizing UAV surveys and the deployment of GCPs [92]. Nevertheless, the reporting of processing parameters, such as dense point cloud reconstruction parameters, the key point, tie point limits, and accuracy level is highly recommended regardless of software choice [54].
Machine learning algorithms have been used in this survey are shown in Figure 15. Of the case studies presenting utilized machine learning and statistical software (85 studies), 25% used Matlab, followed by R with about 20%. Moreover, commercial software such as ENVI and eCognition are the most common software with the share of 17% and 13% respectively in this study.

4.6. UAV and Flight Height

A positive correlation between spatial resolution and flight height can be seen in the studies which provided details of UAV flights. As shown in Figure 16, in more than 80% of cases, the data were acquired at a GSD between 0.3 and 10 cm and a flight height between 10 and 200 m. Helicopter platforms have the highest average flight height (245 m) and GSD (19.8 cm), followed by fixed-wing (227 m, 10 cm), hexacopter (101 m, 6.5 cm), quadcopter (83 m, 6 cm), and octocopter (80 m, 5 cm). More flight heights are used in classification studies rather than regression studies.
Flight height decrease reduces GSD, increases spatial image resolution, and consequently decreases coverage areas. Thus, a higher number of flight missions with variable environmental conditions is required to cover the study area. This variety of in-flight missions may result in low spectral accuracy and complicated radiometric correction. Therefore, high flight heights are recommended to cover large areas and maintain relatively constant environmental conditions, such as sun angle and radiance, cloudlessness, etc. Sensor capture time is another factor that affects flight height. Sensors like Tettracam, with a higher capture time (2 s), may require higher flight height for the carrying platform to ensure that sufficient overlap is achieved. Fixed-wing platforms with their high average flight height are less commonly used to carry this type of sensor [53]. In general, high flight height may affect feature delineation because of lower spatial resolution resulting from the high field of view [60]. In contrast, low flight height may produce a more irregular-shaped DSM [56].
Radiometric calibration and correction still have an essential role in UAV image processing, even though the atmospheric effects are reduced when the images are captured at a low flying height. However, only 38% of the studies in this review mentioned radiometric correction, in which the empirical line method [93] is the most frequently used approach. Other radiometric problems such as vignetting effects, radial, tangential, and decentring distortions that may arise from the sensors or platforms themselves can propagate as errors into vegetation indices or classifications if not corrected in pre-processing stages [53].
Geometric correction is another essential step during image processing. In this review, only 76 out of 163 studies (46%) provide details on geo-referencing and geometric correction. Thirty-two studies (20%) adopted for GPS RTK, 17 studies (11%) adopted for the onboard differential global navigation satellite system (DGNSS), and only two studies (1%) used GPS PPK techniques. DGNSS techniques, which require two GNSS receivers, estimate position at the centimeter accuracy level [94]. In [95], a traditional method of geo-referencing based on GCPs was introduced as an accurate and cost-efficient method, while it is a time-consuming process and requires considerable fieldwork. The possible reason for the popularity of the RTK method is its efficiency in surveying inaccessible areas, and its results are not affected by onboard flight patterns and data post-processing [54].

4.7. UAV and Ancillary Data

Ancillary data sampling is another critical step in UAV image processing. Ancillary data collection strategies and types are highly dependent on the research application and environmental type. Among all studies in this review, 103 studies (62%) utilize ancillary data in their processing. As shown in Figure 17, ancillary data is more prevalent in regression studies (62%), followed by classification studies with a 33% share. Moreover, in-situ measurements are more common in regression with a 62% share (64 studies) compared to classification with only 24 studies. On the other hand, satellite images were only used for classification in a 4% share. For example, canopy height and width were measured in [64] for olive tree crown parameter assessment using 3D reconstruction. A hand-held seawater salinity refractometer was used in [96] to evaluate the tidal and meteorological influences on wetlands. In the case of classification, airborne bathymetric LiDAR surface-intensity was used in [97] for coral reef mapping.
In-situ measurements, such as biomass sampling, are conventional methods for biomass estimation in vegetation monitoring [42], in winter wheat crop monitoring [11], and the biomass retrieval of aquatic plants [72]. LAI measurement is another field data that was used in [17] for grain yield prediction in rice and in [98] for the leaf area index mapping of mangrove forests. Terrestrial laser scanning surveys can be used as a reference in many studies, such as in [99] for terrain relief determination and in [100] for creek monitoring in an alpine environment. Combining in-situ measurements and parameters of UAV hyperspectral imagery within a machine learning framework yielded an accurate estimation of soil moisture content in [72]. The WorldView-2 images and in situ measurements are used for object-based classification and validation in [58]. Fusion of vertical information such as airborne LiDAR data and spectral imageries can improve the classification accuracy in some studies [70,80].

4.8. Classification Performance

About 38% of the studies in this review have used classification methods to extract information for the UAV remote sensing imagery. The overall accuracy of these techniques has been assessed in the following sub-sections with consideration of different parameters, such as the spatial resolution, sensor types, classifier types, and classification strategies.
The boxplots of different sensor types are illustrated in Figure 18 to determine the effects of these parameters on overall accuracy. It is observed that the highest median overall accuracies were achieved by visible sensors (92.9%). Three studies utilized LiDAR data in their articles, which achieved the lowest median overall accuracies (70%). The hyperspectral boxplot includes a broader range of overall classification accuracy from 68% to 100% and a lower median of about 85% compared to the multispectral boxplot with the maximum, minimum, median overall accuracy of 69%, 99%, and 89.4%, respectively. The studies reviewed in this paper illustrate that the visible sensor (n = 93) is a popular sensor technology that achieved very accurate classification results. For instance, the overall accuracy increased up to 15% for vegetation mapping when a high-resolution visible sensor was used [101].
The visible camera can generate higher spatial resolution imageries with a larger coverage compared to the multispectral and hyperspectral cameras. The visible information can be used for more accurate classification in agricultural applications with a much higher spatial/temporal resolution [9]. This imaging technology plays an important role in improving the ability to differentiate the composition tree species. High-quality visible cameras ensure low-signal/noise-ratio data and good photogrammetric products for classification [36]. On the other hand, the UAV hyperspectral image provides more spatial and spectral details which improves the accuracy of classification. For instance, the UAV-derived DSM data provide structural and spatial information that increases the capability to separate different species of mangrove in [102]. The effectiveness of the UAV hyperspectral images in mangrove species classification is verified in [27]. Relative height such as airborne LiDAR data plays an important role in classification accuracy and a combination of both spectral and geometric information is the best set of features [69].
Guo et al. [67] revealed in their classification experiments that the relative height plays a significant role in classification accuracy and the best set of features should be a combination of both geometric and spectral information.
Statistics of all 62 classification studies indicate the relationship between classifiers and overall accuracy in Figure 19. As seen, the median overall accuracies for all classifiers are higher than 85%. Deep learning methods achieved the highest median overall accuracy (94.8%), followed by Maximum Likelihood classification (MLC), Support Vector Machine (SVM) and Random Forest (RF), for which the median overall classification accuracies were 91.28%, 90.08%, and 89%, respectively. The KNN method indicated that the interquartile range (IQR) had a smaller variation of overall accuracy than other methods. SAM and RF methods showed approximately equal medians of overall accuracy. However, the variance of RF was higher. In addition, some classifiers such as Bhattacharya, K-means, spatial-spectral-location fusion based on conditional random fields (SSLF-CRF), Generalized linear models (GLM), taxon-based classifier, etc., were ruled out of the study because they had fewer than three examples.
Recent research findings demonstrate the success of deep learning methods for image classification in a wide range of applications compared to the traditional machine learning methods employed in the remote sensing community [103,104]. RF and SVM classifiers have also gained increasing attention in several studies since 2011 for their accurate classification results [105]. The SVM method has several advantages, including providing a separable pattern by mapping the input data into a higher feature space, obtaining an optimal solution by finding a global minimum using convex cost function and quadratic problems, and working well with low quantities of training data [106]. On the other hand, the RF method, which utilizes bootstrap aggregating for image classification, is easy to train with fewer tuning parameters and is less sensitive to the training data quality [39,40]. An RF-supervised classifier was identified as the most suitable for Object-based image analysis (OBIA) in a systematic review analyzing different classifier performances [107].
Moreover, several studies declared that it could increase classification overall accuracy, even though low quantity training data may cause misclassification [66]. This is the possible reason for the significant variance of the RF boxplot in Figure 19. The KNN method that only three studies in this paper adopted is familiar for its several advantages, such as requiring fewer parameters to tune and easy implementation. However, the KNN method has some drawbacks, such as sensitivity to noise and unbalanced data. These characteristics lead to a less significant distance number, which explains the low median of accuracy in the KNN boxplot (Figure 19).
Different strategies for implementing the remote sensing data classifications are shown in Figure 20. As indicated, pixel-based classification methods are in the majority, with a lower median of overall accuracy (88.82%), rather than object-based approaches (94.45%). However, the IQR range and the difference between the maximum and minimum overall accuracies of the pixel-based boxplot are higher than the object-based ones.
In pixel-based methods, only the spectral properties of pixels are analyzed, while in object-based methods (OBIA), information on the texture of pixels and their neighbours are integrated and arranged into segments and then assigned to different classes. Low classification accuracies in pixel-based methods are because of the effect of “salt and pepper” noise due to the rise of high spatial resolution sensors [108]. Thus, object-based methods outperformed pixel-based ones for high-resolution UAV imageries [109]. The superiority of object-based approaches over pixel-based ones in terms of classification performance is proven in [110]. Researchers working with UAV imageries are more inclined to use OBIA methods compared to the traditional pixel-based LULC classification [71].
From a temporal scope perspective, the median overall accuracy of the multi-temporal boxplot (94.9%) is higher than the single-date (89.9%). However, the variance and IQR of the single-date boxplot were narrower than that of the multi-temporal one (Figure 20). Multi-temporal UAV images provided better classification accuracy in many studies compared to single-based image classification.

4.9. Regression Performance

Regression problems can be addressed by two indispensable parts including machine learning and statistical learning in remote sensing. Machine learning predicts future events with more sophisticated models, while statistical models can be used in feature selection, provide the statistical significance of regression coefficients, and to show the relationships between the data points.
Among all the studies reviewed, 101 papers use regression models associated with UAV imagery. Figure 21 shows the percentage of different regression models used in these studies. As seen, the linear regression model (LRM) and its extensions, such as multiple linear regression (MLR), least square regression (LSR), and stepwise linear regression (SWL) were the most frequently used models with an 83% share, followed by the random forest regression (RFR) model with an 11% share.
The linear regression model has several advantages, such as easy implementation and fast computational speed, which make it popular compared to the other models. The RFR model, which only 11 studies adopted, suffers from the complexity of its algorithm and its high running time [111]. However, it is robust against non-linearity without requiring an assumption for target prediction. The MLR model, with only a 4% share, is less common. The possible reason for a minority of studies using MLR might be its inefficiency in predicting and managing relationships between dependent and independent variables [54].

5. Conclusions

In this study, we conducted a meta-analysis review of 163 articles in the area of remote sensing of UAV and machine learning and statistical models. A summary of our findings is as follows:
  • China and the USA account for the bulk of the UAV research with 27% and 16% usage shares, respectively. However, new opportunities for the processing of UAV data are being provided across the world, particularly in northern European countries.
  • The use of machine learning and statistical models in UAV remote sensing applications has increased since 2014. In particular, most of them were published in 2018-2019 with a 59% share. From the perspective of platform type, hexacopters were the most popular platform with a 30% share, followed by quadcopters, fixed-wings, and octocopters with approximately equal shares of about 25%, 24%, and 19%, respectively.
  • Various remote sensing applications have been used to combine UAV image processing and machine learning and statistical models due to the advantages of these algorithms. The top three UAV applications were agriculture (42%), forestry (22%), and grassland mapping (8%).
  • In terms of sensor type, visible sensor technology (53%) was the most commonly used sensor with the highest overall accuracy (92.9%) among classification articles. Canon was the most popular brand used in this review.
  • From an image overlap perspective, agriculture and grassland applications have the same median of forward-and-side overlap at about 80% and 70%, respectively. The forestry boxplot showed a higher median of forwarding and side overlap (85%, 73%), and the wetland showed a lower overlap (75%, 70%).
  • Of the case studies presenting utilized processing software (140 studies), 62% used Agisoft PhotoScan to process UAV remote sensing data (80 studies), followed by Pix4D at about 30% (39 studies).
  • Among all studies in this review, 103 studies (62%) utilized ancillary data in their processing.
  • In-situ measurements are common in regression applications with a 62% share (64 studies), compared to classification ones with only 24 studies. On the other hand, satellite images only used for classification accounted for a 4% share.
  • Classification using deep learning method achieved the highest overall accuracy (94.8%) followed by MLC, SVM, and RF with 91.28%, 90.08%, and 89% overall accuracy, respectively.
  • Visible sensors achieved the highest median overall accuracies with share of about 92.9 followed by hyperspectral, multispectral, and LiDAR sensors with the share of 85%, 89%, and 70%, respectively.
  • Pixel-based classification methods are in the majority with a lower median of overall accuracy (88.82%), rather than object-based approaches (94.45%). From a temporal scope perspective, multi-temporal achieved the higher median overall accuracy of the about 94.9% compared to the single-date with the share of about 89.9%.
  • Regression was the primary method used in this review, with a 62% share. The most common regression model was linear regression (68%), followed by RF (11%).

Author Contributions

Conceptualization, M.M.; Supervision, M.M.; Formal Analysis, R.E., M.M.; Data Collection, R.E.; Visualization, R.E., M.M.; Writing—Original Draft Preparation, R.E., M.M.; Writing—Review & Editing, R.E., M.M., F.M., B.S., B.B., S.H. All authors have read and agreed to the published version of the manuscript.

Funding

This research was undertaken with financial support of the Canada Centre for Mapping and Earth Observation (CCMEO), Natural Resources Canada (NRCan).

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Parameters cataloged in the meta-analysis.
Table A1. Parameters cataloged in the meta-analysis.
No.AttributeDescriptionCategories
1TitleTitle of the article
2First author
3Affiliation
4JournalRefereed journal
5Year of publication
6Citation
7ApplicationDisciplinary topic Agriculture; Forestry; Grassland; Soil; Sea ice; Wetland; Water; Marine; Mining; Land cover/Land use; Coastal management; Disaster management
8Method Classification; Regression
9Study areaGeographical location of study area
10Ancillary dataIncluding field measurement or additional data
11Extracted featureFeatures used for classification such as spectral or texture indices
12# extracted featuresNumber of features
13Processing unitClassification processing unitPixel; object
14Assessment indicesClassification or regression accuracy assessment Overall Accuracy (OA); User Accuracy (UA); Producer’s Accuracy (PA); Kappa coefficient; RMSE, R2
15Processing environmentSoftware used for photogrammetry processingPix4DMapper; Agisoft Photoscan
16ML environmentSoftware used for Machine learning and statistical analysisMatlab; R; ENVI; eCognition; Python; SAS, SPSS; ArcGIS
17Control systemFlight planning apps and ground control systems
18Platform NameManufacturer, make, and model
19Platform type Fixed-wing; Helicopter; Quadcopter; Hexacopter; Octocopter
20Platform weightMeasured in kg
21Flight heightMeasured in m
22Temporal Scope Single date; Multi-temporal
23Sensor NameManufacturer, make, and model
24Sensor type Visible; LiDAR; Multispectral; Hyperspectral; Thermal
25Focal length (mm) Distance between the lens and the image sensor
26Image resolution (Pixel)Number of pixels in an image
27Pixel size (µm)The size of each Pixel measured in Micron
28Frame rate (fps)Frame frequency
29Field of view (degree)The angular extent of a given scene that is imaged by a camera
30Forward/Side overlap (%)Image overlap percentage
31GSD (cm)Distance between two consecutive pixel centers measured on the ground
32# GCPs Number of collected ground control points
33GPSGlobal positioning systemDGPS; GPS RTK; GPS PPK
34Calibration methodProcedures for accurate location of images

References

  1. Manfreda, S.; McCabe, M.F.; Miller, P.E.; Lucas, R.; Pajuelo Madrigal, V.; Mallinis, G.; Ben-Dor, E.; Helman, D.; Estes, L.; Ciraolo, G.; et al. On the Use of Unmanned Aerial Systems for Environmental Monitoring. Remote Sens. 2018, 10, 641. [Google Scholar] [CrossRef] [Green Version]
  2. Dabrowska-Zielinska, K.; Budzynska, M.; Malek, I.; Bojanowski, J.; Bochenek, Z.; Lewinski, S. Assessment of crop growth conditions for agri–environment ecosystem for modern landscape management. In Remote Sensing for a Changing Europe, Proceedings of the 28th Symposium of the European Association of Remote Sensing Laboratories, Istanbul, Turkey, 2–5 June 2008; IOS Press: Amsterdam, The Netherlands, 2009. [Google Scholar]
  3. Mulla, D.J. Twenty five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2013, 114, 358–371. [Google Scholar] [CrossRef]
  4. Lillesand, T.; Kiefer, R.W.; Chipman, J. Remote Sensing and Image Interpretation; John Wiley and Sons: Hoboken, NJ, USA, 2015. [Google Scholar]
  5. Müllerová, J.; Brůna, J.; Bartaloš, T.; Dvořák, P.; Vítková, M.; Pyšek, P. Timing Is Important: Unmanned Aircraft vs. Satellite Imagery in Plant Invasion Monitoring. Front. Plant Sci. 2017, 8, 887. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  7. Tao, C.V. Mobile mapping technology for road network data acquisition. J. Geospat. Eng. 2000, 2, 1–14. [Google Scholar]
  8. Hunt, J.E.R.; Hively, W.D.; Fujikawa, S.J.; Linden, D.S.; Daughtry, C.S.T.; Mccarty, G.W. Acquisition of NIR-Green-Blue Digital Photographs from Unmanned Aircraft for Crop Monitoring. Remote Sens. 2010, 2, 290–305. [Google Scholar] [CrossRef] [Green Version]
  9. Pádua, L.; Vanko, J.; Hruška, J.; Adão, T.; Sousa, J.J.; Peres, E.; Morais, R. UAS, sensors, and data processing in agroforestry: A review towards practical applications. Int. J. Remote Sens. 2017, 38, 2349–2391. [Google Scholar] [CrossRef]
  10. Zheng, H.; Li, W.; Jiang, J.; Liu, Y.; Cheng, T.; Tian, Y.; Zhu, Y.; Cao, W.; Zhang, Y.; Yao, X. A Comparative Assessment of Different Modeling Algorithms for Estimating Leaf Nitrogen Content in Winter Wheat Using Multispectral Images from an Unmanned Aerial Vehicle. Remote Sens. 2018, 10, 2026. [Google Scholar] [CrossRef] [Green Version]
  11. Song, Y.; Wang, J. Winter Wheat Canopy Height Extraction from UAV-Based Point Cloud Data with a Moving Cuboid Filter. Remote Sens. 2019, 11, 1239. [Google Scholar] [CrossRef] [Green Version]
  12. Zhang, S.; Zhao, G.; Lang, K.; Su, B.; Chen, X.; Xi, X.; Zhang, H. Integrated Satellite, Unmanned Aerial Vehicle (UAV) and Ground Inversion of the SPAD of Winter Wheat in the Reviving Stage. Sensors 2019, 19, 1485. [Google Scholar] [CrossRef] [Green Version]
  13. Buchaillot, M.L.; Gracia-Romero, A.; Vergara-Diaz, O.; Zaman-Allah, M.A.; Tarekegne, A.; Cairns, J.E.; Prasanna, B.M.; Araus, J.L.; Kefauver, S.C. Evaluating Maize Genotype Performance under Low Nitrogen Conditions Using RGB UAV Phenotyping Techniques. Sensors 2019, 19, 1815. [Google Scholar] [CrossRef] [Green Version]
  14. Thorp, K.; Thompson, A.L.; Harders, S.J.; French, A.; Ward, R.W. High-Throughput Phenotyping of Crop Water Use Efficiency via Multispectral Drone Imagery and a Daily Soil Water Balance Model. Remote Sens. 2018, 10, 1682. [Google Scholar] [CrossRef] [Green Version]
  15. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.-H. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef] [Green Version]
  16. Ampatzidis, Y.; Partel, V. UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence. Remote Sens. 2019, 11, 410. [Google Scholar] [CrossRef] [Green Version]
  17. Zhou, X.; Zheng, H.; Xu, X.; He, J.; Ge, X.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.; Tian, Y. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  18. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned Aerial Vehicle Remote Sensing for Field-Based Crop Phenotyping: Current Status and Perspectives. Front. Plant Sci. 2017, 8, 1111. [Google Scholar] [CrossRef]
  19. Enciso, J.; Avila, C.A.; Jung, J.; Elsayed-Farag, S.; Chang, A.; Yeom, J.; Landivar, J.; Maeda, M.; Chavez, J.C. Validation of agronomic UAV and field measurements for tomato varieties. Comput. Electron. Agric. 2019, 158, 278–283. [Google Scholar] [CrossRef]
  20. Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral Vegetation Indices and Their Relationships with Agricultural Crop Characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
  21. Mohammadimanesh, F.; Salehi, B.; Mahdianpari, M.; Brisco, B.; Motagh, M. Wetland Water Level Monitoring Using Interferometric Synthetic Aperture Radar (InSAR): A Review. Can. J. Remote Sens. 2018, 44, 247–262. [Google Scholar] [CrossRef]
  22. Su, T.-C. A study of a matching pixel by pixel (MPP) algorithm to establish an empirical model of water quality mapping, as based on unmanned aerial vehicle (UAV) images. Int. J. Appl. Earth Obs. Geoinf. 2017, 58, 213–224. [Google Scholar] [CrossRef]
  23. Jonassen, M.O.; Tisler, P.; Altstädter, B.; Scholtz, A.; Vihma, T.; Lampert, A.; König-Langlo, G.; Lüpkes, C. Application of remotely piloted aircraft systems in observing the atmospheric boundary layer over Antarctic sea ice in winter. Polar Res. 2015, 34, 25651. [Google Scholar] [CrossRef] [Green Version]
  24. Gonçalves, J.; Henriques, R. UAV photogrammetry for topographic monitoring of coastal areas. ISPRS J. Photogramm. Remote Sens. 2015, 104, 101–111. [Google Scholar] [CrossRef]
  25. Mahdianpari, M.; Salehi, B.; Mohammadimanesh, F.; Larsen, G.; Peddle, D.R. Mapping land-based oil spills using high spatial resolution unmanned aerial vehicle imagery and electromagnetic induction survey data. J. Appl. Remote Sens. 2018, 12, 1. [Google Scholar] [CrossRef]
  26. Honkavaara, E.; Eskelinen, M.A.; Pölönen, I.; Saari, H.; Ojanen, H.; Mannila, R.; Holmlund, C.; Hakala, T.; Litkey, P.; Rosnell, T.; et al. Remote Sensing of 3-D Geometry and Surface Moisture of a Peat Production Area Using Hyperspectral Frame Cameras in Visible to Short-Wave Infrared Spectral Ranges Onboard a Small Unmanned Airborne Vehicle (UAV). IEEE Trans. Geosci. Remote Sens. 2016, 54, 5440–5454. [Google Scholar] [CrossRef] [Green Version]
  27. Ge, X.; Wang, J.; Ding, J.; Cao, X.; Zhang, Z.; Liu, J.; Li, X. Combining UAV-based hyperspectral imagery and machine learning algorithms for soil moisture content monitoring. PeerJ 2019, 7, e6926. [Google Scholar] [CrossRef]
  28. Peng, J.; Biswas, A.; Jiang, Q.; Zhao, R.; Hu, J.; Hu, B.; Shi, Z. Estimating soil salinity from remote sensing and terrain data in southern Xinjiang Province, China. Geoderma 2019, 337, 1309–1319. [Google Scholar] [CrossRef]
  29. Wang, S.; Garcia, M.; Ibrom, A.; Jakobsen, J.; Köppl, C.J.; Mallick, K.; Looms, M.C.; Bauer-Gottwein, P. Mapping Root-Zone Soil Moisture Using a Temperature–Vegetation Triangle Approach with an Unmanned Aerial System: Incorporating Surface Roughness from Structure from Motion. Remote Sens. 2018, 10, 1978. [Google Scholar] [CrossRef] [Green Version]
  30. Al-Rawabdeh, A.; He, F.; Moussa, A.; El-Sheimy, N.; Habib, A. Using an Unmanned Aerial Vehicle-Based Digital Imaging System to Derive a 3D Point Cloud for Landslide Scarp Recognition. Remote Sens. 2016, 8, 95. [Google Scholar] [CrossRef] [Green Version]
  31. Jayathunga, S.; Owari, T.; Tsuyuki, S. The use of fixed–wing UAV photogrammetry with LiDAR DTM to estimate merchantable volume and carbon stock in living biomass over a mixed conifer–broadleaf forest. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 767–777. [Google Scholar] [CrossRef]
  32. Yuan, C.; Zhang, Y.; Liu, Z. A survey on technologies for automatic forest fire monitoring, detection, and fighting using unmanned aerial vehicles and remote sensing techniques. Can. J. Res. 2015, 45, 783–792. [Google Scholar] [CrossRef]
  33. Michez, A.; Piégay, H.; Lisein, J.; Claessens, H.; Lejeune, P. Classification of riparian forest species and health condition using multi-temporal and hyperspatial imagery from unmanned aerial system. Environ. Monit. Assess. 2016, 188, 1–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Garza, B.N.; Ancona, V.; Enciso, J.; Perotto-Baldiviezo, H.; Kunta, M.; Simpson, C. Quantifying Citrus Tree Health Using True Color UAV Images. Remote Sens. 2020, 12, 170. [Google Scholar] [CrossRef] [Green Version]
  35. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  36. Holloway, J.; Mengersen, K. Statistical Machine Learning Methods and Remote Sensing for Sustainable Development Goals: A Review. Remote Sens. 2018, 10, 1365. [Google Scholar] [CrossRef] [Green Version]
  37. Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
  38. Lu, B.; He, Y. Species classification using Unmanned Aerial Vehicle (UAV)-acquired high spatial resolution imagery in a heterogeneous grassland. ISPRS J. Photogramm. Remote Sens. 2017, 128, 73–85. [Google Scholar] [CrossRef]
  39. Gislason, P.O.; Benediktsson, J.A.; Sveinsson, J.R. Random Forests for land cover classification. Patt. Recognit. Lett. 2006, 27, 294–300. [Google Scholar] [CrossRef]
  40. Mahdianpari, M.; Salehi, B.; Mohammadimanesh, F.; Motagh, M. Random forest wetland classification using ALOS-2 L-band, RADARSAT-2 C-band, and TerraSAR-X imagery. ISPRS J. Photogramm. Remote Sens. 2017, 130, 13–31. [Google Scholar] [CrossRef]
  41. Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.N.; et al. Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
  42. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  43. Lovitt, J.; Rahman, M.M.; McDermid, G.J. Assessing the Value of UAV Photogrammetry for Characterizing Terrain in Complex Peatlands. Remote Sens. 2017, 9, 715. [Google Scholar] [CrossRef] [Green Version]
  44. Bian, J.; Zhang, Z.; Chen, J.; Chen, H.; Cui, C.; Li, X.; Chen, S.; Fu, Q. Simplified Evaluation of Cotton Water Stress Using High Resolution Unmanned Aerial Vehicle Thermal Imagery. Remote Sens. 2019, 11, 267. [Google Scholar] [CrossRef] [Green Version]
  45. Vanegas, F.; Bratanov, D.; Powell, K.; Weiss, J.; Gonzalez, F. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data. Sensors 2018, 18, 260. [Google Scholar] [CrossRef] [Green Version]
  46. Hu, J.; Peng, J.; Zhou, Y.; Xu, D.; Zhao, R.; Jiang, Q.; Fu, T.; Wang, F.; Shi, Z. Quantitative Estimation of Soil Salinity Using UAV-Borne Hyperspectral and Satellite Multispectral Images. Remote Sens. 2019, 11, 736. [Google Scholar] [CrossRef] [Green Version]
  47. Salamí, E.; Barrado, C.; Pastor, E. UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas. Remote Sens. 2014, 6, 11051–11081. [Google Scholar] [CrossRef] [Green Version]
  48. Poley, L.G.; McDermid, G.J. A Systematic Review of the Factors Influencing the Estimation of Vegetation Aboveground Biomass Using Unmanned Aerial Systems. Remote Sens. 2020, 12, 1052. [Google Scholar] [CrossRef] [Green Version]
  49. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.; Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2016, 38, 2427–2447. [Google Scholar] [CrossRef]
  50. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J.J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  51. Yao, H.; Wen, B.; Chen, X. Unmanned Aerial Vehicle for Remote Sensing Applications—A Review. Remote Sens. 2019, 11, 1443. [Google Scholar] [CrossRef] [Green Version]
  52. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  53. Singh, K.K.; Frazier, A.E. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
  54. Tmušić, G.; Manfreda, S.; Aasen, H.; James, M.R.; Gonçalves, G.; Ben Dor, E.; Brook, A.; Polinova, M.; Arranz, J.J.; Mészáros, J.; et al. Current Practices in UAS-based Environmental Monitoring. Remote Sens. 2020, 12, 1001. [Google Scholar] [CrossRef] [Green Version]
  55. Watts, A.C.; Ambrosia, V.G.; Hinkley, E.A. Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use. Remote Sens. 2012, 4, 1671–1692. [Google Scholar] [CrossRef] [Green Version]
  56. Colomina, I.; Molina, P. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2014, 92, 79–97. [Google Scholar] [CrossRef] [Green Version]
  57. Toth, C.; Jóźków, G. Remote sensing platforms and sensors: A survey. ISPRS J. Photogramm. Remote Sens. 2016, 115, 22–36. [Google Scholar] [CrossRef]
  58. Bhardwaj, A.; Sam, L.; Akanksha; Martín-Torres, F.J.; Kumar, R. UAVs as remote sensing platform in glaciology: Present applications and future prospects. Remote Sens. Environ. 2016, 175, 196–204. [Google Scholar] [CrossRef]
  59. Gaffey, C.; Bhardwaj, A. Applications of Unmanned Aerial Vehicles in Cryosphere: Latest Advances and Prospects. Remote Sens. 2020, 12, 948. [Google Scholar] [CrossRef] [Green Version]
  60. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by UAV to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  61. Tu, Y.-H.; Phinn, S.; Johansen, K.; Robson, A. Assessing Radiometric Correction Approaches for Multi-Spectral UAS Imagery for Horticultural Applications. Remote Sens. 2018, 10, 1684. [Google Scholar] [CrossRef] [Green Version]
  62. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P.J. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
  63. Mlambo, R.; Woodhouse, I.H.; Gerard, F.; Anderson, K. Structure from Motion (SfM) Photogrammetry with Drone Data: A Low Cost Method for Monitoring Greenhouse Gas Emissions from Forests in Developing Countries. Forests 2017, 8, 68. [Google Scholar] [CrossRef] [Green Version]
  64. Díaz-Varela, R.A.; De La Rosa, R.; León, L.; Zarco-Tejada, P.J. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef] [Green Version]
  65. De Castro, A.I.; Peña, J.-M.; Torres-Sánchez, J.; Jiménez-Brenes, F.M.; Valencia-Gredilla, F.; Recasens, J.; López-Granados, F.; Recasens, J. Mapping Cynodon Dactylon Infesting Cover Crops with an Automatic Decision Tree-OBIA Procedure and UAV Imagery for Precision Viticulture. Remote Sens. 2019, 12, 56. [Google Scholar] [CrossRef] [Green Version]
  66. Belgiu, M.; Drăguţ, L. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
  67. Hassan-Esfahani, L.; Torres-Rua, A.; Jensen, A.; McKee, M. Assessment of Surface Soil Moisture Using High-Resolution Multi-Spectral Imagery and Artificial Neural Networks. Remote Sens. 2015, 7, 2627–2646. [Google Scholar] [CrossRef] [Green Version]
  68. Tavus, M.R.; Eker, M.E.; Şenyer, N.; Karabulut, B. Plant counting by using k-NN classification on UAVs images. In Proceedings of the 23nd Signal Processing and Communications Applications Conference (SIU), Malatya, Turkey, 16–19 May 2015; pp. 1058–1061. [Google Scholar]
  69. Yuan, H.; Yang, G.; Li, C.; Wang, Y.; Liu, J.; Yu, H.; Feng, H.; Xu, B.; Zhao, X.; Yang, X. Retrieving Soybean Leaf Area Index from Unmanned Aerial Vehicle Hyperspectral Remote Sensing: Analysis of RF, ANN, and SVM Regression Models. Remote Sens. 2017, 9, 309. [Google Scholar] [CrossRef] [Green Version]
  70. Jaakkola, A.; Hyyppä, J.; Yu, X.; Kukko, A.; Kaartinen, H.; Liang, X.; Hyyppä, H.; Wang, Y. Autonomous Collection of Forest Field Reference—The Outlook and a First Step with UAV Laser Scanning. Remote Sens. 2017, 9, 785. [Google Scholar] [CrossRef] [Green Version]
  71. Chen, J.; Yi, S.; Qin, Y.; Wang, X. Improving estimates of fractional vegetation cover based on UAV in alpine grassland on the Qinghai–Tibetan Plateau. Int. J. Remote Sens. 2016, 37, 1922–1936. [Google Scholar] [CrossRef]
  72. Jing, R.; Gong, Z.; Zhao, W.; Pu, R.; Deng, L. Above-bottom biomass retrieval of aquatic plants with regression models and SfM data acquired by a UAV platform—A case study in Wild Duck Lake Wetland, Beijing, China. ISPRS J. Photogramm. Remote Sens. 2017, 134, 122–134. [Google Scholar] [CrossRef]
  73. Yuan, W.; Li, J.; Bhatta, M.; Shi, Y.; Baenziger, P.S.; Ge, Y. Wheat Height Estimation Using LiDAR in Comparison to Ultrasonic Sensor and UAS. Sensors 2018, 18, 3731. [Google Scholar] [CrossRef] [Green Version]
  74. Congalton, R.G. Remote sensing and geographic information system data integration: Error sources and Research Issues. Photogramm. Eng. Remote Sens. 1991, 57, 677–687. [Google Scholar]
  75. Ramezan, C.A.; Warner, T.A.; Maxwell, A.E. Evaluation of Sampling and Cross-Validation Tuning Strategies for Regional-Scale Machine Learning Classification. Remote Sens. 2019, 11, 185. [Google Scholar] [CrossRef] [Green Version]
  76. Congalton, R.G. A review of assessing the accuracy of classifications of remotely sensed data. Remote Sens. Environ. 1991, 37, 35–46. [Google Scholar] [CrossRef]
  77. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Ann. Int. Med. 2009, 151, 264–269. [Google Scholar] [CrossRef] [Green Version]
  78. Kraaijenbrink, P.D. High-resolution insights into the dynamics of Himalayan debris-covered glaciers. In Proceedings of the AGU Fall Meeting Abstracts, Washington, DC, USA, 10–14 December 2018. [Google Scholar]
  79. Barnas, A.F.; Felege, C.J.; Rockwell, R.F.; Ellis-Felege, S.N. A pilot(less) study on the use of an unmanned aircraft system for studying polar bears (Ursus maritimus). Polar Biol. 2018, 41, 1055–1062. [Google Scholar] [CrossRef]
  80. Rossini, M.; Di Mauro, B.; Garzonio, R.; Baccolo, G.; Cavallini, G.; Mattavelli, M.; De Amicis, M.; Colombo, R. Rapid melting dynamics of an alpine glacier with repeated UAV photogrammetry. Geomorphology 2018, 304, 159–172. [Google Scholar] [CrossRef]
  81. Romero, M.; Luo, Y.; Su, B.; Fuentes, S. Vineyard water status estimation using multispectral imagery from an UAV platform and machine learning algorithms for irrigation scheduling management. Comput. Electron. Agric. 2018, 147, 109–117. [Google Scholar] [CrossRef]
  82. Aboutalebi, M.; Allen, L.N.; Torres-Rua, A.F.; McKee, M.; Coopmans, C. Estimation of soil moisture at different soil levels using machine learning techniques and unmanned aerial vehicle (UAV) multispectral imagery. In Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV; International Society for Optics and Photonics: Baltimore, MD, USA, 2019; Volume 11008, p. 110080S. [Google Scholar]
  83. Brieger, F.; Herzschuh, U.; Pestryakova, L.A.; Bookhagen, B.; Zakharov, E.S.; Kruse, S. Advances in the Derivation of Northeast Siberian Forest Metrics Using High-Resolution UAV-Based Photogrammetric Point Clouds. Remote Sens. 2019, 11, 1447. [Google Scholar] [CrossRef] [Green Version]
  84. Stuart, M.B.; Mcgonigle, A.J.S.; Willmott, J.R. Hyperspectral Imaging in Environmental Monitoring: A Review of Recent Developments and Technological Advances in Compact Field Deployable Systems. Sensors 2019, 19, 3071. [Google Scholar] [CrossRef] [Green Version]
  85. Elsner, P.; Dornbusch, U.; Thomas, I.; Amos, D.; Bovington, J.; Horn, D. Coincident beach surveys using UAS, vehicle mounted and airborne laser scanner: Point cloud inter-comparison and effects of surface type heterogeneity on elevation accuracies. Remote Sens. Environ. 2018, 208, 15–26. [Google Scholar] [CrossRef]
  86. DJI—Der Marktführer im Bereich ziviler Drohnen- und Luftbildtechnologie. Available online: https://www.dji.com/de (accessed on 8 June 2020).
  87. Mesas-Carrascosa, F.; Rumbao, I.C.; Torres-Sánchez, J.; García-Ferrer, A.; Peña, J.M.; Granados, F.L. Accurate ortho-mosaicked six-band multispectral UAV images as affected by mission planning for precision agriculture proposes. Int. J. Remote Sens. 2016, 38, 2161–2176. [Google Scholar] [CrossRef]
  88. Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2017, 19, 115–133. [Google Scholar] [CrossRef]
  89. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.M.; Van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote. Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
  90. Isacsson, M. Snow layer mapping by remote sensing from Unmanned Aerial Vehicles: A mixed method study of sensor applications for research in Arctic and Alpine environments. Master′s Thesis, Royal Institute of Technology, Stockholm, Sweden, 2018. [Google Scholar]
  91. Agrisoft LLC. Agisoft Metashape User Manual: Professional Edition, Version 1.5. St. Petersburg; Agrisoft LLC: Yogyakarta, Indonesia, 2019. [Google Scholar]
  92. James, M.; Robson, S.; D’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef] [Green Version]
  93. Wang, C.; Myint, S.W. A Simplified Empirical Line Method of Radiometric Calibration for Small Unmanned Aircraft Systems-Based Remote Sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 1876–1885. [Google Scholar] [CrossRef]
  94. Stöcker, C.; Nex, F.; Koeva, M.; Gerke, M. Quality Assessment of Combined Imu/Gnss Data for Direct Georeferencing in the Context Of Uav-Based Mapping. ISPRS Int. Arch. Photogramm. Remote Sens. 2017, 42, 355–361. [Google Scholar] [CrossRef] [Green Version]
  95. Padró, J.-C.; Muñoz, F.-J.; Planas, J.; Pons, X. Comparison of four UAV georeferencing methods for environmental monitoring purposes focusing on the combined use with airborne and satellite remote sensing platforms. Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 130–140. [Google Scholar] [CrossRef]
  96. Zhu, X.; Meng, L.; Zhang, Y.; Weng, Q.; Morris, J. Tidal and Meteorological Influences on the Growth of Invasive Spartina alterniflora: Evidence from UAV Remote Sensing. Remote Sens. 2019, 11, 1208. [Google Scholar] [CrossRef] [Green Version]
  97. Collin, A.; Ramambason, C.; Pastol, Y.; Casella, E.; Rovere, A.; Thiault, L.; Espiau, B.; Siu, G.; Lerouvreur, F.; Nakamura, N.; et al. Very high resolution mapping of coral reef state using airborne bathymetric LiDAR surface-intensity and drone imagery. Int. J. Remote Sens. 2018, 39, 5676–5688. [Google Scholar] [CrossRef] [Green Version]
  98. Tian, J.; Wang, L.; Li, X.; Gong, H.; Shi, C.; Zhong, R.; Liu, X. Comparison of UAV and WorldView-2 imagery for mapping leaf area index of mangrove forest. Int. J. Appl. Earth Obs. Geoinf. 2017, 61, 22–31. [Google Scholar] [CrossRef]
  99. Gruszczyński, W.; Matwij, W.; Ćwiąkała, P. Comparison of low-altitude UAV photogrammetry with terrestrial laser scanning as data-source methods for terrain covered in low vegetation. ISPRS J. Photogramm. Remote Sens. 2017, 126, 168–179. [Google Scholar] [CrossRef]
  100. Seier, G.; Stangl, J.; Schöttl, S.; Sulzer, W.; Sass, O. UAV and TLS for monitoring a creek in an alpine environment, Styria, Austria. Int. J. Remote Sens. 2017, 38, 2903–2920. [Google Scholar] [CrossRef]
  101. Komárek, J.; Klouček, T.; Prosek, J. The potential of Unmanned Aerial Systems: A tool towards precision classification of hard-to-distinguish vegetation types? Int. J. Appl. Earth Obs. Geoinf. 2018, 71, 9–19. [Google Scholar] [CrossRef]
  102. Cruzan, M.B.; Weinstein, B.G.; Grasty, M.R.; Kohrn, B.F.; Hendrickson, E.C.; Arredondo, T.M.; Thompson, P.G. Small Unmanned Aerial Vehicles (Micro-Uavs, Drones) in Plant Ecology. Appl. Plant Sci. 2016, 4, 1600041. [Google Scholar] [CrossRef]
  103. Mahdianpari, M.; Salehi, B.; Rezaee, M.; Mohammadimanesh, F.; Zhang, Y. Very Deep Convolutional Neural Networks for Complex Land Cover Mapping Using Multispectral Remote Sensing Imagery. Remote Sens. 2018, 10, 1119. [Google Scholar] [CrossRef] [Green Version]
  104. Rezaee, M.; Mahdianpari, M.; Zhang, Y.; Salehi, B. Deep Convolutional Neural Network for Complex Wetland Classification Using Optical Remote Sensing Imagery. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 3030–3039. [Google Scholar] [CrossRef]
  105. Duro, D.C.; Franklin, S.E.; Dubé, M.G. A comparison of pixel-based and object-based image analysis with selected machine learning algorithms for the classification of agricultural landscapes using SPOT-5 HRG imagery. Remote Sens. Environ. 2012, 118, 259–272. [Google Scholar] [CrossRef]
  106. Mountrakis, G.; Im, J.; Ogole, C. Support vector machines in remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
  107. Li, M.; Ma, L.; Blaschke, T.; Cheng, L.; Tiede, D. A systematic comparison of different object-based classification techniques using high spatial resolution imagery in agricultural environments. Int. J. Appl. Earth Obs. Geoinf. 2016, 49, 87–98. [Google Scholar] [CrossRef]
  108. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65, 2–16. [Google Scholar] [CrossRef] [Green Version]
  109. Guo, Q.; Kelly, M.; Gong, P.; Liu, D.; Kelly, M. An Object-Based Classification Approach in Mapping Tree Mortality Using High Spatial Resolution Imagery. GiSci. Remote Sens. 2007, 44, 24–47. [Google Scholar] [CrossRef]
  110. Pérez-Ortiz, M.; Gutiérrez, P.A.; Peña, J.M.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. An Experimental Comparison for the Identification of Weeds in Sunflower Crops via Unmanned Aerial Vehicles and Object-Based Analysis. In Lecture Notes in Computer Science; Springer Science and Business Media LLC: Berlin, Germany, 2015; Volume 9094, pp. 252–262. [Google Scholar]
  111. Forkuor, G.; Hounkpatin, O.K.L.; Welp, G.; Thiel, M. High Resolution Mapping of Soil Properties Using Remote Sensing Variables in South-Western Burkina Faso: A Comparison of Machine Learning and Multiple Linear Regression Models. PLoS ONE 2017, 12, e0170478. [Google Scholar] [CrossRef]
Figure 1. Data processing Workflow.
Figure 1. Data processing Workflow.
Remotesensing 12 03511 g001
Figure 2. Photogrammetric processing workflow.
Figure 2. Photogrammetric processing workflow.
Remotesensing 12 03511 g002
Figure 3. PRISMA flow diagram for case study selection.
Figure 3. PRISMA flow diagram for case study selection.
Remotesensing 12 03511 g003
Figure 4. Frequency of keywords used within total searched papers (Tag cloud). The bigger the font size, the higher the frequency of usage.
Figure 4. Frequency of keywords used within total searched papers (Tag cloud). The bigger the font size, the higher the frequency of usage.
Remotesensing 12 03511 g004
Figure 5. The number of relevant published articles per journal. Journals with more than three publications have been shown here. Remote. Sens.: Remote Sensing, IJRS: International Journal of Remote Sensing, IJAEOG: International Journal of Applied Earth Observation and Geoinformation, ISPRS JPRS: ISPRS Journal of Photogrammetry and Remote Sensing, IEEE J- STARS: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, IJGI: ISPRS International Journal of Geo-Information, Remote. Sens. of Environ.: Remote Sensing of Environment, IEEE TGRS: IEEE Transactions on Geoscience and Remote Sensing, Remote. Sens. Letter: Remote Sensing Letter.
Figure 5. The number of relevant published articles per journal. Journals with more than three publications have been shown here. Remote. Sens.: Remote Sensing, IJRS: International Journal of Remote Sensing, IJAEOG: International Journal of Applied Earth Observation and Geoinformation, ISPRS JPRS: ISPRS Journal of Photogrammetry and Remote Sensing, IEEE J- STARS: IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing, IJGI: ISPRS International Journal of Geo-Information, Remote. Sens. of Environ.: Remote Sensing of Environment, IEEE TGRS: IEEE Transactions on Geoscience and Remote Sensing, Remote. Sens. Letter: Remote Sensing Letter.
Remotesensing 12 03511 g005
Figure 6. World distribution of relevant studies.
Figure 6. World distribution of relevant studies.
Remotesensing 12 03511 g006
Figure 7. Frequency of platform types utilized in articles combined machine learning and statistical models and UAV imagery for the period 2000–2019.
Figure 7. Frequency of platform types utilized in articles combined machine learning and statistical models and UAV imagery for the period 2000–2019.
Remotesensing 12 03511 g007
Figure 8. The number of studies per agro-environmental monitoring application. The purple bars represent the number of studies with classification purposes. The green bars represent the number of studies with regression purposes.
Figure 8. The number of studies per agro-environmental monitoring application. The purple bars represent the number of studies with classification purposes. The green bars represent the number of studies with regression purposes.
Remotesensing 12 03511 g008
Figure 9. (a) Image spatial resolution vs. sensor type, (b) Number of studies per sensor technology.
Figure 9. (a) Image spatial resolution vs. sensor type, (b) Number of studies per sensor technology.
Remotesensing 12 03511 g009
Figure 10. Preferred brands of sensors used in the surveyed studies with percentages and values for each brand.
Figure 10. Preferred brands of sensors used in the surveyed studies with percentages and values for each brand.
Remotesensing 12 03511 g010
Figure 11. Preferred brands of UAV platforms used in the surveyed studies with percentages and values for each brand.
Figure 11. Preferred brands of UAV platforms used in the surveyed studies with percentages and values for each brand.
Remotesensing 12 03511 g011
Figure 12. Forward overlap distribution of UAV imagery for different applications.
Figure 12. Forward overlap distribution of UAV imagery for different applications.
Remotesensing 12 03511 g012
Figure 13. Side overlap distribution of UAV imagery for different applications.
Figure 13. Side overlap distribution of UAV imagery for different applications.
Remotesensing 12 03511 g013
Figure 14. Percentage of case studies in terms of photogrammetry processing software.
Figure 14. Percentage of case studies in terms of photogrammetry processing software.
Remotesensing 12 03511 g014
Figure 15. Number of studies in terms of machine learning and statistical software.
Figure 15. Number of studies in terms of machine learning and statistical software.
Remotesensing 12 03511 g015
Figure 16. Ground sampling distance (GSD) vs. flight height for different platform types.
Figure 16. Ground sampling distance (GSD) vs. flight height for different platform types.
Remotesensing 12 03511 g016
Figure 17. Percentages of ancillary data types in regression and classification studies.
Figure 17. Percentages of ancillary data types in regression and classification studies.
Remotesensing 12 03511 g017
Figure 18. Overall accuracy vs. sensor types.
Figure 18. Overall accuracy vs. sensor types.
Remotesensing 12 03511 g018
Figure 19. The overall accuracy of different classifiers.
Figure 19. The overall accuracy of different classifiers.
Remotesensing 12 03511 g019
Figure 20. Overall accuracy vs. remote sensing strategies. The red bar shows the median overall accuracy per classification strategy.
Figure 20. Overall accuracy vs. remote sensing strategies. The red bar shows the median overall accuracy per classification strategy.
Remotesensing 12 03511 g020
Figure 21. Regression models used in 101 studies.
Figure 21. Regression models used in 101 studies.
Remotesensing 12 03511 g021
Table 1. Related review studies on Unmanned Aerial Vehicle (UAV) remote sensing.
Table 1. Related review studies on Unmanned Aerial Vehicle (UAV) remote sensing.
No.TitleRefYearJournalContent
1UAS, sensors, and data processing in agroforestry: a review towards practical applications[9]2017IJRSA review on technological advancements in UAVs and imaging sensors in agroforestry applications
2Unmanned Aircraft Systems in Remote Sensing and Scientific Research: Classification and Considerations of Use[55]2012Remote. Sens.A review of UAV platform characteristics, applications, and regulations
3Unmanned aerial systems for photogrammetry and remote sensing: A review[56]2014ISPRS JPRSA review of the recent unmanned aircraft, sensing, navigation, orientation and general data processing developments for UAS photogrammetry and remote sensing
4UAV Flight Experiments Applied to the Remote Sensing of Vegetated Areas[47]2014Remote. Sens.A review of UAV remote sensing applications in vegetated areas monitoring
5Remote sensing platforms and sensors: A survey[57]2015ISPRS JPRSA review of remote sensing technologies, platforms and sensors
6UAVs as remote sensing platform in glaciology: Present applications and future prospects[58]2016Remote. Sens. of Environ.A review on polar and alpine applications of UAV
7A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications[53]2017IJRSA meta-analysis review on techniques and procedures used in terrestrial remote-sensing applications
8Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry[50]2017Remote. Sens.A review of UAV-based hyperspectral remote sensing for agriculture and forestry
9Forestry applications of UAVs in Europe: a review[49]2017ISPRS JPRSA review of UAV-based forestry applications and regulatory framework for UAV operation in the European Union/of the technology and of scientific applications in the forest sector
10On the Use of Unmanned Aerial Systems for Environmental Monitoring[1]2018Remote. Sens.An overview of applications of UAS in natural and agricultural ecosystem monitoring
11Unmanned Aerial Vehicle for Remote Sensing Applications—A Review[51]2019Remote. Sens.A review of UAVs remote sensing data processing and applications
12A Systematic Review of the Factors Influencing the Estimation of Vegetation Aboveground Biomass Using Unmanned Aerial Systems[48]2020Remote. Sens.A systematic review of UAS-borne passive sensors for vegetation AGB estimation
13Current Practices in UAS-based Environmental Monitoring[54]2020Remote. Sens.A review of studies in UAV-based environmental mapping using passive sensors
14Applications of Unmanned Aerial Vehicles in cryosphere: Latest Advances and Prospects[59]2020Remote. Sens.A review on applications of UAVs within glaciology, snow, permafrost, and polar research
IJRS: International Journal of Remote Sensing, Remote. Sens.: Remote Sensing, ISPRS JPRS: ISPRS Journal of Photogrammetry and Remote Sensing. Remote. Sens. of Environ.: Remote Sensing of Environment.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Eskandari, R.; Mahdianpari, M.; Mohammadimanesh, F.; Salehi, B.; Brisco, B.; Homayouni, S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sens. 2020, 12, 3511. https://doi.org/10.3390/rs12213511

AMA Style

Eskandari R, Mahdianpari M, Mohammadimanesh F, Salehi B, Brisco B, Homayouni S. Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models. Remote Sensing. 2020; 12(21):3511. https://doi.org/10.3390/rs12213511

Chicago/Turabian Style

Eskandari, Roghieh, Masoud Mahdianpari, Fariba Mohammadimanesh, Bahram Salehi, Brian Brisco, and Saeid Homayouni. 2020. "Meta-analysis of Unmanned Aerial Vehicle (UAV) Imagery for Agro-environmental Monitoring Using Machine Learning and Statistical Models" Remote Sensing 12, no. 21: 3511. https://doi.org/10.3390/rs12213511

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop