Next Article in Journal
Vegetation-Ice-Bare Land Cover Conversion in the Oceanic Glacial Region of Tibet Based on Multiple Machine Learning Classifications
Previous Article in Journal
Synergistic Use of Single-Pass Interferometry and Radar Altimetry to Measure Mass Loss of NEGIS Outlet Glaciers between 2011 and 2014
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application

1
Department of Biosystems & Biomaterials Science and Engineering, College of Agriculture and Life Sciences, Seoul National University, Seoul 08826, Korea
2
Department of Plant Resources and Environment, Jeju National University, Jeju 63243, Korea
3
Seeds Research, Syngenta Crop Protection LLC, Research Triangle Park, NC 27703, USA
4
Plant Bioscience, School of Applied Biosciences, Kyungpook National University, Daegu 41566, Korea
5
National Institute of Agricultural Sciences, Rural Development Administration (RDA), Jeonju 54874, Korea
6
School of Computer Information and Communication Engineering, Kunsan National University, Kunsan 54150, Korea
*
Author to whom correspondence should be addressed.
These authors also contributed equally to this work.
Remote Sens. 2020, 12(6), 998; https://doi.org/10.3390/rs12060998
Submission received: 25 February 2020 / Revised: 11 March 2020 / Accepted: 19 March 2020 / Published: 20 March 2020
(This article belongs to the Section Remote Sensing in Agriculture and Vegetation)

Abstract

:
Utilization of remote sensing is a new wave of modern agriculture that accelerates plant breeding and research, and the performance of farming practices and farm management. High-throughput phenotyping is a key advanced agricultural technology and has been rapidly adopted in plant research. However, technology adoption is not easy due to cost limitations in academia. This article reviews various commercial unmanned aerial vehicle (UAV) platforms as a high-throughput phenotyping technology for plant breeding. It compares known commercial UAV platforms that are cost-effective and manageable in field settings and demonstrates a general workflow for high-throughput phenotyping, including data analysis. The authors expect this article to create opportunities for academics to access new technologies and utilize the information for their research and breeding programs in more workable ways.

Graphical Abstract

1. Introduction

Plant breeding research in the twenty-first century is a combination of genotyping, phenotyping, and computational data analytics [1]. Over the last two decades, genotyping technology has improved markedly in terms of cost, turnaround time from sample to data, and the change from assay-based to sequence-based technology. Innovation in computational data analytics has been exponential, along with advances such as algorithms for analyzing large data sets and cloud computing. Phenotyping has also advanced dramatically with emerging technologies, such as automation, sensing, imaging, robotics, and high-speed networks.
In the last twenty years, sensing technology has become one of the most promising high-throughput phenotyping technologies. It provides a non-destructive measurement of crop performance in both controlled and field environments. With various sensing technologies, the popularity of aerial platforms has substantially increased, especially for studies in field settings. Aerial platforms include satellites, manned aircrafts, and unmanned aerial vehicles (UAVs). Satellite and manned aircraft remote sensing platforms can monitor plant behaviors in large areas and measure various data concurrently, based on fine spectral data and wide wavelengths. However, their low spatial and temporal resolution and high equipment costs are critical bottlenecks [2]. In addition, their low payloads create limitations in sensor mounting and low battery capacity, resulting in short flight times [3]. Given the above limitations, the UAV is a valid technology and a suitable aerial platform for breeders with the advantages of high spatial resolution, derived from their low altitude for capturing images, and high temporal resolution, derived from their flexible image capturing capability, even in overcast sky conditions [4,5,6,7,8].
Phenotyping methods must balance accuracy, speed, and cost. This article reviews various methodologies and applications for phenotyping using commercial UAV platforms. It focuses on cost-effective equipment and end-to-end workflow to improve the accuracy and speed of utilizing collected datasets.

2. Overview of Cost-Effective Commercial UAV Platforms

UAV platforms are classified based on various specifications, such as speed, payload, mounted sensor, and maximum flight time, with a wide range of market prices. Thus, to obtain the maximum benefit, a UAV platform should be chosen by considering the target phenotypes, the workforce available to operate it, and the available budget. To balance the demands, assembling UAVs is one option for adapting commercial flight controllers, such as the Pixhawk, Naza, and Navio series [9]. However, they are labor-intensive to assemble, and handling the electronic components of a UAV platform is difficult for researchers who have limited knowledge of mechanics. Another option is purchasing commercially available industrial UAV platforms, which provide high payload and flight time and can mount various sensors and acquire images over a wide field; however, they are less economical and require professional skills to operate.
To overcome the limitations of both the hand-assembled and industrial options, the authors propose instead cost-effective commercial UAV platforms, which are significantly less expensive and easier to operate than industrial platforms. Cost-effective commercial UAV platforms (lower than USD 5000) are summarized in Table 1. Key features of the UAV platforms are flight time, spatial resolution, and sensors able to be mounted (mostly Red Green Blue sensors), acquiring visible spectral images with a 90° adjustable lens [10,11]. Mavic, Phantom, and Inspire are the commercial drone series from DJI (SZ DJI Technology Co., Ltd.) on which an RGB camera is mounted by default. Among the DJI series, the Inspire series has the highest specification and price, while Mavic has the lowest specification and price. The Parrot Drone series (ANAFI Work, ANAFI Thermal, and Bluegrass) is differentiated by the sensor mounted on the UAV. Yuneec drones have two series: Mantis (a quadcopter) and Typhoon (a hexacopter). Walkera supplies the Vitus with an RGB camera, the Vitus Starlight with illuminance correction, and the VOYAGER with RGB, thermal, and night vision as selectable options. The HolyStone drone series is the most cost-effective, although it also has relatively low specifications.

3. Proposed End-to-End Workflow of High Throughput Phenotyping Using Cost-Effective Commercial UAVs

Figure 1 shows a proposed end-to-end workflow of a remote sensing research methodology. It is based on determining relationships or constructing models of electromagnetic energy from a specific band and exploring actual target plant biophysical properties [25]. The first step is to identify the traits of the target plant species that will be observed by the remote sensor; second, determine the sensors to be mounted on the UAV platform that could be expected to demonstrate a high correlation between target traits and captured images. To improve prediction accuracy, we propose capturing images of the actual traits of the plant in the ground simultaneously with capturing aerial images using the UAV platform. They will be used to construct a proper model.
The processes proposed here are suitable for commercial UAV platforms in terms of operation easiness and work efficiency. The following Section 4, Section 5, Section 6, Section 7 demonstrates details of technical principle, method for acquiring data, image processing and applications.
The proposed workflow is suitable for a commercial UAV platform in terms of ease of operation and work efficiency. Section 4, Section 5, Section 6, Section 7 provide details of technical principles, the method for acquiring data, image processing and applications.

4. Considerations before Phenotyping Operation

Before starting operation, the hardware system mounted on the UAV platform should be calibrated for the safety of the vehicle and performance of the planned mission-based waypoints. Inertial measurement unit (IMU) and global navigation satellite systems (GNSS) are critical for calibration. Most commercial UAV platform manufacturers provide a manageable calibration tool in a free remote-control application for manual control (Table 2). The flight should be planned to have overlapped aerial images with a regular, overlapped ratio and uniform spatial scale [26,27,28]. Images captured using an UAV platform can be processed to generate an orthoimage through image mosaicking techniques and a digital surface model (DSM) through photogrammetric techniques for extracting image variables. A ground control station (GCS) supporting flight planning software enables the collection of such images by fixing the travel route, time interval of image acquisition, and altitude of UAV flight with automated flying. A sensor perspective position-based autopilot mobile application, so-called waypoint navigation, has recently become available for use with commercial UAV platforms for capturing images for agriculture, construction, and environmental surveys. A list of commercial flight planning applications is given in Table 3, with advantages and disadvantages.

5. Sensors Mounted on UAVs

Sensors can be grouped into active and passive sensing systems. An active sensor system uses energy from an electrical source for measurements transmitted by the sensor, while a passive sensor is a device which does not use energy from an electrical source. Both systems can be used in a field environment for phenotyping, such as assessing biomass [35] and the nitrogen status of maize [36].
Sensor selection should be based on the target traits and the purpose of the study. Here, sensor types used in the remote sensing method are grouped in the following manner: visible and reflective infrared, thermal infrared, and microwave types [37]. Visible and reflective infrared types include RGB digital cameras and spectral cameras to obtain reflectance data from the plant [38]. Thermal infrared types, also called thermal sensors, detect the wavelength of thermal radiation emitted from the plant, depending on its temperature [39]. A light detection and ranging (LIDAR) sensor, which is representative of microwave types, measures the distance to one point of a target plant by illuminating the plant and analyzing the reflected light. LIDAR sensors construct a three-dimensional structure with the distances to several parts of the plant [40]. LIDAR sensors are based on a Time of Flight (ToF) method requiring pulse modulation for emitting light with a rotating, relatively heavy sensor [41]. Thus, LIDAR sensors are not suitable for mounting on a commercial UAV platform, and we do not recommend them.
Commercial UAV platforms are less expensive and relatively easy to operate, but their major limitation is the low payload, both the number and weight of sensors mounted on the UAV. Since the highest pay1oad of a commercial UAV platform is 450 g, only sensors below 450 g are described in each of the following sections. Cutting-edge technology, known as non-orthogonal multiple access (NOMA), is now available for UAV operation with 5G networks [42]. Detailed information on RGB digital cameras, spectral sensors, and thermal sensors is given below.

5.1. RGB Digital Cameras

The low-cost RGB digital camera is widely used in remote sensing techniques, providing a high spatial resolution of radiation values in the red (~600 nm), green (~550 nm), and blue (~450 nm) spectral bands. Most commercial UAV platforms provide RGB sensors with differing spatial resolution determining the image quality (Table 1). Visible images, such as plant coverage, plant height, and color indices, can be extracted by processing the aerial image from an RGB camera [43].
Plant coverage is defined as the area of plant extracted from the total field image. Plant coverage is the proportion of the total field image that is solely occupied by plants [44]. It is highly correlated with indicators of biophysical status, such as biomass, plant vigor, leaf area index, and yield [43,45,46,47]. To acquire accurate plant coverage information in an orthoimage, plant pixels must be segmented from non-plant, background pixels [48]. The segmentation method is based on applying a color index, thresholding, or a learning method [49]. Discriminating plant from non-plant can be done by using Hue histogram conversion from the RGB image [50,51]. A red, green, blue image can be converted to hue–saturation–vibrancy space or grayscale, a color index for precise plant analysis. Additionally, an optimal threshold value can be applied to the converted image to segment plant from background [52,53,54]. Therefore, segmenting vine canopy and minimizing shadow effect can be successfully processed with those applications of RGB camera and Hue histogram conversion [55,56,57,58]. Classification learning methods, such as K-means, artificial neural networks (ANN), random forest (RForest), and spectral indices (SI), have been applied to analyzing vines and trees, with ANN and SI methods delivering high accuracy.
Plant height is the vertical distance between a ground reference and the upper boundary of the plants [59]. It is a meaningful indicator for estimating plant health, growth rate, biomass, and yield [60,61]. As mentioned above, regularly overlapped two-dimensional images captured at a steady altitude are required for elevation data. Each image with its GPS location is captured along the planned path from an RGB digital camera mounted on the UAV platform. Images are captured to generate a DSM with photogrammetric techniques based on a structure from motion (SfM) algorithm. The plant height can be calculated by subtracting a digital terrain model (DTM), which represents the field with no plants, from the DSM, which includes the plants [59,62,63]. However, the estimated plant height can show a low correlation with the actual plant height due to the spatial resolution of RGB sensors and the spiky form of the target plants [17,64]. Photogrammetric techniques based on an SfM algorithm are highly dependent on the image resolution and the overlapping proportions between acquired aerial images [8]. Thus, having a higher resolution camera or setting a lower altitude and slow and stable flight speed for the UAV platform should be seriously considered [65]. It is also important to calibrate the height using the actual height of plants or some structure for estimating plant height to minimize this low correlation issue [66].
Color indices are acquired through algebraic calculation with reflectance values from the R (red), G (green), and B (blue) bands, respectively. While color indices can be used to segment plants from overall image as mentioned above, the main value of color indices is in facilitating the prediction or estimation of biophysical properties of target plants, such as biomass, leaf area index, and yield [67,68,69]. RGB digital cameras have advantages in their high resolution and low price, but the sensors also have limitations in overlapping the red, green, and blue wavelength spectra (Table 4).

5.2. Spectral Sensors

Green plants have the highest rate of reflectance in near-infrared (NIR) wavelengths (700–1300 nm) and have a relatively low reflection at wavelengths beyond 1300 nm [70]. Their reflectance varies significantly due to biotic and abiotic stresses that cause physical or physiological disorders in the plant [71,72,73]. Therefore, high-throughput phenotyping technology applies spectral sensors that can detect various wavelengths, not only in the visible spectrum (400–700 nm), such as RGB digital cameras, but also in the invisible, NIR spectrum (700–1300 nm). Spectral sensors acquire a spectral signature from radiance energy in each pixel by collecting the radiance reflected, emitted, and transmitted from the target plant. Spectral data is then processed and analyzed to estimate target traits [38,64,74].
Spectral sensors are divided into multispectral and hyperspectral sensors [75]. The criteria for distinguishing them are the number of spectrum bands and the width of each spectrum band. A multispectral sensor generally detects five to twelve spectral bands in each pixel. A hyperspectral sensor can acquire imagery data with hundreds or thousands of spectrum bands in each pixel through narrow widths (5–10 nm) in the visible–infrared region [76]. However, we do not recommend hyperspectral sensors because they need integrating with more devices than multispectral sensors to operate properly in UAV platforms [77]: these other devices include battery, frame grabber, data storage device, and GPS inertia navigation system (INS). Being heavier and larger, a hyperspectral sensor results in a heavy and large UAV with the extra devices for the sensor application. Moreover, in practice a multispectral sensor can produce image data suitable for vegetation indices, even though hyperspectral sensors produce more precise image data. Therefore, the multispectral sensor is a suitable tool for a cost-effective UAV platform due to its efficiency.
Vegetation indices (VIs) are algebraically calculated by combining radiation values that are transformed from the radiation value at each band of a multispectral sensor to highlight a specific characteristic of the target plant. Its value can be used in designing a model for estimating biophysical and biochemical traits, such as health status, chlorophyll contents, water stress, vegetation vigor, and canopy biomass [78], with better performance than an individual spectral channel [79]. Examples of VIs using multispectral images are listed in Table 5, and each index has its own advantages and disadvantages. The Normalized Difference Vegetation Index (NDVI) is a well-known VI which has values ranging from −1.0 to 1.0, negative values represent water, values −0.1 to 0.1 correspond to soil, and positive values indicate target plants due to their greenness [16,22,27]. However, the NDVI is limited by errors due to atmospheric influence and soil reflectance [80]. The Green Normalized Difference Vegetation Index (GNDVI) is calculated in the same way as the NDVI, but the radiation value of the green band is substituted by that of the red band, so it relates more to chlorophyll concentration than the NDVI. The Soil Adjusted Vegetation Index (SAVI) is calculated similarly to the NDVI but has a constant (L) for correction of soil reflectance, meaning the coverage of target plants [77]. The Transformed Chlorophyll Absorption in Reflectance Index (TCARI)/Optimized Soil-Adjusted Vegetation Index (OSAVI) are typically used in predicting water stress [81,82]. Equations and features of VIs using multispectral cameras are presented in Table 4.
Table 4. Examples of vegetation indices (VIs) which can be extracted using multispectral sensors.
Table 4. Examples of vegetation indices (VIs) which can be extracted using multispectral sensors.
IndexSensorsFormulaFeaturesReference
Excess Green (ExG)RGB 2 g r b Vegetation classification[83]
Excess Red (ExR)RGB 1.4 r g Vegetation classification[84]
Photochemical Reflectance Index (PRI)RGB R 531 R 470 R 531 + R 470 Plant stress measure[75]
Modified Green Red Vegetation Index (MGRVI)RGB ( R 550 ) 2 ( R 660 ) 2 ( R 550 ) 2 + ( R 660 ) 2 Biomass and plant height prediction[85]
Normalized Difference Vegetation Index (NDVI)RGB and Infrared N I R R E D N I R + R E D Crop health status measurement[83]
Green Normalized Difference Vegetation Index (GNDVI)RGB and Infrared N I R G r e e n Crop health status measurement related to chlorophyll concentration[86]
Soil Adjusted Vegetation Index (SAVI)RGB and Infrared N I R G r e e n N I R + G r e e n + L + ( 1 + L ) Soil influences on canopy spectra are minimized by the soil brightness correction factor [87]
Modified Soil Adjusted Vegetation Index (MSAVI)RGB and Infrared 0.5 × { 2 R 900 + 1 S Q R T [ ( 2 R 2 + 1 ) 2 8 ( R 800 R 670 ) ] Developed for the more reliable and simple calculation of a soil brightness correction factor than the SAVI[88]
Transformed Chlorophyll Absorption in Reflectance Index/Optimized Soil-Adjusted Vegetation Index (TCARI/OSAVI)RGB and Infrared [ 3 × { ( R 700 R 670 ) 0.2 × ( R 700 R 550 ) ( R 700 R 670 ) } ] [ ( 1 + 0.16 ) ( R 800 R 670 ) ( R 800 + R 670 + 0.61 ) ] Chlorophyll content, water status prediction, and plant stress identification[89]
Ratio Vegetation Index (RVI)RGB and Infrared R E D N I R High-density vegetation coverage and biomass[90]
Difference Vegetation Index (DVI)RGB and Infrared N I R R E D Developed for the vegetation monitoring by distinguishing the soil and the vegetation, but do not include the effects of atmosphere or shadow[91]
Perpendicular Vegetation Index (PVI)RGB and Infrared ( R s o i l R v e g ) R E D 2 ( R s o i l R v e g ) N I R 2 Leaf area index estimation, vegetation identification, and classification[91]
Atmospherically Resistant Vegetation Index (ARVI)RGB and Infrared N I R R B N I R + R B Vegetation status measurement with the elimination of the atmospheric effect[92]
Normalized Difference Red Edge Index (NDREI)RGB and Infrared R 790 R 735 R 790 + R 735 Estimation of green leaf area during senescence.[93]
Enhanced Normalized Difference Vegetation Index (ENDVI)RGB and Infrared [ ( R 790 R 550 ) 2 R 660 ] [ ( R 790 R 550 ) + 2 R 660 ] Produces better discrimination within the index than the NDVI by using green channel additionally[94]
Renormalized Difference Vegetation Index (RDVI)RGB and Infrared R 790 R 660 R 790 + R 660 Crop health status measurement with insensitivity to the effects of soil and sun[95]
Green Chlorophyll Index (CLG)RGB and Infrared R 790 R 550 1 Chlorophyll content estimation[96]
Chlorophyll Vegetation Index (CVI)RGB and Infrared R 790 × R 660 ( R 550 ) 2 Chlorophyll content estimation[97]
Rn = reflectance in corresponding spectrum, NIR = near infrared (770–895 nm); Green = green (510–580 nm); RED = red (630–690 nm); Blue = blue (450–510 nm); RB = |RED – Blue|; R s o i l = soil reflectance, R v e g = vegetation reflectivity.
A multispectral sensor can collect radiation data from spectral bands with almost no overlapping [26,69]. Further, it can include data of near-infrared wavelength. Hence, it can result in more accurate and precise data for VIs compared to color indices using an RGB digital camera. Although a multispectral sensor has the disadvantages of higher price and lower spatial resolution than an RGB digital camera, it has great advantages, such as the ability to detect the invisible physiological status of the plant, which can be highly correlated with target traits. The multispectral sensors listed in Table 5 for commercial UAV platforms are relatively less expensive than others on the market.

5.3. Thermal Sensors

Plant surface temperature is an important parameter directly related to the physiological response to various environmental stresses [45]. All bodies emit electromagnetic energy in the infrared (IR) wavelength range depending on temperature according to the principle of black body radiation. A thermal sensor detects this invisible energy (with wavelengths from 3–14 μm), then converts it to visible images showing the temperature of the target [69]. A thermal sensor is prone to errors owing to fluctuating environmental conditions in the air and other objects emitting or reflecting thermal infrared radiation. Thus, periodical calibration for thermal sensors is crucial for collecting accurate data. The calibration of thermal sensors can be conducted in the laboratory with a black body or other reference targets of known accurate temperature [44].
One example of thermal sensor application is using an index such as the Crop Water Stress Index (CWSI) to predict water stress of target plants subjected to environmental effects [44,98,99]. The CWSI is calculated with wet and dry reference temperatures, which are the lower and upper bounds for plant surface temperature. Specifically, the wet temperature means that the stomata are opened in a fully transpiring state; the dry temperature means that the stomata are closed and not transpiring, which is the water-stressed state. Methods for computing CWSI are addressed in many studies [80,100]. Several additional thermal indices are listed in Table 6.
A thermal sensor is one of the best options for collecting plant surface temperature data, but its spatial resolution is lower than an RGB digital camera. The low resolution of the thermal sensor makes it difficult to extract surface temperature data of a targeted plant from the background of the whole image. However, this problem can be resolved by including RGB or other spectral sensors in the imaging process. Spectral image data concurrently obtained with the thermal data enables the segmentation of plant pixels. The original thermal image with plant and non-plant together is overlaid with the image of the plant extracted from the background; the plant pixels can then be extracted from the thermal image [101]. Therefore, weight and resolution should be considered when selecting more than one sensor for mounting on the UAV, and applicable thermal sensors are summarized in Table 7. Other issues that might occur during the above process include gaining unwanted detector and offset non-uniformity in registered temperature data. These can be solved, as suggested in Mesas-Carrascosa et al. [102].
Table 6. Examples of thermal indices which can be applied with thermal sensors.
Table 6. Examples of thermal indices which can be applied with thermal sensors.
IndexFormulaFeatureReference
Crop water stress index (CWSI) T c a n o p y T w e t T d r y T w e t Value ranges from
(0 to 1 which the values close to 1 are related to high levels of stress
[103]
Jones index (IG) T d r y T c a n o p y T c a n o p y T w e t Positive correlation with the stomatal conductance[100,104]
Jones index (I3) T c a n o p y T w e t T d r y T c a n o p y Positively correlation with the stomatal resistance

6. Pre-Processing of Acquired Images

Remote sensing using aerial platforms is highly affected by environmental conditions, including incoming light, atmospheric conditions, and climatic factors [105]. Extracting valid data from image variables in aerial images using satellite platforms requires radiometric and geometric calibration. There are several approaches to minimizing atmospheric influences using atmospheric models such as MODTRAN, ACRON, and FLAASH [23]. By contrast, commercial UAV platforms do not need such calibration because they obtain data from lower altitudes with higher resolution [75]. For atmospheric correction, however, illumination correction should be carried out.
The sensor digital number (DN) depends on the amount of light in each given moment that cannot provide the same spectral data due to different light conditions at each time. The DN should be converted to another referencing radiation value using empirical line calibration based on reference radiation data from a brightness calibration tarp or spectralon panel [106]. In addition, in the case of a thermal sensor, accurate plant surface temperature collection should be calculated as an absolute temperature based on empirical line calibration, as discussed above [107].
Before extracting aerial images, a georeferencing process using ground control points (GCPs) is required in site-specific image processing to take into consideration the influence of the unstable flight of the UAV platform [8]. The GCPs should be installed on the experimental field for the identification of captured images with visible markers to increase the accuracy of GPS data. In a photogrammetric procedure based on the SfM algorithm, the process of converting captured images to accurate geometric data is conducted prior to the orthomosaic image production [38]. To reduce the number of GCPs for correcting the errors of spatial data, an automatic georeferencing method could be used based on navigation data and camera lens distortion [21,108,109]. Nonetheless, image sensors with GPS and INS mounted on a UAV platform can collect appropriate data for the photogrammetric process.

7. Image Processing Software

Using post-processing software, aerial images from UAV platforms are converted to extract image variables. Photogrammetry and mapping software can generate orthoimages and DSM. The image variables mentioned above can be extracted from the generated orthoimage and DSM using geographic information system (GIS) software. A program to extract image variables can also be easily constructed using library modules from commercial programming languages such as Python, C++, and MATLAB. Extracted image variables are used for determining relationships or constructing models with linear or non-linear methods, using actual traits simultaneously acquired with captured aerial images. Table 8 includes examples of phenotyping traits using commercial UAVs and various sensors and suggested methods. In addition, commercial photogrammetry and mapping software, and GIS software are listed in Table 9, with advantages and disadvantages.

8. Conclusions

Efficient, high-throughput phenotyping methods can be implemented only when data accuracy, process speed, and cost are well balanced within the permitted limits. It is clear that limitations of the technology exist, such as low payload and a narrow area for image collection. However, we have demonstrated that the utilization of cost-effective commercial UAV platforms for phenotyping and methodologies for high-throughput phenotyping accelerate plant breeding cycles. Introducing suitable selection criteria and combining devices properly selected for the UAV platform promise efficient pathways within the limits of current technologies. Low-priced commercial UAVs, which provide safe flight and flight mission planning software, and free image processing software, such as QGIS, are recommended for users who seek RGB image data of crops with low budgets and unskilled flight control techniques. Users who want additional spectral data or data under specific environmental conditions will have a fuller choice, depending on the purpose. For instance, Parrot Bluegrass Field for NIR image data acquisition, Walkera VOYAGER for thermal data and night vision, and hexacopters, such as Yuneec Typhoon, for specific environments, are typical choices. There are multiple options for obtaining various physiological and morphological traits with cost-effective sensors and UAVs.
Cost-effectiveness and the ease of operation of a commercial UAV platform is a huge advantage compared to handcrafted or industrial drones. A commercial UAV platform will be a very good option for users who lack engineering knowledge and want high-throughput phenotyping with a tight budget. We are pleased to introduce such platforms to readers who are interested in cost-effective research on their target traits.

Author Contributions

Conceptualization, H.-J.K. and Y.S.C.; investigation, Y.K., K.-H.K., and C.W.L.; resources, Y.K., K.-H.K., and C.W.L.; writing—original draft preparation, G.J. and J.K.; writing—review and editing J.-K.Y., D.-W.K., and Y.S.C.; supervision, D.-W.K. and Y.S.C.; project administration, H.-J.K. and Y.S.C.; funding acquisition, Y.S.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (2019R1A6A1A11052070).

Acknowledgments

We thank the Sustainable Agriculture Research Institute (SARI) at the Jeju National University for providing the experimental facilities.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rasheed, A.; Hao, Y.; Xia, X.; Khan, A.; Xu, Y.; Varshney, R.K.; He, Z. Crop Breeding Chips and Genotyping Platforms: Progress, Challenges, and Perspectives. Mol. Plant. 2017, 10, 1047–1064. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Xiang, H.; Tian, L. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosyst. Eng. 2011, 108, 174–190. [Google Scholar] [CrossRef]
  3. Hardin, P.J.; Jensen, R.R. Small-scale unmanned aerial vehicles in environmental remote sensing: Challenges and opportunities. Gisci. Remote Sens. 2011, 48, 99–111. [Google Scholar] [CrossRef]
  4. Matese, A.; Toscano, P.; Di Gennaro, S.; Genesio, L.; Vaccari, F.; Primicerio, J.; Belli, C.; Zaldei, A.; Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision Viticulture. Remote Sens. 2015, 7, 2971–2990. [Google Scholar] [CrossRef] [Green Version]
  5. Erena, M.; Montesinos, S.; Portillo, D.; Alvarez, J.; Marin, C.; Fernandez, L.; Henarejos, J.M.; Ruiz, L.A. Configuration and Specifications of an Unmanned Aerial Vehicle for Precision Agriculture. PLoS ONE 2013, 8, e58210. [Google Scholar]
  6. HrISToV, G.V.; ZAHArIEV, P.Z.; BELoEV, I.H. A review of the characteristics of modern unmanned aerial vehicles. Acta Technol. Agric. 2016, 19, 33–38. [Google Scholar] [CrossRef] [Green Version]
  7. Muchiri, N.; Kimathi, S. A Review of Applications and Potential Applications of UAV. In Proceedings of the 2016 Annual Conference on Sustainable Research and Innovation, Nairobi, Kenya, 4–6 May 2016; pp. 280–283. [Google Scholar]
  8. Shi, Y.; Thomasson, J.A.; Murray, S.C.; Pugh, N.A.; Rooney, W.L.; Shafian, S.; Rajan, N.; Rouze, G.; Morgan, C.L.; Neely, H.L.; et al. Unmanned Aerial Vehicles for High-Throughput Phenotyping and Agronomic Research. PLoS ONE 2016, 11, e0159781. [Google Scholar] [CrossRef] [Green Version]
  9. Selecting a Drone Flight Controller. Available online: https://dojofordrones.com/drone-flight-controller/ (accessed on 31 March 2019).
  10. Guan, S.; Fukami, K.; Matsunaka, H.; Okami, M.; Tanaka, R.; Nakano, H.; Sakai, T.; Nakano, K.; Ohdan, H.; Takahashi, K. Assessing Correlation of High-Resolution NDVI with Fertilizer Application Level and Yield of Rice and Wheat Crops using Small UAVs. Remote Sens. 2019, 11, 112. [Google Scholar] [CrossRef] [Green Version]
  11. Dedicated mounting PARROT SEQUOIA+ or RedEdge camera for Yuneec H520 Drone. Available online: https://aeromind.pl/product-eng-11195-Dedicated-mounting-PARROT-SEQUOIA-or-RedEdge-camera-for-Yuneec-H520-Drone.html/ (accessed on 25 February 2020).
  12. Chen, A.; Orlov-Levin, V.; Meron, M. Applying high-resolution visible-channel aerial scan of crop canopy to precision irrigation management. Agric. Water Manag. 2018, 2, 335. [Google Scholar] [CrossRef] [Green Version]
  13. Kolarik, N.E.; Ellis, G.; Gaughan, A.E.; Stevens, F.R. Describing seasonal differences in tree crown delineation using multispectral UAS data and structure from motion. Remote Sens. Lett. 2019, 10, 864–873. [Google Scholar] [CrossRef]
  14. Potena, C.; Khanna, R.; Nieto, J.; Siegwart, R.; Nardi, D.; Pretto, A. AgriColMap: Aerial-Ground Collaborative 3D Mapping for Precision Farming. IEEE Robot Autom. Lett. 2019, 4, 1085–1092. [Google Scholar] [CrossRef] [Green Version]
  15. Harwin, S.; Lucieer, A. Assessing the Accuracy of Georeferenced Point Clouds Produced via Multi-View Stereopsis from Unmanned Aerial Vehicle (UAV) Imagery. Remote Sens. 2012, 4, 1573–1599. [Google Scholar] [CrossRef] [Green Version]
  16. Ni, J.; Yao, L.; Zhang, J.; Cao, W.; Zhu, Y.; Tai, X. Development of an Unmanned Aerial Vehicle-Borne Crop-Growth Monitoring System. Sensors 2017, 17, 502. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Ajayi, O.G.; Salubi, A.A.; Angbas, A.F.; Odigure, M.G. Generation of accurate digital elevation models from UAV acquired low percentage overlapping images. Int. J. Remote Sens. 2017, 38, 3113–3134. [Google Scholar] [CrossRef]
  18. Possoch, M.; Bieker, S.; Hoffmeister, D.; Bolten, A.; Schellberg, J.; Bareth, G. Multi-Temporal Crop Surface Models Combined with the Rgb Vegetation Index from Uav-Based Images for Forage Monitoring in Grassland. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41, 991–998. [Google Scholar] [CrossRef]
  19. Watanabe, T.; Raju, A.; Hiraga, Y.; Sugimura, K. Development of Geospatial Model for Preparing Distribution of Rare Plant Resources Using UAV/Drone. Indian J. Pharm. Educ. 2018, 52, S146–S150. [Google Scholar] [CrossRef] [Green Version]
  20. Liu, T.; Li, R.; Zhong, X.; Jiang, M.; Jin, X.; Zhou, P.; Liu, S.; Sun, C.; Guo, W. Estimates of rice lodging using indices derived from UAV visible and thermal infrared images. Agric. Meteorol. 2018, 252, 144–154. [Google Scholar] [CrossRef]
  21. Fuldain González, J.; Varón Hernández, F. NDVI Identification and Survey of a Roman Road in the Northern Spanish Province of Álava. Remote Sens. 2019, 11, 725. [Google Scholar] [CrossRef] [Green Version]
  22. Marín, J.; Parra, L.; Rocher, J.; Sendra, S.; Lloret, J.; Mauri, P.V.; Masaguer, A. Urban Lawn Monitoring in Smart City Environments. J. Sens. 2018, 2018, 8743179. [Google Scholar] [CrossRef] [Green Version]
  23. Yang, B.; Hawthorne, T.L.; Torres, H.; Feinman, M. Using Object-Oriented Classification for Coastal Management in the East Central Coast of Florida: A Quantitative Comparison between UAV, Satellite, and Aerial Data. Drones 2019, 3, 60. [Google Scholar] [CrossRef] [Green Version]
  24. Safonova, A.; Tabik, S.; Alcaraz-Segura, D.; Rubtsov, A.; Maglinets, Y.; Herrera, F. Detection of Fir Trees (Abies sibirica) Damaged by the Bark Beetle in Unmanned Aerial Vehicle Images with Deep Learning. Remote Sens. 2019, 11, 643. [Google Scholar] [CrossRef] [Green Version]
  25. Jensen, J.R.; Lulla, K. Introductory Digital Image Processing: A Remote Sensing Perspective, 4th ed.; Prentice Hall: Upper Saddle River, NI, USA, 1987. [Google Scholar]
  26. Nex, F.; Remondino, F. UAV for 3D mapping applications: A review. Appl. Geomat. 2013, 6, 1–15. [Google Scholar] [CrossRef]
  27. Mesas-Carrascosa, F.J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by UAV to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  28. Mesas-Carrascosa, F.; Rumbao, I.; Berrocal, J.; Porras, A. Positional quality assessment of orthophotos obtained from sensors onboard multi-rotor UAV platforms. Sensors 2014, 14, 22394–22407. [Google Scholar] [CrossRef] [PubMed]
  29. DJI GS Pro Home Page. Available online: https://www.dji.com/ground-station-pro (accessed on 9 March 2020).
  30. DroneDeploy: Drone & UAV Mapping Platform Home Page. Available online: https://www.dronedeploy.com/ (accessed on 9 March 2020).
  31. Litchi for DJI Mavic/Phantom/Inspire/Spark Home Page. Available online: https://flylitchi.com/ (accessed on 9 March 2020).
  32. Pix4Dcapture: Free drone flight planning mobile app Home Page. Available online: https://www.pix4d.com/product/pix4dcapture (accessed on 9 March 2020).
  33. AeroPoints—Propeller Aero Home Page. Available online: https://www.propelleraero.com/aeropoints/ (accessed on 9 March 2020).
  34. Maps Made Easy Home Page. Available online: https://www.mapsmadeeasy.com/ (accessed on 9 March 2020).
  35. Freeman, K.W.; Girma, K.; Arnall, D.B.; Mullen, R.W.; Martin, K.L.; Teal, R.K.; Raun, W.R. By-plant prediction of corn forage biomass and nitrogen uptake at various growth stages using remote sensing and plant height. Agron. J. 2007, 99, 530–536. [Google Scholar] [CrossRef] [Green Version]
  36. Guo, J.; Wang, X.; Meng, Z.; Zhao, C.; Yu, Z.; Chen, L. Study on diagnosing nitrogennutritionstatus of cornusing Greenseeker and SPADmeter. Plant Nutr. Fertil. Sci. 2018, 1, 43–47. [Google Scholar]
  37. Sheng, H.; Chao, H.; Coopmans, C.; Han, J.; McKee, M.; Chen, Y. Low-cost UAV-based thermal infrared remote sensing: Platform, calibration and applications. In Proceedings of the 2010 IEEE/ASME International Conference on Mechatronic and Embedded Systems and Applications, Qingdao, China, 16–17 July 2010; pp. 38–43. [Google Scholar]
  38. Aasen, H.; Honkavaara, E.; Lucieer, A.; Zarco-Tejada, P. Quantitative Remote Sensing at Ultra-High Resolution with UAV Spectroscopy: A Review of Sensor Technology, Measurement Procedures, and Data Correction Workflows. Remote Sens. 2018, 10, 1091. [Google Scholar] [CrossRef] [Green Version]
  39. Jones, H.G. Application of Thermal Imaging and Infrared Sensing in Plant Physiology and Ecophysiology. Adv. Bot. Res. 2004, 41, 107–163. [Google Scholar]
  40. Lin, Y. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Comput. Electron. Agr. 2015, 119, 61–73. [Google Scholar] [CrossRef]
  41. Vazquez-Arellano, M.; Griepentrog, H.W.; Reiser, D.; Paraforos, D.S. 3-D Imaging Systems for Agricultural Applications-A Review. Sensors 2016, 16, 618. [Google Scholar] [CrossRef] [Green Version]
  42. Fu, S.; Fang, F.; Zhao, L.; Ding, Z.; Jian, X. Joint Transmission Scheduling and Power Allocation in Non-Orthogonal Multiple Access. IEEE Trans. Commun. 2019, 67, 8137–8150. [Google Scholar] [CrossRef]
  43. Liu, J.; Pattey, E. Retrieval of leaf area index from top-of-canopy digital photography over agricultural crops. Agric. Meteorol. 2010, 150, 1485–1490. [Google Scholar] [CrossRef]
  44. Chen, J.; Yi, S.; Qin, Y.; Wang, X. Improving estimates of fractional vegetation cover based on UAV in alpine grassland on the Qinghai–Tibetan Plateau. Int. J. Remote Sens. 2016, 37, 1922–1936. [Google Scholar] [CrossRef]
  45. Córcoles, J.I.; Ortega, J.F.; Hernández, D.; Moreno, M.A. Estimation of leaf area index in onion (Allium cepa L.) using an unmanned aerial vehicle. Biosyst. Eng. 2013, 115, 31–42. [Google Scholar]
  46. Ballesteros, R.; Ortega, J.F.; Hernandez, D.; Moreno, M.A. Onion biomass monitoring using UAV-based RGB imaging. Precis. Agric. 2018, 19, 840–857. [Google Scholar] [CrossRef]
  47. Kim, S.L.; Chung, Y.S.; Ji, H.; Lee, H.; Choi, I.; Kim, N.; Lee, E.; Oh, J.; Kang, D.-Y.; Baek, J.; et al. New Parameters for Seedling Vigor Developed via Phenomics. Appl. Sci. 2019, 9, 1752. [Google Scholar] [CrossRef] [Green Version]
  48. Kataoka, T.; Kaneko, T.; Okamoto, H. Crop Growth Estimation System Using Machine Vision. In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Kobe, Japan, 20–24 July 2003; pp. b1079–b1083. [Google Scholar]
  49. Hamuda, E.; Glavin, M.; Jones, E. A survey of image processing techniques for plant extraction and segmentation in the field. Comput. Electron. Agr. 2016, 125, 184–199. [Google Scholar] [CrossRef]
  50. Lee, K.J.; Lee, B.W. Estimation of rice growth and nitrogen nutrition status using color digital camera image analysis. Eur. J. Agron. 2013, 48, 57–65. [Google Scholar] [CrossRef]
  51. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar]
  52. Hassanein, M.; Lari, Z.; El-Sheimy, N. A New Vegetation Segmentation Approach for Cropped Fields Based on Threshold Detection from Hue Histograms. Sensors 2018, 18, 1253. [Google Scholar] [CrossRef] [Green Version]
  53. Kim, D.W.; Yun, H.; Jeong, S.J.; Kwon, Y.S.; Kim, S.G.; Lee, W.; Kim, H.J. Modeling and Testing of Growth Status for Chinese Cabbage and White Radish with UAV-Based RGB Imagery. Remote Sens. 2018, 10, 563. [Google Scholar] [CrossRef] [Green Version]
  54. Gitelson, A.A.; Kaufman, Y.J.; Robert, S.; Rundquist, D. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef] [Green Version]
  55. Chen, Y.; Ribera, J.; Boomsma, C.; Delp, E.J. Plant leaf segmentation for estimating phenotypic traits. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 Sept 2017; pp. 3884–3888. [Google Scholar]
  56. Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
  57. Poblete-Echeverría, C.; Olmedo, G.; Ingram, B.; Bardeen, M. Detection and Segmentation of Vine Canopy in Ultra-High Spatial Resolution RGB Imagery Obtained from Unmanned Aerial Vehicle (UAV): A Case Study in a Commercial Vineyard. Remote Sens. 2017, 9, 268. [Google Scholar] [CrossRef] [Green Version]
  58. Kerkech, M.; Hafiane, A.; Canals, R. Deep leaning approach with colorimetric spaces and vegetation indices for vine diseases detection in UAV images. Comput. Electron. Agr. 2018, 155, 237–243. [Google Scholar] [CrossRef]
  59. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  60. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerle, M.; Colombeau, G.; Comar, A. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates. Front. Plant Sci. 2017, 8, 2002–2015. [Google Scholar] [CrossRef] [Green Version]
  61. Han, X.; Thomasson, J.A.; Bagnall, G.C.; Pugh, N.A.; Horne, D.W.; Rooney, W.L.; Jung, J.; Chang, A.; Malambo, L.; Popescu, S.C.; et al. Measurement and Calibration of Plant-Height from Fixed-Wing UAV Images. Sensors 2018, 18, 4092. [Google Scholar] [CrossRef] [Green Version]
  62. Bendig, J.; Bolten, A.; Bareth, G. UAV-based Imaging for Multi-Temporal, very high Resolution Crop Surface Models to monitor Crop Growth Variability. Photogramm. Fernerkun. 2013, 2013, 551–562. [Google Scholar] [CrossRef]
  63. Qiu, R.; Wei, S.; Zhang, M.; Li, H.; Sun, H.; Liu, G.; Li, M. Sensors for measuring plant phenotyping: A review. Int. J. Agric. Biol. Eng. 2018, 11, 1–17. [Google Scholar] [CrossRef] [Green Version]
  64. Watanabe, K.; Guo, W.; Arai, K.; Takanashi, H.; Kajiya-Kanegae, H.; Kobayashi, M.; Yano, K.; Tokunaga, T.; Fujiwara, T.; Tsutsumi, N.; et al. High-Throughput Phenotyping of Sorghum Plant Height Using an Unmanned Aerial Vehicle and Its Application to Genomic Prediction Modeling. Front Plant Sci. 2017, 8, 421–431. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Von Bueren, S.K.; Burkart, A.; Hueni, A.; Rascher, U.; Tuohy, M.P.; Yule, I.J. Deploying four optical UAV-based sensors over grassland: Challenges and limitations. Biogeosciences 2015, 9, 163–175. [Google Scholar] [CrossRef] [Green Version]
  66. Hu, P.; Chapman, S.C.; Wang, X.; Potgieter, A.; Duan, T.; Jordan, D.; Guo, Y.; Zheng, B. Estimation of plant height using a high throughput phenotyping platform based on unmanned aerial vehicle and self-calibration: Example for sorghum breeding. Eur. J. Agron. 2018, 95, 24–32. [Google Scholar] [CrossRef]
  67. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K.-H. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef] [Green Version]
  68. Zhou, X.; Zheng, H.B.; Xu, X.Q.; He, J.Y.; Ge, X.K.; Yao, X.; Cheng, T.; Zhu, Y.; Cao, W.X.; Tian, Y.C. Predicting grain yield in rice using multi-temporal vegetation indices from UAV-based multispectral and digital imagery. Isprs J. Photogramm. Remote Sens. 2017, 130, 246–255. [Google Scholar] [CrossRef]
  69. Lussem, U.; Bolten, A.; Gnyp, M.L.; Jasper, J.; Bareth, G. Evaluation of Rgb-Based Vegetation Indices from Uav Imagery to Estimate Forage Yield in Grassland. In Proceedings of the International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 2018, Beijing, China, 7–10 May 2018; pp. 1215–1219. [Google Scholar]
  70. Broge, N.H.; Mortensen, J.V. Deriving green crop area index and canopy chlorophyll density of winter wheat from spectral reflectance data. Remote Sens. Environ. 2002, 81, 45–57. [Google Scholar] [CrossRef]
  71. Calderón, R.; Montes-Borrego, M.; Landa, B.B.; Navas-Cortés, J.A.; Zarco-Tejada, P.J. Detection of downy mildew of opium poppy using high-resolution multi-spectral and thermal imagery acquired with an unmanned aerial vehicle. Precis. Agric. 2014, 15, 639–661. [Google Scholar] [CrossRef]
  72. Di Gennaro, S.F.; Battiston, E.; Di Marco, S.; Facini, O.; Matese, A.; Nocentini, M.; Palliotti, A.; Mugnai, L. Unmanned Aerial Vehicle (UAV)-based remote sensing to monitor grapevine leaf stripe disease within a vineyard affected by esca complex. Phytopathol. Mediterr. 2016, 55, 262–275. [Google Scholar]
  73. Zaman-Allah, M.; Vergara, O.; Araus, J.L.; Tarekegne, A.; Magorokosho, C.; Zarco-Tejada, P.J.; Hornero, A.; Albà, A.H.; Das, B.; Craufurd, P.; et al. Unmanned aerial platform-based multi-spectral imaging for field phenotyping of maize. Plant Methods 2015, 11, 35. [Google Scholar] [CrossRef] [Green Version]
  74. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef] [Green Version]
  75. Gamon, J.; Penuelas, J.; Field, C. A narrow-waveband spectral index that tracks diurnal changes in photosynthetic efficiency. Remote Sens. Environ. 1992, 41, 35–44. [Google Scholar] [CrossRef]
  76. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral Imaging: A Review on UAV-Based Sensors, Data Processing and Applications for Agriculture and Forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  77. Nackaerts, K.; Delauré, B.; Everaerts, J.; Michiels, B.; Holmlund, C.; Mäkynen, J.; Saari, H. Evaluation of a lightweigth UAS-prototype for hyperspectral imaging. In Proceedings of the International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences 2010, Newcastle upon Tyen, UK, 21–24 June 2010; pp. 478–483. [Google Scholar]
  78. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
  79. Bannari, A.; Morin, D.; Bonn, F.; Huete, A.R. A review of vegetation indices. Remote Sens. Rev. 1995, 13, 95–120. [Google Scholar] [CrossRef]
  80. He, H.J.; Wu, D.; Sun, D.W. Potential of hyperspectral imaging combined with chemometric analysis for assessing and visualising tenderness distribution in raw farmed salmon fillets. J. Food Eng. 2014, 126, 156–164. [Google Scholar] [CrossRef]
  81. Berni, J.A.J.; Zarco-Tejada, P.J.; Suárez, L.; González-Dugo, V.; Fereres, E. Remote sensing of vegetation from UAV platforms using lightweight multispectral and thermal imaging sensors. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2009, 38, 6–11. [Google Scholar]
  82. Baluja, J.; Diago, M.P.; Balda, P.; Zorer, R.; Meggio, F.; Morales, F.; Tardaguila, J. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 2012, 30, 511–522. [Google Scholar] [CrossRef]
  83. Rouse Jr, J.W.; Haas, R.H.; Schell, J.; Deering, D. Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation. In NASA/GSFC, Type III, Final Report; Texas A & M University: College Station, TX, USA, 1974; pp. 309–317. [Google Scholar]
  84. Meyer, G.E.; Neto, J.C. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  85. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  86. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  87. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  88. Qi, J.; Chehbouni, A.; Huerte, A.; Kerr, Y.; Sorooshian, S. A modified soil adjusted vegetation index: Remote Sensing Environment. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  89. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  90. Pearson, R.L.; Miller, L.D. Remote mapping of standing crop biomass for estimation of the productivity of the shortgrass prairie. Remote Sens. Environ. 1972, 8, 1357–1381. [Google Scholar]
  91. Richardson, A.D.; Duigan, S.P.; Berlyn, G.P. An evaluation of noninvasive methods to estimate foliar chlorophyll content. New Phytol. 2002, 153, 185–194. [Google Scholar] [CrossRef] [Green Version]
  92. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. Ieee Trans. Geosci. Remote Sens. 1992, 30, 261–270. [Google Scholar] [CrossRef]
  93. Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photoch. Photobio. B. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  94. Rasmussen, J.; Ntakos, G.; Nielsen, J.; Svensgaard, J.; Poulsen, R.N.; Christensen, S. Are vegetation indices derived from consumer-grade cameras mounted on UAVs sufficiently reliable for assessing experimental plots? Eur. J. Agron. 2016, 74, 75–92. [Google Scholar] [CrossRef]
  95. Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  96. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  97. Vincini, M.; Frazzi, E.; D’Alessio, P. A broad-band leaf chlorophyll vegetation index at the canopy scale. Precis. Agric. 2008, 9, 303–319. [Google Scholar] [CrossRef]
  98. Alchanatis, V.; Cohen, Y.; Cohen, S.; Moller, M.; Sprinstin, M.; Meron, M.; Tsipris, J.; Saranga, Y.; Sela, E. Evaluation of different approaches for estimating and mapping crop water status in cotton with thermal imaging. Precis. Agric. 2010, 11, 27–41. [Google Scholar] [CrossRef]
  99. Ben-Gal, A.; Agam, N.; Alchanatis, V.; Cohen, Y.; Yermiyahu, U.; Zipori, I.; Presnov, E.; Sprintsin, M.; Dag, A. Evaluating water stress in irrigated olives: Correlation of soil water status, tree water status, and thermal imagery. Irrig. Sci. 2009, 27, 367–376. [Google Scholar] [CrossRef]
  100. Jones, H.G. Use of infrared thermometry for estimation of stomatal conductance as a possible aid to irrigation scheduling. Agric. For. Meteorol. 1999, 95, 139–149. [Google Scholar] [CrossRef]
  101. Li, L.; Zhang, Q.; Huang, D. A review of imaging techniques for plant phenotyping. Sensors 2014, 14, 20078–20111. [Google Scholar] [CrossRef] [PubMed]
  102. Mesas-Carrascosa, F.J.; Pérez-Porras, F.; Meroño, L.J.; Mena, F.C.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P.; García-Ferrer, A. Drift correction of lightweight microbolometer thermal sensors on-board unmanned aerial vehicles. Remote Sens. 2018, 10, 615. [Google Scholar] [CrossRef] [Green Version]
  103. Idso, S.B.; Jackson, R.D.; Pinter Jr, P.J.; Reginato, R.J.; Hatfield, J.L. Normalizing the stress-degree-day parameter for environmental variability. Agric Meteorol. 1981, 24, 45–55. [Google Scholar] [CrossRef]
  104. Leinonen, I.; Grant, O.M.; Tagliavia, C.P.P.; Chaves, M.M.; Jones, H.G. Estimating stomatal conductance with thermal imagery. Plant Cell Environ. 2006, 29, 1508–1518. [Google Scholar] [CrossRef]
  105. Kelcey, J.; Lucieer, A. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing. Remote Sens. 2012, 4, 1462–1493. [Google Scholar] [CrossRef] [Green Version]
  106. Wang, C.; Myint, S.W. A Simplified Empirical Line Method of Radiometric Calibration for Small Unmanned Aircraft Systems-Based Remote Sensing. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 1876–1885. [Google Scholar] [CrossRef]
  107. Santesteban, L.G.; Di Gennaro, S.F.; Herrero-Langreo, A.; Miranda, C.; Royo, J.B.; Matese, A. High-resolution UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status within a vineyard. Agric. Water Manag. 2017, 183, 49–59. [Google Scholar] [CrossRef]
  108. Xiang, H.; Tian, L. Method for automatic georeferencing aerial remote sensing (RS) images from an unmanned aerial vehicle (UAV) platform. Biosyst. Eng. 2011, 108, 104–113. [Google Scholar] [CrossRef]
  109. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  110. Di Gennaro, S.F.; Rizza, F.; Badeck, F.W.; Berton, A.; Delbono, S.; Gioli, B.; Toscano, P.; Zaldei, A.; Matese, A. UAV-based high-throughput phenotyping to discriminate barley vigour with visible and near-infrared vegetation indices. Int. J. Remote Sens 2017, 39, 5330–5344. [Google Scholar] [CrossRef]
  111. Yang, G.; Liu, J.; Zhao, C.; Li, Z.; Huang, Y.; Yu, H.; Xu, B.; Yang, X.; Zhu, D.; Zhang, X.; et al. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: Current status and perspectives. Front. Plant Sci. 2017, 8, 1111–1135. [Google Scholar] [CrossRef]
  112. Thenot, F.; Méthy, M.; Winkel, T. The Photochemical Reflectance Index (PRI) as a water-stress index. Int. J. Remote Sens. 2002, 23, 5135–5139. [Google Scholar] [CrossRef]
  113. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment. J. Appl. Remote Sens. 2017, 11, 026035. [Google Scholar] [CrossRef] [Green Version]
  114. Hunt, E.R.; Daughtry, C.S.T.; Eitel, J.U.H.; Long, D.S. Remote Sensing Leaf Chlorophyll Content Using a Visible Band Index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef] [Green Version]
  115. Berni, J.; Zarco-Tejada, P.J.; Suarez, L.; Fereres, E. Thermal and Narrowband Multispectral Remote Sensing for Vegetation Monitoring from an Unmanned Aerial Vehicle. IEEE Trans. Geosci. Remote Sens. 2009, 47, 722–738. [Google Scholar] [CrossRef] [Green Version]
  116. Gago, J.; Douthe, C.; Coopman, R.; Gallego, P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs challenge to assess water stress for sustainable agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  117. Agisoft Metashape Home Page. Available online: https://www.agisoft.com/ (accessed on 9 March 2020).
  118. Pix4Dmapper: Professional Drone Mapping and Photogrammetry Software Home Page. Available online: https://www.pix4d.com/product/pix4dmapper-photogrammetry-software (accessed on 9 March 2020).
  119. SimActive High-End Mapping Software Home Page. Available online: https://www.simactive.com/ (accessed on 9 March 2020).
  120. About ArcGIS Mapping & Analytics Platform–Esri Home Page. Available online: https://www.esri.com/en-us/arcgis/about-arcgis/overview (accessed on 9 March 2020).
  121. Welcome to the QGIS project! Home Page. Available online: https://www.qgis.org/en/site/ (accessed on 9 March 2020).
  122. ENVI-The Leading Geospatial Image Analysis Software Home page. Available online: https://www.harrisgeospatial.com/Software-Technology/ENVI (accessed on 9 March 2020).
  123. ERDAS IMAGINE Hexagon Geospatial Home Page. Available online: https://www.hexagongeospatial.com/products/power-portfolio/erdas-imagine (accessed on 9 March 2020).
Figure 1. Workflow for high-throughput phenotyping using commercial unmanned aerial vehicle (UAV) platforms.
Figure 1. Workflow for high-throughput phenotyping using commercial unmanned aerial vehicle (UAV) platforms.
Remotesensing 12 00998 g001
Table 1. Commercial unmanned aerial vehicle (UAV) platforms available in the market.
Table 1. Commercial unmanned aerial vehicle (UAV) platforms available in the market.
ModelPrice ($)Price (€)Flight Time (min)Sensor
(Spatial Resolution)
Application
DJI Mavic 2 pro1599.001499.0029RGB camera (5472 × 3648)[12,13,14]
DJI Mavic 2 zoom1349.001249.0029RGB camera (4000 × 3000)
DJI Mavic Air919.00849.0020RGB camera (4056 × 3040)
RGB camera (4056 × 2282)
DJI Mavic Pro Platinum1149.00999.0030RGB camera (4000 × 3000)
Phantom 4 Pro V2.01599.001699.0030RGB camera (5472 × 3648)
RGB camera (4864 × 3648)
RGB camera (5472 × 3078)
[10,12,15,16,17,18]
Inspire 23299.003399.0023 ~ 27RGB camera (24Mega)
RGB camera (20.8Mega)
[19,20,21,22]
Parrot ANAFI Work999.00999.0025RGB camera (5344 × 4016)
RGB camera (4000 × 3000)
-
Parrot ANAFI Thermal1900.001900.0026RGB camera (5344 × 4016)
RGB camera (4608 × 3456)
Thermal camera (160 × 120)
Parrot Bluegrass Fields4980.004510.5925multispectral sensor (1280 × 960)-
Yuneec Mantis G699.99699.0033RGB camera (4160 × 2340)
RGB camera (4160 × 2340)
[23]
Yuneec Mantis Q499.99499.0033RGB camera (4160 × 2340)
RGB camera (4160 × 2340)
-
Yuneec Typhoon H899.99799.0025RGB camera (4:3/12.4Mega)[21,24]
Yuneec Typhoon H5201561.79 ~ 11,864.001382.93 ~ 10,505.0025RGB camera
(4:3/12 Mega)
Walkera Vitus739.00654.7628RGB camera (4000 × 3000)
Walkera Vitus Starlight899.00796.5322RGB camera (1920 × 1080)-
Walkera VOYAGER 517,999.0015,947.3720RGB camera (3840 × 2160)-
HolyStone HS720 GPS Drone with 2K Camera299.99279.9926RGB camera (2048 × 1152)-
HolyStone HS120D FPV
Drone with GPS System
159.99139.9916RGB camera (1920 × 1080)-
HolyStone HS100 FPV Drone with GPS169.99159.9912–15RGB camera (1280 × 720)-
Both prices are based on the price on 9 March 2020.
Table 2. Software for hardware calibration of commercial unmanned aerial vehicle (UAV) platforms.
Table 2. Software for hardware calibration of commercial unmanned aerial vehicle (UAV) platforms.
ManufacturerAndroid _ ApplicationIOS _ Application
DJIDJI GO, DJI GO 4DJI GO, DJI GO 4
ParrotFreeFlight 6, FreeFlight ProFreeFlight 6, FreeFlight Pro
YuneecYuneec Pilot, CGO3Yuneec Pilot, CGO
Table 3. Software for flight mission planning of commercial unmanned aerial vehicle (UAV) platforms.
Table 3. Software for flight mission planning of commercial unmanned aerial vehicle (UAV) platforms.
ApplicationProsConsManufacturer
DJI GS Pro
-
Free software that has a clean and intuitive interface.
-
Allows automated 3D Map Area systems.
-
Seamlessly works with DJI products.
-
Waypoints are limited in 99 maximums.
-
Only compatible with DJI drones and the iPad.
DJI [29]
DroneDeploy
-
User-friendly interface and option for drone mapping.
-
Provides numerous applications by allowing 3rd parties to interface with the collected aerial imagery and generate specific datasets.
-
Relatively fewer robust solutions are provided than other programs.
DroneDeploy [30]
LITCHI
-
Simple to use and missions can be created on a regular PC and android phone, unlike DJI GS Pro.
-
Lack of DJI warranty support.
VC Technology [31]
Pix4D Capture
-
Allows you to create flight plans for capturing image data with easy flight planning, automated pre-flight check, and real-time monitoring.
-
Produces georeferenced maps and models in Pix4D desktop or cloud software easily.
-
User can define the size of a mission to map areas of all sizes.
-
Automated data uploading is available.
-
Unable to plan flight missions that require multi-battery.
-
Unable to set accurate velocity and interval of capturing time in android.
Pix4D [32]
Propeller AeroPoints
-
Efficient data coordination by streamlined data collection and management system.
-
Highly durable in extreme environmental conditions, ground control point (GCP) is provided.
-
Interface is aimed at experienced users.
Propeller Aerobotics [33]
Maps Made Easy
-
Provided for the free package.
-
Easy to use, up to 7500 images can be processed.
-
Multi-battery flight issues.
-
Less straightened interface than other software.
Drones Made Easy [34]
Table 5. Multispectral sensors for commercial unmanned aerial vehicle (UAV) platforms.
Table 5. Multispectral sensors for commercial unmanned aerial vehicle (UAV) platforms.
ModelWeight (g)Spectral Band Name (Center Wavelength)Spatial ResolutionFrame Rate
MAIA WV420PURPLE (422.5 nm), BLUE (487.5 nm), GREEN (550 nm), ORANGE (602.5 nm), RED (660 nm), RED EDGE (725 nm), NIR1 (785 nm), NIR2 (887.5 nm), RGB camera1280 × 9603 fps with 10 bits and 12 bits
(6 fps with 8 bits
MAIA S2420VIOLET (443 nm), BLUE (490 nm), GREEN (560 nm), RED (665 nm), RED EDGE1 (705 nm), RED EDGE2 (740 nm), NIR1 (783 nm), NIR2 (842 nm), NIR3 (865 nm)1280 × 9603 fps with 10 bits and 12 bits
(6 fps with 8 bits
MAIA M270Select two bands among the following bands:
(VIOLET (422.5 nm), NVIOLET (443 nm), BLUE (487.5 nm), SBLUE (490 nm), GREEN (550 nm), NGREEN (560 nm), YELLOW (602.5 nm), RED (660 nm), NRED (665 nm), H RED EDGE (705 nm), RED EDGE (725 nm), L RED EDGE (740 nm), H NNIR (783 nm), H NIR (785 nm), WNIR (842 nm), L NNIR (865 nm), L NIR (887.5 nm), RGB camera
1280 × 9603 fps with 10 bits and 12 bits
(6 fps with 8 bits
Parrot Sequoia +72GREEN (550 nm), RED (660 nm), RED EDGE (735 nm), Near infrared (790 nm), RGB camera1280 × 9601 fps 10 bits
MicaSense Rededge-MX231.9BLUE (475 nm), GREEN (560 nm), RED (668 nm), RED EDGE (717 nm), NIR (840 nm), RGB camera1280 × 9601 fps, 12 bits
MicaSense ALTUM357BLUE (475 nm), GREEN (560 nm), RED (668 nm), RED EDGE (717 nm), NIR (840 nm)2064 × 15441 fps, 12 bits
Sentera Double 4k Sensor80BLUE (446 nm), GREEN (548 nm), RED (650 nm), RED EDGE (720 nm), NIR (840 nm)1080 × 72030 fps
Sentera AGX710270BLUE (446 nm), GREEN (548 nm), RED (650 nm), RED EDGE (720 nm), NIR (840 nm)1080 × 72030 fps
Sentera High Precision Single Sensor30For Normalized Difference Vegetation Index (NDVI); RED (625 nm), NIR (850 nm)
(For Normalized Difference Red Edge Index (NDREI); RED EDGE (720 nm), NIR (840 nm)
1248 × 9507 fps
Sentera Quad Sensor170RED (655 nm), RED EDGE (725 nm), NIR (800 nm),
(RGB camera
1248 × 9507 fps, 12 bits
fps = frames per second.
Table 7. Thermal sensors for commercial unmanned aerial vehicle (UAV) platforms.
Table 7. Thermal sensors for commercial unmanned aerial vehicle (UAV) platforms.
ModelWeight (g)Spectral Range (µm)Spatial ResolutionOperating Temperature Range (°C)
FLIR Vue Pro R92–1137.5–13.5336 × 256−20 ~ 50
FLIR Vue Pro92–1137.5–13.5336 × 256−20 ~ 50
FLIR Duo Pro R3257.5–13.5336 × 256−20 ~ 50
DJI Zenmuse XT2707.5–13.5640 × 512 336 × 256−40 ~ 550
Yuneec CGOET2758–141920 × 1080−10 ~ 40
Yuneec E10T3708–14320 × 256 640 × 512−10 ~ 40
Table 8. Examples of options for obtaining physiological and morphological traits by UAVs and multispectral sensors.
Table 8. Examples of options for obtaining physiological and morphological traits by UAVs and multispectral sensors.
TraitsRecommended UAVsSensorsImage Processing MethodsReference
Plant heightAllRGB DSM DTM , MGRVI[67]
Vegetation coverageAllRGBExG, ExR[52,83,84,110]
UAVs with enough payloads for the multispectral sensorMultispectralNDVI, RVI, DVI, PVI, ARVI, ENDVI, RDVI[85,90,91,111]
BiomassRGBExG, ExR, MGRVI[85]
MultispectralNDVI, ARVI, ENDVI[92,94,111]
Plant stressMultispectralPRI, NDVI, TCARI/OSAVI, ENDVI, NDREI[75,83,89,93,94,112,113]
Chlorophyll contentMultispectralTCARI/OSAVI, NDREI, PVI, CLG, CVI[91,114]
Water statusParrot ANAFI Thermal
Yuneec H520
MultispectralPRI, NDVI, TCARI/OSAVI[112,115]
ThermalCWSI, IG, I3[116]
Canopy temperatureParrot ANAFI Thermal
Yuneec H520
Thermal-
Transpiration rateParrot ANAFI Thermal
Yuneec H520
Thermal-
Table 9. Photogrammetry and mapping software and geographic information system (GIS) software for unmanned aerial vehicle (UAV) platforms.
Table 9. Photogrammetry and mapping software and geographic information system (GIS) software for unmanned aerial vehicle (UAV) platforms.
TypeSoftwareProsConsManufacturer
Photogrammetry and Mapping softwareAgisoft Photoscan Pro (Metashape)
-
Wide range of 3D modeling tools, including thermal, NIR, RGB, and advanced multi-spectral images.
-
Capable of creating digital surface models and point clouds and accurate measurements.
-
Wide range of supported camera.
-
Allows georeferenced (digital surface model) DSM, (digital terrain model) DTM, and orthoimage export options.
-
Provides a 4D Modeling process.
-
Relatively complicated interface than other software.
-
One software package for one computer.
Agisoft [117]
Maps Made Easy
-
Provided for the free package.
-
Easy to use, up to 7500 images can be processed.
-
Multi-battery flight issues.
-
less straightened interface than other software.
Drones Made Easy [34]
Pix4D Mapper
-
Provides automated processes, such as world map image deployment, photo camera calibration, orthoimage generation, and DSM generation.
-
Allows users to process image data in wide platforms, both online and offline.
-
Topographic maps are created only manual process.
Pix4D [118]
SimActive Correlator 3D
-
Provides accurate, survey-grade maps.
-
The fastest processing speeds of any mapping software currently available.
-
PC based software.
SimActive [119]
GIS SoftwareArcGis
-
Standard geospatial analysis software which is fairly easy to use.
-
Provides robust spatial analysis and the data advanced statistical tools. Effective to handle a large amount of vector data.
-
Lots of tutorials are available online and offline.
-
Compatible with the open-source programming language Python.
-
High-price for its license: $1500, $7000, or $12,000 per user.
-
Interface is not user-friendly.
Esri [120]
QGIS
-
Has the edge for consuming data
-
No license. open-source, thus, does not limit which tools can be used
-
Slow processing time
QGIS Development Team [121]
ENVI
-
Simplified user-friendly interfaces.
-
No tools for 3D imagery, geocoding and spatial analysis
Harris Geospatial Solutions [122]
ERDAS Imagine
-
Wide array of tools for geospatial analysis, such as map composition, image enhancement, image classification, raster geographic information system (GIS) modeling, and stereo analysis, are available.
-
Provides light detection and ranging (LiDAR) tools, DSM data sets, spectral image data tools, radar tools, and spatial model editing.
-
Unable to perform hierarchical object-based image analysis (OBIA).
Hexagon Geospatial [123]

Share and Cite

MDPI and ACS Style

Jang, G.; Kim, J.; Yu, J.-K.; Kim, H.-J.; Kim, Y.; Kim, D.-W.; Kim, K.-H.; Lee, C.W.; Chung, Y.S. Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application. Remote Sens. 2020, 12, 998. https://doi.org/10.3390/rs12060998

AMA Style

Jang G, Kim J, Yu J-K, Kim H-J, Kim Y, Kim D-W, Kim K-H, Lee CW, Chung YS. Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application. Remote Sensing. 2020; 12(6):998. https://doi.org/10.3390/rs12060998

Chicago/Turabian Style

Jang, GyuJin, Jaeyoung Kim, Ju-Kyung Yu, Hak-Jin Kim, Yoonha Kim, Dong-Wook Kim, Kyung-Hwan Kim, Chang Woo Lee, and Yong Suk Chung. 2020. "Review: Cost-Effective Unmanned Aerial Vehicle (UAV) Platform for Field Plant Breeding Application" Remote Sensing 12, no. 6: 998. https://doi.org/10.3390/rs12060998

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop