Next Article in Journal
Seasonality Analysis of Sentinel-1 and ALOS-2/PALSAR-2 Backscattered Power over Salar de Aguas Calientes Sur, Chile
Previous Article in Journal
A Multi-Level Attention Model for Remote Sensing Image Captions
Previous Article in Special Issue
High-Throughput Phenotyping Analysis of Potted Soybean Plants Using Colorized Depth Images Based on A Proximal Platform
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”

1
Institute of Crop Sciences, Chinese Academy of Agricultural Sciences/Key Laboratory of Crop Physiology and Ecology, Ministry of Agriculture, Beijing 100081, China
2
National Engineering Research Center for Information Technology in Agriculture (NERCITA), Beijing 100097, China
3
Institute of Geomatics, University of Natural Resources and Life Sciences (BOKU), Peter Jordan Straße 82, Vienna 1190, Austria
*
Authors to whom correspondence should be addressed.
Remote Sens. 2020, 12(6), 940; https://doi.org/10.3390/rs12060940
Submission received: 6 March 2020 / Accepted: 9 March 2020 / Published: 13 March 2020

Abstract

:
High-throughput crop phenotyping is harnessing the potential of genomic resources for the genetic improvement of crop production under changing climate conditions. As global food security is not yet assured, crop phenotyping has received increased attention during the past decade. This spectral issue (SI) collects 30 papers reporting research on estimation of crop phenotyping traits using unmanned ground vehicle (UGV) and unmanned aerial vehicle (UAV) imagery. Such platforms were previously not widely available. The special issue includes papers presenting recent advances in the field, with 22 UAV-based papers and 12 UGV-based articles. The special issue covers 16 RGB sensor papers, 11 papers on multi-spectral imagery, and further 4 papers on hyperspectral and 3D data acquisition systems. A total of 13 plants’ phenotyping traits, including morphological, structural, and biochemical traits are covered. Twenty different data processing and machine learning methods are presented. In this way, the special issue provides a good overview regarding potential applications of the platforms and sensors, to timely provide crop phenotyping traits in a cost-efficient and objective manner. With the fast development of sensors technology and image processing algorithms, we expect that the estimation of crop phenotyping traits supporting crop breeding scientists will gain even more attention in the future.

1. Introduction

Under changing climatic conditions, global food security is challenged. Crop production needs to be increased urgently, while coping with limited resources and plant stresses. Currently, crop breeding as the central pillar for yield increases and resource efficiency, is limited by breeding efficiency, and phenotyping selection [1]. Phenotyping is defined as the application of methodologies and protocols to measure a specific trait, ranging from the cellular level to the whole plant or canopy level, related to plant structure, biochemicals, and function [2,3]. High-throughput crop phenotyping is receiving increased attention for its potential to harness genomic resources in the genetic improvement of crop production under changing climate conditions.
Traditional phenotyping traits measurements are done manually with a lot of professionals’ effort, time, and resources. Remote sensing techniques are complementary to this field work and offers estimation of high-throughput phenotyping traits, in particular, taking into account the maturity of unmanned ground vehicle (UGV), unmanned aerial vehicle (UAV) technology, as well as micro-sensors and advanced sensors such multi/hyperspectral, thermal infrared, LiDAR sensors etc. Many experts in remote sensing and plant physiology have recognized the value of UGV/UAV remote sensing phenotyping, given their high abundant spectral, spatial, and temporal information [4,5]. The potential advantages of UAV remote sensing are not only greatly enhancing the efficiency of data acquisition, at the same time, data standardization is becoming easier, thereby reducing personal subjective assessments. Moreover, image processing and machine learning algorithms are making good progress, including advancements in physically based radiative transfer models. Advances are also made in data preprocessing, system or platform testing, and modern machine learning algorithms. All these elements are critical to access the target traits by raw data [6].
This Special Issue aims to contribute the latest innovative research results in the field of remote sensing technology, senor technologies, and image processing algorithms and also provides a number of specific applications specifically addressing issues estimating specific crop phenotyping traits based on UGV and UAV images.

2. Overview of Contributions

The contributions published in this SI clearly demonstrate the added value that phenotyping provides for various plants (Figure 1) that were monitored by multi-sourced sensors deployed on UGV and UAV platforms. In particular, we wish to highlight the high number of target traits that were addressed by machine learning and image segmentation algorithms (Table 1).

2.1. Platforms and Sensors

High-throughput phenotyping is currently mostly based on remote sensing, or more specifically, near ground platforms: actual ground and aerial platforms. Twelve papers in this special issue reported results from ground platforms, and twenty-two papers used the UAV platform.
For ground platform, ground fixed scanning system [15,22,30,35], handheld-based field measuring [11,14,16,32], mobile ground platform (MGP) [14,20], and lifting hoist-based elevated platform [12,36] were reported for different crops types (Figure 2). These platforms were easy-to-use with low cost, but data acquisition was semi-automatic.
The recent development and increasing acceptance of UAV or drones in terms of cost and reliability has made data collection much more efficient with unprecedented spatial, spectral, and temporal detail, thereby supporting their application in phenotyping traits mapping [6]. The UAV remote sensing improves the efficiency of crop phenotyping traits data acquisition. It allows high-throughput measurements of canopy structure (e.g., crop height, leaf angle, leaf area, etc.,) in more than 1000 plots within more or less one day by RGB stereo imagery. This is clearly far more efficient compared to traditional manual measurement. Khan et al. [20] also compared the estimating accuracy based on mobile ground platform (MGP) and UAV platform, and showed that the estimated canopy height derived from MGP imagery were better than UAV imagery, while the opposite results were obtained regarding the estimating of canopy vigor.
Besides the platforms, sensors play a very important role in advance phenotyping. Lightweight sensors can be loaded in the UAV platform improving their data capture quality. In this special issue, applications based on RGB and multi-spectral imagery cameras, hyperspectral sensors, thermal cameras, and light detection and ranging (LiDAR) sensors are shown (Figure 3). Up to now, RGB imagery for classification or segmentation and multi-spectral imagery cameras for biochemical or physical traits are most prominently exploited. Hyperspectral sensors and LiDAR loaded on UAV platforms are recommended to be further exploited in the future.

2.2. Phenotyping Traits

For this Special Issue, phenotyping traits are broadly classified into two types: morphological/structural traits and biochemical traits (Figure 4). Morphological and structural traits include canopy height, leaf area, canopy coverage, as well as the volume, size, diameter, and width of crowns, etc. These structural traits were determined by three-dimensional point clouds from RGB imagery or LiDAR scans. Tu et al. [31] and Patrick et al. [8] established a structure from motion algorithm (SfM) for accurately estimating crown height, extent, plant projective cover in avocado tree and Blueberry bush, respectively. Wang et al. [30] compared three representative 3D data acquisition approaches, including 3D Laser scanning, multi-view stereo reconstruction, and 3D digitizing estimates, and these approaches with respect to leaf length, width, inclination angle, azimuth, area, and height in Maize. The SI also reports work on seed emergence uniformity [7], number of spikes [11] in wheat, number of flowers in oilseed rape [23] and cotton bolls [28] under various ecological conditions. Concerning biochemical traits, senescence, beet cyst nematode, late blight, water use efficiency, stem water potential; chlorophyll a, fluorescence, were measured by RGB camera, thermal images and multi/hyperspectral images from handheld and UAV platforms. Measurements of fluxes were also reported, e.g., leaf gas exchange.

2.3. Data Processing Methods

In the age of big data, data processing methods are at the core for improving the quality and efficiency of information extracted from data capture systems. The application of various data processing methods is shown in this special issue (Figure 5). The methods are broadly classified into two types, and each was applied to the two above mentioned types of phenotyping traits. For biochemical traits, machine learning and artificial intelligence methods, including principal component analysis (PCA), partial least squares regression (PLSR), random forest regression (RF), artificial neural network (ANN), etc., were usually preferred and recommended [10,14,15,16,17,18,21,23,24,32]. Those methods proved to be very efficient for the various target phenotyping traits. Physical model integrating optimizing algorithm was also considered in one remote sensing application, and there was one study introduced by Thorp et al. [26], which used the fractional vegetation cover (FVC) to drive a daily ET-based soil water balance model for seasonal crop water use quantification.
In comparison to biochemical traits, morphological and structural traits were widely extracted using image segmentation methods. In Zhou et al. [11] study, the maximum-entropy method was used to do the coarse-segmentation for recognizing wheat spikes. Junho et al. [28] proposed an automatic open cotton boll detection algorithm from UAV imagery for yield estimation. Above referred methods and other image segmentation methods in this special issue were mainly used for extracting the target morphological traits, while reducing the interference from the background information. Other morphological traits such as crown height, extent, volume, and diameter were mainly derived and processed from three-dimensional point clouds by structure from motion algorithms [8,19,30,31,33].

3. Conclusions

The contributions of 30 papers reported in this special issue highlight the hot topic on estimation of crop phenotyping traits using by UGV and UAV imageries. The Special Issue—and our short Editorial—show the importance of high-throughput crop phenotyping for crop production. Second, the SI discusses the application of the platforms and sensors for high-throughput phenotyping traits of crops. Finally, it provides important hints on how to use data processing methods to estimate crop phenotyping traits for different crop types.
Despite the tremendous progress in the field of phenotyping, there is still ample opportunity for follow-up investigations into some key points. In particular, we recommend more research on the development and application of UGV platforms and data fusion algorithms combining multi-source data. More research is also warranted regarding deep learning methods, linking radiative transfer models and functional structure models, which are not sufficiently covered in this spectral issue. With fast development of sensors technology and image processing algorithms, the above key points will certainly receive more attention by the respective remote sensing, image processing, and crop breeding communities.

Author Contributions

The three authors contributed equally to all aspects of this editorial. All authors have read and agreed to the published version of the manuscript.

Acknowledgments

The Guest Editors would like to thank the authors who contributed to this Special Issue and the reviewers who dedicated their time and provided the authors with valuable and constructive recommendations. They would also like to thank the editorial team of Remote Sensing for their support.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Araus, J.L.; Kefauver, S.C. Breeding to adapt agriculture to climate change: Affordable phenotyping solutions. Curr. Opin. Plant Biol. 2018, 45, 237–247. [Google Scholar] [CrossRef] [PubMed]
  2. Singh, A.; Ganapathysubramanian, B.; Singh, A.K. Machine Learning for High-Throughput Stress Phenotyping in Plants. Trends Plant Sci. 2015, 21, 110–124. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Ghanem, M.E.; Marrou Hélène Sinclair, T.R. Physiological phenotyping of plants for crop improvement. Trends Plant Sci. 2015, 20, 139–144. [Google Scholar] [CrossRef] [PubMed]
  4. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef]
  5. Araus, J.L.; Kefauver, S.C.; Zaman-Allah, M.; Olsen, M.S.; Cairns, J.E. Translating High-Throughput Phenotyping into Genetic Gain. Trends Plant Sci. 2018, 23, 451–466. [Google Scholar] [CrossRef] [Green Version]
  6. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef]
  7. Liu, T.; Li, R.; Jin, X.; Ding, J.; Zhu, X.; Sun, C.; Guo, W. Evaluation of Seed Emergence Uniformity of Mechanically Sown Wheat with UAV RGB Imagery. Remote Sens. 2017, 9, 1241. [Google Scholar] [CrossRef] [Green Version]
  8. Patrick, A.; Li, C. High Throughput Phenotyping of Blueberry Bush Morphological Traits Using Unmanned Aerial Systems. Remote Sens. 2017, 9, 1250. [Google Scholar] [CrossRef] [Green Version]
  9. Yao, X.; Wang, N.; Liu, Y.; Cheng, T.; Tian, Y.; Chen, Q.; Zhu, Y. Estimation of Wheat LAI at Middle to High Levels Using Unmanned Aerial Vehicle Narrowband Multispectral Imagery. Remote Sens. 2017, 9, 1304. [Google Scholar] [CrossRef] [Green Version]
  10. Yue, J.; Feng, H.; Yang, G.; Li, Z. A Comparison of Regression Techniques for Estimation of Above-Ground Winter Wheat Biomass Using Near-Surface Spectroscopy. Remote Sens. 2018, 10, 66. [Google Scholar] [CrossRef] [Green Version]
  11. Zhou, C.; Liang, D.; Yang, X.; Xu, B.; Yang, G. Recognition of Wheat Spike from Field Based Phenotype Platform Using Multi-Sensor Fusion and Improved Maximum Entropy Segmentation Algorithms. Remote Sens. 2018, 10, 246. [Google Scholar] [CrossRef] [Green Version]
  12. Brocks, S.; Bareth, G. Estimating Barley Biomass with Crop Surface Models from Oblique RGB Imagery. Remote Sens. 2018, 10, 268. [Google Scholar] [CrossRef] [Green Version]
  13. Makanza, R.; Zaman-Allah, M.; Cairns, J.; Magorokosho, C.; Tarekegne, A.; Olsen, M.; Prasanna, B. High-Throughput Phenotyping of Canopy Cover and Senescence in Maize Field Trials Using Aerial Digital Canopy Imaging. Remote Sens. 2018, 10, 330. [Google Scholar] [CrossRef] [Green Version]
  14. Gracia-Romero, A.; Vergara-Díaz, O.; Thierfelder, C.; Cairns, J.; Kefauver, S.; Araus, J. Phenotyping Conservation Agriculture Management Effects on Ground and Aerial Remote Sensing Assessments of Maize Hybrids Performance in Zimbabwe. Remote Sens. 2018, 10, 349. [Google Scholar] [CrossRef] [Green Version]
  15. Ma, X.; Feng, J.; Guan, H.; Liu, G. Prediction of Chlorophyll Content in Different Light Areas of Apple Tree Canopies based on the Color Characteristics of 3D Reconstruction. Remote Sens. 2018, 10, 429. [Google Scholar] [CrossRef] [Green Version]
  16. Joalland, S.; Screpanti, C.; Varella, H.; Reuther, M.; Schwind, M.; Lang, C.; Walter, A.; Liebisch, F. Aerial and Ground Based Sensing of Tolerance to Beet Cyst Nematode in Sugar Beet. Remote Sens. 2018, 10, 787. [Google Scholar] [CrossRef] [Green Version]
  17. Moeckel, T.; Dayananda, S.; Nidamanuri, R.; Nautiyal, S.; Hanumaiah, N.; Buerkert, A.; Wachendorf, M. Estimation of Vegetable Crop Parameter by Multi-temporal UAV-Borne Images. Remote Sens. 2018, 10, 805. [Google Scholar] [CrossRef] [Green Version]
  18. Hassan, M.; Yang, M.; Rasheed, A.; Jin, X.; Xia, X.; Xiao, Y.; He, Z. Time-Series Multispectral Indices from Unmanned Aerial Vehicle Imagery Reveal Senescence Rate in Bread Wheat. Remote Sens. 2018, 10, 809. [Google Scholar] [CrossRef] [Green Version]
  19. Johansen, K.; Raharjo, T.; McCabe, M. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
  20. Khan, Z.; Chopin, J.; Cai, J.; Eichi, V.; Haefele, S.; Miklavcic, S. Quantitative Estimation of Wheat Phenotyping Traits Using Ground and Aerial Imagery. Remote Sens. 2018, 10, 950. [Google Scholar] [CrossRef] [Green Version]
  21. Yue, J.; Feng, H.; Jin, X.; Yuan, H.; Li, Z.; Zhou, C.; Yang, G.; Tian, Q. A Comparison of Crop Parameters Estimation Using Images from UAV-Mounted Snapshot Hyperspectral Sensor and High-Definition Digital Camera. Remote Sens. 2018, 10, 1138. [Google Scholar] [CrossRef] [Green Version]
  22. Guan, H.; Liu, M.; Ma, X.; Yu, S. Three-Dimensional Reconstruction of Soybean Canopies Using Multisource Imaging for Phenotyping Analysis. Remote Sens. 2018, 10, 1206. [Google Scholar] [CrossRef] [Green Version]
  23. Wan, L.; Li, Y.; Cen, H.; Zhu, J.; Yin, W.; Wu, W.; Zhu, H.; Sun, D.; Zhou, W.; He, Y. Combining UAV-Based Vegetation Indices and Image Classification to Estimate Flower Number in Oilseed Rape. Remote Sens. 2018, 10, 1484. [Google Scholar] [CrossRef] [Green Version]
  24. Duarte-Carvajalino, J.; Alzate, D.; Ramirez, A.; Santa-Sepulveda, J.; Fajardo-Rojas, A.; Soto-Suárez, M. Evaluating Late Blight Severity in Potato Crops Using Unmanned Aerial Vehicles and Machine Learning Algorithms. Remote Sens. 2018, 10, 1513. [Google Scholar] [CrossRef] [Green Version]
  25. Han, L.; Yang, G.; Feng, H.; Zhou, C.; Yang, H.; Xu, B.; Li, Z.; Yang, X. Quantitative Identification of Maize Lodging-Causing Feature Factors Using Unmanned Aerial Vehicle Images and a Nomogram Computation. Remote Sens. 2018, 10, 1528. [Google Scholar] [CrossRef] [Green Version]
  26. Thorp, K.; Thompson, A.; Harders, S.; French, A.; Ward, R. High-Throughput Phenotyping of Crop Water Use Efficiency via Multispectral Drone Imagery and a Daily Soil Water Balance Model. Remote Sens. 2018, 10, 1682. [Google Scholar] [CrossRef] [Green Version]
  27. Michez, A.; Bauwens, S.; Brostaux, Y.; Hiel, M.; Garré, S.; Lejeune, P.; Dumont, B. How Far Can Consumer-Grade UAV RGB Imagery Describe Crop Production? A 3D and Multitemporal Modeling Approach Applied to Zea mays. Remote Sens. 2018, 10, 1798. [Google Scholar] [CrossRef] [Green Version]
  28. Yeom, J.; Jung, J.; Chang, A.; Maeda, M.; Landivar, J. Automated Open Cotton Boll Detection for Yield Estimation Using Unmanned Aircraft Vehicle (UAV) Data. Remote Sens. 2018, 10, 1895. [Google Scholar] [CrossRef] [Green Version]
  29. Ziliani, M.; Parkes, S.; Hoteit, I.; McCabe, M. Intra-Season Crop Height Variability at Commercial Farm Scales Using a Fixed-Wing UAV. Remote Sens. 2018, 10, 2007. [Google Scholar] [CrossRef] [Green Version]
  30. Wang, Y.; Wen, W.; Wu, S.; Wang, C.; Yu, Z.; Guo, X.; Zhao, C. Maize Plant Phenotyping: Comparing 3D Laser Scanning, Multi-View Stereo Reconstruction, and 3D Digitizing Estimates. Remote Sens. 2019, 11, 63. [Google Scholar] [CrossRef] [Green Version]
  31. Tu, Y.; Johansen, K.; Phinn, S.; Robson, A. Measuring Canopy Structure and Condition Using Multi-Spectral UAS Imagery in a Horticultural Environment. Remote Sens. 2019, 11, 269. [Google Scholar] [CrossRef] [Green Version]
  32. Lobos, G.; Escobar-Opazo, A.; Estrada, F.; Romero-Bravo, S.; Garriga, M.; del Pozo, A.; Poblete-Echeverría, C.; Gonzalez-Talice, J.; González-Martinez, L.; Caligari, P. Spectral Reflectance Modeling by Wavelength Selection: Studying the Scope for Blueberry Physiological Breeding under Contrasting Water Supply and Heat Conditions. Remote Sens. 2019, 11, 329. [Google Scholar] [CrossRef] [Green Version]
  33. Wilke, N.; Siegmann, B.; Klingbeil, L.; Burkart, A.; Kraska, T.; Muller, O.; van Doorn, A.; Heinemann, S.; Rascher, U. Quantifying Lodging Percentage and Lodging Severity Using a UAV-Based Canopy Height Model Combined with an Objective Threshold Approach. Remote Sens. 2019, 11, 515. [Google Scholar] [CrossRef] [Green Version]
  34. Feng, L.; Wu, W.; Wang, J.; Zhang, C.; Zhao, Y.; Zhu, S.; He, Y. Wind Field Distribution of Multi-rotor UAV and Its Influence on Spectral Information Acquisition of Rice Canopies. Remote Sens. 2019, 11, 602. [Google Scholar] [CrossRef] [Green Version]
  35. Thompson, A.; Thorp, K.; Conley, M.; Elshikha, D.; French, A.; Andrade-Sanchez, P.; Pauli, D. Comparing Nadir and Multi-Angle View Sensor Technologies for Measuring in-Field Plant Height of Upland Cotton. Remote Sens. 2019, 11, 700. [Google Scholar] [CrossRef] [Green Version]
  36. Ma, X.; Zhu, K.; Guan, H.; Feng, J.; Yu, S.; Liu, G. High-Throughput Phenotyping Analysis of Potted Soybean Plants Using Colorized Depth Images Based on A Proximal Platform. Remote Sens. 2019, 11, 1085. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Crop types included in this special issue.
Figure 1. Crop types included in this special issue.
Remotesensing 12 00940 g001
Figure 2. Platforms included in this special issue.
Figure 2. Platforms included in this special issue.
Remotesensing 12 00940 g002
Figure 3. Sensors included in this special issue.
Figure 3. Sensors included in this special issue.
Remotesensing 12 00940 g003
Figure 4. Crop phenotyping traits included in this special issue.
Figure 4. Crop phenotyping traits included in this special issue.
Remotesensing 12 00940 g004
Figure 5. Data processing methods included in this special issue.
Figure 5. Data processing methods included in this special issue.
Remotesensing 12 00940 g005
Table 1. Estimation of crop phenotyping traits in this Special Issue. The table distinguishes between crop type, phenotyping platforms, sensors, and methods.
Table 1. Estimation of crop phenotyping traits in this Special Issue. The table distinguishes between crop type, phenotyping platforms, sensors, and methods.
No.CropTraitsPlatformsSensorsMethodsReference
1WheatSeed emergence uniformityUAVRGBALA[7]
2BlueberryHeight, extents, canopy area, volume; crown diameter and widthUAVRGBMotion algorithms[8]
3WheatLAIUAVMSIOLS[9]
4WheatAGBGroundHyperspectralANN, MLR, DT, BBRT, PLSR, RF, SVM, PCR[10]
5WheatSpikesField-based phenotype platformRGB; MSIMES[11]
6BarleyFresh/dry BiomassElevated positionRGBCSM[12]
7MaizeCC; SenescenceUAVRGBSenescence Index[13]
8MaizeYieldGround; UAVRGB; MSIMLR[14]
9Apple TreeLCC; 3D reconstructionGround fixed3D laser scannerANN[15]
10Sugar BeetBeet cyst nematode; yieldHandheld and UAVHyperspectral;
Thermal images;
HSI
PCR; DT[16]
11Eggplant,
Tomato,
Cabbage
Height; biomassUAVRGBRF; OLS[17]
12WheatSenescence RateUAVMSICorrelation[18]
13TreeCrown perimeter; width; height; area; CCUAVMSIImage segmentation[19]
14WheatHeight; vigorGround; UAVRGB-[20]
15WheatHeight; LAI; AGBUAVHSI; RGBRF; PLSR; [21]
16SoybeanHeight; greenness indexGroundPhotonic mixer detector; RGBDBSCAN; PCR; ICP[22]
17Oilseed RapeFlower numberUAVMSI; RGBRF; OSR[23]
18PotatoLate blight severityUAVMSIMLP, SVR, RF, ANN[24]
19MazieLodgingUAVMSINC[25]
20CottonWUE, FVCUAVMSIET model[26]
21MaizeAGBUAVRGB + point cloudCSM; PLSR[27]
22CottonCotton bolls; yieldUAVRGBAutomatic open cotton boll detection algorithm[28]
23MaizeHeightUAVRGB; LiDARCSM[29]
24MaizeLeaf length; width; inclination angle; azimuth; area; heightGround3D laser scanning; 3D digitizing--[30]
25Avocado treeCrown height; extent; CCUAVMSICSM; OLS; RF[31]
26BlueberryStem water potential; Cab; fluorescence; leaf gas exchangeGroundHyperspectral MLR; PLSR[32]
27BarelyHeight; lodging percentage; severityUAVRGBSfM[33]
28RiceCCUAVMSIOLS[34]
29CottonHeightGround; UAVNadir/Multi-Angle View Sensor--[35]
30SoybeanHeight; breadth; colorGroundRGB-D [36]
Note: Traits: LAI, leaf area index; AGB, above ground biomass; CC, canopy cover; FVC, fractional vegetation cover; LCC, leaf chlorophyll content; WUE, water use efficiency; Cab, leaf chlorophyll a. Platforms: UAV, unmanned aerial vehicles. Sensors: RGB, red-green-blue imagery; MSI, multi-spectral imagery; HSI, hyperspectral imagery; RGB-D, RGB and depth imagery. Methods: ALA, area localization algorithm; ANN, artificial neural network; MLR, multivariable linear regression; DT, decision-tree regression; BBRT, boosted binary regression tree; PLSR, partial least squares regression; RF, random forest regression; SVM, support vector machine regression; PCR, principal component regression; MES, Maximum entropy segmentation; CSM, crop surface models; OLS, ordinary least squares; DBSCAN, density-based spatial clustering of applications with noise; ICP, iterative closest point; OSR, optimal subset regression; MLP, multilayer perceptron; NC, nomogram computation; SfM, structure from motion.

Share and Cite

MDPI and ACS Style

Jin, X.; Li, Z.; Atzberger, C. Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”. Remote Sens. 2020, 12, 940. https://doi.org/10.3390/rs12060940

AMA Style

Jin X, Li Z, Atzberger C. Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”. Remote Sensing. 2020; 12(6):940. https://doi.org/10.3390/rs12060940

Chicago/Turabian Style

Jin, Xiuliang, Zhenhai Li, and Clement Atzberger. 2020. "Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”" Remote Sensing 12, no. 6: 940. https://doi.org/10.3390/rs12060940

APA Style

Jin, X., Li, Z., & Atzberger, C. (2020). Editorial for the Special Issue “Estimation of Crop Phenotyping Traits using Unmanned Ground Vehicle and Unmanned Aerial Vehicle Imagery”. Remote Sensing, 12(6), 940. https://doi.org/10.3390/rs12060940

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop