Next Article in Journal
Biomass Waste Conversion Technologies and Its Application for Sustainable Environmental Development—A Review
Previous Article in Journal
The Differentiations in the Soil Nematode Community in an Agricultural Field after Soil Amendment Using Composted Coffee Waste in Various Concentrations
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Field Phenotyping Monitoring Systems for High-Throughput: A Survey of Enabling Technologies, Equipment, and Research Challenges

by
Huali Yuan
1,2,3,4,
Minghan Song
1,2,3,4,
Yiming Liu
1,2,3,4,
Qi Xie
1,2,3,4,
Weixing Cao
1,2,3,4,
Yan Zhu
1,2,3,4 and
Jun Ni
1,2,3,4,*
1
College of Agriculture, Nanjing Agricultural University, Nanjing 210095, China
2
National Engineering and Technology Center for Information Agriculture, Nanjing 210095, China
3
China Engineering Research Center of Smart Agriculture, Ministry of Education, Nanjing 210095, China
4
Collaborative Innovation Center for Modern Crop Production Co-Sponsored by Province and Ministry, Nanjing 210095, China
*
Author to whom correspondence should be addressed.
Agronomy 2023, 13(11), 2832; https://doi.org/10.3390/agronomy13112832
Submission received: 22 October 2023 / Revised: 14 November 2023 / Accepted: 14 November 2023 / Published: 17 November 2023
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
High-throughput phenotype monitoring systems for field crops can not only accelerate the breeding process but also provide important data support for precision agricultural monitoring. Traditional phenotype monitoring methods for field crops relying on artificial sampling and measurement have some disadvantages including low efficiency, strong subjectivity, and single characteristics. To solve these problems, the rapid monitoring, acquisition, and analysis of phenotyping information of field crops have become the focus of current research. The research explores the systematic framing of phenotype monitoring systems for field crops. Focusing on four aspects, namely phenotyping sensors, mobile platforms, control systems, and phenotyping data preprocessing algorithms, the application of the sensor technology, structural design technology of mobile carriers, intelligent control technology, and data processing algorithms to phenotype monitoring systems was assessed. The research status of multi-scale phenotype monitoring products was summarized, and the merits and demerits of various phenotype monitoring systems for field crops in application were discussed. In the meantime, development trends related to phenotype monitoring systems for field crops in aspects including sensor integration, platform optimization, standard unification, and algorithm improvement were proposed.

1. Introduction

Crop phenotypes refer to all (or some) of the discernible crop characteristics and traits determined or influenced by their genotypes and the environment. These traits include the shape, structure, growth, and pigment content of crop plants [1,2]. By acquiring crop phenotyping information, crops can be presented from macroscopic to microscopic scales, which allows efficient understanding of the relationship between gene functions and environmental factors. The phenotyping information can also be used to guide germplasm screening in the early stage of breeding and assess field performance of various varieties in the later popularization and planting. It is also an important basis for precision management and control of crops [3,4].
Phenotype monitoring of field crops requires acquisition of multi-scale, multi-sequential, and multi-source phenotyping information in a non-invasive, high-throughput manner in the real growth environment of crops. In recent years, research on phenotyping traits of field crops has transformed from low-throughput and extensive monitoring in a single environment to high-throughput precision monitoring for group shapes in complex environments [5]. To meet the demand for high-throughput phenotype monitoring for field crops, researchers have made great efforts to develop diverse phenotype monitoring systems for field crops. In the transition from “Agriculture 1.0” to “Agriculture 4.0”, phenotype monitoring of field crops has gradually developed from artificial measurement using a ruler and a steelyard to multi-scale, high-precision, high-throughput, and intelligent phenotype monitoring modes, as shown in Figure 1.
Phenotype monitoring systems for field crops tend to integrate four parts: phenotyping sensors, mobile platforms, platform control systems, and data processing algorithms, thus achieving high throughput and automatic sampling of phenotyping data (Figure 2). Phenotyping sensors are core devices in phenotype monitoring systems. To adapt to phenotype information monitoring in different spatial domains, multi-scale platforms were employed to carry phenotyping sensors in phenotype monitoring systems. These platforms include Internet of Things (IoT)-based, track-type (gantry or suspension-type), vehicle-mounted, and drone-borne devices.
Based on the motion control system of platforms, the multi-source phenotype information could be obtained in a short time, and phenotyping traits of crops were resolved using phenotype data processing algorithms [13].
Internationally, many research institutions and commercial corporations have actively studied crop phenotypes and invested time and funds into the field to build phenotype monitoring systems, to good effect. Typical phenotyping sensors and phenotype monitoring systems as well as their research and development (R&D) institutions are displayed in Figure 3. The common physiological phenotyping sensors of crops include the single-leaf SPAD [14], Dualex [15], canopy-level ASDs (Analytical Spectral Devices) Field Spec Pro [16,17,18], CGMD-402 (Crop Growth Monitoring and Diagnosis 402) [19], and so on [20,21,22,23]. Phenotyping sensors of crop morphologies are mainly multiple types of image collectors. Based on the integration and application of hardware such as these phenotyping sensors, CropDesign in Belgium took the lead to develop a high-throughput phenotyping platform that can evaluate crop traits at large scale, namely the Trait Mill system [24,25]. The Plant Accelerator developed by the Australian Plant Phenomics Facility, a leading international research organization on plant phenotypes, is one of the most complex and expensive facilities for plant phenomics studies [26,27]. Lemna Tec in Germany developed Scanalyzer 3D and Greenhouse Scanalyzer, which is a high-throughput phenotyping platform [28,29].

2. Phenotyping Sensors for Field Crops

2.1. Classification of Crop Phenotyping Traits

Yield, resistance, quality, and nutrition are the ultimate aims of modern agriculture, so crop phenotyping traits can be classified into four types, including those relating to the yield, resistance, quality, and nutrition. These phenotyping traits are strongly associated with the morphological and structural traits of crops (crop height, crown breadth, coverage, biomass, leaf length, leaf width, and fruit characteristics), as illustrated in Figure 4. These traits can be measured using advanced imaging and spectrum technologies [30].
Yield, in essence, reflects biomass. The crop yield is extremely significantly associated with the harvested organs. The morphological parameters of some important organs and important agronomic traits of crops are all strongly associated with the yield and have been extensively applied to yield monitoring and research into a wide range of crops [31].
Phenotyping traits relating to resistance are complex traits of crops under various environmental factors including biological stresses (disease, insect pests, and weed) and abiotic stresses (drought, salt and alkali, and flood) unfavorable to crop survival and growth. Analysis of phenotyping traits pertaining to resistance calls for multi-dimensional phenotype information. By acquiring the spectral reflectance of crops under multiple spectra and developing specific image analysis algorithms, one can dynamically and quantitatively analyze phenotyping traits relating to the resistance of plants under stress [32].
Phenotyping traits relating to quality are mainly studied by focusing on the morphological and structural variation and physiological and biochemical indices of harvested organs. It is difficult to evaluate quality traits of field crops based on morphological and structural characteristics. Phenotypic traits related to quality are often analyzed through the integration of nutritional contents and morphological features of crop plants. This approach is commonly used to achieve non-destructive testing in agricultural applications [33].
Phenotyping traits relating to nutrition comprehensively reflect the soil nutrient supply, nutrient demand of crops, and nutrient absorption capacity of crops. Crops lacking certain nutritional elements generally demonstrate different phenotyping traits in their appearance, color, and size. Crop phenotyping traits can be obtained to further aid the diagnosis of the nutriture in crops, which serves as a basis for agricultural management decisions such as topdressing of crops.

2.2. Common Phenotyping Sensors for Crops

A wide variety of phenotyping sensors used for monitoring of field crops are available (Table 1). According to the difference in the usage, sensors can be roughly classified into four types, namely those for monitoring phenotyping traits relating to the yield, quality, nutrition, and resistance [34]. In accordance with the area of the sensing field, sensors can also be classified into sensors for detecting information at certain points of crops and imaging sensors that provide the spatial distribution of detected objects.
Point sensors for detecting information at certain points of crops acquire reflectivity at corresponding bands mainly based on optical radiation information in characteristic spectral bands. Phenotyping parameters of crops can be attained based on the strong regularity between reflectivity in characteristic spectral bands and phenotype information of crops. The commonly used spectral sensors in this type include the handheld chlorophyll sensor SPAD-502 (Konica Minolta, Tokyo, Japan) [14], RapidSCAN CS-45 canopy monitor (Holland Scientific, Lincoln, NE, USA) [35], and the GreenSeeker spectrometer based on active light sources (Oklahoma State University and N-tech, Okmulgee, OK, USA) [21]. Spectral sensors show high sensitivity and low cost, while they set strict requirements for the light and generally need to measure phenotype information in specific time frames on sunny days [36].
Spatially distributed imaging-type phenotyping sensors mainly acquire and store image information based on the photoelectric properties of semiconductor elements. Modern phenotyping imaging technology with high resolution can realize the visualization of multi-dimensional and multi-parameter data. Accurate, intuitive, and comprehensive crop phenotype data capture aids in more deeply understanding crop growth characteristics and adaptability to the environment.
Acquisition of visible images using a color digital camera is the most widely used imaging technology at present, which is at low cost and can obtain information including the size, shape, color, and structure of crops by analyzing color images. However, such methods call for tedious post-processing, sunshine or shading also may induce overexposure or underexposure, and interpretation of the data is complicated [37,38]. Apart from obtaining image information in a single band, multispectral and hyperspectral imaging techniques can also attain the spectral absorption curves of crops. By analyzing images and spectral information, real-time and in situ observations of phenotype information involving spectral vegetation indices (the normalized difference vegetation index (NDVI) and ratio vegetation index (RVI)) of crops can be realized [39,40,41,42,43,44,45,46].
Near-infrared and infrared cameras are digital imaging devices that are sensitive to electromagnetic waves with wavelengths in the range of 400 to 14,000 nm, showing the technological advantages of stable and reliable performance. They are commonly used to monitor phenotyping traits such as grain quality of crops [47] and so on [48,49,50]. Thermal cameras can detect and visualize the invisible infrared radiation of detected objects, which is consistent with object temperature and is commonly used to monitor traits including the early thermal reaction [51,52] and lodging resistance [53] of crops under stress. However, such devices are affected by extraneous noise, and errors arise due to mixed pixels [54,55,56]. Fluorescence sensors adopt an active measurement method while facing some difficulties in fluorescence excitation, so their in situ application is limited [57]. Depth-sensing cameras can output the depth, amplitude, and intensity images and have been widely applied to crop phenotype monitoring to solve problems arising from leaf occlusion [58,59,60,61,62]. Lidar scanners, characterized by high precision and strong anti-jamming capability, acquire the 3D point cloud data by scanning the crop canopy or plants and obtain parameters including the canopy height [63] and so on [64,65] by analyzing point cloud data.
Imaging-type phenotyping sensors which show high efficiency and strong visuality can acquire large-scale image information in a short time. However, a large amount of image information is acquired, and the real-time transmission function is limited. In addition, test results generally call for offline processing by professionals and software. As a result, a significant delay arises.

3. Mobile Phenotyping Platforms for Field Crops

With the rapid development of aviation, automation, and electronic information technologies, these technologies have provided many conveniences for the development of phenotype monitoring systems for field crops. For accurate, continuously collected phenotype information from the proximity to long distance at multiple scales including single leaves or plant organs, single plants, small plots, or farms, different types of platforms have been developed. They are mainly classified into IoT-based, track-type, vehicle-mounted, and drone-borne types (Figure 5).

3.1. IoT-Based Platforms

IoT-based platforms generally refer to IoT systems that realize the interconnection and verification of multi-location data [97] and connect various information acquisition sensors including temperature sensors, infrared and light sensors, and RGB or spectral cameras by using wireless communication methods such as WIFI, ZigBee, and LoRa. IoT-based platforms use independent small field workstations to monitor the growth environmental parameters of crops in plots and crop phenotype information, showing advantages including easy installation and flexible layout [98]. Ji Zhou developed an IoT-based crop phenotyping platform, CropQuant [95], which is a mature small IoT-based phenotype monitoring system. This system integrates multiple sensors using industrial single-board computers to form a scalable multi-point monitoring network for large-scale crops. It can dynamically support high-resolution data acquisition of crop microenvironments and multiple key growth phenotypes at multiple points. The platform has been applied to phenotyping in the spike area of wheat (Triticum aestivum) in fields [99]. Masayuki Hirafuji developed an IoT-based open field server (OpenFS), which uses the cloud service X (formerly Twitter) to achieve long-term monitoring of different environments in fields on the basis of integrating multiple low-cost sensors. The platform was deployed in an orangery in Japan and collected environmental and phenotype data [100]. Specific stations in IoT-based platforms can be dynamically allocated according to the demands, and they can be flexibly and conveniently networked. However, a single IoT-based platform only covers a small area, and a whole plot cannot be detected unless a network is formed. In addition, such platforms acquire information through sampling and fail to achieve full coverage of all individuals.

3.2. Track-Type Mobile Platforms

Track-type platforms scan and monitor crops by building fixed tracks in the field to drive sensor systems using motors and cables. They can achieve all-weather measurement in the fixed area without physical contact with soil or crops. They also avoid the mechanism shaking, which is similar to the ground mobile platform. Track-type platforms are ideal field platforms for collecting high-resolution phenotype information.
The first commercialized high-throughput track-type phenotyping platform for field crops in the world was developed by PhenoSpex in The Netherlands. By paving fixed tracks in the field, the platform drives gantry cranes for movement and scanning using a driving motor, thus achieving fully automatic, high-throughput measurement across a 16 m × 200 m range [70]. LemnaTec developed the Field Scanalyzer, a track-type high-throughput phenotyping platform for field crops, the main body frame of which is a gantry crane measuring 125 m × 15 m × 6 m. The platform is capable of realizing 24 h high-resolution automatic monitoring across a 10 m × 110 m range [10,101]. ETH Zurich created a track-type multi-sensor platform FIP (Field Phenotyping Platform) hung with cables, which can cover a rectangular field of one hectare, in which four poles (24 m high) are erected at each corner of the field. The pulleys on the top and winches on the bottom of poles are used to drive the movement of cables, thus driving the integrated sensor device carried by the cables to scan and monitor crops.
Track-type platforms can carry multiple types of sensors to move in a flexible, stable manner above the monitoring area, which overcomes the inconvenience of ground-based mobile platforms in crossing crop rows. In addition, these platforms are slightly disturbed by external factors (terrain and vibration) and show high positioning accuracy and repeatability. However, the cost of customizing track-type platforms is high, and subsequent maintenance is difficult. Professional teams are required to provide technical support in installation, debugging, operation, maintenance, and late analysis. Considering this, it is generally difficult to apply such platforms to large-scale multi-location breeding and cultivation projects [102,103].

3.3. Vehicle-Mounted Mobile Platforms

Vehicle-mounted mobile platforms mainly refer to commercial agricultural tractors [91,92,93,104], independently developed trolleys, and mini-robot chassis or chassis with a large ground clearance [85,88,89,105,106]. Sensors are arranged on such carriers according to characteristics including the crop variety, cultivation agronomic characteristics, and growth stages. In addition, these platforms are also equipped with data memory and a global positioning system (GPS); therefore, they carry a large load and can substantially improve the phenotype monitoring efficiency [92,107]. Vehicle-mounted platforms based on agricultural machinery are relatively easily achieved, which reduces labor intensity and improves working efficiency; however, the volume of agricultural machinery is so large that these platforms show poor field trafficability and commonly cause soil compaction, thus damaging the crops. In addition, because of the limited height of chassis, they are mainly applicable to the early growth stages of low-growing crops such as wheat and cotton (Gossypium hirsutum) [108]. Additionally, most vehicle-mounted platforms are powered by internal combustion engines, so the vehicle body and spray rod vibrate, which is not conducive to accurate data acquisition and also limits the high-throughput phenotyping ability of the method [109]. Independently developed, hand-pushed vehicle-mounted platforms can decrease the research and development cost and decrease soil compactness (lightweight architecture), whereas these platforms still require artificial driving, and the stopping–measurement–movement mode and slow response speed cannot guarantee the efficient collection of crop trait data.
To further reduce the cost, enhance field trafficability, and improve the automation degree and measurement accuracy, researchers have begun to use mini-robot chassis or independently designed mobile platforms with a high ground clearance to carry phenotyping sensors and acquisition systems [110,111]. Mini robots, with their small size, light weight, mature technology, and easy refitting, have been widely applied to field phenotype monitoring. Phenotyping research has also been conducted on crops with large row spacing, such as corn (Zea mays) and broomcorn [84]. Limited by the ground clearance and bearing capacity, mini robots generally run between crop rows with large row spacing, while they find it more difficult to undertake cross-row monitoring in fields with small row spacing [77,80,83,112,113].
Mobile platforms with high ground clearance or adjustable chassis can solve problems in the aforementioned carriers, improve the field trafficability of platforms, and achieve cross-row scanning and monitoring. Therefore, such platforms have become a research hotspot in recent years [75,77,114]. Tabile et al. [74,115] developed a field agronomy information collection platform with high ground clearance, which has a ground clearance of 1.8 m and uses the sleeve-type wheel track adjustment device. It can manually adjust the wheel track according to the plant morphology and row spacing of plants. Likewise, various field phenotype monitoring platforms such as MARS X [116], Ted [77], and MYCE Vigne [114] adopt the gantry device, and they are characterized by the high ground clearance, simple structure, and strong field trafficability. This is a common carrier structure for phenotype monitoring. Due to the interaction of field environmental factors and high-density planting, the motion of vehicle-mounted platforms in the field still faces many limitations. Limited by body size and ground clearance, mini robots with low chassis are limited in their universality. Mobile carriers with a high ground clearance or adjustable chassis significantly improve the field trafficability and universality. In addition, by using the open trusswork or integrating sensors in the front of the platforms, the platform structure of these carriers avoids casting a shadow; however, vibration due to soil heterogenization also exerts certain adverse effects on phenotype monitoring [76,111,117,118,119,120].

3.4. Drone-Borne Mobile Platforms

Drone-borne phenotyping platforms carry diverse lightweight sensors on fixed-wing or multi-rotor drones and use technologies including remote communication to realize rapid, lossless acquisition of phenotype information [121,122,123]. Drone-borne mobile platforms overcome the above limitations of various factors including the platform acquisition speed and field environment, and they are also flexibly controlled, portable, and cheap. Hence, they have been widely applied to monitor large areas of field phenotype information [124,125].
Li et al. used a small electric drone, Free Bird, as the carrier of a remote-sensing platform, which takes off by being thrown and lands by running on the ground. Being able to carry a payload of 0.4 kg, the drone carries a Ricoh GXRA12 non-mapping digital camera, which acquires image features of corn lodging in the pustulation period and extracts the corn lodging area by using an image analysis method [126]. By carrying image and GPS sensors on a two degree-of-freedom (DOF) cradle head at the head of a fixed-wing drone, Andrea S. Laliberte collected images of field crops over a total area of 130 ha and developed segmentation and classification rule sets, realizing the high-accuracy classification at the crop level [66]. By using a four-rotor drone, the team led by Zhu installed a single- axis cradle head to carry an RGB camera, which obtained aerial images of corn (Zea mays L.) population in the seedling stage in a field and constructed the structural model of the canopy [127].
Due to an inability to hover and their high flight speed, fixed-wing drones set onerous requirements for the sensitivity of sensors. For this reason, these drones are less commonly used in crop phenotype monitoring platforms. Due to advantages including portability, hovering capability, beyond terrain limitation, and rapid acquisition of large ranges of phenotype information, rotary-wing drones have become the first choice of drone for crop phenotype monitoring. Despite these, rotary-wing drones still have some insurmountable defects, including low bearing capacity, short endurance, and susceptibility to weather conditions, which have become main problems that limit their wider application.

4. Phenotype Monitoring Control System for Field Crops

Motion control systems are core components of the motion and task execution of phenotyping platforms for field crops and are also key to achieving the consistency and validity of phenotype monitoring data. Motion control systems of phenotyping platforms for field crops are generally composed of three parts, namely the actuator driver, controller, and navigation and pose sensors. The controller receives the input signals of the sensors and runs the motion control algorithm, followed by outputting commands to adjust actuating equipment, so as to maintain various parameters of the platform at the needed motion state. The control algorithm achieves accurate motion control strategies using the controller and computer program, so they are a key in the controller design. Advantages and limitations of common control algorithms and controllers are listed in Table 2.

4.1. Motion Control Algorithms of Phenotyping Platforms for Field Crops

4.1.1. PID Control Algorithm

The proportion, integral, and derivative (PID) control algorithm is a closed-loop control algorithm commonly seen in control systems. The PID control algorithm shows favorable control characteristics, sets low requirements for models, and is easily realized. It has found good application in aspects including controlling the motion speed, navigation, and flight attitude of phenotyping platforms for field crops.
Kang [128] used the PID control algorithm to control the wheel speed of the platform in a bid to ensure stationarity of the motion speed of the acquisition platform of crop phenotype information and improve the accuracy of collected data. The relay feedback method was also adopted to achieve the online self-tuning of PID parameters, and tests were conducted to verify that the online self-tuned PID adjustment algorithm can realize precise control over the wheel speed of phenotyping platforms. Bakker et al. [129] developed a robot platform used in a sugar beet field based on RTK-GPS, which achieves precise control over the wheel speed by virtue of the controller. Zhang et al. [130] designed a control system for agricultural four-wheel-independent-driven robots and applied the PID control algorithm to analyze and verify the effectiveness of the four-wheel-independent-steering control strategy. In the steering process within 0° to 360°, the maximum mean absolute error (MAE) for controlling the rotation angle is 0.1°, indicative of high control accuracy of the steering angle. Based on a multi-rotor drone platform and a PID double closed-loop control strategy, Liao et al. [68] rapidly adjusted the motor speed and guaranteed stability and balance of the drone pose. The drone also shows strong anti-jamming performance and meets the requirements of collecting field phenotype information at a low altitude.
The PID control algorithm, not relying on a mathematical model, shows high robustness, has a small steady-state error, and is beneficially applied to the environment of linear systems. However, a phenotyping platform is a non-linear system with a large time delay, and the pose of the platform is likely to be affected by multiple environmental factors including the center-of-gravity position of the platform and the road condition.

4.1.2. Fuzzy Control Algorithm

Fuzzy control is a control technique formed based on the fuzzy mathematical theory. Although it is not necessary to establish an accurate mathematical model for the controlled object, the algorithm has higher controllability, adaptability, and rationality. Thus, fuzzy control has become an important branch for controlling field-mobile platforms. It is highly applicable to processes that are difficult to acquire in agricultural production and processes showing dynamic characteristics that are difficult to master or change to any extremely significant extent.
Ding et al. [131] obtained the status information of field information acquisition platforms to serve as the input of motion controllers, by using low-precision Beidou positioning modules, electronic compasses, rotary encoders, and angle sensors. By constructing a motion controller with lateral correction and longitudinal constant-speed walking, the lateral correction and longitudinal constant-speed walking in the platform’s walking process were achieved. In this way, the stability of the platform meets the demand for field information acquisition. Kanan et al. [132] designed an agricultural vehicle for field environment detection and used a fuzzy controller to change the driving wheel speed, thus enabling the vehicle to reach the expected steering angle and improving the motion efficiency of the vehicle. To allow field robots to walk between crop rows, Bengochea-Guevara et al. [133] devised two fuzzy controllers: one used for steering control and the other for speed control. Test results show that the fuzzy controllers enable the robots to follow crop rows to avoid rolling the crops. At present, mobile platforms based on fuzzy controllers generally use the lateral deviation and the course deviation of current location from the targeted paths as the input of fuzzy controllers while the wheel speed difference or expected steering angle is the output. Fuzzy control algorithms are generally based on the experience and knowledge of experts and can rapidly compensate for systematic errors and retain their innate high stability, while the following error at the zero position is generally so high that it cannot be rapidly corrected.

4.1.3. Neural Network Control Algorithm

Neural network control, which refers to using a neural network to model non-linear objects in the control system, shows strong applicability and learning ability. Considering the complexity of the agricultural environment, neural network technology can make reasonable and accurate decisions, control, and learn about the uncertainty of the control system and the varying environment. Neural network control is one of the important technologies for the intelligent development of mobile platforms for field crops.
Jodas et al. [134] developed a navigation system that controls mobile robots through paths in plantations, which uses a neural network algorithm to search for the effect of the most appropriate path, with an accuracy rate of 90%. Eski et al. [135] used the PID control algorithm based on neural networks of models to control unmanned agricultural vehicles, under which the transient and steady responses of the control mechanism were detected. Chen et al. [136] established a 4-4-4-3 BP neural network algorithm by using the distance from the target path, heading angle, steering angle, and variation in steering angle of an agricultural vehicle as the input, while the distance from the target path, heading angle, and variation in steering angle of the vehicle at the next sampling point was the output. The algorithm achieves high-accuracy straight driving in the field, with 95% absolute values of variation of less than 50 mm. Neural network control does not need an accurate mathematical model, shows strong non-linear fitting ability, and is easily achieved using a computer. When controlling the navigation of a mobile platform in the field, the deviation is generally used as the input while the expected steering angle is used as the output to train the neural network, or the neural network is used to learn and optimize the proportion, integral, and derivative coefficients in PID control, so as to improve the accuracy of PID control. The limitation is that training of neural networks calls for lots of samples, and the output of neural networks is uncertain.
Existing research on the control of mobile platforms in the field mainly focuses on application to the steering, navigation, and path planning of platforms. The PID control structure is simple and has been the most widely applied; however, the PID control algorithm is only applicable to linear systems, and its response time is contradictory with the overshoot. Although fuzzy control is applicable to non-linear systems, it exhibits low accuracy and needs some experience-based judgment. In recent years, many control strategies of path tracking have been based on PID control integrating with the fuzzy control algorithm to optimize PID control. For the course-following control, the domain of discourse of the fuzzy control system is fixed, and lots of control rules remain idle in the control process, which leads to low accuracy. Therefore, fuzzy control methods still have scope for improvement in terms of course-following control. Neural networks show strong fault tolerance and adaptive learning characteristics. They can better analyze and integrate perceptual information in the nonstructured environment of fields and are important ways of improving the navigation and path planning of platforms.

4.2. Motion Controllers of Phenotyping Platforms for Field Crops

Controllers are core components of control systems used to control and monitor various devices and elements in the system. The performance of controllers directly influences the reliability of control systems, data processing speed, and timeliness of data acquisition. In the control system of a phenotyping platform for field crops, industrial per-sonal computers (IPCs), programmable logic controllers (PLCs), or single-board computers generally serve as controllers to acquire data recorded by sensors and control the generation and output of instructions.
Luo et al. [137] developed an intelligent, mobile, agricultural working platform and designed a navigation control system based on the GPS and electronic compass by using an IPC as the upper computer while using a single-board computer as the lower computer. The system is a beneficial attempt in the research on the working platform of “precision agriculture”. Taking the Raspberry Pi single-board computer as the core controller, Zhang compiled an autonomous navigation control program using Python and developed a human–computer interaction interface based on HTML5, which achieves ridge walking, autonomous navigation, and fast acquisition of agricultural information. Bak et al. [138] designed a robot platform for detecting field information, in which a PC is used as the master controller and an RS232 serial port is adopted to receive status information from the RTK-DGPS, directional gyroscope, and geomagnetic compass. A CAN-Bus control motor is used to control the four-wheel rotation of the robot, thus achieving the following accuracy at the centimeter level. For the phenotyping platforms for field crops developed by Lu [139], an outdoor controller PLC is used, the four-wheel cooperative motion control of fuzzy PID is applied, and RGB and thermal infrared cameras are carried to obtain the phenotype information of cotton. Using the ARM9 embedded mini2440 master controller and Linux operating system, Zhao et al. [140] designed a variable structure method to prevent integral supersaturation in PID controllers. In addition, the method is combined with the self-adaptive filtering algorithm to improve the stability and accuracy of the navigation system on the agricultural robot platforms. Based on a multi-rotor drone platform, Liao devised a multi-rotor flight control system with STM32F407 as the master controller, which can rapidly adjust the pose within 1 to 2 s in the field environment, showing strong anti-jamming performance. It meets the requirement for acquiring the phenotype information of field crops using multi-rotor drones at low altitude. Sabanci et al. [141] developed a power chassis control system based on PLC for collecting field information, which processes the obtained image data using the host computer and fulfils the field operation based on the mechanism of execution.
Phenotyping platforms for field crops run in complex environments and integrate diverse sensors, which sets high requirements for the timeliness, reliability, and compatibility of the controllers. PLC controllers show multiple advantages including simple programming, low failure rate, robustness, and convenient usage and maintenance, meaning that they can be applied to harsh field environments for long-term deployment. They have become the preferred controller for use on track-type phenotyping platforms for field crops. Single-board computers are highly real-time, fast, can be used across a wide range, and are mainly applied to drone-borne and IoT-based phenotyping platforms for field crops. IPCs are stable, reliable, highly compatible, and applicable to vehicle-mounted platforms in complex environments. The controllers on phenotyping platforms for field crops have become an important tool for promoting automatic phenotype monitoring. They can not only reduce the labor intensity of agricultural production but also improve the efficiency of information acquisition.

5. Phenotype Data Processing Algorithms for Field Crops

5.1. Phenotype Data Processing Technologies

With the development of artificial intelligence (AI) algorithms, intelligent data processing methods including machine learning and deep learning have been applied to the processing of a bulk of image data of crop phenotypes. This can achieve the full-automatic and accurate resolution of phenotype information. These algorithms have shown powerful data processing advantages in image classification, identification, feature extraction, and high-throughput automatic resolution of phenotyping traits in research on crop phenotypes. The commonly used data processing technologies for crop phenotypes include machine vision, 3D reconstruction, machine learning, and deep learning. Table 3 lists the commonly used phenotype data analysis technologies and the corresponding applications.
Machine vision refers to simulating the visual system of humans using theories including image processing, image identification, and analysis. It is characterized by high real-time performance, high positioning accuracy, and the ability to enhance the capabilities of intelligent systems. Machine vision is generally applied to four types of analysis, namely identification, classification, evaluation, and prediction, and it can acquire phenotype parameters including the leaf length, leaf width, area, and perimeter. However, due to the interference of factors including illumination difference and shading, the processing and analysis technologies of phenotype images of some crops still show the following disadvantages: difficulty in feature design and limited ability in complex tasks. Machine vision fails to solve the problem of overlapping and shading of adjacent leaves, spikes, and fruits.
For the bulk of the data and complexity of phenotype images, deep learning has been extensively applied to phenotyping research on diverse crops considering its powerful feature extraction capacity and modeling capacity. Deep learning extracts the height in target characteristics, thus significantly improving target identification and detection accuracy under complex conditions in a real environment. This solves the problem of spike density among the wheat population in the field and predicts the national- and county-level corn and soybean (Glycine max) yields [142].
Three-dimensional reconstruction is an important tool for describing the full information structure of crop morphologies and can be applied to a wide variety of crops. However, factors including the difference in features of reconstructed objects, difficulty in data extraction, and high price of 3D scanners to some extent restrict the development of 3D reconstruction technology.
Table 3. Phenotyping technologies of crops and their application cases.
Table 3. Phenotyping technologies of crops and their application cases.
Phenotyping TechnologiesPhenotyping MethodsPhenotype ParametersCrops
Machine visionConvolutional neural networkPlant height, variety classification [143], and wheat spike identification [144,145]Potato, wheat, and broomcorn
Deep learning and machine visionDeep convolutional neural network (DCNN)Number of stems, phenotypes of stem width, and yield traitBroomcorn, sugarcane, cereal, corn, and lettuce
Support vector machine (SVM)Canopy coverage, vegetation index, and flowering phenotype detectionCotton
Artificial neural network (ANN)Green area index (GAI)Wheat
Three-dimensional reconstructionStructure from motion (SFM)Plant height [146,147] and crop morphology [148]Corn and wheat

5.2. Phenotype Data Processing and Management Software

In recent years, various high-throughput phenotyping analysis platforms have been equipped with powerful data processing and management software systems, which show functions including data acquisition and storage, data analysis, and information mining. Phenotyping data processing and management software integrates these massive initial data to analyze crop phenotype parameters, mine information of biological significance, deepen phenotype and genetic research, and accurately manage fields. The commonly used phenotype data processing and management software is displayed in Table 4. Such software can automatically or semi-automatically extract digital features of the shape, size, color, and spectral characteristics from complex images of many crops including corn, barley (Hordeum vulgare), Arabidopsis thaliana, and wheat. They have integrated functions of complete image analysis from processing to descriptive statistics and can be run on platforms including OS X, Windows, and Linux. At present, most commercial phenotypic data analysis software relies on customization of specific hardware platforms. In addition, the extracted phenotype data are relatively one-sided, the installation and maintenance cost of the software is high, and it is difficult to operate. The above problems hinder the development of phenotype data analysis tools towards the universality, practicability, and standardization and their wide application.

6. Pending Problems

Over the years, the emergence of novel phenotyping sensors, intelligent monitoring systems, and digital processing methods has provided ample carriers and technologies for the fast, accurate, and non-invasive monitoring of phenotypes relating to morphological and physical characteristics of whole plants or canopies. However, these sensors, monitoring systems, and methods also remain to be further improved from the following perspectives:
  • Lack of R&D and integration technologies of novel phenotyping sensors. Breakthroughs remain to be made in the R&D and field application of low-cost phenotyping sensors for monitoring traits relating to the resistance and nutrition of crops. Most imaging-type phenotyping sensors are not applicable to the dynamic phenotype monitoring of field crops and cannot overcome sensor shaking due to platform vibration, so the collected images are blurred and distorted. A single sensor can only acquire limited data, while the use of multiple sensors together faces technological problems pertaining to system standards and synchronous calibration. Moreover, technological problems relating to the integration of multi-source phenotype information at different scales in different growth stages also pose a challenge for phenotype research teams.
  • Urgent need to develop low-cost and highly applicable phenotyping platforms. Phenotyping platforms for field crops generally use specific commercial software to fulfil hardware control, data management, and trait analysis, to which the investment and maintenance cost are prohibitive. Platforms and sensor systems also cost tens of thousands of dollars. In addition, some phenotyping platforms for field crops are designed to adapt to specific crops and agronomic traits, which limits their utilization in other crops and plots with different agronomic designs. In addition, changes relating to the plant height and size in the crop growth process also limit the utilization of platforms in all the growth stages.
  • Incomplete development standards for phenotype monitoring systems. Definite development standards are unavailable for various modules including the sensor acquisition, communication transmission, and data analysis, so that software and hardware systems of many phenotype monitoring systems follow different development and application standards. This limits the secondary development and promotion of the technology.
  • Timeliness of data processing to be improved. It is acknowledged that the interactions between field crops and environments are complex, and the soil shows heterogeneity. This means that relevant external environmental factors can all affect the stability and accuracy of phenotype monitoring systems for field crops in navigation, positioning, target detection, and data transmission in field crop phenotyping monitoring systems. Limited by the computer hardware and due to the influences of algorithms and software, the data processing and phenotyping trait extraction of monitoring systems are mainly performed offline, during which it is challenging to ensure timeliness and online control.

7. Prospects

The limitations of phenotype monitoring systems for field crops are all inevitable problems influencing practical field application. The main development and solution directions of future research into crop phenotypes include:
  • Multi-sensor integration and multi-source data fusion. A ground-based automatic acquisition system (e.g., swarm robots) for phenotype information needs to be established, and a multi-dimensional phenotype information acquisition system combining ground-based and aerial platforms is suggested to be deployed. This can realize data acquisition with full spatial coverage and improve the data throughput of multi-scale monitoring systems. A multi-sensor integrated system needs to be developed to achieve high-integration and high-resolution phenotype collection with strong anti-jamming performance and to fully integrate traits recorded by these sensors, so as to realize parallel tests of multiple parameters. Multi-source phenotype data should be further mined, arranged, and visualized. Additionally, multi-source data fusion methods should be explored to acquire the correspondence between genetic characteristics and presentation of phenotyping traits of crops.
  • Optimizing platform mechanisms, improving data quality, and enhancing field applicability of platforms. Design of mobile structures of platforms should be innovated to enhance the anti-vibration property and stability of platforms and improve the accuracy of data collected on complex terrains. Automatic regulating devices or modular mechanism design can be used so that platforms are adaptive to different planting systems, including the plant height, row spacing, and field layout, and can be flexibly operated in various environments and can execute tasks to acquire phenotype information about different crops.
  • Building a unified, open, standardized technological system. The cooperation between developers of phenotyping platforms and sensors can be enhanced to form the unified and open platform and sensor standards and provide more opportunities of secondary development for more researchers. This can also provide technological support for multi-sensor integration and intelligent acquisition of platforms. Aiming at the acquired multi-source data, normalized and standardized processing standards and data management systems should be established to provide data support for the application of information processing technologies including data storage, sharing, analysis, and decision making.
  • Optimizing and upgrading data processing software. Processing software should be developed to meet the demand for efficient data analysis in the context of big data. The application of emerging technologies such as machine learning and AI to the sensing and control of phenotyping platforms should be explored to understand scenarios and extract phenotyping traits more efficiently. Novel data processing algorithms are suggested to be combined to further improve the speed and accuracy of automatic information processing of monitoring systems in practical production environments with varying levels of illumination and backgrounds to achieve high-quality, online, real-time data processing.

8. Conclusions

High-throughput, automated, high-resolution crop phenotyping platforms and analysis technologies are key to accelerating crop improvement and breeding processes, increasing the yield, and enhancing the resistance to disease. However, overcoming the complexities of the field environment, rapidly obtaining complex traits pertaining to the crop yield, resistance, quality, and nutrition, and storing and analyzing multi-sequence and multi-source high-throughput phenotypic data in real time remain challenges in the development of current phenotypic techniques. To solve these problems, for phenotypic monitoring technologies, multi-sensor integrated systems should be developed, so as to achieve the goals of high integration and high resolution; for phenotyping platforms, the mechanisms should be optimized and the development standard should be unified; for the motion control system of platforms, high-accuracy and automated control systems need to be constructed for field crops; for data processing, real-time and efficient algorithms for parsing and managing phenotypic parameters should be developed. With the further development of relevant technologies in the future, high-throughput, low-cost plant phenotypic information collection technologies and platforms will develop from experimental research into production and application and form a relevant industry based on associated technologies. This may help to promote the creation of a new state of the art for genomics as applied to precision agriculture.

Author Contributions

Conceptualization, H.Y. and J.N.; data curation, H.Y. and Q.X.; formal analysis, H.Y. and M.S.; funding acquisition, J.N. and Y.Z.; investigation, H.Y., Y.L., Q.X. and M.S.; methodology, J.N. and H.Y.; project administration, J.N.; resources, W.C. and Y.Z.; software, H.Y. and M.S.; supervision, W.C. and Y.Z.; validation, Y.L.; visualization, H.Y.; writing—original draft, H.Y.; writing—review and editing, J.N. and H.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by the National Key Research and Development Program of China (grant number 2021YFD2000105), the Modern Agricultural Machinery Equipment and Technology Demonstration and Promotion of Jiangsu Province (grant number NJ2021-58), and the Primary Research and Development Plan of Jiangsu Province of China (grant number BE2019306, BE2021304).

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

We would like to thank all the researchers in the Intelligent Equipment Research Group of the National Engineering and Technology Center for Information Agriculture and all the foundations for this research.

Conflicts of Interest

We declare that we do not have any commercial or associative interests that represent conflicts of interest in connection with the work submitted.

References

  1. Morisse, M.; Wells, D.M.; Millet, E.J.; Lillemo, M.; Fahrner, S.; Cellini, F.; Lootens, P.; Muller, O.; Herrera, J.M.; Bentley, A.R.; et al. A European perspective on opportunities and demands for field-based crop phenotyping. Field Crops Res. 2022, 276, 108371. [Google Scholar] [CrossRef]
  2. Ying-Hong, P. Analysis of Concepts and Categories of Plant Phenome and Phenomics. Acta Agron. Sin. 2015, 41, 175–186. [Google Scholar] [CrossRef]
  3. Sheikh, M.; Iqra, F.; Ambreen, H.; Pravin, K.A.; Ikra, M.; Chung, Y.S. Integrating artificial intelligence and high-throughput phenotyping for crop improvement. J. Integr. Agric. 2023. [Google Scholar] [CrossRef]
  4. Yang, W.; Feng, H.; Zhang, X.; Zhang, J.; Doonan, J.H.; Batchelor, W.D.; Xiong, L.; Yan, J. Crop Phenomics and High-Throughput Phenotyping: Past Decades, Current Challenges, and Future Perspectives. Mol. Plant 2020, 13, 187–214. [Google Scholar] [CrossRef]
  5. Jin, X.; Yang, W.; Doonan, J.H.; Atzberger, C. Crop phenotyping studies with application to crop monitoring. Crop J. 2022, 10, 1221–1223. [Google Scholar] [CrossRef]
  6. Lijin to Carry Out Targeted Field Management and Agricultural Gas Services in Wheat Field Greening Period. Available online: http://sd.cma.gov.cn/gslb/dysqxj/xwzx/gzdt/202103/t20210308_2907884.html (accessed on 21 October 2023).
  7. It is Not Difficult to Wear AR Glasses to Tour the Fields Accurately and Quickly to Identify Pests and Diseases—Financial Headlines. Available online: https://t.cj.sina.com.cn/articles/view/7517400647/1c0126e4705904gq4m (accessed on 24 October 2023).
  8. Jimenez-Berni, J.A.; Deery, D.M.; Rozas-Larraondo, P.; Condon, A.T.G.; Rebetzke, G.J.; James, R.A.; Bovill, W.D.; Furbank, R.T.; Sirault, X.R.R. High Throughput Determination of Plant Height, Ground Cover, and Above-Ground Biomass in Wheat with LiDAR. Front. Plant Sci. 2018, 9, 237. [Google Scholar] [CrossRef]
  9. Barker, J.; Zhang, N.; Sharon, J.; Steeves, R.; Wang, X.; Wei, Y.; Poland, J. Development of a field-based high-throughput mobile phenotyping platform. Comput. Electron. Agric. 2016, 122, 74–85. [Google Scholar] [CrossRef]
  10. Virlet, N.; Sabermanesh, K.; Sadeghi-Tehran, P.; Hawkesford, M.J. Field Scanalyzer: An automated robotic field phenotyping platform for detailed crop monitoring. Funct. Plant Biol. 2017, 44, 143. [Google Scholar] [CrossRef]
  11. Field Flux Robot—Adigo AS. Available online: https://www.adigo.no/portfolio/field-flux-robot-2-2/?lang=en (accessed on 21 October 2023).
  12. Caturegli, L.; Corniglia, M.; Gaetani, M.; Grossi, N.; Magni, S.; Migliazzi, M.; Angelini, L.; Mazzoncini, M.; Silvestri, N.; Fontanelli, M.; et al. Unmanned Aerial Vehicle to Estimate Nitrogen Status of Turfgrasses. PLoS ONE 2016, 11, e158268. [Google Scholar] [CrossRef]
  13. Borges, C.S.; Chakraborty, S.; Weindorf, D.C.; Lopes, G.; Guilherme, L.R.G.; Curi, N.; Li, B.; Ribeiro, B.T. Pocket-sized sensor for controlled, quantitative and instantaneous color acquisition of plant leaves. J. Plant Physiol. 2022, 272, 153686. [Google Scholar] [CrossRef]
  14. Markwell, J.; Osterman, J.C.; Mitchell, J.L. Calibration of the Minolta SPAD-502 leaf chlorophyll meter. Photosynth. Res. 1995, 46, 467–472. [Google Scholar] [CrossRef]
  15. Li Zhenhai, W.J.H.P. Modelling of crop chlorophyll content based on Dualex. Trans. Chin. Soc. Agric. Eng. 2015, 31, 191–197. [Google Scholar]
  16. Danner, M.; Locherer, M.; Hank, T.; Richter, K. Spectral Sampling with the ASD FIELDSPEC 4; GFZ Data Services: Potsdam, Germany, 2015. [Google Scholar]
  17. Dallon, D. Comparison of the Analytical Spectral Devices FieldSpec Pro JR and the Apogee/StellarNet Model SPEC-PAR/NIR Spectroradiometers; Crop Physiology Laboratory: Logan, UT, USA, 2003. [Google Scholar]
  18. Kuester, M.; Thome, K.; Krause, K.; Canham, K.; Whittington, E. Comparison of Surface Reflectance Measurements from Three ASD FieldSpec FR Spectroradiometers and One ASD FieldSpec VNIR Spectroradiometer, 2001/1/1, 2001; IEEE: Piscataway, NJ, USA, 2001; pp. 72–74. [Google Scholar]
  19. Jia, H.; Yan, G.; Lijun, W.; Yan, Z.; Ben, Z.; Laigang, W. Monitor Model of Corn Leaf Area Index Based on CGMD-402. Trans. Chin. Soc. Agric. Mach. 2019, 50, 187–194. [Google Scholar]
  20. Chen, Q.; Zhang, Z.; Liu, P.; Wang, X.; Jiang, F. Monitoring of Growth Parameters of Sweet Corn Using CGMD302 Spectrometer. Agric. Sci. Technol. 2015, 16, 364. [Google Scholar]
  21. Jordan, B.S.; Branch, W.D.; Coffin, A.W.; Smith, C.M.; Culbreath, A.K. Comparison of Trimble GreenSeeker and Crop Circle (Model ACS-210) Reflectance Meters for Assessment of Severity of Late Leaf Spot. Peanut Sci. 2019, 46, 110–117. [Google Scholar] [CrossRef]
  22. Aranguren, M.; Castellón, A.; Aizpurua, A. Crop sensor based non-destructive estimation of nitrogen nutritional status, yield, and grain protein content in wheat. Agriculture 2020, 10, 148. [Google Scholar] [CrossRef]
  23. Wei, F.; Yonghua, W.; Yingxin, X.; Guozhang, K.; Yunji, Z.; Tiancai, G. Review of Study on Technique of Crop Nitrogen Diagnosis. Chin. Agric. Sci. Bull. 2008, 179–185. Available online: https://kns.cnki.net/kcms2/article/abstract?v=Pk5Eu7LuuI5lpLK-B3loP2_Cov4MdKdf4fwkP4Qmejj0TxnBv_ALCFRazqaCHgL2vD4e5Xq6AdT58g_Byp4YkZEJ-FF6xf4e5Cn-zBOofsCAUPOpU6u-pgNLoXKFRBW1&uniplatform=NZKPT&language=CHS (accessed on 21 October 2023).
  24. Reuzeau, C.; Pen, J.; Frankard, V.; Wolf, J.D.; Camp, W.V. TraitMill: A Discovery Engine for Identifying Yield-enhancement Genes in Cereals. Mol. Plant Breed. 2005, 3. Available online: https://kns.cnki.net/kcms2/article/abstract?v=Pk5Eu7LuuI54t2gI-sFfh-Qh3lWLI7G9Q2XgvoHDJ0e8E8zDdx6uhXkRHhiSGsG5fIf4LejZJ9uZMy_XEiVjhllHYWiKJhO9MviPmWY6D--dz-2TajovB_Ao84LKbGHR&uniplatform=NZKPT&language=CHS (accessed on 21 October 2023). [CrossRef]
  25. Reuzeau, C. TraitMill (TM): A high throughput functional genomics platform for the phenotypic analysis of cereals. Vitr. Cell. Dev. Biol. Anim. 2007, 43, S4. [Google Scholar]
  26. Furbank, R.T.; Tester, M. Phenomics—Technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef]
  27. Johansen, K.; Morton, M.J.L.; Malbeteau, Y.M.; Aragon, B.; Al-Mashharawi, S.K.; Ziliani, M.G.; Angel, Y.; Fiene, G.M.; Negrão, S.S.C.; Mousa, M.A.A.; et al. Unmanned Aerial Vehicle-Based Phenotyping Using Morphometric and Spectral Analysis Can Quantify Responses of Wild Tomato Plants to Salinity Stress. Front. Plant Sci. 2019, 10, 370. [Google Scholar] [CrossRef]
  28. Chen, D.; Neumann, K.; Friedel, S.; Kilian, B.; Chen, M.; Altmann, T.; Klukasa, C. Dissecting the Phenotypic Components of Crop Plant Growth and Drought Responses Based on High-Throughput Image Analysis. Plant Cell 2014, 26, 4636–4655. [Google Scholar] [CrossRef] [PubMed]
  29. Arvidsson, S.; Perez-Rodriguez, P.; Mueller-Roeber, B. A growth phenotyping pipeline for Arabidopsis thaliana integrating image analysis and rosette area modeling for robust quantification of genotype effects. New Phytol. 2011, 191, 895–907. [Google Scholar] [CrossRef] [PubMed]
  30. Brief Discussion on Plant Phenotypic Characters. Available online: https://mp.weixin.qq.com/s?__biz=MzU2NzI1NjkzNw==&mid=2247507575&idx=1&sn=1b2849ebee61fcc5d52b89abf8f03996&chksm=fc9d6c71cbeae567c3c25a9255d29db5881c01b646b19122b17d400040dd415215ee6115b69a&scene=27 (accessed on 21 October 2023).
  31. Studnicki, M.; Wijata, M.; Sobczyński, G.; Samborski, S.; Gozdowski, D.; Rozbicki, J. Effect of genotype, environment and crop management on yield and quality traits in spring wheat. J. Cereal Sci. 2016, 72, 30–37. [Google Scholar] [CrossRef]
  32. Ye, J.; Zhong, T.; Zhang, D.; Ma, C.; Wang, L.; Yao, L.; Zhang, Q.; Zhu, M.; Xu, M. The Auxin-Regulated Protein ZmAuxRP1 Coordinates the Balance between Root Growth and Stalk Rot Disease Resistance in Maize. Mol. Plant 2019, 12, 360–373. [Google Scholar] [CrossRef]
  33. Xiu-ying, H.; Yao-ping, L.; Yong-sheng, C.; Zhao-ming, C.; Yue-han, C. Reviews and prospects for the research of rice grain quality. Rice Res. Inst. Guangdong Acad. Agric. Sci. 2009, 1, 11–16. [Google Scholar] [CrossRef]
  34. Tripodi, P.; Massa, D.; Venezia, A.; Cardi, T. Sensing technologies for precision phenotyping in vegetable crops: Current status and future challenges. Agronomy 2018, 8, 57. [Google Scholar] [CrossRef]
  35. Paiao, G.; Fernández, F.; Spackman, J.; Kaiser, D.; Weisberg, S. Ground-based optical canopy sensing technologies for corn nitrogen management in the Upper Midwest. Agron. J. 2020, 112, 2998–3011. [Google Scholar] [CrossRef]
  36. Jin, X.; Zarco-Tejada, P.J.; Schmidhalter, U.; Reynolds, M.P.; Hawkesford, M.J.; Varshney, R.K.; Yang, T.; Nie, C.; Li, Z.; Ming, B.; et al. High-Throughput Estimation of Crop Traits: A Review of Ground and Aerial Phenotyping Platforms. IEEE Geosci. Rem. Sen. M. 2021, 9, 200–231. [Google Scholar] [CrossRef]
  37. Yuanqi, Z.; Dunliang, W.; Chen, C.; Rui, L.I.; Dongshuang, L.I.; Tao, L.; Chengming, S.; Xiaochun, Z.; Shengping, L.; Dawei, D. Prediction of wheat yield based on color index and texture feature index of unmanned aerial vehicle RGB image. J. Yangzhou Univ. (Agric. Life Sci. Ed.) 2021, 42, 110–116. [Google Scholar] [CrossRef]
  38. Bowman, B.C.; Chen, J.; Zhang, J.; Wheeler, J.; Wang, Y.; Zhao, W.; Nayak, S.; Heslot, N.; Bockelman, H.; Bonman, J.M. Evaluating Grain Yield in Spring Wheat with Canopy Spectral Reflectance. Crop Sci. 2015, 55, 1881–1890. [Google Scholar] [CrossRef]
  39. Yaxiao, N.; Liyuan, Z.; Wenting, H.; Guomin, S. Fractional Vegetation Cover Extraction Method of Winter Wheat Based on UAV Remote Sensing and Vegetation Index. Trans. Chin. Soc. Agric. Mach. 2018, 49, 212–221. [Google Scholar] [CrossRef]
  40. Qian, W.; Hong, S.; Minzan, L.; Yuanyuan, S.; Yane, Z. Research on precise segmentation and chlorophyll diagnosis of maize multispectral images. Spectrosc. Spect. Anal. 2015, 35, 178–183. [Google Scholar]
  41. Jun, S.; Xiaming, J.; Hanping, M.; Xiaohong, W.; Wenjing, Z.; Xiaodong, Z.; Hongyan, G. Detection of nitrogen content in lettuce leaves based on spectroscopy and texture using hyperspectral imaging technology. Trans. Chin. Soc. Agric. Eng. 2014, 30, 167–173. [Google Scholar] [CrossRef]
  42. Choi, S.K.; Lee, S.K.; Jung, S.H.; Choi, J.W.; Choi, D.Y.; Chun, S.J. Estimation of Fractional Vegetation Cover in Sand Dunes Using Multi-spectral Images from Fixed-wing UAV. J. Korean Soc. Surv. Geod. Photogramm. Cartogr. 2016, 34, 431–441. [Google Scholar] [CrossRef]
  43. Hong, S.; Tao, Z.; Ning, L.; Meng, C.; Minzan, L.; Qin, Z. Vertical distribution of chlorophyll in potato plants based on hyperspectral imaging. Trans. Chin. Soc. Agric. Eng. 2018, 34, 149–156. [Google Scholar] [CrossRef]
  44. Zhang, Y.; Xia, C.; Zhang, X.; Cheng, X.; Feng, G.; Wang, Y.; Gao, Q. Estimating the maize biomass by crop height and narrowband vegetation indices derived from UAV-based hyperspectral images. Ecol. Indic. 2021, 129, 107985. [Google Scholar] [CrossRef]
  45. Yang, W.; Nigon, T.; Hao, Z.; Dias Paiao, G.; Fernández, F.G.; Mulla, D.; Yang, C. Estimation of corn yield based on hyperspectral imagery and convolutional neural network. Comput. Electron. Agric. 2021, 184, 106092. [Google Scholar] [CrossRef]
  46. Feng, L.; Zhang, Z.; Ma, Y.; Du, Q.; Williams, P.; Drewry, J.; Luck, B. Alfalfa Yield Prediction Using UAV-Based Hyperspectral Imagery and Ensemble Learning. Remote Sens. 2020, 12, 2028. [Google Scholar] [CrossRef]
  47. Hang, Y.; Lifu, Z.; Qingxi, T. Identification of corn seed varieties using visible/near infrared imaging spectroscopy. Infrared Laser Eng. 2013, 42, 2437–2441. [Google Scholar]
  48. Jian, Z.; Jin, M.; BiQuan, Z.; Dongyan, Z.; Jing, X. Prediction of chlorophyll (SPAD) distribution in rice leaves by consumer near-infrared cameras. Spectrosc. Spect. Anal. 2018, 38, 737–744. [Google Scholar]
  49. Xie, L.; Ying, Y.; Ying, T. Quantification of chlorophyll content and classification of nontransgenic and transgenic tomato leaves using visible/near-infrared diffuse reflectance spectroscopy. J. Agric. Food Chem. 2007, 55, 4645–4650. [Google Scholar] [CrossRef] [PubMed]
  50. Cozzolino, D. The role of near-infrared sensors to measure water relationships in crops and plants. Appl. Spectrosc. Rev. 2017, 52, 837–849. [Google Scholar] [CrossRef]
  51. Han, M.; Zhang, H.; DeJonge, K.C.; Comas, L.H.; Trout, T.J. Estimating maize water stress by standard deviation of canopy temperature in thermal imagery. Agric. Water Manag. 2016, 177, 400–409. [Google Scholar] [CrossRef]
  52. Giménez-Gallego, J.; González-Teruel, J.D.; Soto-Valles, F.; Jiménez-Buendía, M.; Navarro-Hellín, H.; Torres-Sánchez, R. Intelligent thermal image-based sensor for affordable measurement of crop canopy temperature. Comput. Electron. Agric. 2021, 188, 106319. [Google Scholar] [CrossRef]
  53. Biswal, S.; Chatterjee, C.; Mailapalli, D.R. Damage Assessment Due to Wheat Lodging Using UAV-Based Multispectral and Thermal Imageries. J. Indian. Soc. Remote 2023, 51, 935–948. [Google Scholar] [CrossRef]
  54. Pradawet, C.; Khongdee, N.; Pansak, W.; Spreer, W.; Hilger, T.; Cadisch, G. Thermal imaging for assessment of maize water stress and yield prediction under drought conditions. J. Agron. Crop Sci. 2023, 209, 56–70. [Google Scholar] [CrossRef]
  55. Guo, J.; Tian, G.; Zhou, Y.; Wang, M.; Ling, N.; Shen, Q.; Guo, S. Evaluation of the grain yield and nitrogen nutrient status of wheat (Triticum aestivum L.) using thermal imaging. Field Crop Res. 2016, 196, 463–472. [Google Scholar] [CrossRef]
  56. Elsherbiny, O.; Zhou, L.; Feng, L.; Qiu, Z. Integration of Visible and Thermal Imagery with an Artificial Neural Network Approach for Robust Forecasting of Canopy Water Content in Rice. Remote Sens. 2021, 13, 1785. [Google Scholar] [CrossRef]
  57. Song, X.; Yang, G.; Yang, C.; Wang, J.; Cui, B. Spatial Variability Analysis of Within-Field Winter Wheat Nitrogen and Grain Quality Using Canopy Fluorescence Sensor Measurements. Remote Sens. 2017, 9, 237. [Google Scholar] [CrossRef]
  58. Wang, J.; Zhang, Y.; Gu, R. Research Status and Prospects on Plant Canopy Structure Measurement Using Visual Sensors Based on Three-Dimensional Reconstruction. Agriculture 2020, 10, 462. [Google Scholar] [CrossRef]
  59. Ma, X.; Zhu, K.; Guan, H.; Feng, J.; Yu, S.; Liu, G. High-Throughput Phenotyping Analysis of Potted Soybean Plants Using Colorized Depth Images Based on A Proximal Platform. Remote Sens. 2019, 11, 1085. [Google Scholar] [CrossRef]
  60. Xiang, L.; Bao, Y.; Tang, L.; Ortiz, D.; Salas-Fernandez, M.G. Automated morphological traits extraction for sorghum plants via 3D point cloud data analysis. Comput. Electron. Agric. 2019, 162, 951–961. [Google Scholar] [CrossRef]
  61. Gai, J.; Xiang, L.; Tang, L. Using a depth camera for crop row detection and mapping for under-canopy navigation of agricultural robotic vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
  62. Andújar, D.; Fernández-Quintanilla, C.; Dorado, J. Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry. Sensors 2015, 15, 12999–13011. [Google Scholar] [CrossRef]
  63. Walter, J.D.; Edwards, J.; McDonald, G.; Kuchel, H. Estimating biomass and canopy height with LiDAR for field crop breeding. Front. Plant Sci. 2019, 10, 1145. [Google Scholar] [CrossRef]
  64. Liu, S.; Baret, F.; Abichou, M.; Boudon, F.; Thomas, S.; Zhao, K.; Fournier, C.; Andrieu, B.; Irfan, K.; Hemmerlé, M. Estimating wheat green area index from ground-based LiDAR measurement using a 3D canopy structure model. Agric. For. Meteorol. 2017, 247, 12–20. [Google Scholar] [CrossRef]
  65. Wu, L.; Zhu, X.; Lawes, R.; Dunkerley, D.; Zhang, H. Comparison of machine learning algorithms for classification of LiDAR points for characterization of canola canopy structure. Int. J. Remote Sens. 2019, 40, 5973–5991. [Google Scholar] [CrossRef]
  66. Laliberte, A.S.; Rango, A. Image Processing and Classification Procedures for Analysis of Sub-decimeter Imagery Acquired with an Unmanned Aircraft over Arid Rangelands. GIScience Remote Sens. 2011, 48, 4–23. [Google Scholar] [CrossRef]
  67. Huichun, Z.; Hongping, Z.; Jiaqiang, Z.; Yufeng, G.; Yangxian, L. Research Progress and Prospect in Plant Phenotyping Platformand Image Analysis Technology. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–17. [Google Scholar]
  68. Yihua, L.; Tiemin, Z.; Yubin, L. Design and test of attitude stabilization control system of multi-rotor unmanned aerial vehicle applied in farmland information acquisition. Trans. Chin. Soc. Agric. Eng. 2017, 33, 88–98. [Google Scholar] [CrossRef]
  69. Schirrmann, M.; Giebel, A.; Gleiniger, F.; Pflanz, M.; Lentschke, J.; Dammer, K. Monitoring Agronomic Parameters of Winter Wheat Crops with Low-Cost UAV Imagery. Remote Sens. 2016, 8, 706. [Google Scholar] [CrossRef]
  70. Vadez, V.; Kholová, J.; Hummel, G.; Zhokhavets, U.; Gupta, S.K.; Hash, C.T. LeasyScan: A novel concept combining 3D imaging and lysimetry for high-throughput phenotyping of traits controlling plant water budget. J. Exp. Bot. 2015, 66, 5581–5593. [Google Scholar] [CrossRef]
  71. Kirchgessner, N.; Liebisch, F.; Yu, K.; Pfeifer, J.; Friedli, M.; Hund, A.; Walter, A. The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 2017, 44, 154. [Google Scholar] [CrossRef]
  72. Bai, G.; Ge, Y.; Scoby, D.; Leavitt, B.; Stoerger, V.; Kirchgessner, N.; Irmak, S.; Graef, G.; Schnable, J.; Awada, T. NU-Spidercam: A large-scale, cable-driven, integrated sensing and robotic system for advanced phenotyping, remote sensing, and agronomic research. Comput. Electron. Agric. 2019, 160, 71–81. [Google Scholar] [CrossRef]
  73. Cubero, S.; Marco-Noales, E.; Aleixos, N.; Barbé, S.; Blasco, J. RobHortic: A Field Robot to Detect Pests and Diseases in Horticultural Crops by Proximal Sensing. Agriculture 2020, 10, 276. [Google Scholar] [CrossRef]
  74. Godoy, E.P.; Tabile, R.A.; Pereira, R.R.; Tangerino, G.T.; Porto, A.J.; Inamasu, R.Y. Design and Implementation of an Electronic Architecture for an Agricultural Mobile Robot; SciELO: São Paulo, Brasil, 2010; Volume 14, pp. 1240–1247. [Google Scholar]
  75. Underwood, J.; Wendel, A.; Schofield, B.; McMurray, L.; Kimber, R. Efficient in-field plant phenomics for row-crops with an autonomous ground vehicle. J. Field Robot. 2017, 34, 1061–1083. [Google Scholar] [CrossRef]
  76. Werner, J.P. Flex-Ro: Design, Implementation, and Control of Subassemblies for an Agricultural Robotic Platform. Ph.D. Thesis, University of Nebraska, Lincoln, NE, USA, 2016. Available online: https://digitalcommons.unl.edu/biosysengdiss/60 (accessed on 21 October 2023).
  77. Oz, Ted, Dino. Available online: https://www.naio-technologies.com/oz/ (accessed on 21 October 2023).
  78. Baret, F.; Benoit, D.S.; Samuel, T.; Philippe, B.; Shouyang, L.; Comar, A. Phenomobile: A fully automatic robot for high-throughput field phenotyping of a large range of crops with active measurements. In IAMPS-Image Analysis Methods in the Plant Sciences; 2019; Available online: https://hal.inrae.fr/hal-03646863v1/file/IAMPS_Phenomobile.pdf (accessed on 21 October 2023).
  79. From the Greenhouse to the Fields—Robohub. Available online: https://dorhoutrd.com/ (accessed on 21 October 2023).
  80. Guzmán, R.; Ariño, J.; Navarro, R.; Lopes, C.M.; Graça, J.; Reyes, M.; Barriguinha, A.; Braga, R. Autonomous hybrid gps/reactive navigation of an unmanned ground vehicle for precision viticulture—VINBOT. In Proceedings of the Intervitis Interfructa Hortitechnica—Technology for Wine, Juice and Special Crops, Stuttgart, Germany, 27–30 November 2016; Available online: https://www.researchgate.net/publication/311264530_Autonomous_hybrid_gpsreactive_navigation_of_an_unmanned_ground_vehicle_for_precision_viticulture_-VINBOT (accessed on 21 October 2023).
  81. Mueller-Sim, T.; Jenkins, M.; Abel, J.; Kantor, G. The Robotanist: A Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping, 2017/1/1, 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 3634–3639. [Google Scholar]
  82. Shafiekhani, A.; Kadam, S.; Fritschi, F.; DeSouza, G. Vinobot and Vinoculer: Two Robotic Platforms for High-Throughput Field Phenotyping. Sensors 2017, 17, 214. [Google Scholar] [CrossRef]
  83. Rowbot. Available online: https://www.rowbot.com/ (accessed on 21 October 2023).
  84. Young, S.N.; Kayacan, E.; Peschel, J.M. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precis. Agric. 2019, 20, 697–722. [Google Scholar] [CrossRef]
  85. Pérez-Ruiz, M.; Prior, A.; Martinez-Guanter, J.; Apolo-Apolo, O.E.; Andrade-Sanchez, P.; Egea, G. Development and evaluation of a self-propelled electric platform for high-throughput field phenotyping in wheat breeding trials. Comput. Electron. Agric. 2020, 169, 105237. [Google Scholar] [CrossRef]
  86. Deery, D.; Jimenez-Berni, J.; Jones, H.; Sirault, X.; Furbank, R. Proximal Remote Sensing Buggies and Potential Applications for Field-Based Phenotyping. Agronomy 2014, 4, 349–379. [Google Scholar] [CrossRef]
  87. Bai, G.; Ge, Y.; Hussain, W.; Baenziger, P.S.; Graef, G. A multi-sensor system for high throughput field phenotyping in soybean and wheat breeding. Comput. Electron. Agric. 2016, 128, 181–192. [Google Scholar] [CrossRef]
  88. Kumar, D.; Kushwaha, S.; Delvento, C.; Liatukas, Ž.; Vivekanand, V.; Svensson, J.T.; Henriksson, T.; Brazauskas, G.; Chawade, A. Affordable Phenotyping of Winter Wheat under Field and Controlled Conditions for Drought Tolerance. Agronomy 2020, 10, 882. [Google Scholar] [CrossRef]
  89. Meacham-Hensold, K.; Fu, P.; Wu, J.; Serbin, S.; Montes, C.M.; Ainsworth, E.; Guan, K.; Dracup, E.; Pederson, T.; Driever, S.; et al. Plot-level rapid screening for photosynthetic parameters using proximal hyperspectral imaging. J. Exp. Bot. 2020, 71, 2312–2328. [Google Scholar] [CrossRef] [PubMed]
  90. Thompson, A.; Thorp, K.; Conley, M.; Elshikha, D.; French, A.; Andrade-Sanchez, P.; Pauli, D. Comparing Nadir and Multi-Angle View Sensor Technologies for Measuring in-Field Plant Height of Upland Cotton. Remote Sens. 2019, 11, 700. [Google Scholar] [CrossRef]
  91. Busemeyer, L.; Mentrup, D.; Möller, K.; Wunder, E.; Alheit, K.; Hahn, V.; Maurer, H.; Reif, J.; Würschum, T.; Müller, J.; et al. BreedVision—A Multi-Sensor Platform for Non-Destructive Field-Based Phenotyping in Plant Breeding. Sensors 2013, 13, 2830–2847. [Google Scholar] [CrossRef]
  92. Andrade-Sanchez, P.; Gore, M.A.; Heun, J.T.; Thorp, K.R.; Carmo-Silva, A.E.; French, A.N.; Salvucci, M.E.; White, J.W. Development and evaluation of a field-based high-throughput phenotyping platform. Funct. Plant Biol. 2014, 41, 68. [Google Scholar] [CrossRef]
  93. Kicherer, A.; Herzog, K.; Bendel, N.; Klück, H.; Backhaus, A.; Wieland, M.; Rose, J.; Klingbeil, L.; Läbe, T.; Hohl, C.; et al. Phenoliner: A New Field Phenotyping Platform for Grapevine Research. Sensors 2017, 17, 1625. [Google Scholar] [CrossRef]
  94. Jiang, Y.; Li, C.; Robertson, J.S.; Sun, S.; Xu, R.; Paterson, A.H. GPhenoVision: A Ground Mobile System with Multi-modal Imaging for Field-Based High Throughput Phenotyping of Cotton. Sci. Rep. 2018, 8, 1213. [Google Scholar] [CrossRef]
  95. Zhou, J.; Reynolds, D.; Websdale, D.; Cornu, T.L.; Gonzaleznavarro, O.; Lister, C.; Orford, S.; Laycock, S.; Finlayson, G.; Stitt, T. CropQuant: An automated and scalable field phenotyping platform for crop monitoring and trait measurements to facilitate breeding and digital agriculture. BioRxiv 2017, 161547. [Google Scholar] [CrossRef]
  96. Reynolds, D.; Ball, J.; Bauer, A.; Davey, R.; Griffiths, S.; Zhou, J. CropSight: A scalable and open-source information management system for distributed plant phenotyping and IoT-based crop management. Gigascience 2019, 8, giz009. [Google Scholar] [CrossRef]
  97. Villarrubia, G.; Paz, J.F.D.; Iglesia, D.H.D.L.; Bajo, J. Combining Multi-Agent Systems and Wireless Sensor Networks for Monitoring Crop Irrigation. Sensors 2017, 17, 1775. [Google Scholar] [CrossRef]
  98. Millet, E.; Welcker, C.; Kruijer, W.; Negro, S.; Nicolas, S.; Praud, S.; Ranc, N.; Presterl, T.; Tuberosa, R.; Bedo, Z.; et al. Genome-wide analysis of yield in Europe: Allelic effects as functions of drought and heat scenarios. Plant Physiol. 2016, 172, 621–2016. [Google Scholar] [CrossRef]
  99. Alkhudaydi, T.; Reynolds, D.; Griffiths, S.; Zhou, J.; de la Iglesia, B. An Exploration of Deep-Learning Based Phenotypic Analysis to Detect Spike Regions in Field Conditions for UK Bread Wheat. Plant Phenomics 2019, 2019, 1–17. [Google Scholar] [CrossRef] [PubMed]
  100. Hirafuji, M.; Yoichi, H.; Kiura, T.; Matsumoto, K.; Fukatsu, T.; Tanaka, K.; Shibuya, Y.; Itoh, A.; Nesumi, H.; Hoshi, N. Creating High-Performance/Low-Cost Ambient Sensor Cloud System Using OpenFS (Open Field Server) for High-Throughput Phenotyping, 2011/1/1, 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 2090–2092. [Google Scholar]
  101. Beauchêne, K.; Leroy, F.; Fournier, A.; Huet, C.; Bonnefoy, M.; Lorgeou, J.; de Solan, B.; Piquemal, B.; Thomas, S.; Cohan, J. Management and Characterization of Abiotic Stress via PhénoField®, a High-Throughput Field Phenotyping Platform. Front. Plant Sci. 2019, 10, 904. [Google Scholar] [CrossRef]
  102. White, J.W.; Bostelman, R.V. Large-area overhead manipulator for access of fields. In Proceedings of the 4th International Multi-Conference on Engineering and Technological Innovation (IMETI), Orlando, FL, USA, 19–22 July 2011. [Google Scholar]
  103. Ji, Z.; Francois, T.; Tony, P.; John, D.; Daniel, R.; Neil, H.; Simon, G.; Tao, C.; Yan, Z.; Xiue, W.; et al. Plant phenoomics: Developments, current status, and challenges. J. Nanjing Agric. Univ. 2018, 41, 580–588. [Google Scholar]
  104. Higgs, N.; Leyeza, B.; Ubbens, J.; Kocur, J.; van der Kamp, W.; Cory, T.; Eynck, C.; Vail, S.; Eramian, M.; Stavness, I. ProTractor: A Lightweight Ground Imaging and Analysis System for Early-Season Field Phenotyping, 2019/1/1, 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2629–2638. [Google Scholar]
  105. Crain, J.L.; Wei, Y.; Barker, J.; Thompson, S.M.; Alderman, P.D.; Reynolds, M.; Zhang, N.; Poland, J. Development and Deployment of a Portable Field Phenotyping Platform. Crop Sci. 2016, 56, 965–975. [Google Scholar] [CrossRef]
  106. Thompson, A.L.; Thorp, K.R.; Conley, M.; Andrade-Sanchez, P.; Heun, J.T.; Dyer, J.M.; White, J.W. Deploying a Proximal Sensing Cart to Identify Drought-Adaptive Traits in Upland Cotton for High-Throughput Phenotyping. Front. Plant Sci. 2018, 9, 507. [Google Scholar] [CrossRef]
  107. Bao, Y.; Nakami, A.D.; Tang, L. Development of a Field Robotic Phenotyping System for Sorghum Biomass Yield Component Traits Characterization. In Proceedings of the Annual International Meeting of the American Society of Agricultural and Biological Engineers, Montreal, QC, Canada, 13–16 July 2014. [Google Scholar]
  108. Bao, Y.; Tang, L. Field-based Robotic Phenotyping for Sorghum Biomass Yield Component Traits Characterization Using Stereo Vision. IFAC-PapersOnLine 2016, 49, 265–270. [Google Scholar] [CrossRef]
  109. Sudduth, K.A.; Kitchen, N.R.; Drummond, S.T. Comparison of Three Canopy Reflectance Sensors for Variable-Rate Nitrogen Application in Corn, 2010/1/1, 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 18–21. [Google Scholar]
  110. Weiss, U.; Biber, P. Plant detection and mapping for agricultural robots using a 3D LIDAR sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
  111. Murman, J.N. Flex-Ro: A Robotic High Throughput Field Phenotyping System. Ph.D. Thesis, University of Nebraska, Lincoln, NE, USA, 2019. [Google Scholar]
  112. Fan, Z.; Sun, N.; Qiu, Q.; Li, T.; Zhao, C. A High-Throughput Phenotyping Robot for Measuring Stalk Diameters of Maize Crops; IEEE: Piscataway, NJ, USA, 2021. [Google Scholar] [CrossRef]
  113. Tuel, T.L. A Robotic Proximal Sensing Platform for In-Field High-Throughput Crop Phenotyping. Ph.D. Thesis, Iowa State University, Ames, IA, USA, 2019. [Google Scholar]
  114. MYCE_Agriculture. Available online: http://www.wall-ye.com/ (accessed on 21 October 2023).
  115. Tabile, R.A.; Godoy, E.P.; Pereira, R.R.D.; Tangerino, G.T.; Porto, A.J.V.; Inamasu, R.Y. Design of the mechatronic architecture of an agricultural mobile robot. IFAC Proc. Vol. 2010, 43, 717–724. [Google Scholar] [CrossRef]
  116. Xu, R.; Li, C. A modular agricultural robotic system (MARS) for precision farming: Concept and implementation. J. Field Robot. 2022, 39, 387–409. [Google Scholar] [CrossRef]
  117. Ruckelshausen, A.B.P.D. BoniRob–an autonomous field robot platform for individual plant phenotyping. Precis. Agric. 2009, 841, 1. [Google Scholar]
  118. Bangert, W.; Kielhorn, A.; Rahe, F.; Dreyer, A.; Trautz, D. Field-Robot-Based Agriculture: “RemoteFarming. 1” and “BoniRob-Apps”. VDI-Berichte 2013, 2193, 2-1. [Google Scholar]
  119. Peter, B.; Weiss, U.; Dorna, M.; Albert, A. Navigation system of the autonomous agricultural robot “BoniRob”. In Proceedings of the Workshop on Agricultural Robotics: Enabling Safe, Efficient, and Affordable Robots for Food Production, Vilamoura, Portugal, 11 October 2012. [Google Scholar]
  120. Burud, I.; Lange, G.; Lillemo, M.; Bleken, E.; Grimstad, L.; Johan From, P. Exploring Robots and UAVs as Phenotyping Tools in Plant Breeding. IFAC-PapersOnLine 2017, 50, 11479–11484. [Google Scholar] [CrossRef]
  121. Freeman, P.K.; Freeland, R.S. Agricultural UAVs in the U.S.: Potential, policy, and hype. Remote Sens. Appl. Soc. Environ. 2015, 2, 35–43. [Google Scholar] [CrossRef]
  122. Jiangang, L.; Chunjiang, Z.; Guijun, Y.; Haiyang, Y.; Xiaoqing, Z.; Bo, X.; Qinglin, N. Review of field-based phenotyping by unmanned aerial vehicle remote sensing platform. Trans. Chin. Soc. Agric. Eng. 2016, 32, 98–106. [Google Scholar]
  123. Shafian, S.; Rajan, N.; Schnell, R.; Bagavathiannan, M.; Valasek, J.; Shi, Y.; Olsenholler, J. Unmanned aerial systems-based remote sensing for monitoring sorghum growth and development. PLoS ONE 2018, 13, e196605. [Google Scholar] [CrossRef]
  124. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef]
  125. Sugiura, R.; Noguchi, N.; Ishii, K. Remote-sensing technology for vegetation monitoring using an unmanned helicopter. Biosyst. Eng. 2005, 90, 369–379. [Google Scholar] [CrossRef]
  126. Zongnan, L.; Zhongxin, C.; Limin, W.; Jia, L.; Qingbo, Z. Area extraction of maize lodging based on remote sensing by small unmanned aerial vehicle. Trans. Chin. Soc. Agric. Eng. 2014, 30, 207–213. [Google Scholar] [CrossRef]
  127. Binglin, Z.; Min, L.; Fusang, L.; Aobo, J.; Xiu, M.; Yan, G. Modeling of Canopy Structure of Field-grown Maize Based on UAV Images. Trans. Chin. Soc. Agric. Mach. 2021, 52, 170–177. [Google Scholar]
  128. Kai, K. Design of self-propelled field phenotyping platform. Master’s Thesis, Hebei Agricultural University, Baoding, China, 2020. [Google Scholar]
  129. Bakker, T.; van Asselt, K.; Bontsema, J.; Müller, J.; van Straten, G. Autonomous navigation using a robot platform in a sugar beet field. Biosyst. Eng. 2011, 109, 357–368. [Google Scholar] [CrossRef]
  130. Jing, Z.; Du, C.; Shumao, W.; Xiaoan, H.; Dong, W. Design and experiment of four-wheel independent steering driving and control system for agricultural wheeled robot. Trans. Chin. Soc. Agric. Eng. 2015, 31, 63–70. [Google Scholar] [CrossRef]
  131. Youchun, D.; Peng, Z.; Yawen, Z.; Junqiang, Y.; Wenyu, Z.; Kai, Z. Design and experiment of motion controller for information collection platform in field with Beidou positioning. Trans. Chin. Soc. Agric. Eng. 2017, 33, 178–185. [Google Scholar] [CrossRef]
  132. Kannan, P.; Natarajan, S.K.; Dash, S.S. Design and Implementation of Fuzzy Logic Controller for Online Computer Controlled Steering System for Navigation of a Teleoperated Agricultural Vehicle. Math. Probl. Eng. 2013, 2013, 590861. [Google Scholar] [CrossRef]
  133. Bengochea-Guevara, J.M.; Conesa-Muñoz, J.; Andújar, D.; Ribeiro, A. Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors 2016, 16, 276. [Google Scholar] [CrossRef]
  134. Jodas, D.S.; Marranghello, N.; Pereira, A.S.; Guido, R.C. Comparing Support Vector Machines and Artificial Neural Networks in the Recognition of Steering Angle for Driving of Mobile Robots Through Paths in Plantations. Procedia Comput. Sci. 2013, 18, 240–249. [Google Scholar] [CrossRef]
  135. Eski, O.; Kuş, Z.A. Control of unmanned agricultural vehicles using neural network-based control system. Neural Comput. Appl. 2019, 31, 583–595. [Google Scholar] [CrossRef]
  136. Jun, C.; Zhongxiang, Z.; Ryo, T.; Jun-ichi, T. Automatic On-tracking Control of Farm Vehicle Based on Neural Network. Trans. Chin. Soc. Agric. Mach. 2007, 38, 121, 131–133. [Google Scholar] [CrossRef]
  137. Xiwen, L.; Yinggang, Q. Development of agricultural intelligent mobile work platform model. Trans. Chin. Soc. Agric. Eng. 2005, 83–85. [Google Scholar]
  138. Bak, T.; Jakobsen, H. Agricultural Robotic Platform with Four Wheel Steering for Weed Detection. Biosyst. Eng. 2004, 87, 125–136. [Google Scholar] [CrossRef]
  139. Shaozhi, L. Design and Experiment of Field Crop Phenotype Detection Platform. Jorunal Huazhong Agric. Univ. 2021, 40, 209–218. [Google Scholar]
  140. Dean, Z.; Weikuan, J.; Yun, Z.; Yuyan, Z.; Wei, J.; Yun, L. Design of Agricultural Robot Autonomous Navigation Control Based on Improved Self-adaptive Filter. Trans. Chin. Soc. Agric. Mach. 2015, 46, 1–6. [Google Scholar] [CrossRef]
  141. Sabanci, K.; Aydin, C. Smart Robotic Weed Control System for Sugar Beet. J. Agric Sci. Tech. 2017, 19, 73–83. [Google Scholar]
  142. Yang, W.; Rui, Z.; Chenming, W.; Meng, W.; Xiujie, W.; Yongjin, L. A survey on deep-learning-based plant phenotype research in agriculture. Sci. Sin. Vitae 2019, 49, 698–716. (In Chinese) [Google Scholar] [CrossRef]
  143. Shengmei, H.; Zhonglai, L.; Zhonghu, H. Classification of Wheat Cultivar by Digital Image Analysis. Sci. Agric. Sin. 2005, 38, 1869–1875. [Google Scholar]
  144. Mengyang, F.; Qin, M.; Junming, L.; Qing, W.; Yue, W.; Xiongchun, D. Counting Method of Wheatear in Field Based on Machine Vision Technology. Trans. Chin. Soc. Agric. Mach. 2015, 46, 234–239. [Google Scholar]
  145. Wenchao, L.; Bin, L.; Dayu, P.; Yong, Z.; Chunhua, Y.; Cheng, W. Synchronous measurement of wheat ear length and spikelets number based on image processing. J. Chin. Agric. Mech. 2016, 37, 210–215. [Google Scholar] [CrossRef]
  146. Hongming, Z.; Ziwei, T.; Wenting, H.; Shanna, Z.; Shuyin, Z.; Chenyu, G. Extraction Method of Maize Height Based on UAV Remote Sensing. Trans. Chin. Soc. Agric. Mach. 2019, 50, 241–250. [Google Scholar] [CrossRef]
  147. Weiss, M.; Baret, F. Using 3D point clouds derived from UAV RGB imagery to describe vineyard 3D macro-structure. Remote Sens. 2017, 9, 111. [Google Scholar] [CrossRef]
  148. Zhikai, L.; Yaoxiao, N.; Yi, W.; Wenting, H. Estimation of Plant Height of Winter Wheat Based on UAV Visible Image. J. Triticeae Crops 2019, 39, 859–866. [Google Scholar]
  149. Klukas, C.; Pape, J.; Entzian, A. Analysis of high-throughput plant image data with the information system IAP. J. Integr. Bioinform. 2012, 9, 16–18. [Google Scholar] [CrossRef]
  150. Fabre, J.; Dauzat, M.; Nègre, V.; Wuyts, N.; Tireau, A.; Gennari, E.; Neveu, P.; Tisné, S.; Massonnet, C.; Hummel, I. PHENOPSIS DB: An Information System for Arabidopsis thalianaphenotypic data in an environmental context. BMC Plant Biol. 2011, 11, 77. [Google Scholar]
  151. Tessmer, O.L.; Jiao, Y.; Cruz, J.A.; Kramer, D.M.; Chen, J. Functional approach to high-throughput plant growth analysis. BMC Syst. Biol. 2013, 7 (Suppl. S6), S17. [Google Scholar] [CrossRef] [PubMed]
  152. Weight, C.; Parnham, D.; Waites, R. Technical advance: LeafAnalyser: A computational method for rapid and large-scale analyses of leaf shape variation. Plant J. 2008, 53, 578–586. [Google Scholar] [CrossRef] [PubMed]
  153. Zhou, J.; Applegate, C.; Alonso, A.D.; Reynolds, D.; Orford, S.; Mackiewicz, M.; Griffiths, S.; Penfield, S.; Pullen, N. Leaf-GP: An open and automated software application for measuring growth phenotypes for arabidopsis and wheat. Plant Methods 2017, 13, 117. [Google Scholar] [CrossRef] [PubMed]
  154. Minervini, M.; Giuffrida, M.V.; Perata, P.; Tsaftaris, S.A. Phenotiki: An open software and hardware platform for affordable and easy image-based phenotyping of rosette-shaped plants. Plant J. 2017, 90, 204–216. [Google Scholar] [CrossRef]
  155. Islam ElManawy, A.; Sun, D.; Abdalla, A.; Zhu, Y.; Cen, H. HSI-PP: A flexible open-source software for hyperspectral imaging-based plant phenotyping. Comput. Electron. Agric. 2022, 200, 107248. [Google Scholar] [CrossRef]
Figure 1. Evolution of phenotype monitoring modes for field crops: 1. [6]; 2. [7]; 3. Phenomobile Lite [8]; 4. Bowman Mudmaster sprayer-based system [9]; 5. Field Scanalyzer [10]; 6. FieldFlux [11]; 7. multi-rotor drone phenotyping platform [12].
Figure 1. Evolution of phenotype monitoring modes for field crops: 1. [6]; 2. [7]; 3. Phenomobile Lite [8]; 4. Bowman Mudmaster sprayer-based system [9]; 5. Field Scanalyzer [10]; 6. FieldFlux [11]; 7. multi-rotor drone phenotyping platform [12].
Agronomy 13 02832 g001
Figure 2. Composition of a phenotype monitoring system.
Figure 2. Composition of a phenotype monitoring system.
Agronomy 13 02832 g002
Figure 3. Phenotyping sensors and phenotype monitoring systems and their R&D institutions.
Figure 3. Phenotyping sensors and phenotype monitoring systems and their R&D institutions.
Agronomy 13 02832 g003
Figure 4. Classification of crop phenotyping traits and their typical characters.
Figure 4. Classification of crop phenotyping traits and their typical characters.
Agronomy 13 02832 g004
Figure 5. Phenotyping platforms for field crops: 1. fixed-wing drone phenotyping platform [66]; 2. fixed-wing drone phenotyping platform [67]; 3. multi-rotor drone phenotyping platform [68]; 4. multi-rotor drone phenotyping platform [68]; 5. multi-rotor drone phenotyping platform [12]; 6. multi-rotor drone phenotyping platform [69]; 7. FieldScan [70]; 8. Field Scanalyzer [10]; 9. Field Phenotyping Platform [71]; 10. NU-Spidercam [72]; 11. RobHortic [73]; 12. an agricultural mobile robot [74]; 13. Ladybird [75]; 14. Flex-Ro [76]; 15. Ted [77]; 16. FieldFlux [11]; 17. Phenomobile V2 [78]; 18. Prospero [79]; 19. Vinbot [80]; 20. Robotanist [81]; 21. Vinobot [82]; 22. OZ [77] 23. RowBot [83]; 24. TERRA-MEPP [84]; 25. a self-propelled electric platform [85]; 26. buggies [86]; 27. a proximal sensing system [87]; 28. Phenomobile Lite [8]; 29. Phenocart [88]; 30. motorized pushcart [89]; 31. Avenger-tractor-based system [90]; 32. BreedVision [91]; 33. LeeAgra 3434 DL open rider sprayer-based system [92]; 34. Bowman Mudmaster sprayer-based system [9]; 35. Phenoliner [93]; 36. GPhenoVision [94]; 37. CropQuant [95]; 38. CropSight [96].
Figure 5. Phenotyping platforms for field crops: 1. fixed-wing drone phenotyping platform [66]; 2. fixed-wing drone phenotyping platform [67]; 3. multi-rotor drone phenotyping platform [68]; 4. multi-rotor drone phenotyping platform [68]; 5. multi-rotor drone phenotyping platform [12]; 6. multi-rotor drone phenotyping platform [69]; 7. FieldScan [70]; 8. Field Scanalyzer [10]; 9. Field Phenotyping Platform [71]; 10. NU-Spidercam [72]; 11. RobHortic [73]; 12. an agricultural mobile robot [74]; 13. Ladybird [75]; 14. Flex-Ro [76]; 15. Ted [77]; 16. FieldFlux [11]; 17. Phenomobile V2 [78]; 18. Prospero [79]; 19. Vinbot [80]; 20. Robotanist [81]; 21. Vinobot [82]; 22. OZ [77] 23. RowBot [83]; 24. TERRA-MEPP [84]; 25. a self-propelled electric platform [85]; 26. buggies [86]; 27. a proximal sensing system [87]; 28. Phenomobile Lite [8]; 29. Phenocart [88]; 30. motorized pushcart [89]; 31. Avenger-tractor-based system [90]; 32. BreedVision [91]; 33. LeeAgra 3434 DL open rider sprayer-based system [92]; 34. Bowman Mudmaster sprayer-based system [9]; 35. Phenoliner [93]; 36. GPhenoVision [94]; 37. CropQuant [95]; 38. CropSight [96].
Agronomy 13 02832 g005
Table 1. Application of phenotyping sensors to crop phenotype monitoring.
Table 1. Application of phenotyping sensors to crop phenotype monitoring.
Phenotyping TraitsPhenotyping Sensors
RGB
Camera
Imaging SpectrometerThermal CameraFluorescent ImagerDepth-Sensing CameraLidar ScannerSpectral Sensor
Phenotyping traits relating to yieldPlant density
Canopy coverage
Canopy height
Cover fraction
Grain number and size
Biomass
Chlorophyll content
Phenotyping traits relating to qualityFruit/inflorescence size
Grain quality
Water content
Phenotyping traits relating to resistanceCanopy temperature
Leaf rolling
Leaf wilting
Lodging
GNDVI (green normalized difference vegetation index)
Phenotyping traits relating to nutritionNitrogen content
LAI (leaf area index)
PNA (plant nitrogen accumulation)
Commercialized or notYYYYYYY
Models of sensorsCanon;
Nikon; and
Sony
MS3100 Duncan Camera;
SOC710E; and
Hyper Spec VNIR
FLIR T series Multiples 2, 3RealSense series;
CamCube 3.0;
SR4000; and
Kinect 2.0
LMS series;
VLP-16; and
HDL-32E
GreenSeeker RT 100, 200;
CropCircle ACS 210, 430, 470; and
N-sensor
Whether supporting secondary development or notYNYNYNN
Table 2. Advantages and limitations of common control algorithms and controllers.
Table 2. Advantages and limitations of common control algorithms and controllers.
Control Algorithms or ControllersAdvantagesLimitations
PID control algorithmEasy-to-use, flexibility, and convenient adjustmentLow regulation precision
Fuzzy control algorithmEasy realization, high robustness, and strong fault-tolerant abilityLow dynamic quality and lack of systematicity
Neural network control algorithmNon-linearity, high fault-tolerant ability, and strong expansibilityProneness to overfitting
Programmable logic controller (PLC)High reliability, high protection class, and good stabilityHigh hardware cost and difficulties in programming and maintenance
Single-board computerHigh integrity, low cost, high flexibility, and good portabilityLong response time and narrow application range
Industrial personal computer (IPC)High applicability, good expansibility, and powerful functionsPoor compatibility and high price
Table 4. Crop phenotype data analysis and management software.
Table 4. Crop phenotype data analysis and management software.
SoftwareR&D Institutions (Year)Types of Analyzed DataObtained Phenotype InformationCharacteristics
ImageJ version 1.8.0National Institutes of Health (2007)Visible imagesLeaf area, leaf perimeter, leaf length, leaf width, and plant heightPublic image processing software
IAP (Integrated Analysis Platform) [149]Leibniz Institute of Plant Genetics and Crop Plant Research (2012)Visible, fluorescence, near-infrared, and infrared imagesMorphological and structural traits including plant height, leaf area, biomass, and leaf inclination, color traits, fluorescence intensity, and near-infrared reflectivityImage data management and analysis platform
HTPheno [150]Leibniz Institute of Plant Genetics and Crop Plant Research (2011)Visible imagesWidth, height, and projected shoot areaImageJ plug-in and open-source image data analysis software system
HPGA (High-throughput Plant Growth Analysis) [151]Michigan State University (2016)Three digital imagesPlant area, leaf shape High-throughput phenotyping platforms for growth modeling and function analysis of plants
Leaf Analyzer [152]University of York (2007)2D or 3D imagesLeaf shape and sizeSoftware for rapid, large-scale, automatic analysis of variation in leaf shape
Leasyscan [70]ICRISAT—Crop Physiology Laboratory (2015)3D point cloud images3D leaf area, projected leaf area, leaf area index, leaf inclination, leaf angle, plant height, maximum plant height, optical penetration depth, biomassCommercial integrated analysis software based on multispectral laser 3D scanning and measuring instrument PlantEye
LemnaGrid [29] LemnaTec, GermanyVisible imagesMorphological and structural traits including leaf area and compactness Commercial integrated analysis software based on Scanalyzer 3D platform
Leaf-GP [153]Earlham Institute, Norwich Research ParkVisible imagesNumber of leaves, morphological and structural traits including projected leaf area and perimeter, and color traitsOpen source, extensibility, easy-to-use, and ability to simply resolve images of Arabidopsis thaliana and wheat taken by low-cost imaging devices such as smart phones and digital cameras
Phenotiki [154]IMT School for Advanced Studies, Piazza S.Visible imagesMorphological and structural traits, color traits, number of leaves, dynamic growth curves of plantsEconomy and ease of deployment
HSI-PP [155]State Key Laboratory of Modern Optical Instrumentation,
Zhejiang University
Hyperspectral imagesProjected leaf area, leaf perimeter, plant diameter, leaf convex hull, stockiness, and compactnessMachine learning and deep learning models that can preprocess hyperspectral images so that they are more applicable to training classification and regression
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yuan, H.; Song, M.; Liu, Y.; Xie, Q.; Cao, W.; Zhu, Y.; Ni, J. Field Phenotyping Monitoring Systems for High-Throughput: A Survey of Enabling Technologies, Equipment, and Research Challenges. Agronomy 2023, 13, 2832. https://doi.org/10.3390/agronomy13112832

AMA Style

Yuan H, Song M, Liu Y, Xie Q, Cao W, Zhu Y, Ni J. Field Phenotyping Monitoring Systems for High-Throughput: A Survey of Enabling Technologies, Equipment, and Research Challenges. Agronomy. 2023; 13(11):2832. https://doi.org/10.3390/agronomy13112832

Chicago/Turabian Style

Yuan, Huali, Minghan Song, Yiming Liu, Qi Xie, Weixing Cao, Yan Zhu, and Jun Ni. 2023. "Field Phenotyping Monitoring Systems for High-Throughput: A Survey of Enabling Technologies, Equipment, and Research Challenges" Agronomy 13, no. 11: 2832. https://doi.org/10.3390/agronomy13112832

APA Style

Yuan, H., Song, M., Liu, Y., Xie, Q., Cao, W., Zhu, Y., & Ni, J. (2023). Field Phenotyping Monitoring Systems for High-Throughput: A Survey of Enabling Technologies, Equipment, and Research Challenges. Agronomy, 13(11), 2832. https://doi.org/10.3390/agronomy13112832

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop