remotesensing-logo

Journal Browser

Journal Browser

Advanced Imaging for Plant Phenotyping

A special issue of Remote Sensing (ISSN 2072-4292). This special issue belongs to the section "Remote Sensing Image Processing".

Deadline for manuscript submissions: closed (31 December 2019) | Viewed by 66628

Special Issue Editors


E-Mail Website
Guest Editor
University of Bonn, Institute of Plant Sciences and Resource Conservation, Nussallee 9, 53115 Bonn, Germany
Interests: hyperspectral imaging; computer vision; camera calibration; machine learning; plant phenotyping

E-Mail Website
Guest Editor
University of Bonn, Institute of Geodesy and Geoinformation, Nussallee 17, 53115 Bonn, Germany
Interests: mobile multi-sensor systems; 3D Mapping; sensor fusion; precision agriculture; image- and laser-based plant phenotyping

E-Mail Website
Guest Editor
IFZ-Institute for Sugar Beet Research, Holtenser Landstraße 77, 37079 Göttingen, Germany
Interests: 3D and hyperspectral plant imaging; computer vision; plant phenotyping; machine learning

Special Issue Information

Dear Colleagues,

Plant phenotyping is an emerging topic involving the application of digital methods to the highly relevant task of optimizing the genetic potential, cultivation methods, and resource deployment in plant production. In transdisciplinary research, state-of-the-art sensors and data analysis concepts are combined to derive reliable plant-physiological parameters at an increasing throughput.

Plant phenotyping comprises technologies that derive parameters of the plant phenotype as a consequence of the interaction of its genotype with the environmental conditions. To cope with the natural variability in phenotypic expression, a high number of samples will be evaluated in most cases. Phenotypic parameters are expressed at various scales, from single leaves at the plant level, up to crop stands also regarding the plant–plant interaction. In most cases, the reaction of the plant to a specific environmental stress (e.g. drought, plant diseases, or nutrient efficiency) is recorded and used to evaluate the performance of the genotype under these specific conditions.

We welcome papers from the global research community actively involved in research on imaging for plant phenotyping. As such, this Special Issue is open to anyone doing research in this field. The selection of papers for publication will depend on quality and rigor of research. Specific topics include, but are not limited to advanced methods for imaging technologies, sensor setups, and data processing in plant phenotyping:

  • Panchromatic, multispectral, and hyperspectral approaches;
  • 3D imaging techniques adapted to plants;
  • High-throughput sensor platforms;
  • Robotics for phenotyping;
  • Field phenotyping;
  • Stress detection;
  • Disease detection;
  • Data analysis in plant phenotyping;
  • Multi-scale phenotyping;
  • Multi-sensor phenotyping.

Dr. Jan Behmann
Dr. Lasse Klingbeil
Dr. Stefan Paulus
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Remote Sensing is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2700 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • panchromatic, multispectral, and hyperspectral approaches 
  • 3D imaging techniques adapted to plants
  • high-throughput sensor platforms 
  • robotics for phenotyping 
  • field phenotyping 
  • stress detection 
  • disease detection 
  • data analysis in plant phenotyping
  • multi-scale phenotyping 
  • multi-sensor phenotyping

Published Papers (11 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

23 pages, 6454 KiB  
Article
Automated Identification of Crop Tree Crowns from UAV Multispectral Imagery by Means of Morphological Image Analysis
by Ricardo Sarabia, Arturo Aquino, Juan Manuel Ponce, Gilberto López and José Manuel Andújar
Remote Sens. 2020, 12(5), 748; https://doi.org/10.3390/rs12050748 - 25 Feb 2020
Cited by 23 | Viewed by 4176
Abstract
Within the context of precision agriculture, goods insurance, public subsidies, fire damage assessment, etc., accurate knowledge about the plant population in crops represents valuable information. In this regard, the use of Unmanned Aerial Vehicles (UAVs) has proliferated as an alternative to traditional plant [...] Read more.
Within the context of precision agriculture, goods insurance, public subsidies, fire damage assessment, etc., accurate knowledge about the plant population in crops represents valuable information. In this regard, the use of Unmanned Aerial Vehicles (UAVs) has proliferated as an alternative to traditional plant counting methods, which are laborious, time demanding and prone to human error. Hence, a methodology for the automated detection, geolocation and counting of crop trees in intensive cultivation orchards from high resolution multispectral images, acquired by UAV-based aerial imaging, is proposed. After image acquisition, the captures are processed by means of photogrammetry to yield a 3D point cloud-based representation of the study plot. To exploit the elevation information contained in it and eventually identify the plants, the cloud is deterministically interpolated, and subsequently transformed into a greyscale image. This image is processed, by using mathematical morphology techniques, in such a way that the absolute height of the trees with respect to their local surroundings is exploited to segment the tree pixel-regions, by global statistical thresholding binarization. This approach makes the segmentation process robust against surfaces with elevation variations of any magnitude, or to possible distracting artefacts with heights lower than expected. Finally, the segmented image is analysed by means of an ad-hoc moment representation-based algorithm to estimate the location of the trees. The methodology was tested in an intensive olive orchard of 17.5 ha, with a population of 3919 trees. Because of the plot’s plant density and tree spacing pattern, typical of intensive plantations, many occurrences of intra-row tree aggregations were observed, increasing the complexity of the scenario under study. Notwithstanding, it was achieved a precision of 99.92%, a sensibility of 99.67% and an F-score of 99.75%, thus correctly identifying and geolocating 3906 plants. The generated 3D point cloud reported root-mean square errors (RMSE) in the X, Y and Z directions of 0.73 m, 0.39 m and 1.20 m, respectively. These results support the viability and robustness of this methodology as a phenotyping solution for the automated plant counting and geolocation in olive orchards. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

17 pages, 9230 KiB  
Article
Image-Based Dynamic Quantification of Aboveground Structure of Sugar Beet in Field
by Shunfu Xiao, Honghong Chai, Ke Shao, Mengyuan Shen, Qing Wang, Ruili Wang, Yang Sui and Yuntao Ma
Remote Sens. 2020, 12(2), 269; https://doi.org/10.3390/rs12020269 - 14 Jan 2020
Cited by 28 | Viewed by 4073
Abstract
Sugar beet is one of the main crops for sugar production in the world. With the increasing demand for sugar, more desirable sugar beet genotypes need to be cultivated through plant breeding programs. Precise plant phenotyping in the field still remains challenge. In [...] Read more.
Sugar beet is one of the main crops for sugar production in the world. With the increasing demand for sugar, more desirable sugar beet genotypes need to be cultivated through plant breeding programs. Precise plant phenotyping in the field still remains challenge. In this study, structure from motion (SFM) approach was used to reconstruct a three-dimensional (3D) model for sugar beets from 20 genotypes at three growth stages in the field. An automatic data processing pipeline was developed to process point clouds of sugar beet including preprocessing, coordinates correction, filtering and segmentation of point cloud of individual plant. Phenotypic traits were also automatically extracted regarding plant height, maximum canopy area, convex hull volume, total leaf area and individual leaf length. Total leaf area and convex hull volume were adopted to explore the relationship with biomass. The results showed that high correlations between measured and estimated values with R2 > 0.8. Statistical analyses between biomass and extracted traits proved that both convex hull volume and total leaf area can predict biomass well. The proposed pipeline can estimate sugar beet traits precisely in the field and provide a basis for sugar beet breeding. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

18 pages, 5697 KiB  
Article
Color Calibration of Proximal Sensing RGB Images of Oilseed Rape Canopy via Deep Learning Combined with K-Means Algorithm
by Alwaseela Abdalla, Haiyan Cen, Elfatih Abdel-Rahman, Liang Wan and Yong He
Remote Sens. 2019, 11(24), 3001; https://doi.org/10.3390/rs11243001 - 13 Dec 2019
Cited by 25 | Viewed by 5004
Abstract
Plant color is a key feature for estimating parameters of the plant grown under different conditions using remote sensing images. In this case, the variation in plant color should be only due to the influence of the growing conditions and not due to [...] Read more.
Plant color is a key feature for estimating parameters of the plant grown under different conditions using remote sensing images. In this case, the variation in plant color should be only due to the influence of the growing conditions and not due to external confounding factors like a light source. Hence, the impact of the light source in plant color should be alleviated using color calibration algorithms. This study aims to develop an efficient, robust, and cutting-edge approach for automatic color calibration of three-band (red green blue: RGB) images. Specifically, we combined the k-means model and deep learning for accurate color calibration matrix (CCM) estimation. A dataset of 3150 RGB images for oilseed rape was collected by a proximal sensing technique under varying illumination conditions and used to train, validate, and test our proposed framework. Firstly, we manually derived CCMs by mapping RGB color values of each patch of a color chart obtained in an image to standard RGB (sRGB) color values of that chart. Secondly, we grouped the images into clusters according to the CCM assigned to each image using the unsupervised k-means algorithm. Thirdly, the images with the new cluster labels were used to train and validate the deep learning convolutional neural network (CNN) algorithm for an automatic CCM estimation. Finally, the estimated CCM was applied to the input image to obtain an image with a calibrated color. The performance of our model for estimating CCM was evaluated using the Euclidean distance between the standard and the estimated color values of the test dataset. The experimental results showed that our deep learning framework can efficiently extract useful low-level features for discriminating images with inconsistent colors and achieved overall training and validation accuracies of 98.00% and 98.53%, respectively. Further, the final CCM provided an average Euclidean distance of 16.23 ΔΕ and outperformed the previously reported methods. This proposed technique can be used in real-time plant phenotyping at multiscale levels. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

16 pages, 2446 KiB  
Article
Analysis of Cold-Developed vs. Cold-Acclimated Leaves Reveals Various Strategies of Cold Acclimation of Field Pea Cultivars
by Alexandra Husičková, Jan F. Humplík, Miroslav Hýbl, Lukáš Spíchal and Dušan Lazár
Remote Sens. 2019, 11(24), 2964; https://doi.org/10.3390/rs11242964 - 11 Dec 2019
Cited by 3 | Viewed by 2765
Abstract
Peas (Pisum sativum L.) belong among the world’s oldest domesticated crops, serving as a source of proteins, complex carbohydrates, vitamins and minerals. Autumn sowing allows a higher biomass production as well as the avoidance of the drought and heat stresses of late [...] Read more.
Peas (Pisum sativum L.) belong among the world’s oldest domesticated crops, serving as a source of proteins, complex carbohydrates, vitamins and minerals. Autumn sowing allows a higher biomass production as well as the avoidance of the drought and heat stresses of late spring. However, the character of European continental winters limits plant growth and development through cold stress. This work sought parameters that reflect the cold tolerance of pea plants and consequently to suggest an afila-type pea cultivar with resilience to European continental winters. For this purpose, we employed indoor remote sensing technology and compared the 22-day-long acclimation to 5 °C of four pea cultivars: Arkta, with normal leaves and the known highest cold resistance to European continental winters, and Enduro, Terno and CDC Le Roy, all of the afila type. Besides evaluation of shoot growth rate and quenching analysis of chlorophyll fluorescence (ChlF) by imaging methods, we measured the chlorophyll content and ChlF induction with a nonimaging fluorometer. Here we show that the acclimation to cold of the Arkta exhibits a different pattern than the other cultivars. Arkta showed the fastest retardation of photosynthesis and shoot growth, which might be part of its winter survival strategy. Terno, on the other hand, showed sustained photosynthetic performance and growth, which might be an advantageous strategy for spring. Surprisingly, Enduro showed sustained photosynthesis in the stipules, which transferred and acclimated to 5 °C (cold-acclimated). However, of all the cultivars, Enduro had the strongest inhibition of photosynthesis in new stipules that developed after the transition to cold (cold-developed). We conclude that the parameters of ChlF spatial imaging calculated as averages from whole plants are suboptimal for the characterization of various cold acclimation strategies. The most marked changes were obtained when the new cold-developed leaves were analyzed separately from the rest of the plant. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

22 pages, 3566 KiB  
Article
Combination of an Automated 3D Field Phenotyping Workflow and Predictive Modelling for High-Throughput and Non-Invasive Phenotyping of Grape Bunches
by Florian Rist, Doreen Gabriel, Jennifer Mack, Volker Steinhage, Reinhard Töpfer and Katja Herzog
Remote Sens. 2019, 11(24), 2953; https://doi.org/10.3390/rs11242953 - 10 Dec 2019
Cited by 15 | Viewed by 3460
Abstract
In grapevine breeding, loose grape bunch architecture is one of the most important selection traits, contributing to an increased resilience towards Botrytis bunch rot. Grape bunch architecture is mainly influenced by the berry number, berry size, the total berry volume, and bunch width [...] Read more.
In grapevine breeding, loose grape bunch architecture is one of the most important selection traits, contributing to an increased resilience towards Botrytis bunch rot. Grape bunch architecture is mainly influenced by the berry number, berry size, the total berry volume, and bunch width and length. For an objective, precise, and high-throughput assessment of these architectural traits, the 3D imaging sensor Artec® Spider was applied to gather dense point clouds of the visible side of grape bunches directly in the field. Data acquisition in the field is much faster and non-destructive in comparison to lab applications but results in incomplete point clouds and, thus, mostly incomplete phenotypic values. Therefore, lab scans of whole bunches (360°) were used as ground truth. We observed strong correlations between field and lab data but also shifts in mean and max values, especially for the berry number and total berry volume. For this reason, the present study is focused on the training and validation of different predictive regression models using 3D data from approximately 2000 different grape bunches in order to predict incomplete bunch traits from field data. Modeling concepts included simple linear regression and machine learning-based approaches. The support vector machine was the best and most robust regression model, predicting the phenotypic traits with an R2 of 0.70–0.91. As a breeding orientated proof-of-concept, we additionally performed a Quantitative Trait Loci (QTL)-analysis with both the field modeled and lab data. All types of data resulted in joint QTL regions, indicating that this innovative, fast, and non-destructive phenotyping method is also applicable for molecular marker development and grapevine breeding research. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

22 pages, 6286 KiB  
Article
A Proposed Methodology to Analyze Plant Growth and Movement from Phenomics Data
by María Victoria Díaz-Galián, Fernando Perez-Sanz, Jose David Sanchez-Pagán, Julia Weiss, Marcos Egea-Cortines and Pedro J. Navarro
Remote Sens. 2019, 11(23), 2839; https://doi.org/10.3390/rs11232839 - 29 Nov 2019
Cited by 4 | Viewed by 6036
Abstract
Image analysis of developmental processes in plants reveals both growth and organ movement. This study proposes a methodology to study growth and movement. It includes the standard acquisition of internal and external reference points and coordinates, coordinates transformation, curve fitting and the corresponding [...] Read more.
Image analysis of developmental processes in plants reveals both growth and organ movement. This study proposes a methodology to study growth and movement. It includes the standard acquisition of internal and external reference points and coordinates, coordinates transformation, curve fitting and the corresponding statistical analysis. Several species with different growth habits were used including Antirrhinum majus, A. linkianum, Petunia x hybrida and Fragaria x ananassa. Complex growth patterns, including gated growth, could be identified using a generalized additive model. Movement, and in some cases, growth, could not be adjusted to curves due to drastic changes in position. The area under the curve was useful in order to identify the initial stage of growth of an organ, and its growth rate. Organs displayed either continuous movements during the day with gated day/night periods of maxima, or sharp changes in position coinciding with day/night shifts. The movement was dependent on light in petunia and independent in F. ananassa. Petunia showed organ movement in both growing and fully-grown organs, while A. majus and F. ananassa showed both leaf and flower movement patterns linked to growth. The results indicate that different mathematical fits may help quantify growth rate, growth duration and gating. While organ movement may complicate image and data analysis, it may be a surrogate method to determine organ growth potential. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

20 pages, 6772 KiB  
Article
Detection of Fusarium Head Blight in Wheat Using a Deep Neural Network and Color Imaging
by Ruicheng Qiu, Ce Yang, Ali Moghimi, Man Zhang, Brian J. Steffenson and Cory D. Hirsch
Remote Sens. 2019, 11(22), 2658; https://doi.org/10.3390/rs11222658 - 13 Nov 2019
Cited by 76 | Viewed by 7724
Abstract
Fusarium head blight (FHB) is a devastating disease of wheat worldwide. In addition to reducing the yield of the crop, the causal pathogens also produce mycotoxins that can contaminate the grain. The development of resistant wheat varieties is one of the best ways [...] Read more.
Fusarium head blight (FHB) is a devastating disease of wheat worldwide. In addition to reducing the yield of the crop, the causal pathogens also produce mycotoxins that can contaminate the grain. The development of resistant wheat varieties is one of the best ways to reduce the impact of FHB. To develop such varieties, breeders must expose germplasm lines to the pathogen in the field and assess the disease reaction. Phenotyping breeding materials for resistance to FHB is time-consuming, labor-intensive, and expensive when using conventional protocols. To develop a reliable and cost-effective high throughput phenotyping system for assessing FHB in the field, we focused on developing a method for processing color images of wheat spikes to accurately detect diseased areas using deep learning and image processing techniques. Color images of wheat spikes at the milk stage were collected in a shadow condition and processed to construct datasets, which were used to retrain a deep convolutional neural network model using transfer learning. Testing results showed that the model detected spikes very accurately in the images since the coefficient of determination for the number of spikes tallied by manual count and the model was 0.80. The model was assessed, and the mean average precision for the testing dataset was 0.9201. On the basis of the results for spike detection, a new color feature was applied to obtain the gray image of each spike and a modified region-growing algorithm was implemented to segment and detect the diseased areas of each spike. Results showed that the region growing algorithm performed better than the K-means and Otsu’s method in segmenting diseased areas. We demonstrated that deep learning techniques enable accurate detection of FHB in wheat based on color image analysis, and the proposed method can effectively detect spikes and diseased areas, which improves the efficiency of the FHB assessment in the field. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

17 pages, 3715 KiB  
Article
Estimation of the Maturity Date of Soybean Breeding Lines Using UAV-Based Multispectral Imagery
by Jing Zhou, Dennis Yungbluth, Chin Nee Vong, Andrew Scaboo and Jianfeng Zhou
Remote Sens. 2019, 11(18), 2075; https://doi.org/10.3390/rs11182075 - 04 Sep 2019
Cited by 50 | Viewed by 4867
Abstract
Physiological maturity date is a critical parameter for the selection of breeding lines in soybean breeding programs. The conventional method to estimate the maturity dates of breeding lines uses visual ratings based on pod senescence by experts, which is subjective by human estimation, [...] Read more.
Physiological maturity date is a critical parameter for the selection of breeding lines in soybean breeding programs. The conventional method to estimate the maturity dates of breeding lines uses visual ratings based on pod senescence by experts, which is subjective by human estimation, labor-intensive and time-consuming. Unmanned aerial vehicle (UAV)-based phenotyping systems provide a high-throughput and powerful tool of capturing crop traits using remote sensing, image processing and machine learning technologies. The goal of this study was to investigate the potential of predicting maturity dates of soybean breeding lines using UAV-based multispectral imagery. Maturity dates of 326 soybean breeding lines were taken using visual ratings from the beginning maturity stage (R7) to full maturity stage (R8), and the aerial multispectral images were taken during this period on 27 August, 14 September and 27 September, 2018. One hundred and thirty features were extracted from the five-band multispectral images. The maturity dates of the soybean lines were predicted and evaluated using partial least square regression (PLSR) models with 10-fold cross-validation. Twenty image features with importance to the estimation were selected and their changing rates between each two of the data collection days were calculated. The best prediction (R2 = 0.81, RMSE = 1.4 days) was made by the PLSR model using image features taken on 14 September and their changing rates between 14 September and 27 September with five components, leading to the conclusion that the UAV-based multispectral imagery is promising and practical in estimating maturity dates of soybean breeding lines. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

11 pages, 13969 KiB  
Article
Extending Hyperspectral Imaging for Plant Phenotyping to the UV-Range
by Anna Brugger, Jan Behmann, Stefan Paulus, Hans-Georg Luigs, Matheus Thomas Kuska, Patrick Schramowski, Kristian Kersting, Ulrike Steiner and Anne-Katrin Mahlein
Remote Sens. 2019, 11(12), 1401; https://doi.org/10.3390/rs11121401 - 12 Jun 2019
Cited by 32 | Viewed by 8008
Abstract
Previous plant phenotyping studies have focused on the visible (VIS, 400–700 nm), near-infrared (NIR, 700–1000 nm) and short-wave infrared (SWIR, 1000–2500 nm) range. The ultraviolet range (UV, 200–380 nm) has not yet been used in plant phenotyping even though a number of plant [...] Read more.
Previous plant phenotyping studies have focused on the visible (VIS, 400–700 nm), near-infrared (NIR, 700–1000 nm) and short-wave infrared (SWIR, 1000–2500 nm) range. The ultraviolet range (UV, 200–380 nm) has not yet been used in plant phenotyping even though a number of plant molecules like flavones and phenol feature absorption maxima in this range. In this study an imaging UV line scanner in the range of 250–430 nm is introduced to investigate crop plants for plant phenotyping. Observing plants in the UV-range can provide information about important changes of plant substances. To record reliable and reproducible time series results, measurement conditions were defined that exclude phototoxic effects of UV-illumination in the plant tissue. The measurement quality of the UV-camera has been assessed by comparing it to a non-imaging UV-spectrometer by measuring six different plant-based substances. Given the findings of these preliminary studies, an experiment has been defined and performed monitoring the stress response of barley leaves to salt stress. The aim was to visualize the effects of abiotic stress within the UV-range to provide new insights into the stress response of plants. Our study demonstrated the first use of a hyperspectral sensor in the UV-range for stress detection in plant phenotyping. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Figure 1

26 pages, 12117 KiB  
Article
3D Morphological Processing for Wheat Spike Phenotypes Using Computed Tomography Images
by Biao Xiong, Bo Wang, Shengwu Xiong, Chengde Lin and Xiaohui Yuan
Remote Sens. 2019, 11(9), 1110; https://doi.org/10.3390/rs11091110 - 09 May 2019
Cited by 14 | Viewed by 4885
Abstract
Wheat is the main food crop today world-wide. In order to improve its yields, researchers are committed to understand the relationships between wheat genotypes and phenotypes. Compared to progressive technology of wheat gene section identification, wheat trait measurement is mostly done manually in [...] Read more.
Wheat is the main food crop today world-wide. In order to improve its yields, researchers are committed to understand the relationships between wheat genotypes and phenotypes. Compared to progressive technology of wheat gene section identification, wheat trait measurement is mostly done manually in a destructive, labor-intensive and time-consuming way. Therefore, this study will be greatly accelerated and promoted if we can automatically discover wheat phenotype in a nondestructive and fast manner. In this paper, we propose a novel pipeline based on 3D morphological processing to detect wheat spike grains and stem nodes from 3D X-ray micro computed tomography (CT) images. We also introduce a set of newly defined 3D phenotypes, including grain aspect ratio, porosity, Grain-to-Grain distance, and grain angle, which are very difficult to be manually measured. The analysis of the associations among these traits would be very helpful for wheat breeding. Experimental results show that our method is able to count grains more accurately than normal human performance. By analyzing the relationships between traits and environment conditions, we find that the Grain-to-Grain distance, aspect ratio and porosity are more likely affected by the genome than environment (only tested temperature and water conditions). We also find that close grains will inhibit grain volume growth and that the aspect ratio 3.5 may be the best for higher yield in wheat breeding. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

19 pages, 9879 KiB  
Article
UAV-Based High Throughput Phenotyping in Citrus Utilizing Multispectral Imaging and Artificial Intelligence
by Yiannis Ampatzidis and Victor Partel
Remote Sens. 2019, 11(4), 410; https://doi.org/10.3390/rs11040410 - 17 Feb 2019
Cited by 174 | Viewed by 14060
Abstract
Traditional plant breeding evaluation methods are time-consuming, labor-intensive, and costly. Accurate and rapid phenotypic trait data acquisition and analysis can improve genomic selection and accelerate cultivar development. In this work, a technique for data acquisition and image processing was developed utilizing small unmanned [...] Read more.
Traditional plant breeding evaluation methods are time-consuming, labor-intensive, and costly. Accurate and rapid phenotypic trait data acquisition and analysis can improve genomic selection and accelerate cultivar development. In this work, a technique for data acquisition and image processing was developed utilizing small unmanned aerial vehicles (UAVs), multispectral imaging, and deep learning convolutional neural networks to evaluate phenotypic characteristics on citrus crops. This low-cost and automated high-throughput phenotyping technique utilizes artificial intelligence (AI) and machine learning (ML) to: (i) detect, count, and geolocate trees and tree gaps; (ii) categorize trees based on their canopy size; (iii) develop individual tree health indices; and (iv) evaluate citrus varieties and rootstocks. The proposed remote sensing technique was able to detect and count citrus trees in a grove of 4,931 trees, with precision and recall of 99.9% and 99.7%, respectively, estimate their canopy size with overall accuracy of 85.5%, and detect, count, and geolocate tree gaps with a precision and recall of 100% and 94.6%, respectively. This UAV-based technique provides a consistent, more direct, cost-effective, and rapid method to evaluate phenotypic characteristics of citrus varieties and rootstocks. Full article
(This article belongs to the Special Issue Advanced Imaging for Plant Phenotyping)
Show Figures

Graphical abstract

Back to TopTop