Next Article in Journal
A Framework for Correcting Ionospheric Artifacts and Atmospheric Effects to Generate High Accuracy InSAR DEM
Next Article in Special Issue
A New, Satellite NDVI-Based Sampling Protocol for Grape Maturation Monitoring
Previous Article in Journal
Ground Based Hyperspectral Imaging to Characterize Canopy-Level Photosynthetic Activities
Previous Article in Special Issue
Mapping Cynodon Dactylon Infesting Cover Crops with an Automatic Decision Tree-OBIA Procedure and UAV Imagery for Precision Viticulture
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications

by
Francisco-Javier Mesas-Carrascosa
1,*,
Ana I. de Castro
2,
Jorge Torres-Sánchez
2,
Paula Triviño-Tarradas
1,
Francisco M. Jiménez-Brenes
2,
Alfonso García-Ferrer
1 and
Francisca López-Granados
2
1
Department of Graphic Engineering and Geomatics, University of Cordoba, Campus de Rabanales, Crta. IV, km. 396, E-14071 Córdoba, Spain
2
Imaping Group, Department of Crop Protection, Institute for Sustainable Agriculture (IAS), Spanish National Research Council (CSIC), E-14004 Córdoba, Spain
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(2), 317; https://doi.org/10.3390/rs12020317
Submission received: 17 November 2019 / Revised: 16 January 2020 / Accepted: 16 January 2020 / Published: 18 January 2020
(This article belongs to the Special Issue Remote Sensing in Viticulture)

Abstract

:
Remote sensing applied in the digital transformation of agriculture and, more particularly, in precision viticulture offers methods to map field spatial variability to support site-specific management strategies; these can be based on crop canopy characteristics such as the row height or vegetation cover fraction, requiring accurate three-dimensional (3D) information. To derive canopy information, a set of dense 3D point clouds was generated using photogrammetric techniques on images acquired by an RGB sensor onboard an unmanned aerial vehicle (UAV) in two testing vineyards on two different dates. In addition to the geometry, each point also stores information from the RGB color model, which was used to discriminate between vegetation and bare soil. To the best of our knowledge, the new methodology herein presented consisting of linking point clouds with their spectral information had not previously been applied to automatically estimate vine height. Therefore, the novelty of this work is based on the application of color vegetation indices in point clouds for the automatic detection and classification of points representing vegetation and the later ability to determine the height of vines using as a reference the heights of the points classified as soil. Results from on-ground measurements of the heights of individual grapevines were compared with the estimated heights from the UAV point cloud, showing high determination coefficients (R² > 0.87) and low root-mean-square error (0.070 m). This methodology offers new capabilities for the use of RGB sensors onboard UAV platforms as a tool for precision viticulture and digitizing applications.

Graphical Abstract

1. Introduction

Precision agriculture involves the collection and use of large amounts of georeferenced data relating to crops and their attributes in production areas at a high spatial resolution [1]. Its purpose is for site-specific management of crop heterogeneity at both time and spatial scales [2] to optimize agricultural inputs. As a second consequence, this high-quality information can be connected with the global objective of creating opportunities for digitizing agriculture by renewing processes and technologies to make the sector more insight-driven and efficient, without forgetting the need for improving yield and final product quality and reducing the environmental impact of agricultural activity. Factors like soil, water availability, pests (e.g., incidences related to weeds, fungi, insects, root-knot nematodes), presence of cover crops between grape rows, topography, and variable climatic conditions cause different responses in the crop which are ultimately reflected in spatial fluctuations in yield and grape composition [3]. Precision viticulture (PV) and digitizing-related strategies, which fall inside precision agriculture and the wider concept of digital transformation of agriculture, could contribute to solving and managing this spatial heterogeneity in a sustainable way. This is because their main objectives are the monitoring of vineyard variability and the design of site-specific management accordingly to improve production efficiency with reduced inputs (e.g., labor, fuel, water, canopy management, or phytosanitary applications) [4].
The implementation of PV can be considered a process which begins with the observation of vineyard attributes, followed by the interpretation and evaluation of collected data, implementation of targeted management (e.g., irrigation, fertilizers, spray, pruning or other canopy management, and even selective harvesting), and, finally, evaluation of the implemented management [5]. The efficiency of PV, particularly referring to vineyard zone delineation for site-specific phytosanitary foliar applications, depends on many interacting factors related to canopy crop characteristics like height, vegetative stage, or growing habits of the corresponding grape variety, which must be properly combined to adapt the chemical application to the foliar part of the crop [6]. In this context, georeferenced information on grapevine height at the field scale is one of the most important structural inputs used to map and monitor the vineyard and to provide accurate information for rational decision-making [7].
Phenological development stage is related to biophysical processes, and among all phenotypic characteristics, crop height is an adequate indicator of crop yield [8,9], evapotranspiration [10,11,12], health [13,14], and biomass [9,15]. Most of the works conducted to measure tree height or crown volume among other characteristics using geomatic techniques have been related to forest areas [16,17]. These geometric characteristics, to a lesser degree, are also used in agriculture as indicators to evaluate pruning, pest effects on crops or fruit detection [18,19,20,21], probably motivated by their difficulty to measure [22]. Collecting these data at the field scale is time-consuming and offers uncertain results because of the variability of tree crowns in orchards and the difficulty of fitting to geometric models such as cones or ovoids. To date, the measurement and characterization of plant structures has been carried out using different remote sensed alternatives like radar [23], hemispherical photography [24], digital photogrammetric techniques [25], light sensors [26], stereo images [27], ultrasonic sensors [28], and Light Detection and Ranging (LiDAR) sensors [29]. Despite the great variety of technologies used to characterize the 3D structures of plants, many of them have aspects that limit their use; only a small group of them are suitable for this purpose, with LiDAR and those based on stereoscopic images being the most relevant [30]. On the one hand, those methodologies based on terrestrial laser scanners are very precise in measuring tree architecture [31,32]; however, they are inefficient over large spatial extents [33]. Similarly, those methods based on stereoscopic images require a very high spatial resolution to properly model the 3D characteristics of woody crops [34]. In this context, images registered by sensors on board satellites or piloted aircraft platforms do not satisfy these technical requirements, with unmanned aerial vehicles (UAVs) instead being the most adequate platform. The advantages of UAV application in agriculture have been demonstrated in comparison to traditional platforms regarding their very high spatial and temporal resolution [35] and low cost [36,37], which make UAV technology an adequate tool to monitor crops at the field scale [35,38].
UAVs are capable of carrying sensors like LiDAR [39], RGB [40], thermal [41], multispectral [42], and hyperspectral [43] sensors. Although LiDAR UAV data, combined with data from Global Navigation Satellite System (GNSS) and inertial measurement unit (IMU) sensors, provide 3D point clouds to monitor plant structure information, their use is limited because of the system weight and economical cost [44], requiring very strong platforms [45]. As an alternative, UAV images registered by passive sensors can form an alternative for the 3D characterization of crops by producing digital surface models (DSMs). A DSM can be understood to be an image in which pixel values contain elevation information or a set of 3D geometries. DSMs are obtained by Structure from Motion (SFM) algorithms, which can also produce very highly dense 3D point clouds with color which corresponds to the color of the original image pixel where each point is projected as part of the processing pipeline.
In the case of agricultural applications, one of the most crucial steps is the segmentation of soil and vegetation. Using orthomosaics, the segmentation of soil and vegetation can be carried out by vegetation indices (VIs) using different spectral bands and their combinations. Of all the possible VIs, color vegetation indices (CVIs) using common red, green, and blue (RGB) sensors onboard UAV platforms are used to accentuate plant greenness [46]. On the other hand, other methods are based on using DSMs. Many approaches have been developed to detect, delineate, and segment objects in either raster or vector data [47,48,49]. For raster DSMs, some strategies have used object-based image analysis (OBIA) [50,51], local maxima [52,53], or watershed segmentation [54,55], among others. Of these, the OBIA methods have successfully classified and identified single olive trees [56] and vertical trellis vineyards [57], although OBIA methods need to design an effective rule set to assign the correct scale, shape, and compactness parameters to obtain meaningful objects [58]. For vector DSMs, authors have used the adaptive clustering algorithm [59] or top-to-bottom region growing approach [60], and previous research projects have successfully used DSMs on both herbaceous [61,62] and woody crops [56,57], based on geometrical approaches. Therefore, methods using vector DSMs have mainly focused on reducing the DSM to a digital elevation model (DEM), removing height value objects by a filtering algorithm. Different filter algorithms have been reported based on morphological filters [63], linear prediction [64], or spline approximation [65], with all of them based on the geometric characteristics of the point clouds and not using the color information of the points in the process.
As per the above discussion, we report in this article a new method to classify 3D UAV photogrammetric point clouds using RGB information through CVIs, tested in two vineyards on two different dates. Our specific objectives included (1) selecting the most appropriate index, (2) classifying point clouds by the selected CVI, (3) determining the heights of the vines, and, finally, (4) assessing the quality of the results obtained by comparing the UAV estimated and on-ground height measurements.

2. Materials and Methods

2.1. Study Field and UAV Flights

The presented study was carried out in two different commercial vineyards (Vitis vinifera L.) located in the province of Lleida, Spain (Figure 1). The first vineyard, Field A, was an area of 4925 m2 cultivated with the Merlot vine variety (central coordinates 41°38’41’’ N; 5°30’34’’W, WGS84), while Field B was an area of 4415 m2 cultivated with the Albariño vine variety (central coordinates 41°39’3.34’’; 5°30’22’’W, WGS84). Vines were drip-irrigated and trellis-trained and had inter-row cover crops composed of grasses. The vineyard design was focused on wine production, with the distance between rows equal to 3 m, vine spacing equal to 2 m, and north–south row orientation.
A total of four UAV flights, two on each vineyard, was performed; the first was on 29 July 2015, and the second was on 16 September 2015, depicting two different crop stages. All UAV flights were performed under similar weather conditions. In July, the grapevine canopies were fully developed, while in September, grapes were machine-harvested. Flying at two crop stages made it possible to analyze different situations to test the validity of the proposed methodology. The UAV used was an MD4-1000 multi-rotor drone (Microdrones GmbH, Siegen, Germany). This UAV is a quadcopter with a maximum payload equal to 1.2 kg. It uses 4 × 250 W gearless brushless motors and reaches a cruising speed of 15 m/s. The UAV was equipped with an Olympus PEN E-PM1 (Olympus Corporation, Tokyo, Japan), which is an RGB (R: Red; G: Green; B: Blue) camera. It has a focal length equal to 14 mm. Registered images have a dimension equal to 4032 × 3024 pixels and a pixel pitch of 4.3 μm. UAV flights were performed at 30 m above ground level, with a ground sample distance of 1 cm and a ground image dimension of 37 × 28 m. Images were registered in continuous mode at one-second intervals, resulting in 93% and 60% forward and side laps, respectively. These high overlaps allow us to achieve an accurate 3D reconstruction of woody crops, according to previous investigations [40]. Five ground control points (GCPs) were placed per vineyard, one in each corner and the other in the center. Each GCP was measured with the stop-and-go technique using a Trimble GeoXH 2008 Series (Trimble, Sunnyvale, CA, USA) to georeference the DSM and orthomosaic in the photogrammetric processing.
Figure 2 summarizes the workflow for classifying points. RGB images were photogrammetrically processed to obtain a 3D RGB point cloud. From this RGB information, a CVI was calculated for each point, obtaining a 3D CVI point cloud. A sample is extracted to calculate a separation threshold value between the vegetation and non-vegetation points. Finally, this threshold value was applied to the 3D CVI point cloud to classify the points.
To produce a very dense point cloud for each UAV flight, aerial triangulation was firstly performed to determine the individual external orientation, position, and orientation of each image of the photogrammetric block. Afterwards, point clouds were generated using SfM techniques. SfM works under the principles of stereoscopic photogrammetry, using well-defined geometrical features registered in multiple images from different points of view [66]. This methodology has been validated in previous research projects [36,67], and we used Agisoft PhotoScan Professional Edition software (Agisoft LLC, St. Petersburg, Russia) for photogrammetric processing.

2.2. Point Cloud Classification

A crucial step of the proposed methodology is the accurate classification of those points representing vegetation and non-vegetation classes. The non-vegetation class represents the bare soil as well as the trunks and branches of vines. The classification task took advantage of the capacity of certain CVIs to enhance the discrimination of vegetation points. In this research, six CVIs were calculated based in our previous experience [46]: excess of blue (ExB), excess of green (ExG), excess of red (ExR), excess of green minus excess Red (ExGR), color index of vegetation extraction (CIVE), and Normal green–red difference index (NGRDI) (Table 1).
Taking into account that each singular point has information in the RGB color space, prior to the calculation of the indices, a color space normalization for each singular point was applied following the normalization scheme described in [74]. As a result, normalized spectral components r, g, and b ranging in [0,1] were obtained according to Equation (1):
r = R R + G + B , g = G R + G + B , b = B R + G + B
where R, G, and B are normalized RGB values ranging in [0,1] obtained according to Equation (2):
R = R R max , G = G G max , B = B B max
Here, R max = G max = B max = 255 for 24-bit radiometric resolution.
Therefore, throughout a script developed in Matlab, the original RGB point cloud was converted to a grey-scale point cloud with the CVI value as an attribute for each of the CVIs shown in Table 1. The script has as input a LAS file of the point cloud. It reads the RGB values of each point to calculate the CVI values. As output, it generates an ASCII file for each CVI, storing the coordinates of each point as well as the index value. The potential of each CVI to discriminate vegetation and non-vegetation was evaluated by applying the M-Statistic [75] (Equation (3)), where μ and σ are, respectively, the mean and standard deviation of both classes. Normality of distribution was evaluated by Lilliefors test. The M-statistic defines the degree of discrimination between these two classes by evaluating the separation of their histograms.
M = ( μ class 1 μ class 2 ) / ( σ class 1 + σ class 2 )
A value of M lower than 1 means that the histograms overlap significantly and therefore offer poor discrimination. On the other hand, a value of M higher than 1 means that the histograms are well separated, providing adequate discrimination. To calculate the mean and standard deviation for each class, a stratified systematic unaligned strategy was used as the sampling method to select points. For that, the UAV point clouds from every field and date were divided into regularly spaced regions of 10 × 10 m, these regions being divided into smaller sampling unit areas of 0.1 × 0.1 m. For each region, units with points belonging to a single class were selected manually, taking into account in-field differences. To do this, points were shown using their RGB color over top, frontal and side views. Of all the CVIs in Table 1, the one that showed the highest value in the M-Statistic test was used as the index to classify the point cloud.
Once the most appropriate CVI was selected, the next step was to determine a threshold to separate both classes. To binarize the grey-scale point cloud, a threshold value which maximizes the variance between the vegetation and non-vegetation points was chosen using Otsu’s method [76]. Otsu’s method analyzes the histogram of CVI values. The bimodal distribution, with two normal distributions, one representing vegetation and the remainder representing non-vegetation, was verified through the Sarle’s bimodality coefficient (SBC). Otsu’s methods provide an index thresholding value by maximizing the between-class variance and minimizing the within-class variance of the values. Due to the very high point density, threshold determination was performed on a sample of points from the original point cloud. This reduced point cloud was formed by reading one point out of ten (10% of the total points). All points with a CVI value equal to or below the calculated threshold were assigned to the vegetation class, while all points with a CVI value greater than this threshold were assigned to the non-vegetation class.
Matlab software (Natick, MA, USA) was employed to perform the point cloud classification and R software ((R Development Core Team, 2012) was used to perform the data analysis.

2.3. Validation of Vineyard Height Estimation

Vineyard height validation using point cloud classification was carried out using 40 on-ground validation randomly points distributed and georeferenced during each field and UAV flight. A photograph with the branch of the vine and a ruler was taken to measure the grapevine height at the 40 on-ground points (Figure 3). Then, the measured on-ground heights were compared to the estimated heights from the classified point clouds. For this purpose, for each position measured for every field and date, the nearest points belonging to the terrain and vegetation classes were located in the classified point cloud to later calculate the value of the height of every grapevine. The coefficient of determination ( R 2 ) derived from a linear regression model and the root-mean-square error (RMSE) of this adjustment were calculated using R Commander software [77] taking into account each UAV flight independently and jointly in order to analyze the quality of the adjustment.
In addition, one-way and two-way Analysis of Variance (ANOVA) tests were applied to evaluate significant differences in height errors. The Shapiro–Wilk test and the Bartlett’s test were used to assess normality and homoscedasticity of variances, respectively, to meet ANOVA assumptions.

3. Results

Figure 4 shows a partial top view of each of the point clouds generated for each UAV flight over the two fields in July and September. From visual analysis, it can be observed that the vines had a larger cross-section in July (Figure 4a,c) than in September (Figure 4b,d) because of the different phenological stages, setting (young berries growing) and harvest, respectively. In addition, the green color of the vegetation was much more intense in July than in September. On the other hand, it can be observed that, although both fields had cover crops between the crop rows, this appears to be much more widespread in Field A than in Field B. Of the four UAV flights, the one carried out on Plot B in July (Figure 4c) offers the best conditions for a manual interpretation to separate the vegetation from the bare soil, being also easier to differentiate between cover crop and vineyard.

3.1. Color Vegetation Index Selection

From the manually selected points belonging to the vegetation, including vines and cover crops, and bare soil classes CVIs were calculated (Table 2 shows the sample size for each field, date, and class). Firstly, the Lilliefors test was employed to examine the normality of each CVI per class, showing a p-value higher than 0.05 in all the cases. Table 3 and Figure 5 show the M-Statistic results for each field and UAV flight. All M-Statistics values obtained, irrespective of UAV flight and CVI, showed a value greater than 1. Therefore, all of them are suitable for isolating points belonging to the vegetation class from those belonging to the bare soil class. The lowest M-Statistic value was equal to 1.20, obtained in the flight over Field A in September with the ExB index (Figure 5, FA-2). On the contrary, the highest value was equal to 2.24 in Field B in July with the NGRDI index (Figure 5, FB-1). With the exception of the UAV flight over Field B in July (Figure 5, FB-1), the UAV flights showed very similar behavior in terms of the range of the M-Statistic values. In the flight over Field B in July, the M-Statistic values were the highest; therefore, the separation between classes was better. This coincides with the previous visual analysis presented (Figure 4). From the analysis of the data in Table 3, higher values of M-statistic index UAV flight over Field B was due to: a) the sum of the deviations of the two classes in the flight over Field A in July on plot A were always greater, b) same behavior appeared in September in Field B and, finally, c) the differences between the class averages in September in Field A was always smaller. This difference with respect to the September UAV flights may be due to the fact that the vegetation on this date had a less intense green color, its tonality approaching bare soil color. Therefore, the differences between the normal distributions for each class were smaller and, consequently, a lower M-Statistic value was obtained in September. On the other hand, the cover crop and vines in Field A of July (Figure 4a) presented a more accentuated green difference than did those in Field B (Figure 4b). The consequence of this is that the deviation of the CVI in the vegetation class was greater in Field A, and therefore, the value of the M index was lower. Based on the fact that the NGRDI index showed the highest values in the M-Statistic test on all flights, this index was selected to continue with for the rest of the results.

3.2. Point Cloud Classification

As an example, Figure 6 shows the outcome of each of the stages in the proposed point cloud classification process based on the CVI. CVI showed an SBC higher than 5/9, having a bimodal distribution with two differentiated peaks. Starting from the original point cloud (Figure 6a), the CVI was calculated to later determine the separation threshold between classes. This made it possible to have a first classification of the point cloud, differentiating between vegetation (Figure 6b) and preliminary non-vegetation (Figure 6c) points. The points classified as vegetation represented the canopy of every grapevine (Figure 6b). However, those points classified as non-vegetation (Figure 5) represented bare soil, the trunk and branches of the vine, and vegetation of both the vines and the crop cover. Those points belonging to the trunk and branches of the vines, not presenting greenish tones, together with the bare soil points, were correctly classified within the non-vegetation class. However, as can be seen in Figure 6c, there were still points that corresponded to vegetation and were classified as non-vegetation. This could be due to the fact that these points presented a color that, although green, is not as green as that of those points classified as vegetation. Thus, it was necessary to reclassify the non-vegetation class to detect new points belonging to the vegetation class. The classification procedure applied to these points began with the calculation of a new threshold based on the CVI value of these points. From this threshold it was possible to classify a new set of points belonging to vegetation (Figure 6d), differentiating them from non-vegetation points (Figure 6e). These new points classified as vegetation (Figure 6d) represented both vine and cover crop points. On the other hand, the set of points classified as non-vegetation corresponded to bare soil, as well as the trunk and branches of the vine (Figure 6e). In addition, a (reduced) number of points were still classified as non-vegetation although they corresponded to the canopy of the vines. The reason for this may be that they were affected by shadows, presenting a different color than green. As a result, the points belonging to the vegetation class were extracted for a two-stage classification process (Figure 6f).

3.3. Vine Height Quantification

Figure 7 shows the accuracy and graphical comparisons between the measured and UAV vine heights for each field and date. Independently of flight date and field, all fitted models reported a high correlation (R2 higher than 0.871) and a low RMSE (lower than 0.076 m). In addition, most of the points were close to the 1:1 line, which indicated an adequate fit of the UAV estimated vine height and the measured height. The best results were obtained in the UAV flight over Field B in July (Figure 7c), matching with the visual analysis carried out on the four-point clouds (Figure 4). Shapiro–Wilk test (W = 0.956, p-value = 0.154) and Bartlett’s test (p-value = 0.184) validated positively normality and homoscedasticity of vine height error distribution. One-way ANOVA test was used to study the differences in vine height error among the UAV flights, with the conclusion that there were no significant differences in the vine height errors among the UAV flights (d.f. = 3/156, F = 1.965, p > 0.05). In addition, the results of the two-way ANOVA test (Table 4) showed there was no statistically significant interaction between flight dates and fields.
Figure 8 shows the accuracy and graphical comparisons between the measured and UAV vine heights taking into account jointly all vine heights from the four UAV flights. Linear regression model estimates a slope and an intercept equal to 0.93502 (p-value < 2 × 10−16) and 0.052 m (p-value = 0.006) respectively, being the residual standard error 0.063 m. The estimated plant heights of the vines showed a very high coefficient of determination (R2 = 0.91) and a low RMSE (equal to 0.070 m). These results are in the same range as those obtained in other research works using other methodologies such as OBIA in the case of vineyards [57] or olive groves [40]. The points were close to the 1:1 line, indicating an adequate fit between the UAV estimated height and the on-ground measured height. Figure 9 shows a diagnosis of the linear model. Residuals were equally spread around the dotted horizontal line (Figure 9a). This indicates that there was a linear relationship between the fitted adjusted height and residuals. The normal Q–Q plot (Figure 9b) shows how the residuals were normally distributed, close to the straight line and not deviating severely. In addition, Shapiro–Wilk normality testing was applied to residuals, with a result of W = 0.974 (p = 0.149). Taking into account that the p-value was higher than 0.05, this indicates that the residual distribution was not significantly different from a normal distribution. Figure 9c shows how the residuals were equally spread along the range of predictors and therefore have equal variance. In addition, Barlett’s test, analyzing the homogeneity of variances taking into account the four UAV flights, had a p-value equal to 0.268 and therefore showed no evidence to suggest differences between UAV flights. Finally, the extreme values used were not influential in determining the linear regression model. Figure 9d shows how every residual is outside the Cook’s distance and, therefore, regression results would not be altered if any measurement was excluded.

4. Discussion

The results shown are slightly better than those obtained in other works using UAV flights. Other authors like [57,78,79] show R2 values close to 0.8 and an RMSE equal to 0.17 m. All these authors work for the estimation of the height of the vineyard with raster vegetation height models instead of the 3D point cloud, and this may be the consequence of this slight worsening in the height estimation. De Castro et al. (2018) [57] used UAV flights for height determination based on the use of DSM and OBIA, obtaining good results supposing the rasterization of the information, with the results, therefore, depending on the size of the pixel used. On the other hand, taking into account the measurements of on-ground sensors, such as LiDAR, R2 values similar to those obtained in this work are reached, but with a lower RMSE, around 3 cm [80]. Although the results obtained show a slightly higher RMSE than the on-ground LiDAR measures, it does not affect the decision making in the management of the crop [80]. In addition, using point clouds generated from UAV flights, other authors have obtained an RMSE in row height determination between 3 cm and 29 cm [81], making manual intervention necessary in the processes of classification, which could make the process less time-efficient. To solve this problem, other authors have developed unsupervised methods without manual intervention [82], which required the establishment of a series of parameters in the classification process, meaning that the quality of the process depends directly on the values selected for these parameters. Based on the results presented herein, the use of a CVI in 3D point clouds obtained in UAV flights allows for the correct classification of vegetation and the creation of a DTM based on soil points. To our knowledge, CVIs had not previously been applied to automatically classify cloud points. The results obtained can be used later in the structural characterization of vine orchards, being necessary to validate this methodology in other woody crops. The use of CVIs as proposed in this research in the process of classifying 3D point clouds at two grapevine growth stages allowed a fully automatic method to be used, without the need for human intervention or the selection of any parameter prior to classification. Therefore, the results depend only on the radiometric information stored in each of the points without any geometric considerations like slope or distance.
Once every grapevine is classified and its height quantified (Figure 10), this information could be used for identifying smaller-sized areas for more specific in-season management (e.g., potential fungal diseases or water stress) or as a baseline to start accurate digital transformation of the vineyards for further improvement of the entire management system, avoiding laborious and inconsistent manual on-ground measurements. In addition, the proposed methodology can be used for other PV purposes. Vine volume, by multiplying the height and area covered by the point cloud classified as vegetation, or leaf area index, defined as the projected area of leaves represented by vegetation point cloud over unit of land, are potential utilities to be analyzed and evaluated in future research projects.

5. Conclusions

A fully automatic and accurate method was developed for the classification of 3D point clouds based only on the color information of the points; this method was then tested on two commercial vineyards at two different growth stages. Using the RGB information from each point as input for the classification algorithm, the color index value was first calculated. The threshold of separation between classes in the study area was automatically determined, allowing us to isolate those points that belonged to the vegetation class in the study area. The process was carried out completely automatically with no need to select any parameter or for previous training, eliminating any possible error inherent to manual intervention. In this way, the results are independent of the conditions and state of the crop.
The results were used to calculate the heights of the vines with satisfactory quality, as validated in two fields and at two development stages of the crop. Future work will, therefore, be dedicated to determining more structural characteristics of the vines, such as volume or crown width, to help in the monitoring of the crop and support decision-making. In addition, future work using CVIs for vines should be used to estimate variables like biomass by constructing accurate 3D vineyard maps in each phenological phase.

Author Contributions

F.-J.M.-C., A.I.d.C., J.T.-S., and F.L.-G. conceived and designed the experiments; J.T.-S., F.M.J.-B. and F.L.-G. performed the experiments; F.-J.M.-C., A.G.-F., J.T.-S., P.T.-T. and F.L.-G. analyzed the data; A.I.d.C. and F.L.-G. contributed equipment and analysis tools; F.-J.M.-C. and F.L.-G. wrote the paper; A.I.d.C., J.T.-S., and F.M.J.-B. collaborated in the discussion of the results and revised the manuscript. All authors have read and approved the manuscript.

Funding

This research was funded by the AGL2017-82335-C4-4R project (Spanish Ministry of Science, Innovation and Universities, AEI-EU FEDER funds).

Acknowledgments

The authors thank RAIMAT S.A. for allowing the fieldwork and UAV flights in its vineyards.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cook, S.E.; Bramley, R.G.V. Precision agriculture—opportunities, benefits and pitfalls of site-specific crop management in Australia. Aust. J. Exp. Agric. 1998, 38, 753–763. [Google Scholar] [CrossRef]
  2. Whelan, B.M.; McBratney, A.B. The “Null Hypothesis” of Precision Agriculture Management. Precis. Agric. 2000, 2, 265–279. [Google Scholar] [CrossRef]
  3. Bramley, R.G.V.; Hamilton, R.P. Understanding variability in winegrape production systems. 1. Within vineyard variation in yield over several vintages. Aust. J. Grape Wine Res. 2004, 10, 32–45. [Google Scholar] [CrossRef]
  4. Schieffer, J.; Dillon, C. The economic and environmental impacts of precision agriculture and interactions with agro-environmental policy. Precis. Agric. 2015, 16, 46–61. [Google Scholar] [CrossRef]
  5. Bramley, R.; Pearse, B.; Chamberlain, P. Being profitable precisely—A case study of precision viticulture from Margaret River. Aust. N. Zealand Grapegrow. Winemak. Available online: http://www.nwvineyards.net/docs/PVProfitabiltyPaper.pdf (accessed on 1 October 2019).
  6. Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Variable rate dosing in precision viticulture: Use of electronic devices to improve application efficiency. Crop Prot. 2010, 29, 239–248. [Google Scholar] [CrossRef] [Green Version]
  7. Ballesteros, R.; Ortega, J.F.; Hernández, D.; Moreno, M.Á. Characterization of Vitis vinifera L. Canopy Using Unmanned Aerial Vehicle-Based Remote Sensing and Photogrammetry Techniques. Am. J. Enol. Vitic. 2015, 66, 120. [Google Scholar] [CrossRef]
  8. Boomsma, C.R.; Santini, J.B.; West, T.D.; Brewer, J.C.; McIntyre, L.M.; Vyn, T.J. Maize grain yield responses to plant height variability resulting from crop rotation and tillage system in a long-term experiment. Soil Tillage Res. 2010, 106, 227–240. [Google Scholar] [CrossRef]
  9. Tilly, N.; Hoffmeister, D.; Cao, Q.; Huang, S.; Lenz-Wiedemann, V.; Miao, Y.; Bareth, G. Multitemporal crop surface models: Accurate plant height measurement and biomass estimation with terrestrial laser scanning in paddy rice. J. Appl. Remote Sens. 2014, 8, 1–23. [Google Scholar] [CrossRef]
  10. Pereira, A.R.; Green, S.; Villa Nova, N.A. Penman–Monteith reference evapotranspiration adapted to estimate irrigated tree transpiration. Agric. Water Manag. 2006, 83, 153–161. [Google Scholar] [CrossRef]
  11. Cohen, S.; Fuchs, M.; Moreshet, S.; Cohen, Y. The distribution of leaf area, radiation, photosynthesis and transpiration in a Shamouti orange hedgerow orchard. Part II. Photosynthesis, transpiration, and the effect of row shape and direction. Agric. Meteorol. 1987, 40, 145–162. [Google Scholar] [CrossRef]
  12. Fuchs, M.; Cohen, Y.; Moreshet, S. Determining transpiration from meteorological data and crop characteristics for irrigation management. Irrig. Sci. 1987, 8, 91–99. [Google Scholar] [CrossRef]
  13. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment. J. Appl. Remote Sens. 2017, 11, 1–20. [Google Scholar] [CrossRef] [Green Version]
  14. Sio-Se Mardeh, A.; Ahmadi, A.; Poustini, K.; Mohammadi, V. Evaluation of drought resistance indices under various environmental conditions. Field Crop. Res. 2006, 98, 222–229. [Google Scholar] [CrossRef]
  15. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  16. Dempewolf, J.; Nagol, J.; Hein, S.; Thiel, C.; Zimmermann, R. Measurement of within-season tree height growth in a mixed forest stand using UAV imagery. Forest. 2017, 8, 231. [Google Scholar] [CrossRef] [Green Version]
  17. Birdal, A.C.; Avdan, U.; Türk, T. Estimating tree heights with images from an unmanned aerial vehicle. Geomat. Nat. Hazards Risk 2017, 8, 1144–1156. [Google Scholar] [CrossRef] [Green Version]
  18. Johansen, K.; Raharjo, T.; McCabe, M. Using multi-spectral UAV imagery to extract tree crop structural properties and assess pruning effects. Remote. Sens. 2018, 10, 854. [Google Scholar] [CrossRef] [Green Version]
  19. Del-Campo-Sanchez, A.; Ballesteros, R.; Hernandez-Lopez, D.; Ortega, J.F.; Moreno, M.A.; Agroforestry and Cartography Precision Research Group. Quantifying the effect of Jacobiasca lybica pest on vineyards with UAVs by combining geometric and computer vision techniques. PLoS ONE 2019, 14, e0215521. [Google Scholar] [CrossRef] [Green Version]
  20. Lin, G.; Tang, Y.; Zou, X.; Li, J.; Xiong, J. In-field citrus detection and localisation based on RGB-D image analysis. Biosyst. Eng. 2019, 186, 34–44. [Google Scholar] [CrossRef]
  21. Lin, G.; Tang, Y.; Zou, X.; Xiong, J.; Li, J. Guava detection and pose estimation using a low-cost RGB-D sensor in the field. Sensors 2019, 19, 428. [Google Scholar] [CrossRef] [Green Version]
  22. Lee, K.-H.; Ehsani, R. A Laser Scanner Based Measurement System for Quantification of Citrus Tree Geometric Characteristics. Appl. Eng. Agric. 2009, 25, 777–788. [Google Scholar] [CrossRef]
  23. Bongers, F. Methods to assess tropical rain forest canopy structure: An overview. In Tropical Forest Canopies: Ecology and Management: Proceedings of ESF Conference, Oxford University, 12–16 December 1998; Linsenmair, K.E., Davis, A.J., Fiala, B., Speight, M.R., Eds.; Springer: Dordrecht, The Netherlands, 2001; pp. 263–277. [Google Scholar] [CrossRef]
  24. Côté, J.-F.; Fournier, R.A.; Verstraete, M.M. Canopy Architectural Models in Support of Methods Using Hemispherical Photography. In Hemispherical Photography in Forest Science: Theory, Methods, Applications; Fournier, R.A., Hall, R.J., Eds.; Springer: Dordrecht, The Netherlands, 2017; pp. 253–286. [Google Scholar] [CrossRef]
  25. Phattaralerphong, J.; Sinoquet, H. A Method for 3D Reconstruction of Tree Canopy Volume from Photographs: Assessment from 3D Digitised Plants. Available online: https://www.researchgate.net/publication/281471747_A_method_for_3D_reconstruction_of_tree_canopy_volume_photographs_assessment_from_3D_digitised_plants (accessed on 1 October 2019).
  26. Giuliani, R.; Magnanini, E.; Fragassa, C.; Nerozzi, F. Ground monitoring the light–shadow windows of a tree canopy to yield canopy light interception and morphological traits. Plantcell Environ. 2000, 23, 783–796. [Google Scholar] [CrossRef]
  27. Kise, M.; Zhang, Q. Development of a stereovision sensing system for 3D crop row structure mapping and tractor guidance. Biosyst. Eng. 2008, 101, 191–198. [Google Scholar] [CrossRef]
  28. Schumann, A.W.; Zaman, Q.U. Software development for real-time ultrasonic mapping of tree canopy size. Comput. Electron. Agric. 2005, 47, 25–40. [Google Scholar] [CrossRef]
  29. Llorens, J.; Gil, E.; Llop, J. Ultrasonic and LIDAR sensors for electronic canopy characterization in vineyards: Advances to improve pesticide application methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Rosell, J.R.; Sanz, R. A review of methods and applications of the geometric characterization of tree crops in agricultural activities. Comput. Electron. Agric. 2012, 81, 124–141. [Google Scholar] [CrossRef] [Green Version]
  31. Fernández-Sarría, A.; Martínez, L.; Velázquez-Martí, B.; Sajdak, M.; Estornell, J.; Recio, J.A. Different methodologies for calculating crown volumes of Platanus hispanica trees using terrestrial laser scanner and a comparison with classical dendrometric measurements. Comput. Electron. Agric. 2013, 90, 176–185. [Google Scholar] [CrossRef]
  32. Llorens, J.; Gil, E.; Llop, J.; Queraltó, M. Georeferenced LiDAR 3D vine plantation map generation. Sensors 2011, 11, 6237–6256. [Google Scholar] [CrossRef]
  33. Andújar, D.; Moreno, H.; Bengochea-Guevara, J.M.; de Castro, A.; Ribeiro, A. Aerial imagery or on-ground detection? An economic analysis for vineyard crops. Comput. Electron. Agric. 2019, 157, 351–358. [Google Scholar] [CrossRef]
  34. Díaz-Varela, R.; de la Rosa, R.; León, L.; Zarco-Tejada, P. High-resolution airborne UAV imagery to assess olive tree crown parameters using 3D photo reconstruction: Application in breeding trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef] [Green Version]
  35. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  36. Mesas-Carrascosa, F.; Rumbao, I.; Berrocal, J.; Porras, A. Positional quality assessment of orthophotos obtained from sensors onboard multi-rotor UAV platforms. Sensors 2014, 14, 22394–22407. [Google Scholar] [CrossRef] [PubMed]
  37. Mesas-Carrascosa, F.-J.; Notario García, M.; Meroño de Larriva, J.; García-Ferrer, A. An analysis of the influence of flight parameters in the generation of unmanned aerial vehicle (UAV) orthomosaicks to survey archaeological areas. Sensors 2016, 16, 1838. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  38. Sankaran, S.; Khot, L.R.; Espinoza, C.Z.; Jarolmasjed, S.; Sathuvalli, V.R.; Vandemark, G.J.; Miklas, P.N.; Carter, A.H.; Pumphrey, M.O.; Knowles, N.R.; et al. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. Eur. J. Agron. 2015, 70, 112–123. [Google Scholar] [CrossRef]
  39. Guo, Q.; Su, Y.; Hu, T.; Zhao, X.; Wu, F.; Li, Y.; Liu, J.; Chen, L.; Xu, G.; Lin, G.; et al. An integrated UAV-borne lidar system for 3D habitat mapping in three forest ecosystems across China. Int. J. Remote Sens. 2017, 38, 2954–2972. [Google Scholar] [CrossRef]
  40. Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  41. Mesas-Carrascosa, F.-J.; Pérez-Porras, F.; Meroño de Larriva, J.; Mena Frau, C.; Agüera-Vega, F.; Carvajal-Ramírez, F.; Martínez-Carricondo, P.; García-Ferrer, A. Drift correction of lightweight microbolometer thermal sensors on-board unmanned aerial vehicles. Remote Sens. 2018, 10, 615. [Google Scholar] [CrossRef] [Green Version]
  42. Candiago, S.; Remondino, F.; De Giglio, M.; Dubbini, M.; Gattelli, M. Evaluating multispectral images and vegetation indices for precision farming applications from UAV images. Remote Sens. 2015, 7, 4026–4047. [Google Scholar] [CrossRef] [Green Version]
  43. Adão, T.; Hruška, J.; Pádua, L.; Bessa, J.; Peres, E.; Morais, R.; Sousa, J. Hyperspectral imaging: A review on UAV-based sensors, data processing and applications for agriculture and forestry. Remote Sens. 2017, 9, 1110. [Google Scholar] [CrossRef] [Green Version]
  44. Jozkow, G.; Totha, C.; Grejner-Brzezinska, D. UAS Topographic Mapping with Velodyne Lidar Sensor. Available online: https://www.researchgate.net/profile/Grzegorz_Jozkow/publication/307536902_UAS_TOPOGRAPHIC_MAPPING_WITH_VELODYNE_LiDAR_SENSOR/links/57f7ddf608ae280dd0bcc8e8/UAS-TOPOGRAPHIC-MAPPING-WITH-VELODYNE-LiDAR-SENSOR.pdf (accessed on 1 October 2019).
  45. Nagai, M.; Chen, T.; Shibasaki, R.; Kumagai, H.; Ahmed, A. UAV-Borne 3-D Mapping System by Multisensor Integration. IEEE Trans. Geosci. Remote Sens. 2009, 47, 701–708. [Google Scholar] [CrossRef]
  46. Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
  47. Jakubowski, M.; Li, W.; Guo, Q.; Kelly, M. Delineating individual trees from LiDAR data: A comparison of vector-and raster-based segmentation approaches. Remote Sens. 2013, 5, 4163–4186. [Google Scholar] [CrossRef] [Green Version]
  48. Chen, M.; Tang, Y.; Zou, X.; Huang, K.; Li, L.; He, Y. High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm. Opt. Lasers Eng. 2019, 122, 170–183. [Google Scholar] [CrossRef]
  49. Tang, Y.; Li, L.; Wang, C.; Chen, M.; Feng, W.; Zou, X.; Huang, K. Real-time detection of surface deformation and strain in recycled aggregate concrete-filled steel tubular columns via four-ocular vision. Robot. Comput. Integr. Manuf. 2019, 59, 36–46. [Google Scholar] [CrossRef]
  50. Whiteside, T.G.; Boggs, G.S.; Maier, S.W. Comparing object-based and pixel-based classifications for mapping savannas. Int. J. Appl. Earth Obs. Geoinf. 2011, 13, 884–893. [Google Scholar] [CrossRef]
  51. Guo, Q.; Kelly, M.; Gong, P.; Liu, D. An Object-Based Classification Approach in Mapping Tree Mortality Using High Spatial Resolution Imagery. Gisci. Remote Sens. 2007, 44, 24–47. [Google Scholar] [CrossRef]
  52. Chang, A.; Eo, Y.; Kim, Y.; Kim, Y. Identification of individual tree crowns from LiDAR data using a circle fitting algorithm with local maxima and minima filtering. Remote Sens. Lett. 2013, 4, 29–37. [Google Scholar] [CrossRef]
  53. Hyyppa, J.; Kelle, O.; Lehikoinen, M.; Inkinen, M. A segmentation-based method to retrieve stem volume estimates from 3-D tree height models produced by laser scanners. IEEE Trans. Geosci. Remote Sens. 2001, 39, 969–975. [Google Scholar] [CrossRef]
  54. Kwak, D.-A.; Lee, W.-K.; Lee, J.-H.; Biging, G.S.; Gong, P. Detection of individual trees and estimation of tree height using LiDAR data. J. Res. 2007, 12, 425–434. [Google Scholar] [CrossRef]
  55. Jing, L.; Hu, B.; Li, J.; Noland, T. Automated Delineation of Individual Tree Crowns from Lidar Data by Multi-Scale Analysis and Segmentation. Photogramm. Eng. Remote Sens. 2012, 78, 1275–1284. [Google Scholar] [CrossRef]
  56. Jiménez-Brenes, F.M.; López-Granados, F.; de Castro, A.I.; Torres-Sánchez, J.; Serrano, N.; Peña, J.M. Quantifying pruning impacts on olive tree architecture and annual canopy growth by using UAV-based 3D modelling. Plant Methods 2017, 13, 55. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  57. De Castro, A.; Jiménez-Brenes, F.; Torres-Sánchez, J.; Peña, J.; Borra-Serrano, I.; López-Granados, F. 3-D characterization of vineyards using a novel UAV imagery-based OBIA procedure for precision viticulture applications. Remote Sens. 2018, 10, 584. [Google Scholar] [CrossRef] [Green Version]
  58. Suárez, J.C.; Ontiveros, C.; Smith, S.; Snape, S. Use of airborne LiDAR and aerial photography in the estimation of individual tree heights in forestry. Comput. Geosci. 2005, 31, 253–262. [Google Scholar] [CrossRef]
  59. Lee, H.; Slatton, K.C.; Roth, B.E.; Cropper, W.P. Adaptive clustering of airborne LiDAR data to segment individual tree crowns in managed pine forests. Int. J. Remote Sens. 2010, 31, 117–139. [Google Scholar] [CrossRef]
  60. Li, W.; Guo, Q.; Jakubowski, M.K.; Kelly, M. A New Method for Segmenting Individual Trees from the Lidar Point Cloud. Photogramm. Eng. Remote Sens. 2012, 78, 75–84. [Google Scholar] [CrossRef] [Green Version]
  61. De Castro, A.; Torres-Sánchez, J.; Peña, J.; Jiménez-Brenes, F.; Csillik, O.; López-Granados, F. An automatic random forest-OBIA algorithm for early weed mapping between and within crop rows using UAV imagery. Remote Sens. 2018, 10, 285. [Google Scholar] [CrossRef] [Green Version]
  62. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  63. Eckstein, W.; Muenkelt, O. Extracting Objects from Digital Terrain Models. Available online: https://www.spiedigitallibrary.org/conference-proceedings-of-spie/2572/0000/Extracting-objects-from-digital-terrain-models/10.1117/12.216942.short (accessed on 15 October 2019).
  64. Kraus, K.; Pfeifer, N. Determination of terrain models in wooded areas with airborne laser scanner data. Isprs J. Photogramm. Remote Sens. 1998, 53, 193–203. [Google Scholar] [CrossRef]
  65. Axelsson, P. Processing of laser scanner data—Algorithms and applications. Isprs J. Photogramm. Remote Sens. 1999, 54, 138–147. [Google Scholar] [CrossRef]
  66. Snavely, N.; Seitz, S.M.; Szeliski, R. Modeling the World from Internet Photo Collections. Int. J. Comput. Vis. 2008, 80, 189–210. [Google Scholar] [CrossRef] [Green Version]
  67. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.; López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by UAV to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  68. Mao, W.; Wang, Y.; Wang, Y. Real-time detection of between-row weeds using machine vision. In Proceedings of the 2003 ASAE Annual Meeting, Las Vegas, NV, USA, 27–30 July 2003; p. 1. [Google Scholar]
  69. Woebbecke, D.; Meyer, G.; Von Bargen, K.; Mortensen, D. Shape features for identifying young weeds using image analysis. Trans. ASAE-Am. Soc. Agric. Eng. 1995, 38, 271–282. [Google Scholar] [CrossRef]
  70. Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine Vision Detection Parameters for Plant Species Identification. In Proceedings of the Precision Agriculture and Biological Quality, Boston, MA, USA, 14 January 1999; Volume 3543, pp. 327–335. [Google Scholar]
  71. Camargo Neto, J. A Combined Statistical-Soft Computing APPROACH for Classification and Mapping Weed Species in Minimum-TILLAGE Systems. 2004. Available online: https://search.proquest.com/openview/c9d042c0b775871973b4494b3233002c/1?cbl=18750&diss=y&pq-origsite=gscholar (accessed on 1 October 2019).
  72. Kataoka, T.; Kaneko, T.; Okamoto, H.; Hata, S. Crop growth estimation system using machine vision. In Proceedings of the 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003), Kobe, Japan, 20–24 July 2003; Volume1072, pp. b1079–b1083. [Google Scholar]
  73. Woebbecke, D.M.; Meyer, G.E.; Bargen, K.V.; Mortensen, D.A. Plant Species Identification, Size, and Enumeration Using Machine Vision Techniques on Near-Binary Images. In Proceedings of the Applications in Optical Science and Engineering, Boston, MA, USA, 12 May 1993. [Google Scholar]
  74. Gée, C.; Bossu, J.; Jones, G.; Truchetet, F. Crop/weed discrimination in perspective agronomic images. Comput. Electron. Agric. 2008, 60, 49–59. [Google Scholar] [CrossRef]
  75. Kaufman, Y.J.; Remer, L.A. Detection of forests using mid-IR reflectance: An application for aerosol studies. IEEE Trans. Geosci. Remote Sens. 1994, 32, 672–683. [Google Scholar] [CrossRef]
  76. Otsu, N. A threshold selection method from gray level histogram. IEEE Trans. Syst. Man Cybern. 1979, 9, 66–166. [Google Scholar] [CrossRef] [Green Version]
  77. Fox, J. The R commander: A basic-statistics graphical user interface to R. J. Stat. Softw. 2005, 14, 1–42. [Google Scholar] [CrossRef] [Green Version]
  78. Pádua, L.; Marques, P.; Hruška, J.; Adão, T.; Peres, E.; Morais, R.; Sousa, J. Multi-Temporal Vineyard Monitoring through UAV-Based RGB Imagery. Remote Sens. 2018, 10, 1907. [Google Scholar] [CrossRef] [Green Version]
  79. Caruso, G.; Tozzini, L.; Rallo, G.; Primicerio, J.; Moriondo, M.; Palai, G.; Gucci, R.J.V. Estimating biophysical and geometrical parameters of grapevine canopies (‘Sangiovese’) by an unmanned aerial vehicle (UAV) and VIS-NIR cameras. Vitis 2017, 56, 63–70. [Google Scholar]
  80. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.; Comar, A. High-Throughput Phenotyping of Plant Height: Comparing Unmanned Aerial Vehicles and Ground LiDAR Estimates. Front. Plant Sci. 2017, 8. [Google Scholar] [CrossRef] [Green Version]
  81. Weiss, M.; Baret, F. Using 3D point clouds derived from UAV RGB imagery to describe vineyard 3D macro-structure. Remote Sens. 2017, 9, 111. [Google Scholar] [CrossRef] [Green Version]
  82. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Gay, P. Unsupervised detection of vineyards by 3D point-cloud UAV photogrammetry for precision agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [Google Scholar] [CrossRef]
Figure 1. Study area: (a) general context and (b) locations of parcels.
Figure 1. Study area: (a) general context and (b) locations of parcels.
Remotesensing 12 00317 g001
Figure 2. Flowchart used for the classification of points.
Figure 2. Flowchart used for the classification of points.
Remotesensing 12 00317 g002
Figure 3. An example of vine height measurement.
Figure 3. An example of vine height measurement.
Remotesensing 12 00317 g003
Figure 4. A partial top view of the point clouds generated from unmanned aerial vehicle (UAV) flights over fields A and B in the months of July and September: (a) Field A July, (b) Field A September, (c) Fueld B July and (d) Field B September.
Figure 4. A partial top view of the point clouds generated from unmanned aerial vehicle (UAV) flights over fields A and B in the months of July and September: (a) Field A July, (b) Field A September, (c) Fueld B July and (d) Field B September.
Remotesensing 12 00317 g004
Figure 5. M-Statistic values for each UAV flight per Field (A and B) and month (1: July and 2: September).
Figure 5. M-Statistic values for each UAV flight per Field (A and B) and month (1: July and 2: September).
Remotesensing 12 00317 g005
Figure 6. An example of the results obtained from the point cloud classification process: Cloud points: (a) 3D RGB, (b) vegetation points, (c) preliminar non-vegetation, (d) vegetation points, (e) non-vegetation, (f) final vegetation points.
Figure 6. An example of the results obtained from the point cloud classification process: Cloud points: (a) 3D RGB, (b) vegetation points, (c) preliminar non-vegetation, (d) vegetation points, (e) non-vegetation, (f) final vegetation points.
Remotesensing 12 00317 g006
Figure 7. UAV vine height versus measured vine height by field and date and 95% confidence interval. The red line is the fitted linear function and the black line represents the 1:1 line. The root-mean-square error (RMSE) and the coefficient of determination (R2) derived from the linear regression fit are included (p < 0.0001). Field A: (a) July and (b) September, Field B: (c) July and (d) September.
Figure 7. UAV vine height versus measured vine height by field and date and 95% confidence interval. The red line is the fitted linear function and the black line represents the 1:1 line. The root-mean-square error (RMSE) and the coefficient of determination (R2) derived from the linear regression fit are included (p < 0.0001). Field A: (a) July and (b) September, Field B: (c) July and (d) September.
Remotesensing 12 00317 g007
Figure 8. A graphical comparison of UAV estimated and on-ground measured vine heights for all UAV flights corresponding to Fields A and B and both dates (1: July and 2: September). The red line is the fitted linear function and the black line represents the 1:1 line. The RMSE and R2 values (p < 0.0001) obtained from adjustment are included.
Figure 8. A graphical comparison of UAV estimated and on-ground measured vine heights for all UAV flights corresponding to Fields A and B and both dates (1: July and 2: September). The red line is the fitted linear function and the black line represents the 1:1 line. The RMSE and R2 values (p < 0.0001) obtained from adjustment are included.
Remotesensing 12 00317 g008
Figure 9. Residual analysis of the linear model of UAV estimated and on-ground measured vine heights: (a) residuals vs. fitted values, (b) normality of residuals, (c) scale–location, and (d) leverage vs. residuals.
Figure 9. Residual analysis of the linear model of UAV estimated and on-ground measured vine heights: (a) residuals vs. fitted values, (b) normality of residuals, (c) scale–location, and (d) leverage vs. residuals.
Remotesensing 12 00317 g009
Figure 10. Example of a vineyard height-map based on the proposed methodology.
Figure 10. Example of a vineyard height-map based on the proposed methodology.
Remotesensing 12 00317 g010
Table 1. Color vegetation indices.
Table 1. Color vegetation indices.
Color IndexEquation 1,2
Excess of Blue ExB = 1.4 · b g
Excess of Green ExG = 2 · g r b
Excess of Red ExR = 1.4 · r g
Excess of Green minus excess Red ExGR = ExG ExR
Color Index of Vegetation Extraction CIVE = 0.4412 · r 0.811 · g + 18.78745
Normal Green–Red Difference Index NGRDI = g r g + r
1 r, g, and b are the normalized Red, Green, and Blue spectral bands. 2 Equations are cited from [68,69,70,71,72,73].
Table 2. Sample size of points per plot, UAV flight date and class (vegetation and non-vegetation).
Table 2. Sample size of points per plot, UAV flight date and class (vegetation and non-vegetation).
FieldDateVegetation [Number Points]Non-Vegetation [Number Points]
AJuly28,62328,419
ASeptember26,56127,134
BJuly29,35929,670
BSeptember31,89431,050
Table 3. Mean and standard deviation (SD) of each color vegetation index (CVI) per field and UAV flight date for the vegetation and non-vegetation classes and result of the M-statistic test.
Table 3. Mean and standard deviation (SD) of each color vegetation index (CVI) per field and UAV flight date for the vegetation and non-vegetation classes and result of the M-statistic test.
Field—UAV Flight DateVegetationNon-Vegetation
CVIMeanSDMeanSDM-Statistic
A-JulyEXG0.324730.157670.00860.020551.77385
EXR0.042680.076270.212510.032671.55906
EXB−0.144040.115680.044410.027271.31834
EXGR0.282060.22842−0.203910.047471.76143
CIVE18.663690.063518.79230.008731.78067
NGRDI0.118970.07445−0.075870.03151.83896
A-SeptemberEXG−0.0030.020990.211280.091851.89905
EXR0.217390.031870.091870.046431.60307
EXB0.052670.02171−0.064660.07541.20829
EXGR−0.22040.049350.119410.130541.88897
CIVE18.796970.0090618.709380.037581.87791
NGRDI−0.082660.030150.065040.047231.90891
B–July EXG0.003930.014930.279520.121342.02242
EXR0.229840.023560.087680.050691.91471
EXB0.032380.02103−0.13780.100411.40132
EXGR−0.225910.033610.191840.165382.09936
CIVE18.79480.0062818.682920.048432.04517
NGRDI−0.092580.021860.073110.051872.24719
B–SeptemberEXG0.01110.017940.219110.095131.8397
EXR0.201060.017170.127870.037421.34068
EXB0.053030.01948−0.109530.089871.4867
EXGR−0.189960.030120.091230.121711.85206
CIVE18.784380.0072618.70790.037591.7051
NGRDI−0.065460.015140.030750.036441.86511
Table 4. F and p-values of the two-way ANOVA for vine height error.
Table 4. F and p-values of the two-way ANOVA for vine height error.
FactorFp
Field3.7350.1075
Flight date0.1570.6929
Field: Flight date2.7840.1972

Share and Cite

MDPI and ACS Style

Mesas-Carrascosa, F.-J.; de Castro, A.I.; Torres-Sánchez, J.; Triviño-Tarradas, P.; Jiménez-Brenes, F.M.; García-Ferrer, A.; López-Granados, F. Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sens. 2020, 12, 317. https://doi.org/10.3390/rs12020317

AMA Style

Mesas-Carrascosa F-J, de Castro AI, Torres-Sánchez J, Triviño-Tarradas P, Jiménez-Brenes FM, García-Ferrer A, López-Granados F. Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sensing. 2020; 12(2):317. https://doi.org/10.3390/rs12020317

Chicago/Turabian Style

Mesas-Carrascosa, Francisco-Javier, Ana I. de Castro, Jorge Torres-Sánchez, Paula Triviño-Tarradas, Francisco M. Jiménez-Brenes, Alfonso García-Ferrer, and Francisca López-Granados. 2020. "Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications" Remote Sensing 12, no. 2: 317. https://doi.org/10.3390/rs12020317

APA Style

Mesas-Carrascosa, F. -J., de Castro, A. I., Torres-Sánchez, J., Triviño-Tarradas, P., Jiménez-Brenes, F. M., García-Ferrer, A., & López-Granados, F. (2020). Classification of 3D Point Clouds Using Color Vegetation Indices for Precision Viticulture and Digitizing Applications. Remote Sensing, 12(2), 317. https://doi.org/10.3390/rs12020317

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop