Next Article in Journal
Impact Localization in Complex Cylindrical Shell Structures Based on the Time-Reversal Virtual Focusing Triangulation Method
Previous Article in Journal
Non-Intrusive Continuous Monitoring of Leaks for an In-Service Penstock
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparative Analysis of TLS and UAV Sensors for Estimation of Grapevine Geometric Parameters

by
Leilson Ferreira
1,2,
Joaquim J. Sousa
3,4,
José. M. Lourenço
5,
Emanuel Peres
2,3,6,
Raul Morais
2,3,6 and
Luís Pádua
2,3,6,*
1
Department of Agronomy, School of Agrarian and Veterinary Sciences, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
2
Centre for the Research and Technology of Agro-Environmental and Biological Sciences, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
3
Engineering Department, School of Science and Technology, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
4
Centre for Robotics in Industry and Intelligent Systems (CRIIS), Institute for Systems and Computer Engineering, Technology and Science (INESC-TEC), 4200-465 Porto, Portugal
5
Geology Department and Geosciences Center (CGeo), University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
6
Institute for Innovation, Capacity Building and Sustainability of Agri-Food Production, University of Trás-os-Montes e Alto Douro, 5000-801 Vila Real, Portugal
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(16), 5183; https://doi.org/10.3390/s24165183 (registering DOI)
Submission received: 2 July 2024 / Revised: 7 August 2024 / Accepted: 9 August 2024 / Published: 11 August 2024
(This article belongs to the Section Smart Agriculture)

Abstract

:
Understanding geometric and biophysical characteristics is essential for determining grapevine vigor and improving input management and automation in viticulture. This study compares point cloud data obtained from a Terrestrial Laser Scanner (TLS) and various UAV sensors including multispectral, panchromatic, Thermal Infrared (TIR), RGB, and LiDAR data, to estimate geometric parameters of grapevines. Descriptive statistics, linear correlations, significance using the F-test of overall significance, and box plots were used for analysis. The results indicate that 3D point clouds from these sensors can accurately estimate maximum grapevine height, projected area, and volume, though with varying degrees of accuracy. The TLS data showed the highest correlation with grapevine height (r = 0.95, p < 0.001; R2 = 0.90; RMSE = 0.027 m), while point cloud data from panchromatic, RGB, and multispectral sensors also performed well, closely matching TLS and measured values (r > 0.83, p < 0.001; R2 > 0.70; RMSE < 0.084 m). In contrast, TIR point cloud data performed poorly in estimating grapevine height (r = 0.76, p < 0.001; R2 = 0.58; RMSE = 0.147 m) and projected area (r = 0.82, p < 0.001; R2 = 0.66; RMSE = 0.165 m). The greater variability observed in projected area and volume from UAV sensors is related to the low point density associated with spatial resolution. These findings are valuable for both researchers and winegrowers, as they support the optimization of TLS and UAV sensors for precision viticulture, providing a basis for further research and helping farmers select appropriate technologies for crop monitoring.

1. Introduction

Precision agriculture (PA) is an agricultural management strategy that relies on data observation, acquisition, and processing to understand and respond to temporal and spatial variability. The goal is to enhance the sustainability of agricultural production by optimizing resource use, reducing risks, and minimizing environmental impacts [1]. This approach is crucial for the efficient management of agricultural systems, including viticulture [2]. Viticulture involves the cultivation of grapes for wine production, a complex process that relies on a detailed understanding of the vineyard environment [3]. Traditionally, this understanding has been obtained through labor-intensive and often imprecise methods such as direct observations and soil sampling [4]. However, with the advancement of PA, various technologies for precision viticulture (PV) have become increasingly important [5,6,7].
One of the initial steps in adopting PV involves measuring the geometric and biophysical parameters of grapevines (vitis vinifera L.) [6]. The data collected are then interpreted and evaluated from a viticulture perspective, aiding winegrowers in the decision-making process, to enhance crop production and quality [8]. The precise characterization of these parameters is essential for efficient and sustainable management, optimizing resource use, and increasing productivity [9]. Studies have indicated that the volume and external canopy structure are important factors in understanding plant growth and biological characteristics, as these can indicate vigor [10]. The size of grapevine canopies is closely correlated with the amount of intercepted sunlight, the assimilated carbon [11,12,13], and biomass, which is important for planning pesticide application and defoliation strategies [14]. Additionally, canopy volume is directly linked to water evaporation [15], making these parameters essential for assessing crop management, plant health, water use efficiency, nutritional needs, and optimizing yield [10,15]. Obtaining reliable and timely data on these geometric parameters is, therefore, of utmost importance.
Traditional methods for acquiring geometrical data on plants, such as manual measurements and visual inspections, are time-consuming and labor-intensive [16]. However, technological advancements have led to the development of sensors capable of rapid three-dimensional (3D) data capture, enabling accurate measurements of plant structures. These include Terrestrial Laser Scanners (TLS) [7,17,18,19,20], as well as Light Detection And Ranging (LiDAR) and optical sensors mounted on Unmanned Aerial Vehicles (UAVs) [21,22,23,24,25]. Data collected using these technologies can generate point clouds via Structure from Motion (SfM) techniques or calculate point clouds directly using LiDAR, representing the field in a 3D space (X, Y, Z) [26,27,28,29]. TLS instruments allow for the non-destructive and precise digitization of physical scenes into 3D point clouds and have been extensively studied in precision farming applications [30]. TLS systems use LiDAR sensors that emit light pulses, which reflect off plants, allowing the calculation of distances based on the time taken for the sensor pulses to return to the sensor [31]. While TLS methods are widely used in civil and industrial engineering, such as in scanning buildings and archaeological sites [26,32,33], they also offer significant potential in agriculture for mapping, monitoring crop growth, and optimizing management practices [34,35].
Several studies have explored the use of stationary and mobile TLS point clouds in PA and PV. These applications include geometric analysis [36,37,38], biomass estimation [17], and fruit detection [39]. In PV, most studies have focused on using 3D point clouds from mobile TLS to extract structural parameters and biophysical variables [7,34,40,41,42,43,44,45,46], with fewer studies using stationary TLS [47,48].
Pagliai et al. [34] conducted a comparative study using UAVs, mobile TLS, and a mobile application [49], to assess grapevine canopy parameters. Three data acquisition campaigns were conducted on geo-referenced grapevines, using the Leaf Area Index (LAI) as the reference value. The study found that all tools accurately distinguished zones with varying canopy sizes. The correlation between height data was highest with the mobile application, followed by TLS and UAV canopy volume measurements. The correlation between LAI and volume was moderately strong for all tools, with the highest correlation observed for mobile TLS and the lowest for UAV data. Torres-Sánchez et al. [31] used UAV imagery and mobile TLS to estimate geometric parameters, including maximum height, projected area, and volume (2.5D and convex hull) in vineyards and peach and pear orchards. A high correlation was observed between the sensors, with geometric parameters estimated by mobile TLS generally higher than those of the UAV imagery. The smallest difference between the two sensors was found in maximum height estimates. Despite these satisfactory results, these studies have only compared sensor measurements without validation against actual field measurements. Llorens et al. [50] compared mobile TLS, ultrasonic sensors, and manual canopy measurements and found that grapevine canopy width and volume measurements based on TLS data had lower coefficients of determination (R2) than those obtained with the ultrasonic and manual approaches. Rinaldi et al. [48] developed a protocol to characterize grapevine canopy geometry and determine BBCH stages using manually measured data and a stationary TLS and found a significant correlation between TLS data and LAI at each growth stage. Moreover, they found a statistically significant relationship between row volume, leaf wall area, and the grapevine’s growth stage, with high correlations for height and width between manual measurements and TLS data.
Sensors coupled to UAVs have also been used to extract geometric parameters of grapevines and other biophysical variables [8,51,52,53]. These sensors, employing computer vision and photogrammetric techniques, generate point clouds [54]. Several studies have shown the potential of 3D canopy reconstruction for assessing vineyard spatial variability [55,56] and temporal vegetative decline, detecting grapevine trunks and vineyards [57,58,59], monitoring canopy management operations [60], estimating yield [61], pruning wood weight [62], measuring leaf area index [63,64], and optimizing treatments with pesticides and fertilizers [65]. Cantürk et al. [66] demonstrated an approach using point cloud data to detect trunk positions and extract the total height, width, and volume of the grapevine canopies. The combination of UAV imagery with 3D point cloud data is widely used to assess canopy height, allowing for a feasible analysis of biophysical parameter extraction in agricultural contexts [16,36].
According to Torres-Sánchez et al. [31], TLS and UAV systems differ significantly in terms of sensor types used and captured perspectives. Most UAV-based studies in PV use passive sensors, providing an aerial view, whereas TLS systems use active sensors, offering a ground perspective. Previous studies comparing these systems found divergences in the extracted geometric parameters of plants [34,67,68], highlighting the effectiveness of TLS and UAV data for both qualitative and quantitative analysis of plant structures. However, there is a gap in the literature concerning the comparison of static TLS systems and various UAV sensors for accurately estimating grapevine geometric parameters, particularly with field measurement validation.
This article addresses this gap by comparing a static TLS scanner and multiple UAV sensors, including RGB, LiDAR, Thermal Infrared (TIR), multispectral, and panchromatic data, to evaluate their capabilities in estimating geometric parameters such as the height, canopy area, and volume of grapevines. The study validates maximum height estimates with field measurements and canopy projected area measured in a Geographical Information System (GIS). Additionally, it discusses the strengths and limitations of each evaluated sensor in estimating these metrics, providing valuable insights for both researchers and winegrowers.

2. Materials and Methods

2.1. Study Area

This study was conducted in a 0.30-hectare experimental vineyard located on the campus of the University of Trás-os-Montes e Alto Douro, Vila Real, Portugal (41°17′13.2″ N 7°44′08.7″ W WGS84, altitude: 462 m). The vineyard (Figure 1), planted with Malvasia Fina variety, is trained in a double Guyot system. The grapevines are spaced approximately 1.20 m apart within each row, with 1.80 m between the 22 rows, which are oriented NE–SW. This rainfed vineyard is managed with foliar fertilization and phytosanitary treatments throughout the growing season. The inter-row areas contain spontaneous vegetation, which is managed mechanically at least twice per season.

2.2. Data Acquisition

Both UAV surveys and TLS scans were conducted on 10 October 2023, after the harvesting period (BBCH 91) [69]. Fifteen targets, each measuring 1 × 1 m, were placed throughout the vineyard plot to assist with UAV data alignment and TLS scans georeferencing. These targets were coordinated using a real-time kinematic (RTK) Global Navigation Satellite System (GNSS) receiver (Trimble R2, Trimble Inc., Westminster, CO, USA) in the EPSG:3763 (ETRS89/Portugal TM06) coordinate system, as shown in Figure 2c. Seven of these targets were designated as ground control points (GCPs) for UAV data alignment [70]. They were positioned in three groups: three placed in the northern part of the vineyard, three in the southern part, and one near the central area (Figure 1a). The remaining eight targets, along with one of the GCPs, were used for georeferencing and merging the TLS scans (Figure 1b). After completing data acquisition from all sensors, the heights of 20 grapevines within the area surveyed by both UAV and TLS (Figure 1b) were measured (Figure 2e). At same time, the coordinates of each grapevine were acquired using an RTK tablet (LT700 RTK, Shanghai Huace Navigation Technology Ltd., Shanghai, China).

2.2.1. UAV Data Acquisition

UAV data were obtained using the Matrice 300 RTK (DJI, Shenzhen, China), shown in Figure 2b. This UAV supports multiple payload sensors. In this study, the Zenmuse H20T was used to acquire both TIR (640 × 512 pixels, 12,000 nm) and RGB (4056 × 3040 pixels) imagery, Zenmuse L1 was used to acquire LiDAR point cloud data with a maximum of three returns and RGB imagery (5472 × 3648 pixels), and the Micasense ALTUM-PT (AgEagle Aerial Systems Inc., Wichita, KS, USA) was used to acquire multispectral imagery (2064 × 1544 pixels) in blue (475 nm ± 32 nm), green (560 nm ± 27 nm), red (668 nm ± 14 nm), red-edge (717 nm ± 14 nm) and near-infrared (NIR, 842 nm ± 57 nm) bands, and also panchromatic (4112 × 3008 pixels, 634 nm ± 463 nm) and TIR (320 × 256 pixels, 11,000 nm ± 6000 nm) imagery. Zenmuse sensors are integrated into a three-axis gimbal for stabilization. The UAV RTK connection was set up to a D-RTK 2 high-precision GNSS mobile station (DJI, Shenzhen, China) mounted in a tripod (Figure 2d). A Downwelling Light Sensor (DLS) 2 was positioned atop of the UAV for radiometric corrections of the multispectral sensor. Despite the primary advantage of TIR imagery being the ability to capture thermal information, TIR data from Zenmuse H20T was included in this study to explore the feasibility and effectiveness of TIR sensors in measuring geometric parameters in the context of precision viticulture.
Flight routes were planned using the DJI Pilot 2 application installed in the Matrice 300 smart controller to survey the area and surroundings of the studied vineyard (approximately 0.5 ha). Each flight was set to maintain a flight height of 50 m above the terrain, with a longitudinal overlap of 90% and a lateral overlap of 80%. The flight speeds varied for each sensor: 1.1 m s−1 for H20T, 1.9 m s−1 for L1, and 3.3 m s−1 for ALTUM-PT, taking approximately 12, 15, and 6 min, respectively. The planned spatial resolutions were as follows: 0.0216 m for the ALTUM-PT, 0.0444 m for the TIR imagery, 0.0172 m for the RGB imagery of the H20T sensor, 0.0136 m for the L1 LiDAR data, and 0.0158 m for the RGB imagery. Data acquisition of ALTUM-PT began at 13:20, followed by the Zenmuse L1 at 13:40 and the Zenmuse H20T at 14:00. Before acquiring the multispectral UAV data, images of a calibration reflectance panel were captured for subsequent radiometric corrections.

2.2.2. TLS Data Acquisition

The TLS used in this study was the BLK360 G1 (Leica Geosystems AG, St. Gallen, Switzerland), shown in Figure 2a. This device can produce point clouds with associated RGB information for each point. It uses a high-speed time-of-flight measurement system enhanced by Waveform Digitizing (WFD) technology, offering a field of view of 360° in the horizontal plane and 300° in the vertical plane. The scanner captures up to 360,000 points per second at a wavelength of 830 and provides a 3D point accuracy ranging from 6 mm at 10 m to 8 mm at 20 m, with an acquisition range from 0.6 m up to 60 m. The RGB information is captured by a 15 MP 3-camera system, capable of generating spherical images up to 150 MP. It can be configured and operated through the Leica Cyclone FIELD 360 (Leica Geosystems AG, St. Gallen, Switzerland) software application or through a quick-release button. The BLK360 G1 is lightweight (1 kg) and can be easily deployed in field settings using a tripod.
Data acquisition for this study commenced at 11:00, using the highest resolution configuration available for BLK360 G1. Each scan took approximately nine minutes per location, including installation and stabilization time, resulting in a total scanning duration of about one hour and thirty minutes. Scanning locations were spaced seven meters apart within the same grapevine row, with targets placed 3.5 m away, covering 200 m2 across four rows.

2.3. Data Processing

2.3.1. UAV Data Processing

The UAV imagery was processed using Pix4Dmapper Pro (Pix4D SA, Lausanne, Switzerland) version 4.5.6, using SfM techniques to generate dense point clouds and subsequent raster products. Data from each sensor were processed in separate projects including RGB imagery from the Zenmuse L1, RGB imagery from the Zenmuse H20T, the five multispectral bands of the ALTUM-PT, the panchromatic band of the ALTUM-PT, and the TIR imagery from H20T.
Initially, each project was subjected to processing to generate a sparse point cloud, using the full image size to extract keypoints. For image pair matching, the aerial grid or corridor configuration was selected. In all projects, except for the multispectral imagery obtained from ALTUM-PT (which used 10,000 keypoints), the target number of key points was set to automatic. The internal and external camera parameters were calibrated using the standard calibration method for projects using RGB imagery, while the alternative calibration method was used for other projects.
After completing this stage, GCPs were marked on the imagery (distribution shown in Figure 1a). Figure 3 provides an overview of a target used as a GCP for the various sensors. Following the identification of GCPs, the internal and external camera parameters were reoptimized to incorporate the GCP data. A dense point cloud with high point density was subsequently generated. Densification was carried out using half of the image size to compute additional 3D points (with thermal infrared using the original image size), as well as other image quarters and eight image scales. Each 3D point had to be correctly re-projected in at least three images to be considered for the dense point cloud. The resulting point clouds were exported as LAS files.
Additionally, orthorectified raster products were created, including orthophoto mosaics from the RGB imagery, Land Surface Temperature (LST) mosaics from the TIR imagery, reflectance mosaics from the multispectral and panchromatic bands, and the Normalized Difference Vegetation Index (NDVI) [71].
For the UAV LiDAR data from the Zenmuse L1, processing was conducted using DJI Terra software (version 3.9.4.). The raw data were imported and processed with a high-density setting (considering 100% of the points), assuming a flat ground surface, an iteration angle of 3°, and a distance of 0.3 m. The processed point cloud was then exported as a LAS file.
During photogrametic processing, the UAV-based RGB, multispectral, panchromatic, and TIR imagery, were aligned using GCPs. The Root Mean Square Error (RMSE) for each sensor in the surveyed vineyard plot is presented in Table 1. Overall, the planimetric errors (XY) were below 0.02 m. The multispectral imagery showed the lowest error at 0.005 m, followed by panchromatic (0.006 m) and RGB imagery from both H20T and L1 (0.008 m). The TIR imagery had the highest planimetric errors at 0.016 m. The altimetric errors (Z) ranged from 0.017 m and 0.022 m, except for TIR imagery, which had an error of 0.072 m. The spatial resolution of the resulting products varied, with panchromatic data achieving the highest resolution (0.012 m), while TIR data had the lowest (0.053 m). Data from other sensors had a spatial resolution below 0.03 m, with RGB data from L1 at approximately 0.016 m, RGB data from H20T at around 0.02 m, and multispectral data at 0.026 m.

2.3.2. TLS Data Processing

TLS data were processed using Leica Cyclone REGISTER 360 PLUS (BLK Edition) version 2024.0.1, producing point clouds with RGB information. During data processing, the TLS scans were merged into a unified coordinate system for the entire point cloud. After importing the data, the “Review and Optimization operations” option was performed. This step enabled optimization, modeling, editing, error detection in scanning, and removal of unwanted points, as well as georeferencing the study area. Nine georeferenced targets (Figure 1b) were used for this purpose. The position of each scan was defined in the scanner’s coordinate systems. To align different scanning positions, it was necessary to establish the exact position and orientation of the scanner coordinate systems based on the target coordinates. Specifically, the position and orientation of the nine scanning locations were determined with reference to nine control points, thus georeferencing the TLS dataset to a fixed coordinate system. The registration accuracy of the TLS project was reported as a bundle error of 0.012 m. After processing, the point cloud was exported in LAS file format.

2.3.3. Grapevine Geometrical Parameters Extraction from Point Clouds

The point clouds obtained from each sensor and imagery type of UAV-based data, including multispectral and panchromatic from Micasense ALTUM-PT, RGB and TIR from Zenmuse H20T, and LiDAR and RGB from Zenmuse L1, along with the TLS point cloud, were processed to extract several grapevine-related geometric parameters.
Each point cloud underwent a ground normalization procedure to remove the elevation variation converting altitude values into heights. The Simple Morphological Filter (SMRF) algorithm [72] was used to segment points belonging to the ground. This involved: (1) creating a surface map with the minimum altitude from the point cloud data; (2) segmenting the ground elements by separating them from other points; and (3) segmenting the original point cloud data. Altitude was transformed into height using the points identified as ground through the SMRF algorithm. An interpolation was then conducted to estimate the ground elevation for each point in the point cloud. Each point cloud was normalized by subtracting the interpolated ground altitude from the altitude of each point, creating normalized point clouds to ensure consistent comparisons across grapevine plants and different point clouds.
Once normalized, the monitored grapevines were extracted from the point clouds. This process was automated using “lasclip” function from the LAStools library (rapidlasso GmbH, Gilching, Germany) within QGIS by providing a shapefile containing polygons for each grapevine. These polygons were designed to cover a length of 1 m with a buffer of 0.8 m, encompassing the grapevine plant, its surroundings (including soil and other vegetation), and the area where the height measurements were taken in the field. This procedure allowed for the separation of points associated with each grapevine area into individual LAS files.
With each plant separated, it was possible to assess grapevine geometric parameters extracted from the point clouds generated by each sensor. In this study, the evaluated parameters included the maximum grapevine height (Hmax), which considers the highest point of the canopy above ground level, expressed in meters; the heights at 90th and 95th percentiles (H90 and H95, respectively) from points above the lower trellis wire, also expressed in meters; the grapevine canopy volume, in cubic meters; and grapevine projected area, in square meters.
In the individual grapevine point clouds, points below 0.30 m, which corresponds to the approximate height of the lower trellis wire in the studied vineyard, were excluded to eliminate noise points corresponding to undergrowth vegetation, which could distort the extracted metrics.
For grapevine volume estimation, a triangle mesh of the normalized point cloud was created using the alphaShape algorithm [73] for all points in X and Y dimensions with Z > 0.30 m. An alpha value (α) of 0.5 was used for UAV data, as in described in Di Gennaro and Matese [74], while α value of 0.05 was used for TLS data, following Liu et al. [75]. The same procedure was applied for grapevine projected area estimation, using the same α value. However, for area estimation, only the X and Y dimensions from points with Z > 0.30 m were considered to create a polygon encompassing the entire grapevine canopy.

2.4. Data Analysis

The point clouds produced from each sensor’s data went through a qualitative assessment. This process involved visually inspecting the point clouds to identify similarities and differences across datasets and to derive some metrics related to their point density. Additionally, the distribution of the estimated grapevine geometric parameters within the point cloud data was examined, including considerations of the orthorectified products generated by each sensor.
A statistical analysis of the data extracted for each grapevine was conducted using R software [76] version 2023.12.0 and IBM SPSS Statistics version 28.0.1.0 (IBM Corp., Armonk, NY, USA). Key discriminative statistics, such as mean, minimum, maximum, Standard Deviation (SD), and Coefficient of Variation (CV), were calculated for each measurement tool. To assess the reliability and validity of the extracted geometrical variables, linear correlations among all measured parameters were analyzed. The evaluation of the results was supported by statistical parameters, including RMSE, coefficient of determination (R2), and statistical significance, using the R package “lmodel2” [77] version 1.7-3 determined through the F-test. The findings were visually represented using box plots and correlation matrices using “ggplot2” [78] R package version 3.4.4. Paired correlations of means were examined using Pearson’s test (r), following Mukaka’s analysis [79]. All reported intervals reflect a 95% confidence interval.
The normality of the data was assessed using the Kolmogorov–Smirnov and Shapiro–Wilk tests, while the assumption of homogeneity of variance was evaluated using Levene’s test. In cases where the assumptions of normality and homogeneity of variance were not satisfied (p < 0.05), a one-way analysis of variance (ANOVA) was conducted to determine if there were significant discrepancies in the geometric parameters of the grapevines between the different measurement methods. To improve the reliability of the findings and account for deviations from normality in the sample distribution, bootstrapping procedures (1000 resamples; 95% CI BCa) were employed, resulting in a 95% confidence interval for the differences between the means [80]. Given the presence of heterogeneity of variance, Welch’s correction was applied, and post-hoc analysis was performed using the Games-Howell technique [81].

3. Results

3.1. Data Characterization

The point cloud data from the different sensors presented distinct point density distributions. For the entire vineyard plot, the highest point density among the UAV-based point clouds was observed in the LiDAR data, with 3123 points per m3. Of the more than 17.5 million points registered for vineyard plots, 98.97% were first return points, 1.02% were second return points, and 0.01% were third return points. This was followed by the RGB imagery from H20T (2678 points per m3), the panchromatic imagery of ALTUM-PT (2241 points per m3), and the RGB imagery from L1 (2089 points per m3). The multispectral and TIR imagery had lower point densities at 535 and 466 points per m3, respectively. In contrast, the TLS point cloud achieved an average of 199,718 points per m3. Figure 4 presents a lateral perspective of part of one row of each generated point cloud, illustrating the distribution of points along the Z-axis.
Regarding the average number of points within the grapevine canopies, the photogrammetric processing of the panchromatic data produced an average of 2451 points, L1 RGB data had 2220 points, and the L1 LiDAR data showed 1557 points. The RGB imagery from H20T produced an average of 1253 points, the multispectral imagery had 626 points, and the TIR imagery had an average of 328 points within the grapevine canopy. The point density from the TLS sensor was approximately one thousand times higher than that of the UAV sensors, with an average of 751,780 points. These differences are illustrated in Figure 5, which displays the point distribution and density of the grapevine canopy along part of a row.
In addition to generating point clouds, UAV sensor imagery can also create raster products. Examples of these products are presented in Figure 6, including orthophoto mosaics derived from RGB imagery and reflectance from multispectral bands (Figure 6a). The orthophoto mosaic is useful for visual inspection of the vineyard, while the multispectral data provides valuable information for detailed analysis. The spectral bands offer insights into different elements within the vineyard, particularly parameters related to grapevines. The multispectral sensor used in this study captures five distinct bands, enabling the characterization of elements such as bare soil, grapevines, and other vegetation (Figure 6b). These spectral bands are also used to compute vegetation indices, such as NDVI, and provide temperature-related information from TIR imagery (Figure 6c).
A statistical analysis related to the dispersion of the height variables of the evaluated grapevines using different measurement tools is summarized in Table 2. Figure 7 presents the distribution of different metrics of the analyzed grapevines across all sensors and ground truth measurements.
For Hmax (Figure 7a), considerable variation among data from the evaluated sensors is evident. The SD was relatively consistent across sensors, ranging from 0.16 m for the L1 RGB data to 0.32 m in the H20T TIR data. CVs showed greater variability, ranging from 10.3% to 24.3% among these sensors. The TLS data closely matched the field data, with a mean of 1.59 m (+0.02 m compared to field data), low SD (0.18 m, +0.01 m compared to field data), and consistent distribution (CV = 11.7%). The panchromatic, L1 RGB, multispectral, and H20T RGB point cloud data also had mean values close to field values, with differences of less than 0.05 m: 0.02 m for panchromatic and 0.04 m for L1 RGB, with 0.09 m for multispectral and RGB H20T. Low SD and CV values were observed, except for multispectral data (SD = 0.24 m and CV = 16.25). The H20T TIR sensor had the lowest average Hmax value (1.30 m) and the highest SD (0.32 m), with a range from 0.36 to 1.65 m (CV = 24.32%), showing the greatest data dispersion around the mean.
For H90 and H95 (Figure 7a), the data derived from panchromatic imagery presented the highest mean values for both variables (1.38 m for H95 and 1.33 m for H90), with an SD of 0.19 m and CV of 13.89% and 13.61%, respectively. This was followed by the L1 RGB data (H95 = 1.37 m, H90 = 1.32). Similar values are observed for the TLS data (1.34 and 1.27, respectively for H95 and H90). These sensors also showed lower SD values, indicating less dispersion compared to other sensors. In contrast, the TIR imagery and L1 LiDAR data showed the lowest mean values for both percentiles (H90 = 1.10 m for both sensors, H95 = 1.15 m for LiDAR data, and H95 = 1.16 m for TIR data). The TIR data had greater dispersion, with H90 values ranging from 0.35 m and 1.38 m, an SD of 0.27 m, and a CV of 24.89%. For H95, the range was between 0.36 m and 1.49 m, with an SD of 0.28 m and a CV of 24.42%.
Regarding the grapevine projected area (Figure 7b and Table 3), the TLS point cloud data closely matches the measured area, with the highest mean across sensors (0.51 m2, +0.05 m2 compared to the measured area), and a lower standard deviation (0.14 m2, +0.01 m2 compared to the measured area), ranging from 0.20 m2 to 0.67 m2, with a more consistent data distribution compared to other sensors (CV = 28.07%). The panchromatic, multispectral, and L1 RGB point clouds also show mean values close to measured values, with differences of less than 0.05 m2. The H20T RGB and H20T TIR data had the lowest mean projected areas (0.27 and 0.28 m2, respectively) and the highest variation (CV = 54.04% and 57.83%), indicating the greatest data dispersion around the mean.
For the grapevine volume (Figure 7c and Table 3), variation among sensors was observed. The mean grapevine volume ranged between 0.136 m3 in the TIR data to 0.251 m3 in the panchromatic data. The SD across these sensors ranged from 0.085 m3 to 0.135 m3, with CVs ranging from 41.66% in the TLS data to 67.45% in the multispectral data. Among the UAV-based sensors, the panchromatic point cloud registered the highest mean values (0.251 m3; SD = 0.135 m3; CV = 53.80%), followed by the L1 RGB and LiDAR data, which showed mean values of 0.238 m3 and 0.232 m3, respectively, with corresponding SDs of 0.123 m3 and 0.124 m3, and CVs of 51.46% and 53.28%. These values were close to those from the TLS point cloud results (0.232 m3; SD = 0.097 m3; CV = 41.66%). The RGB data from the H20T showed a mean volume of 0.203 m3, while the multispectral point cloud had 0.184 m3. The H20T TIR data had the lowest mean volume at 0.136 m3, and a CV of 62.73%.
Normality distribution tests (Kolmogorov–Smirnov and Shapiro–Wilk) indicated that the variables did not follow a normal distribution (p < 0.05). Additionally, Levene’s test revealed a lack of homogeneity of variance between the variables (p < 0.05). Due to the violation of these assumptions, a one-way ANOVA could not be applied. Instead, the Games-Howell non-parametric post-hoc test was conducted and interpreted using bootstrapping procedures. This analysis revealed significant differences between the sensors for the variables examined (p < 0.05). The results are presented in Table 2 and Table 3. For Hmax, the TLS, multispectral, panchromatic, H20T RGB, and L1 RGB data showed statistically similar averages. However, the L1 LiDAR data demonstrated significant similarity to the multispectral, H20T RGB, and TIR point cloud data. For H90, the TLS, multispectral, panchromatic, H20T RGB, and L1 RGB point cloud data had statistically similar averages, whereas the H20T TIR data showed significant similarity with the multispectral, H20T RGB and L1 LiDAR. For H95, the TLS, multispectral, panchromatic, H20T RGB, and L1 RGB data showed statistically similar averages, while the H20T TIR and L1 LiDAR data only showed significant similarity to each other. Regarding grapevine projected area, the measured area and the TLS, multispectral, panchromatic, and L1 RGB data exhibited statistically similar averages, while the L1 LiDAR sensor demonstrated significant similarity to the measured area, multispectral, panchromatic, and L1 RGB data. The H20T RGB and H20T TIR data demonstrated only statistically similar averages to each other. Finally, for grapevine canopy volume, the TLS, multispectral, panchromatic, H20T RGB, L1 RGB, and L1 LiDAR data demonstrated statistically similar averages, except for H20T TIR data, which exhibited significant similarity to multispectral and H20T RGB.

3.2. Grapevine Geometric Parameters

Correlations between the geometric parameters extracted from the point cloud data of each sensor were established to verify the accuracy of the point clouds in representing the structure of the grapevine canopy. Table 4 compares the results obtained by the different sensors, including performance metrics such as the correlation coefficient (r), coefficient of determination (R2), and RMSE for the geometric parameters of grapevines.
The correlation matrix reported in Figure 8a shows that the correlation between the field-measured height values and Hmax is strong to very strong and significant (r > 0.7; p < 0.001) for all sensors. Using the measured parameters from the TLS point cloud as a reference to suppress field values for H95 (Figure 8b) and H90 (Figure 8c), the results indicate that all point clouds were able to correctly represent these variables, showing moderate to very strong correlation across all UAV sensors (r > 0.5; p < 0.01).
For grapevine Hmax, the TLS data exhibited a strong, significant correlation (r = 0.95) and the highest R2 (0.90), indicating a strong linear relationship between the data captured by the sensor and actual grapevine height. The RMSE was also low (0.027 m). Among the UAV-based sensors, the panchromatic data showed significant results relative to ground-truth data (r = 0.91), the highest R2 (0.83), and an RMSE of 0.025 m. This was followed by the H20T RGB data (r = 0.91, R2 = 0.83), with an RMSE of 0.081 m; the L1 RGB (r = 0.89, R2 = 0.79) and LIDAR (r = 0.87, R2 = 0.76) data with RMSE values of 0.038 m and 0.129 m, respectively, and the multispectral data (r = 0.83, R2 = 0.70), with an RMSE of 0.084 m. The lowest correlation values were achieved by the TIR data (r = 0.76, R2 = 0.58) with the highest RMSE (0.147 m). Among UAV sensors, the highest correlations were between the panchromatic and L1 RGB data (r = 0.98), while the lowest correlation was between TIR imagery and multispectral data (r = 0.66).
When correlating the UAV-based data with the parameters extracted from the TLS point cloud for H95 (Figure 8b) and H90 (Figure 8c), the panchromatic point cloud data showed very strong, significant correlations for both percentiles, with r = 0.91 and 0.90 (p < 0.001), for H95 and H90, respectively. Similarly, the L1 RGB data demonstrated strong correlations for both percentiles, with r = 0.93 and 0.89 (p < 0.001), respectively. The LiDAR data from the L1 sensor also showed a good correlation, with r = 0.85 (p < 0.001) for both percentiles. The lowest correlations were found for the H20T TIR data in both percentiles. Among the UAV sensors, the highest correlations were between panchromatic and multispectral data (r = 0.95 for H95 and r = 0.96 for H90; p < 0.001). The lowest correlation for H90 was between TIR with L1 RGB and multispectral data (r = 0.76, p < 0.01). For H95, the lowest correlation was between TIR and multispectral data (r = 0.71, p < 0.01).
In the analysis of the projected grapevine area (Table 4), the TLS data showed a strong, significant correlation (r = 0.86, p < 0.001; R2 = 0.74). Among UAV-based sensors, the L1 RGB data demonstrated a very strong and significant correlation (r = 0.95, p < 0.001) and the best R2 (0.89) with the measured area, despite having a slightly higher RMSE (0.048 m2) compared to the TLS and panchromatic data (RMSE = 0.042 m2). The panchromatic data followed closely (r = 0.87, p < 0.001), along with the H20T RGB, H20T TIR, and L1 LiDAR (r = 0.82, p < 0.001). The lowest correlation was observed in the multispectral data (r = 0.79, p < 0.001). Among UAV sensors (Figure 9a), the highest correlation was between multispectral and panchromatic data (r = 0.96, p < 0.001), while the lowest correlation was between multispectral, LiDAR and TIR data (r = 0.73, p < 0.001). For grapevine canopy volume (Figure 9b), there were disparities among the sensors. The panchromatic point cloud data showed the highest correlation coefficient with TLS data (r = 0.68, p < 0.01), while the TIR data exhibited the lowest overall correlation (r = 0.68, p < 0.01). The multispectral data displayed a lower correlation (r = 0.45, p < 0.05). Among UAV sensors, the highest correlations were found for the H20T RGB data with panchromatic and L1 RGB data (r = 0.95, p < 0.001). The lowest observed correlation was between L1 LiDAR and multispectral data (r = 0.61, p < 0.01).

4. Discussion

This study aimed to evaluate and compare the effectiveness of various sensors and data acquisition methods in characterizing the geometric parameters of grapevines. The results indicate significant differences in point density and accuracy among the sensors, which directly impacts the precision of the geometric measurements. Detecting the variability within vineyards is crucial for PV, as it enables the automation of operations, optimization of chemical inputs, and mitigation of environmental impacts [34]. This assessment is essential for differentiating operations such as pruning, harvesting, fertilization, and crop protection [18]. As the geometric characteristics are directly related to plant vigor [41,82,83], this study compared point clouds from multispectral, panchromatic, RGB, TIR, and LiDAR UAV data, as well as TLS point clouds, were compared to evaluate the height variables (Hmax, H90, H95), grapevine projected area, and grapevine volume. Measurements of maximum height (Hmax) and projected area were taken as a reference to assess the ability of the point clouds derived from each sensor and data type for the characterization of grapevine geometric parameters [16,34,36].
As expected from previous studies [34,67], the TLS point cloud had a higher point density compared to UAV data. This higher density can be attributed to the static nature of the TLS sensor, which enables 360-degree scans at each collection point, resulting in superior resolution and detail. This is reflected in the average number of grapevine points (Figure 5). The UAV sensors showed varying point densities with LiDAR data generally providing a higher density than imagery subjected to photogrammetric processing, which depends on the image resolution [84,85].
The Hmax values obtained from point clouds, when compared to the field measurements (Table 2 and Table 4), were found to be reliable for automatically calculating grapevine Hmax (r > 0.7; p < 0.001). The TLS point cloud data provided the highest average and the lowest standard deviation (Table 2) and the highest correlations (Table 4). Although the multispectral and TIR point clouds showed lower correlations with field measurement, multispectral data can still be indicative of vigor and grapevine height [21,86]. The RGB, panchromatic, and LiDAR point clouds also demonstrated good correlations with Hmax, with the panchromatic data showing the lowest RMSE, improving on previously reported results for height estimation from UAV data [52]. The visual analysis of Figure 4 supports these findings, where the TLS point cloud shows a more detailed vertical representation compared to the UAV-based point clouds, which appears smoother, despite the relatively accurate representation at the top part provided by panchromatic and RGB point clouds [87,88,89]. Similar trends have been reported in other studies comparing UAV and TLS data in vineyards. Pagliai et al. [34] reported a mean of 1.04 m and a CV of 13% obtained using a mobile TLS and a mean of 1.07 m and a CV of 12% with data from UAV RGB imagery. In Torres-Sánchez et al. [31], the mean maximum height of grapevines obtained was 2.22 ± 0.25 m and 2.04 ± 0.32 m for mobile TLS and UAV RGB data, respectively. In Escolà et al. [67], the mean heights of 2.19 m and 1.92 m (Hmax and H90) for mobile TLS and 2.09 m and 1.99 m (Hmax and H90) for UAV RGB data. The findings of this study showed superior performance when compared to previous studies that compared real measurements with mobile TLS estimates [50,90] (Table 5). These discrepancies may be attributed to the continuous structure of grapevines, which complicates manual ground truth measurements. However, the results are consistent with those reported by Rinaldi et al. [48].
Given the high accuracy of the TLS estimates and their greater similarity to actual values in the Hmax estimates, the other geometric height parameters (H95 and H90) were compared with the TLS-based dataset (Figure 8b,c), which can serve as ground truth reference, as done in other studies [91]. This provides greater credibility to the use of TLS as a reference for geometric parameters. The point clouds obtained from the UAV data demonstrated strong and significant correlations with H95 and H90 extracted from the TLS point clouds (r > 0.8; p < 0.001), except for the H20T TIR, which had a slightly lower correlation. The panchromatic and L1 RGB point clouds exhibited the strongest correlations, indicating that despite variations in performance, all UAV sensors generally provide reliable height data that is consistently correlated with TLS measurements, similar to findings from similar studies [34,67].
Regarding the grapevine projected area (Figure 9a), the TLS data exhibited the highest mean value (Table 3), with a non-significant difference of 10.9% compared to the measured area. The area estimated from the panchromatic point cloud closely aligns with field data. However, all sensors, except H20T RGB and H20T TIR, were statistically equivalent to the measured average value (p < 0.05). The results, consistent with the correlations, indicate that both TLS and UAV sensors can estimate the projected grapevine area accurately, with the L1 RGB sensor showing the highest correlation (r = 0.95, p < 0.001) with the measured average. Other studies have revealed an R2 = 0.78 (p < 0.001) for this parameter [31].
The grapevine volume data (Figure 9b and Table 3) estimated from the sensors showed some disparities. The TLS point cloud had the lowest variance, followed by the L1 LiDAR. In contrast, the TIR data provided less satisfactory results in terms of precision and consistency. These findings are consistent with those of Torres-Sánchez et al. [31], who reported values of 0.24 ± 0.06 m3 and 0.19 ± 0.06 m3 from a mobile TLS and UAV, respectively. However, the volume calculated by Torres-Sánchez et al. [31] was based on the canopy height model (CHM), assuming that all space between the ground and the canopy top was occupied by the canopy. Meanwhile, Pagliai et al. [34] found a mean volume of 0.40 m3 and a variation of 15% using a mobile TLS, and 0.59 m3 with UAV, showing 22% variation. The estimated volumes in this study were lower than those reported by Escolà et al. [67], who calculated volume as the sum of the cross-sectional areas multiplied by the section length. This highlights the need to explore and compare different approaches for grapevine volume estimation [75], complementing these analyses with field-measured volume validation [52]. When comparing grapevine volume estimates from TLS and UAV sensors, strong and significant correlations were observed (r > 0.7, p < 0.05) for most sensors, except for multispectral data, which still showed a significant correlation (p < 0.01). The panchromatic data exhibited the highest correlation (Figure 9b). Among the UAV sensors, more favorable correlations were observed, although the L1 LiDAR data demonstrated a moderate correlation (r < 0.7) with multispectral and panchromatic data. Interestingly, a close similarity was found between UAV TIR and LiDAR data for volume estimation (r = 0.83, p < 0.001), similar to Buunk et al. [91]. The observed correlations in other studies on grapevine volume estimation are summarized in Table 5. The findings contrast with those of Chakraborty et al. [90], who reported a strong correlation between volume estimates from mobile TLS and canopy surface area from UAV multispectral imagery. Petrović et al. [44] demonstrated a strong correlation between UAV RGB data and mobile TLS volume measurements. Tumbo et al. [92] found that laser sensors provided more accurate canopy volume estimates than ultrasonic sensors due to their higher resolution (R2 > 0.85, RMSE < 2.15 m3).
The underestimation of heights by TIR and L1 LiDAR sensors, variations in the points distribution (notorious in Figure 5), and greater variability in UAV sensor results for projected area and volume estimation (Figure 9) can be attributed to the growth behavior of grapevines. Grapevines develop thin, long branches intertwined with individual leaves in the upper canopy. These thin parts are challenging to detect in aerial images and may not be fully reconstructed using UAV data. This is clear in the generated point clouds, where grass is less apparent in UAV data compared to the TLS point cloud (Figure 4), and details in thin branches are lacking (Figure 5). Additionally, long branches often hang beside the vineyard rows, significantly contributing to the projected area and volume [31,68,93]. This phenomenon was also observed by Torres-Sánchez et al. [31], who reported greater discrepancies in area estimates in vineyards compared to pear and peach orchards, which lack lateral branches. To address canopy height underestimation in UAV methodology, increasing image spatial resolution may be beneficial [31]. Andújar et al. [40] observed that mobile TLS data resulted in higher canopy volume values than UAV data in vineyard analysis. However, Pagliai et al. [34] found the highest volume estimates from the UAV workflow in a comparison with mobile TLS and ground-based photogrammetry. This discrepancy was attributed to noise in the point cloud and difficulty in accurately isolating grapevines. Moreover, area and volume estimates derived from the TLS can be less precise than height estimates [44,48]. This may be due to overlapping errors from scanning both sides of the plants, leading to error accumulation. The high point density and presence of grass can also overestimate the area and volume values due to potential noise. However, in this study, steps were taken to mitigate these errors during data acquisition and processing before the extraction of the geometrical parameters.
In this study, the UAV imagery was captured at a nadir angle, unlike other studies that used oblique angles [67], leading to lower estimates. This conclusion is supported by the highest correlation found between Hmax in the point cloud derived from TLS and field measurements, as the TLS scanning was oriented laterally between rows. This is in line with previous research indicating that the combination of nadir and oblique angles results in more accurate 3D analysis [91,94,95]. Another factor complicating the use of UAV-based imagery to determine the ground surface below the canopy is the presence of tall grass (Figure 4), which results in a noisier ground surface.
The results reported in this study are valuable for PV applications, providing insights into the optimal use of stationary TLS or UAV sensors for geometric parameter estimation, which is crucial for vineyard monitoring and decision support systems. Future research should prioritize data collection across different phenological stages and vineyard settings [60], including steep-slope vineyards and various grape varieties with different training systems. It should also incorporate additional parameters such as canopy length and width, trunk diameter, as well as NDVI and LAI as biophysical parameters in the analyses. Moreover, the effects of UAV-based imagery parameters, such as spatial resolution, flight height, imagery overlap, and camera angle, on grapevine parameter estimates, should be assessed across different periods and grapevine varieties. This will facilitate the optimization of UAV imagery acquisition for geometric parameter extraction [66,96,97,98,99].
Highlighting the advantages and limitations of TLS and UAV systems is crucial for understanding their applications in different scenarios. UAV-based systems offer large-scale measurement capabilities due to their ability to conduct measurements at relatively lower flight heights, resulting in a wide field of view [34]. In this study, UAV data acquisition enabled the rapid surveying of the entire vineyard plot, whereas TLS data acquisition took 1 h and 30 min for a smaller portion of the plot (Figure 1b), potentially limiting the use of TLS for larger areas [41]. Therefore, UAV data acquisition allows for rapid mapping of large areas, while stationary TLS is limited to smaller areas. TLS requires sufficient coverage and overlap to ensure accuracy, minimize data gaps, and improve the overall quality of the 3D model generated from the data [100]. In contrast, mobile TLS systems enables can be adapted to agricultural vehicles, enabling automatic data collection during field operations [31,41,42,83].
However, studies comparing the performance of static TLS and mobile TLS for detecting trees and estimating dendrometric parameters in forest areas [101,102] have demonstrated that, despite the lower accuracy of mobile TLS, the results from both systems are comparable and do not show significant differences in estimates. This suggests that the parameter extraction methodology used in this study is applicable to vineyard data acquired using mobile TLS systems, thereby optimizing the efficiency of this sensor technology. Nevertheless, the use of TLS, primarily mobile TLS, can be challenging in sloped vineyards and uneven terrains [103]. On the other hand, while UAVs do not face these challenges, they require trained personnel and must comply with specific legal regulations [34]. A limitation of UAV data acquisition is the potential information loss, such as difficulties in reconstructing thin lateral branches [31]. Among the evaluated UAV sensors, the ALTUM-PT showed high accuracy in estimating geometrical parameters, particularly when using the panchromatic band, which was effective for measuring height and grapevine projected area. This is significant, as the sensor captures multispectral data (Figure 6), crucial for diverse vineyard-related tasks, along with thermal infrared imagery. In contrast, the point cloud generated from TIR imagery demonstrated a lower performance in evaluating the assessed parameters due to its lower spatial resolution. To mitigate such errors, data fusion techniques can be employed, either during the photogrammetric processing by integrating different imagery types (e.g., RGB and TIR) or by projecting data from lower-resolution sensors into high-resolution point clouds [104,105]. Moreover, UAV-based RGB point cloud data can be integrated with lower-resolution raster data, such as multispectral and TIR imagery, to isolate grapevine information and assist in mapping vineyard variability [106]. Additionally, data fusion techniques involving ground and aerial data, along with SfM outcomes and TLS point clouds, can be explored to achieve precise and accurate results. This particularly valuable in complex topographic environments, such as agricultural terraces [107].

5. Conclusions

This study compared point cloud data calculated from TLS and UAV LiDAR and generated from different UAV sensors subjected to photogrammetric processing to evaluate geometric parameters such as height, projected area, and volume in grapevines. The goal was to establish a benchmark for precise vineyard management by assessing the available tools for grapevine geometry analysis. Detecting variability within vineyards is essential for PV, enabling the automation of operations and optimization of inputs while reducing environmental impacts. The results confirm the feasibility of using TLS and UAV-derived point clouds for automatically calculating grapevine geometric parameters. TLS point clouds provided the most accurate measurements, closely matching field data and serving as a reliable reference. Among UAV sensors, the panchromatic imagery from ALTUM-PT was the most suitable data for height estimates, followed by RGB imagery from Zenmuse L1 and H20T. For the projected area, RGB images from Zenmuse L1 were the most suitable, followed by panchromatic images from ALTUM-PT and Zenmuse L1 LiDAR data. In volume estimation, the panchromatic imagery from ALTUM-PT showed a good performance, followed by RGB images from Zenmuse L1 and L1 LiDAR data. While UAV sensors show effectiveness, especially when correlated with TLS data, variations among sensors underscore the importance of careful selection based on specific application needs and environmental conditions.
TLS-based point cloud data offers greater precision and reliability due to its proximity to the grapevines and detailed characterization of both lateral views of the plants. However, it is less efficient in terms of data acquisition and processing time. The choice of the best system ultimately depends on a range of factors such as environmental conditions, equipment availability, and survey objectives. TLS is preferable for detailed information, while UAVs are more suitable for covering extensive areas.
This study also highlights the importance of understanding the limitations and factors influencing sensor data accuracy in vineyard assessments. Differences in volume estimates among sensors highlight challenges in UAV-based measurements. Factors such as flight angle and interference from ground vegetation can affect the accuracy of geometric parameter estimates. Future research should focus on data collection across various phenological stages, different vineyard settings (e.g., steep slopes), and grape varieties with varying training systems. Incorporating additional biophysical parameters, and minimizing biases will enhance TLS and UAV data acquisition and processing techniques in PV. Furthermore, future studies should assess the impact of optimizing UAV flight parameters on the estimation of geometric parameters.

Author Contributions

Conceptualization, L.F. and L.P.; methodology, L.F. and L.P.; software, L.F. and L.P.; validation, L.F. and L.P.; formal analysis, L.F.; investigation, L.F., J.M.L. and L.P.; resources, J.M.L., J.J.S., E.P. and R.M.; data curation, L.F. and L.P.; writing—original draft preparation, L.F. and L.P.; writing—review and editing, J.M.L., J.J.S., E.P., R.M. and L.P.; visualization, L.F. and L.P.; supervision, J.J.S. and L.P.; project administration, E.P. and R.M.; funding acquisition, J.J.S., E.P. and R.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Vine&Wine Portugal Project, co-financed by the Recovery and Resilience Plan (RRP) and the European NextGeneration EU Funds, within the scope of the Mobilizing Agendas for Reindustrialization, under Ref. C644866286-00000011.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data will be made available upon reasonable request.

Acknowledgments

The authors would like to acknowledge the Portuguese Foundation for Science and Technology (FCT) for financial support through national funds to projects UIDB/04033/2020 (https://doi.org/10.54499/UIDB/04033/2020), LA/P/0126/2020 (https://doi.org/10.54499/LA/P/0126/2020), and UIDB/00073/2020 (https://doi.org/10.54499/UIDB/00073/2020) and UIDP/00073/2020 projects of the R&D unit Geosciences Center (CGEO).

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Gebbers, R.; Adamchuk, V.I. Precision Agriculture and Food Security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef] [PubMed]
  2. Sassu, A.; Gambella, F.; Ghiani, L.; Mercenaro, L.; Caria, M.; Pazzona, A.L. Advances in Unmanned Aerial System Remote Sensing for Precision Viticulture. Sensors 2021, 21, 956. [Google Scholar] [CrossRef] [PubMed]
  3. Santos, J.A.; Fraga, H.; Malheiro, A.C.; Moutinho-Pereira, J.; Dinis, L.-T.; Correia, C.; Moriondo, M.; Leolini, L.; Dibari, C.; Costafreda-Aumedes, S.; et al. A Review of the Potential Climate Change Impacts and Adaptation Options for European Viticulture. Appl. Sci. 2020, 10, 3092. [Google Scholar] [CrossRef]
  4. Moreno, H.; Andújar, D. Proximal Sensing for Geometric Characterization of Vines: A Review of the Latest Advances. Comput. Electron. Agric. 2023, 210, 107901. [Google Scholar] [CrossRef]
  5. Morais, R.; Fernandes, M.A.; Matos, S.G.; Serôdio, C.; Ferreira, P.J.S.G.; Reis, M.J.C.S. A ZigBee Multi-Powered Wireless Acquisition Device for Remote Sensing Applications in Precision Viticulture. Comput. Electron. Agric. 2008, 62, 94–106. [Google Scholar] [CrossRef]
  6. Matese, A.; Gennaro, S.F.D. Technology in Precision Viticulture: A State of the Art Review. IJWR 2015, 7, 69–81. [Google Scholar] [CrossRef]
  7. Rosell Polo, J.R.; Sanz, R.; Llorens, J.; Arnó, J.; Escolà, A.; Ribes-Dasi, M.; Masip, J.; Camp, F.; Gràcia, F.; Solanelles, F.; et al. A Tractor-Mounted Scanning LIDAR for the Non-Destructive Measurement of Vegetative Volume and Surface Area of Tree-Row Plantations: A Comparison with Conventional Destructive Measurements. Biosyst. Eng. 2009, 102, 128–134. [Google Scholar] [CrossRef]
  8. Matese, A.; Di Gennaro, S.F.; Orlandi, G.; Gatti, M.; Poni, S. Assessing Grapevine Biophysical Parameters from Unmanned Aerial Vehicles Hyperspectral Imagery. Front. Plant Sci. 2022, 13, 898722. [Google Scholar] [CrossRef]
  9. Cataldo, E.; Fucile, M.; Mattii, G.B. A Review: Soil Management, Sustainable Strategies and Approaches to Improve the Quality of Modern Viticulture. Agronomy 2021, 11, 2359. [Google Scholar] [CrossRef]
  10. Zhou, L.; Xue, X.; Zhou, L.; Zhang, L.; Ding, S.; Chang, C.; Zhang, X.; Chen, C. Research Situation and Progress Analysis on Orchard Variable Rate Spraying Technology. Trans. Chin. Soc. Agric. Eng. 2017, 33, 80–92. [Google Scholar]
  11. Sommer, K.J.; Islam, M.T.; Clingeleffer, P.R. Light and Temperature Effects on Shoot Fruitfulness in Vitis vinifera L. Cv. Sultana: Influence of Trellis Type and Grafting. Aust. J. Grape Wine Res. 2000, 6, 99–108. [Google Scholar] [CrossRef]
  12. Petrie, P.R.; Trought, M.C.T.; Howell, G.S.; Buchan, G.D.; Palmer, J.W. Whole-Canopy Gas Exchange and Light Interception of Vertically Trained Vitis vinifera L. under Direct and Diffuse Light. Am. J. Enol. Vitic. 2009, 60, 173–182. [Google Scholar] [CrossRef]
  13. Haselgrove, L.; Botting, D.; van Heeswijck, R.; Høj, P.B.; Dry, P.R.; Ford, C.; Land, P.G.I. Canopy Microclimate and Berry Composition: The Effect of Bunch Exposure on the Phenolic Composition of Vitis vinifera L Cv. Shiraz Grape Berries. Aust. J. Grape Wine Res. 2000, 6, 141–149. [Google Scholar] [CrossRef]
  14. Ehlert, D.; Horn, H.-J.; Adamek, R. Measuring Crop Biomass Density by Laser Triangulation. Comput. Electron. Agric. 2008, 61, 117–125. [Google Scholar] [CrossRef]
  15. Ding Weimin, Z.S. Measurement Methods of Fruit Tree Canopy Volume Based on Machine Vision. Nongye Jixie Xuebao/Trans. Chin. Soc. Agric. Mach. 2016, 47, 1–10. [Google Scholar]
  16. Qi, Y.; Dong, X.; Chen, P.; Lee, K.-H.; Lan, Y.; Lu, X.; Jia, R.; Deng, J.; Zhang, Y. Canopy Volume Extraction of Citrus Reticulate Blanco Cv. Shatangju Trees Using UAV Image-Based Point Cloud Deep Learning. Remote Sens. 2021, 13, 3437. [Google Scholar] [CrossRef]
  17. Fernández-Sarría, A.; López-Cortés, I.; Estornell, J.; Velázquez-Martí, B.; Salazar, D. Estimating Residual Biomass of Olive Tree Crops Using Terrestrial Laser Scanning. Int. J. Appl. Earth Obs. Geoinf. 2019, 75, 163–170. [Google Scholar] [CrossRef]
  18. Verma, N.K.; Lamb, D.W.; Reid, N.; Wilson, B. Comparison of Canopy Volume Measurements of Scattered Eucalypt Farm Trees Derived from High Spatial Resolution Imagery and LiDAR. Remote Sens. 2016, 8, 388. [Google Scholar] [CrossRef]
  19. Chiappini, S.; Giorgi, V.; Neri, D.; Galli, A.; Marcheggiani, E.; Savina Malinverni, E.; Pierdicca, R.; Balestra, M. Innovation in Olive-Growing by Proximal Sensing LiDAR for Tree Volume Estimation. In Proceedings of the 2022 IEEE Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Perugia, Italy, 3–5 November 2022; pp. 213–217. [Google Scholar]
  20. Hawryło, P.; Socha, J.; Wężyk, P.; Ochał, W.; Krawczyk, W.; Miszczyszyn, J.; Tymińska-Czabańska, L. How to Adequately Determine the Top Height of Forest Stands Based on Airborne Laser Scanning Point Clouds? For. Ecol. Manag. 2024, 551, 121528. [Google Scholar] [CrossRef]
  21. Matese, A.; Di Gennaro, S.F.; Berton, A. Assessment of a Canopy Height Model (CHM) in a Vineyard Using UAV-Based Multispectral Imaging. Int. J. Remote Sens. 2017, 38, 2150–2160. [Google Scholar] [CrossRef]
  22. Wang, Y.; Wang, J.; Chang, S.; Sun, L.; An, L.; Chen, Y.; Xu, J. Classification of Street Tree Species Using UAV Tilt Photogrammetry. Remote Sens. 2021, 13, 216. [Google Scholar] [CrossRef]
  23. Balestra, M.; Tonelli, E.; Vitali, A.; Urbinati, C.; Frontoni, E.; Pierdicca, R. Geomatic Data Fusion for 3D Tree Modeling: The Case Study of Monumental Chestnut Trees. Remote Sens. 2023, 15, 2197. [Google Scholar] [CrossRef]
  24. Wężyk, P.; Hawryło, P.; Szostak, M.; Zięba-Kulawik, K.; Winczek, M.; Siedlarczyk, E.; Kurzawiński, A.; Rydzyk, J.; Kmiecik, J.; Gilewski, W.; et al. Using LiDAR Point Clouds in Determination of the Scots Pine Stands Spatial Structure Meaning in the Conservation of Lichen Communities in “Bory Tucholskie” National Park. Arch. Photogramm. Cartogr. Remote Sens. 2019, 31, 85–103. [Google Scholar] [CrossRef]
  25. Li, X.; Wang, L.; Guan, H.; Chen, K.; Zang, Y.; Yu, Y. Urban Tree Species Classification Using UAV-Based Multispectral Images and LiDAR Point Clouds. J. Geovis Spat. Anal. 2023, 8, 5. [Google Scholar] [CrossRef]
  26. Nistor-Lopatenco, L.; Tiganu, E.; Vlasenco, A.; Iacovlev, A.; Grama, V. Creation of the Point Cloud and the 3D Model for the Above-Ground Infrastructure in the City of Chisinau by Modern Geodetic Methods. Bulletin of the Transilvania University of Brasov. Ser. I—Eng. Sci. 2022, 15, 9–18. [Google Scholar] [CrossRef]
  27. Bieda, A.; Balawejder, M.; Warchoł, A.; Bydłosz, J.; Kolodiy, P.; Pukanská, K. Use of 3D Technology in Underground Tourism: Example of Rzeszow (Poland) and Lviv (Ukraine). Acta Montan. Slovaca 2021, 26, 205–221. [Google Scholar] [CrossRef]
  28. Ahmed, R.; Mahmud, K.H.; Tuya, J.H. A GIS-Based Mathematical Approach for Generating 3D Terrain Model from High-Resolution UAV Imageries. J. Geovis Spat. Anal. 2021, 5, 24. [Google Scholar] [CrossRef]
  29. da Silva, D.Q.; Aguiar, A.S.; dos Santos, F.N.; Sousa, A.J.; Rabino, D.; Biddoccu, M.; Bagagiolo, G.; Delmastro, M. Measuring Canopy Geometric Structure Using Optical Sensors Mounted on Terrestrial Vehicles: A Case Study in Vineyards. Agriculture 2021, 11, 208. [Google Scholar] [CrossRef]
  30. Njoroge, B.M.; Fei, T.K.; Thiruchelvam, V. A Research Review of Precision Farming Techniques and Technology. J. Appl. Technol. Innov. 2018, 2, 22–30. [Google Scholar]
  31. Torres-Sánchez, J.; Escolà, A.; Isabel de Castro, A.; López-Granados, F.; Rosell-Polo, J.R.; Sebé, F.; Manuel Jiménez-Brenes, F.; Sanz, R.; Gregorio, E.; Peña, J.M. Mobile Terrestrial Laser Scanner vs. UAV Photogrammetry to Estimate Woody Crop Canopy Parameters—Part 2: Comparison for Different Crops and Training Systems. Comput. Electron. Agric. 2023, 212, 108083. [Google Scholar] [CrossRef]
  32. Dassot, M.; Constant, T.; Fournier, M. The Use of Terrestrial LiDAR Technology in Forest Science: Application Fields, Benefits and Challenges. Ann. For. Sci. 2011, 68, 959–974. [Google Scholar] [CrossRef]
  33. Jung, J.; Kim, T.; Min, H.; Kim, S.; Jung, Y.-H. Intricacies of Opening Geometry Detection in Terrestrial Laser Scanning: An Analysis Using Point Cloud Data from BLK360. Remote Sens. 2024, 16, 759. [Google Scholar] [CrossRef]
  34. Pagliai, A.; Ammoniaci, M.; Sarri, D.; Lisci, R.; Perria, R.; Vieri, M.; D’Arcangelo, M.E.M.; Storchi, P.; Kartsiotis, S.-P. Comparison of Aerial and Ground 3D Point Clouds for Canopy Size Assessment in Precision Viticulture. Remote Sens. 2022, 14, 1145. [Google Scholar] [CrossRef]
  35. Jiang, Y.; Li, C.; Takeda, F.; Kramer, E.A.; Ashrafi, H.; Hunter, J. 3D Point Cloud Data to Quantitatively Characterize Size and Shape of Shrub Crops. Hortic. Res. 2019, 6, 43. [Google Scholar] [CrossRef]
  36. Colaço, A.F.; Trevisan, R.G.; Molin, J.P.; Rosell-Polo, J.R.; Escolà, A. A Method to Obtain Orange Crop Geometry Information Using a Mobile Terrestrial Laser Scanner and 3D Modeling. Remote Sens. 2017, 9, 763. [Google Scholar] [CrossRef]
  37. Moorthy, I.; Miller, J.R.; Berni, J.A.J.; Zarco-Tejada, P.; Hu, B.; Chen, J. Field Characterization of Olive (Olea europaea L.) Tree Crown Architecture Using Terrestrial Laser Scanning Data. Agric. For. Meteorol. 2011, 151, 204–214. [Google Scholar] [CrossRef]
  38. Murray, J.; Fennell, J.T.; Blackburn, G.A.; Whyatt, J.D.; Li, B. The Novel Use of Proximal Photogrammetry and Terrestrial LiDAR to Quantify the Structural Complexity of Orchard Trees. Precis. Agric. 2020, 21, 473–483. [Google Scholar] [CrossRef]
  39. Tsoulias, N.; Paraforos, D.S.; Fountas, S.; Zude-Sasse, M. Calculating the Water Deficit Spatially Using LiDAR Laser Scanner in an Apple Orchard. In Precision Agriculture ’19; Wageningen Academic Publishers: Wageningen, The Netherlands, 2019; pp. 115–121. ISBN 978-90-8686-337-2. [Google Scholar]
  40. Andújar, D.; Moreno, H.; Bengochea-Guevara, J.M.; de Castro, A.; Ribeiro, A. Aerial Imagery or On-Ground Detection? An Economic Analysis for Vineyard Crops. Comput. Electron. Agric. 2019, 157, 351–358. [Google Scholar] [CrossRef]
  41. del-Campo-Sanchez, A.; Moreno, M.; Ballesteros, R.; Hernandez-Lopez, D. Geometric Characterization of Vines from 3D Point Clouds Obtained with Laser Scanner Systems. Remote Sens. 2019, 11, 2365. [Google Scholar] [CrossRef]
  42. Lowe, T.; Moghadam, P.; Edwards, E.; Williams, J. Canopy Density Estimation in Perennial Horticulture Crops Using 3D Spinning Lidar SLAM. J. Field Robot. 2021, 38, 598–618. [Google Scholar] [CrossRef]
  43. Moreno, H.; Valero, C.; Bengochea-Guevara, J.M.; Ribeiro, Á.; Garrido-Izard, M.; Andújar, D. On-Ground Vineyard Reconstruction Using a LiDAR-Based Automated System. Sensors 2020, 20, 1102. [Google Scholar] [CrossRef] [PubMed]
  44. Petrović, I.; Sečnik, M.; Hočevar, M.; Berk, P. Vine Canopy Reconstruction and Assessment with Terrestrial Lidar and Aerial Imaging. Remote Sens. 2022, 14, 5894. [Google Scholar] [CrossRef]
  45. Sanz, R.; Llorens, J.; Escolà, A.; Arnó, J.; Planas, S.; Román, C.; Rosell-Polo, J.R. LIDAR and Non-LIDAR-Based Canopy Parameters to Estimate the Leaf Area in Fruit Trees and Vineyard. Agric. For. Meteorol. 2018, 260–261, 229–239. [Google Scholar] [CrossRef]
  46. Tagarakis, A.C.; Koundouras, S.; Fountas, S.; Gemtos, T. Evaluation of the Use of LIDAR Laser Scanner to Map Pruning Wood in Vineyards and Its Potential for Management Zones Delineation. Precis. Agric. 2018, 19, 334–347. [Google Scholar] [CrossRef]
  47. Keightley, K.E.; Bawden, G.W. 3D Volumetric Modeling of Grapevine Biomass Using Tripod LiDAR. Comput. Electron. Agric. 2010, 74, 305–312. [Google Scholar] [CrossRef]
  48. Rinaldi, M.F.; Llorens Calveras, J.; Gil Moya, E. Electronic Characterization of the Phenological Stages of Grapevine Using a LIDAR Sensor. In Precision Agriculture’13; Wageningen Academic Publishers: Wageningen, The Netherlands, 2013. [Google Scholar]
  49. Bei, R.; Fuentes, S.; Gilliham, M.; Tyerman, S.; Edwards, E.; Bianchini, N.; Smith, J.; Collins, C. VitiCanopy: A Free Computer App to Estimate Canopy Vigor and Porosity for Grapevine. Sensors 2016, 16, 585. [Google Scholar] [CrossRef] [PubMed]
  50. Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Ultrasonic and LIDAR Sensors for Electronic Canopy Characterization in Vineyards: Advances to Improve Pesticide Application Methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef]
  51. Roy, P.S.; Behera, M.D.; Srivastav, S.K. Satellite Remote Sensing: Sensors, Applications and Techniques. Proc. Natl. Acad. Sci. USA 2017, 87, 465–472. [Google Scholar] [CrossRef]
  52. Caruso, G.; Tozzini, L.; Rallo, G.; Primicerio, J.; Moriondo, M.; Palai, G.; Gucci, R. Estimating Biophysical and Geometrical Parameters of Grapevine Canopies (‘Sangiovese’) by an Unmanned Aerial Vehicle (UAV) and VIS-NIR Cameras. VITIS—J. Grapevine Res. 2017, 56, 63–70. [Google Scholar] [CrossRef]
  53. Weiss, M.; Baret, F. Using 3D Point Clouds Derived from UAV RGB Imagery to Describe Vineyard 3D Macro-Structure. Remote Sensing 2017, 9, 111. [Google Scholar] [CrossRef]
  54. Nex, F.; Remondino, F. UAV for 3D Mapping Applications: A Review. Appl. Geomat. 2014, 6, 1–15. [Google Scholar] [CrossRef]
  55. Ferro, M.V.; Catania, P.; Miccichè, D.; Pisciotta, A.; Vallone, M.; Orlando, S. Assessment of Vineyard Vigour and Yield Spatio-Temporal Variability Based on UAV High Resolution Multispectral Images. Biosyst. Eng. 2023, 231, 36–56. [Google Scholar] [CrossRef]
  56. Ouyang, J.; Bei, R.D.; Collins, C. Assessment of Canopy Size Using UAV-Based Point Cloud Analysis to Detect the Severity and Spatial Distribution of Canopy Decline. OENO One 2021, 55, 253–266. [Google Scholar] [CrossRef]
  57. Jurado, J.M.; Pádua, L.; Feito, F.R.; Sousa, J.J. Automatic Grapevine Trunk Detection on UAV-Based Point Cloud. Remote Sens. 2020, 12, 3043. [Google Scholar] [CrossRef]
  58. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Gay, P. Unsupervised Detection of Vineyards by 3D Point-Cloud UAV Photogrammetry for Precision Agriculture. Comput. Electron. Agric. 2018, 155, 84–95. [Google Scholar] [CrossRef]
  59. Comba, L.; Zaman, S.; Biglia, A.; Ricauda Aimonino, D.; Dabbene, F.; Gay, P. Semantic Interpretation and Complexity Reduction of 3D Point Clouds of Vineyards. Biosyst. Eng. 2020, 197, 216–230. [Google Scholar] [CrossRef]
  60. López-Granados, F.; Torres-Sánchez, J.; Jiménez-Brenes, F.M.; Oneka, O.; Marín, D.; Loidi, M.; de Castro, A.I.; Santesteban, L.G. Monitoring Vineyard Canopy Management Operations Using UAV-Acquired Photogrammetric Point Clouds. Remote Sens. 2020, 12, 2331. [Google Scholar] [CrossRef]
  61. Torres-Sánchez, J.; Mesas-Carrascosa, F.J.; Santesteban, L.-G.; Jiménez-Brenes, F.M.; Oneka, O.; Villa-Llop, A.; Loidi, M.; López-Granados, F. Grape Cluster Detection Using UAV Photogrammetric Point Clouds as a Low-Cost Tool for Yield Forecasting in Vineyards. Sensors 2021, 21, 3083. [Google Scholar] [CrossRef]
  62. García-Fernández, M.; Sanz-Ablanedo, E.; Pereira-Obaya, D.; Rodríguez-Pérez, J.R. Vineyard Pruning Weight Prediction Using 3D Point Clouds Generated from UAV Imagery and Structure from Motion Photogrammetry. Agronomy 2021, 11, 2489. [Google Scholar] [CrossRef]
  63. Mathews, A.J.; Jensen, J.L.R. Visualizing and Quantifying Vineyard Canopy LAI Using an Unmanned Aerial Vehicle (UAV) Collected High Density Structure from Motion Point Cloud. Remote Sens. 2013, 5, 2164–2183. [Google Scholar] [CrossRef]
  64. Comba, L.; Biglia, A.; Ricauda Aimonino, D.; Tortia, C.; Mania, E.; Guidoni, S.; Gay, P. Leaf Area Index Evaluation in Vineyards Using 3D Point Clouds from UAV Imagery. Precis. Agric. 2020, 21, 881–896. [Google Scholar] [CrossRef]
  65. Campos, J.; Llop, J.; Gallart, M.; García-Ruiz, F.; Gras, A.; Salcedo, R.; Gil, E. Development of Canopy Vigour Maps Using UAV for Site-Specific Management during Vineyard Spraying Process. Precis. Agric. 2019, 20, 1136–1156. [Google Scholar] [CrossRef]
  66. Cantürk, M.; Zabawa, L.; Pavlic, D.; Dreier, A.; Klingbeil, L.; Kuhlmann, H. UAV-Based Individual Plant Detection and Geometric Parameter Extraction in Vineyards. Front. Plant Sci. 2023, 14, 1244384. [Google Scholar] [CrossRef] [PubMed]
  67. Escolà, A.; Peña, J.M.; López-Granados, F.; Rosell-Polo, J.R.; de Castro, A.I.; Gregorio, E.; Jiménez-Brenes, F.M.; Sanz, R.; Sebé, F.; Llorens, J.; et al. Mobile Terrestrial Laser Scanner vs. UAV Photogrammetry to Estimate Woody Crop Canopy Parameters—Part 1: Methodology and Comparison in Vineyards. Comput. Electron. Agric. 2023, 212, 108109. [Google Scholar] [CrossRef]
  68. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  69. Lorenz, D.H.; Eichhorn, K.W.; Bleiholder, H.; Klose, R.; Meier, U.; Weber, E. Growth Stages of the Grapevine: Phenological Growth Stages of the Grapevine (Vitis vinifera L. ssp. Vinifera)—Codes and Descriptions According to the Extended BBCH Scale†. Aust. J. Grape Wine Res. 1995, 1, 100–103. [Google Scholar] [CrossRef]
  70. Sohl, M.A.; Mahmood, S.A. Low-Cost UAV in Photogrammetric Engineering and Remote Sensing: Georeferencing, DEM Accuracy, and Geospatial Analysis. J. Geovis Spat. Anal. 2024, 8, 14. [Google Scholar] [CrossRef]
  71. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in The Great Plains with ERTS. In Goddard Space Flight Center 3d ERTS-1 Symposium; NASA: Washington, DC, USA, 1974; Volume 1, pp. 309–317. [Google Scholar]
  72. Pingel, T.J.; Clarke, K.C.; McBride, W.A. An Improved Simple Morphological Filter for the Terrain Classification of Airborne LIDAR Data. ISPRS J. Photogramm. Remote Sens. 2013, 77, 21–30. [Google Scholar] [CrossRef]
  73. Edelsbrunner, H.; Mücke, E.P. Three-Dimensional Alpha Shapes. ACM Trans. Graph. 1994, 13, 43–72. [Google Scholar] [CrossRef]
  74. Di Gennaro, S.F.; Matese, A. Evaluation of Novel Precision Viticulture Tool for Canopy Biomass Estimation and Missing Plant Detection Based on 2.5D and 3D Approaches Using RGB Images Acquired by UAV Platform. Plant Methods 2020, 16, 91. [Google Scholar] [CrossRef]
  75. Liu, X.; Wang, Y.; Kang, F.; Yue, Y.; Zheng, Y. Canopy Parameter Estimation of Citrus Grandis Var. Longanyou Based on LiDAR 3D Point Clouds. Remote Sens. 2021, 13, 1859. [Google Scholar] [CrossRef]
  76. R Core Team. R: A Language and Environment for Statistical Computing; R Core Team: Vienna, Austria, 2013. [Google Scholar]
  77. Legendre, P. Lmodel2: Model II Regression 2018. Available online: https://cran.r-project.org/web/packages/lmodel2/index.html (accessed on 2 August 2024).
  78. Wickham, H.; Chang, W.; Henry, L.; Pedersen, T.L.; Takahashi, K.; Wilke, C.; Woo, K.; Yutani, H.; Dunnington, D.; van den Brand, T.; et al. Ggplot2: Create Elegant Data Visualisations Using the Grammar of Graphics 2024. R Package Version 2022, 3. [Google Scholar]
  79. Mukaka, M.M. A Guide to Appropriate Use of Correlation Coefficient in Medical Research. Malawi Med. J. 2012, 24, 69–71. [Google Scholar]
  80. Haukoos, J.S.; Lewis, R.J. Advanced Statistics: Bootstrapping Confidence Intervals for Statistics with “Difficult” Distributions. Acad. Emerg. Med. 2005, 12, 360–365. [Google Scholar] [CrossRef]
  81. Field, A. Discovering Statistics Using IBM SPSS Statistics; SAGE: Newcastle upon Tyne, UK, 2013; ISBN 978-1-4462-7458-3. [Google Scholar]
  82. Mazzetto, F.; Calcante, A.; Mena, A.; Vercesi, A. Integration of Optical and Analogue Sensors for Monitoring Canopy Health and Vigour in Precision Viticulture. Precis. Agric. 2010, 11, 636–649. [Google Scholar] [CrossRef]
  83. Cabrera-Pérez, C.; Llorens, J.; Escolà, A.; Royo-Esnal, A.; Recasens, J. Organic Mulches as an Alternative for Under-Vine Weed Management in Mediterranean Irrigated Vineyards: Impact on Agronomic Performance. Eur. J. Agron. 2023, 145, 126798. [Google Scholar] [CrossRef]
  84. Matese, A.; Di Gennaro, S.F. Practical Applications of a Multisensor UAV Platform Based on Multispectral, Thermal and RGB High Resolution Images in Precision Viticulture. Agriculture 2018, 8, 116. [Google Scholar] [CrossRef]
  85. Pádua, L.; Adão, T.; Sousa, A.; Peres, E.; Sousa, J.J. Individual Grapevine Analysis in a Multi-Temporal Context Using UAV-Based Multi-Sensor Imagery. Remote Sens. 2020, 12, 139. [Google Scholar] [CrossRef]
  86. Pádua, L.; Marques, P.; Adão, T.; Guimarães, N.; Sousa, A.; Peres, E.; Sousa, J.J. Vineyard Variability Analysis through UAV-Based Vigour Maps to Assess Climate Change Impacts. Agronomy 2019, 9, 581. [Google Scholar] [CrossRef]
  87. Cao, L.; Liu, H.; Fu, X.; Zhang, Z.; Shen, X.; Ruan, H. Comparison of UAV LiDAR and Digital Aerial Photogrammetry Point Clouds for Estimating Forest Structural Attributes in Subtropical Planted Forests. Forests 2019, 10, 145. [Google Scholar] [CrossRef]
  88. Guimarães, N.; Pádua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and Potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef]
  89. Jurado, J.M.; López, A.; Pádua, L.; Sousa, J.J. Remote Sensing Image Fusion on 3D Scenarios: A Review of Applications for Agriculture and Forestry. Int. J. Appl. Earth Obs. Geoinf. 2022, 112, 102856. [Google Scholar] [CrossRef]
  90. Chakraborty, M.; Khot, L.R.; Sankaran, S.; Jacoby, P.W. Evaluation of Mobile 3D Light Detection and Ranging Based Canopy Mapping System for Tree Fruit Crops. Comput. Electron. Agric. 2019, 158, 284–293. [Google Scholar] [CrossRef]
  91. Buunk, T.; Vélez, S.; Ariza-Sentís, M.; Valente, J. Comparing Nadir and Oblique Thermal Imagery in UAV-Based 3D Crop Water Stress Index Applications for Precision Viticulture with LiDAR Validation. Sensors 2023, 23, 8625. [Google Scholar] [CrossRef]
  92. Tumbo, S.D.; Salyani, M.; Whitney, J.D.; Wheaton, T.A.; Miller, W.M. Investigation of Laser and Ultrasonic Ranging Sensors for Measurements of Citrus Canopy Volume. Appl. Eng. Agric. 2002, 18, 367. [Google Scholar] [CrossRef]
  93. Johansen, K.; Raharjo, T.; McCabe, M.F. Using Multi-Spectral UAV Imagery to Extract Tree Crop Structural Properties and Assess Pruning Effects. Remote Sens. 2018, 10, 854. [Google Scholar] [CrossRef]
  94. Li, M.; Shamshiri, R.R.; Schirrmann, M.; Weltzien, C. Impact of Camera Viewing Angle for Estimating Leaf Parameters of Wheat Plants from 3D Point Clouds. Agriculture 2021, 11, 563. [Google Scholar] [CrossRef]
  95. Che, Y.; Wang, Q.; Xie, Z.; Zhou, L.; Li, S.; Hui, F.; Wang, X.; Li, B.; Ma, Y. Estimation of Maize Plant Height and Leaf Area Index Dynamics Using an Unmanned Aerial Vehicle with Oblique and Nadir Photography. Ann. Bot. 2020, 126, 765–773. [Google Scholar] [CrossRef]
  96. Mesas-Carrascosa, F.-J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.-M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by UAV to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef]
  97. Ottoy, S.; Tziolas, N.; Van Meerbeek, K.; Aravidis, I.; Tilkin, S.; Sismanis, M.; Stavrakoudis, D.; Gitas, I.Z.; Zalidis, G.; De Vocht, A. Effects of Flight and Smoothing Parameters on the Detection of Taxus and Olive Trees with UAV-Borne Imagery. Drones 2022, 6, 197. [Google Scholar] [CrossRef]
  98. Zhu, W.; Rezaei, E.E.; Nouri, H.; Sun, Z.; Li, J.; Yu, D.; Siebert, S. UAV Flight Height Impacts on Wheat Biomass Estimation via Machine and Deep Learning. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2023, 16, 7471–7485. [Google Scholar] [CrossRef]
  99. Catania, P.; Ferro, M.V.; Roma, E.; Orlando, S.; Vallone, M. Evaluation of Different Flight Courses with UAV in Vineyard. In Proceedings of the AIIA 2022: Biosystems Engineering Towards the Green Deal, Palermo, Italy, 19–22 September 2022; Ferro, V., Giordano, G., Orlando, S., Vallone, M., Cascone, G., Porto, S.M.C., Eds.; Springer International Publishing: Cham, Switzerland, 2023; pp. 457–467. [Google Scholar]
  100. Leica BLK360 Imaging Laser Scanner. Available online: https://leica-geosystems.com/en-US/products/laser-scanners/scanners/blk360 (accessed on 24 February 2024).
  101. Bauwens, S.; Bartholomeus, H.; Calders, K.; Lejeune, P. Forest Inventory with Terrestrial LiDAR: A Comparison of Static and Hand-Held Mobile Laser Scanning. Forests 2016, 7, 127. [Google Scholar] [CrossRef]
  102. Cabo, C.; Del Pozo, S.; Rodríguez-Gonzálvez, P.; Ordóñez, C.; González-Aguilera, D. Comparing Terrestrial Laser Scanning (TLS) and Wearable Laser Scanning (WLS) for Individual Tree Modeling at Plot Level. Remote Sens. 2018, 10, 540. [Google Scholar] [CrossRef]
  103. Tarolli, P.; Sofia, G.; Calligaro, S.; Prosdocimi, M.; Preti, F.; Fontana, G.D. Vineyards in Terraced Landscapes: New Opportunities from Lidar Data. Land. Degrad. Dev. 2015, 26, 92–102. [Google Scholar] [CrossRef]
  104. Jurado, J.M.; Ortega, L.; Cubillas, J.J.; Feito, F.R. Multispectral Mapping on 3D Models and Multi-Temporal Monitoring for Individual Characterization of Olive Trees. Remote Sens. 2020, 12, 1106. [Google Scholar] [CrossRef]
  105. López, A.; Jurado, J.M.; Ogayar, C.J.; Feito, F.R. An Optimized Approach for Generating Dense Thermal Point Clouds from UAV-Imagery. ISPRS J. Photogramm. Remote Sens. 2021, 182, 78–95. [Google Scholar] [CrossRef]
  106. Comba, L.; Biglia, A.; Aimonino, D.R.; Barge, P.; Tortia, C.; Gay, P. 2D and 3D Data Fusion for Crop Monitoring in Precision Agriculture. In Proceedings of the 2019 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Portici, Italy, 24–26 October 2019; pp. 62–67. [Google Scholar]
  107. Cucchiaro, S.; Fallu, D.J.; Zhang, H.; Walsh, K.; Van Oost, K.; Brown, A.G.; Tarolli, P. Multiplatform-SfM and TLS Data Fusion for Monitoring Agricultural Terraces in Complex Topographic and Landcover Conditions. Remote Sens. 2020, 12, 1946. [Google Scholar] [CrossRef]
Figure 1. Overview of studied vineyard plot subjected to unmanned aerial vehicle data acquisition (a) and the area scanned by terrestrial laser scanner along with the studied grapevines (b).
Figure 1. Overview of studied vineyard plot subjected to unmanned aerial vehicle data acquisition (a) and the area scanned by terrestrial laser scanner along with the studied grapevines (b).
Sensors 24 05183 g001
Figure 2. Equipment used and data acquisition tasks: (a) BLK360 G1; (b) Matrice 300 RTK; (c) acquisition of coordinates from deployed targets; (d) D-RTK 2 high precision GNSS mobile station; and (e) measurement of grapevine height.
Figure 2. Equipment used and data acquisition tasks: (a) BLK360 G1; (b) Matrice 300 RTK; (c) acquisition of coordinates from deployed targets; (d) D-RTK 2 high precision GNSS mobile station; and (e) measurement of grapevine height.
Sensors 24 05183 g002
Figure 3. View of ground control points in the different imagery data types acquired by the sensors on the unmanned aerial vehicle. NIR: near-infrared; PAN: panchromatic; TIR: thermal infrared.
Figure 3. View of ground control points in the different imagery data types acquired by the sensors on the unmanned aerial vehicle. NIR: near-infrared; PAN: panchromatic; TIR: thermal infrared.
Sensors 24 05183 g003
Figure 4. Comparative view of the point clouds generated from the different sensors in part of a grapevine row in the study area. TLS: terrestrial laser scanner; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Figure 4. Comparative view of the point clouds generated from the different sensors in part of a grapevine row in the study area. TLS: terrestrial laser scanner; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Sensors 24 05183 g004
Figure 5. Top perspective of the point distribution along the grapevine canopy from point clouds generated by different sensors in part of a grapevine row. TLS: terrestrial laser scanner; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Figure 5. Top perspective of the point distribution along the grapevine canopy from point clouds generated by different sensors in part of a grapevine row. TLS: terrestrial laser scanner; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Sensors 24 05183 g005
Figure 6. Spectral and thermal behavior of acquired data in different vineyard elements (grapevine, bare soil, other vegetation): (a) visual representation of the raster products, (b) reflectance in the five spectral bands, and (c) mean values of normalized difference vegetation index (NDVI) and land surface temperature (LST).
Figure 6. Spectral and thermal behavior of acquired data in different vineyard elements (grapevine, bare soil, other vegetation): (a) visual representation of the raster products, (b) reflectance in the five spectral bands, and (c) mean values of normalized difference vegetation index (NDVI) and land surface temperature (LST).
Sensors 24 05183 g006
Figure 7. Box plot distribution of geometrical parameters of the analyzed grapevines obtained using different measurement methods: (a) height metrics, (b) projected area, and (c) canopy volume.
Figure 7. Box plot distribution of geometrical parameters of the analyzed grapevines obtained using different measurement methods: (a) height metrics, (b) projected area, and (c) canopy volume.
Sensors 24 05183 g007
Figure 8. Correlation matrix between height variables obtained from different sensors for (a) maximum height and field-measured height; (b) height at the 95th percentile; and (c) height at the 90th percentile. The diagonal line is intentionally omitted.
Figure 8. Correlation matrix between height variables obtained from different sensors for (a) maximum height and field-measured height; (b) height at the 95th percentile; and (c) height at the 90th percentile. The diagonal line is intentionally omitted.
Sensors 24 05183 g008
Figure 9. Correlation matrix for grapevine projected area (a) and canopy volume (b) from the different sensors. The diagonal line is intentionally omitted.
Figure 9. Correlation matrix for grapevine projected area (a) and canopy volume (b) from the different sensors. The diagonal line is intentionally omitted.
Sensors 24 05183 g009
Table 1. Planimetric (XY), altimetric (Z), and overall (XYZ) Root Mean Square Error (RMSE) of UAV imagery alignment of the imagery from each sensor during photogrammetric processing.
Table 1. Planimetric (XY), altimetric (Z), and overall (XYZ) Root Mean Square Error (RMSE) of UAV imagery alignment of the imagery from each sensor during photogrammetric processing.
SensorImagery TypeRMSE XY (m)RMSE Z (m)RMSE XYZ (m)Spatial Res. (m)
ALTUM-PTMultispectral0.0050.0210.0130.0262
Panchromatic0.0060.0170.0110.0124
Zenmuse L1RGB0.0080.0180.0120.0158
Zenmuse H20TRGB0.0080.0220.0140.0197
Thermal infrared0.0160.0720.0440.0527
Table 2. Statistics of the height variables of the grapevines for the different measurement tools. Mean values with the same letters do not present significant differences (p < 0.05). SD: standard deviation; CV: coefficient of variation; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Table 2. Statistics of the height variables of the grapevines for the different measurement tools. Mean values with the same letters do not present significant differences (p < 0.05). SD: standard deviation; CV: coefficient of variation; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
SensorData TypeMin.Max.MeanSDCV (%)
Maximum height (m)
Measured1.241.811.57 a0.1710.84
BLK360TLS1.261.911.59 a0.1811.67
ALTUM-PTMSP0.791.951.48 a,b0.2416.25
PAN1.221.831.55 a0.1811.35
Zenmuse H20TRGB1.121.801.48 a,b0.1913.11
TIR0.361.651.30 c0.3224.32
Zenmuse L1RGB1.281.841.53 a0.1610.34
LiDAR0.581.731.37 b,c0.2921.05
90th percentile of height (m)
BLK360TLS0.871.481.27 a0.1814.32
ALTUM-PTMSP0.651.501.25 a,b0.2015.74
PAN0.781.591.33 a0.1913.89
Zenmuse H20TRGB0.881.561.25 a,b0.2116.64
TIR0.351.381.10 b,c0.2724.89
Zenmuse L1RGB0.981.591.32 a0.1612.31
LiDAR0.511.391.10 c0.2320.63
95th percentile of height (m)
BLK360TLS0.941.611.34 a0.1913.8
ALTUM-PTMSP0.711.601.31 a0.2015.52
PAN0.811.641.38 a0.1913.61
Zenmuse H20TRGB0.971.621.31 a0.2015.41
TIR0.361.491.16 b0.2824.42
Zenmuse L1RGB1.051.651.37 a0.1611.4
LiDAR0.561.421.15 b0.2219.26
Table 3. Statistics of the projected area and canopy volume of the grapevines for the different measurement tools. Mean values with the same letters do not present significant differences (p < 0.05). SD: standard deviation; CV: coefficient of variation; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
Table 3. Statistics of the projected area and canopy volume of the grapevines for the different measurement tools. Mean values with the same letters do not present significant differences (p < 0.05). SD: standard deviation; CV: coefficient of variation; MSP: multispectral; PAN: panchromatic; TIR: thermal infrared.
SensorData TypeMin.Max.MeanSDCV (%)
Projected area (m2)
Measured 0.120.730.46 a,b0.1533.46
BLK360TLS0.200.670.51 a0.1428.07
ALTUM-PTMSP0.021.020.44 a,b0.2352.57
PAN0.100.820.47 a,b0.1939.99
Zenmuse H20TRGB0.010.530.27 c0.1554.04
TIR0.010.510.28 c0.1657.83
Zenmuse L1RGB0.070.690.41 a,b0.1842.93
LiDAR0.010.650.39 b0.1846.82
Canopy volume (m3)
BLK360TLS0.0550.3720.232 a0.09741.66
ALTUM-PTMSP0.0040.5730.184 a,b0.12467.45
PAN0.0160.6110.251 a0.13553.8
Zenmuse H20TRGB0.0030.5210.203 a,b0.12561.34
TIR0.0010.2860.136 b0.08562.73
Zenmuse L1RGB0.0130.4730.238 a0.12351.46
LiDAR0.0020.4450.232 a0.12453.28
Table 4. Statistical parameters (correlation coefficient (r), coefficient of determination (R2), root mean square error (RMSE)) of the maximum height and area of the measured grapevines with metrics derived from each sensor. TLS: terrestrial laser scanner; TIR: thermal infrared.
Table 4. Statistical parameters (correlation coefficient (r), coefficient of determination (R2), root mean square error (RMSE)) of the maximum height and area of the measured grapevines with metrics derived from each sensor. TLS: terrestrial laser scanner; TIR: thermal infrared.
SensorData TypeMaximum Height (m)Area (m2)
rR2RMSErR2RMSE
BLK360TLS0.950.900.0270.860.740.042
ALTUM-PTMultispectral0.830.700.0840.790.630.072
Panchromatic0.910.830.0250.870.760.042
Zenmuse H20TRGB0.910.830.0810.820.660.165
TIR0.760.580.1470.820.670.143
Zenmuse L1RGB0.890.790.0380.950.890.048
LiDAR0.870.760.1290.820.670.068
Table 5. Correlation results from other published studies addressing grapevine geometric parameters from point cloud data. TLS: terrestrial laser scanner; MTLS: mobile terrestrial laser scanner; UAV: unmanned aerial vehicle; H90: 90th percentile of height; LAI: leaf area index; r: correlation coefficient; R2: coefficient of determination; RMSE: root mean square error. Black dots (•) indicate the type of sensor used in each study, namely: TLS, MTLS, and UAV.
Table 5. Correlation results from other published studies addressing grapevine geometric parameters from point cloud data. TLS: terrestrial laser scanner; MTLS: mobile terrestrial laser scanner; UAV: unmanned aerial vehicle; H90: 90th percentile of height; LAI: leaf area index; r: correlation coefficient; R2: coefficient of determination; RMSE: root mean square error. Black dots (•) indicate the type of sensor used in each study, namely: TLS, MTLS, and UAV.
StudySensors UsedParameterReference MeasurementResults
TLSMTLSUAV
Escolà et al. [67] HeightMTLSr: 0.79; R2: 0.62
H90r: 0.83; R2: 0.69
Llorens et al. [50] HeightField measurementsR2: 0.23
VolumeUltrasonic sensorR2: 0.56
LAIR2: 0.22
Chakraborty et al. [90] HeightField measurementsr: 0.59
VolumeUAV canopy surface area r: 0.82 convex hull
r: 0.75 voxel grid
Rinaldi et al. [48] HeightField measurementsR2: 0.98
Torres-Sánchez et al. [31] HeightMTLSR2: 0.76
AreaR2: 0.78
VolumeR2: 0.73 (convex hull)
R2: 0.85 (2.5 D volume)
Pagliai et al. [34] HeightMTLSR2: 0.80; RMSE: 0.124 m
VolumeR2: 0.78; RMSE: 0.057 m3
Buunk et al. [91] VolumeUAV LiDARR2: 0.70
Petrović et al. [44] VolumeMTLS and UAVR2: 0.92
Caruso et al. [52] HeightField measurementsR2: 0.61
VolumeR2: 0.75
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ferreira, L.; Sousa, J.J.; Lourenço, J.M.; Peres, E.; Morais, R.; Pádua, L. Comparative Analysis of TLS and UAV Sensors for Estimation of Grapevine Geometric Parameters. Sensors 2024, 24, 5183. https://doi.org/10.3390/s24165183

AMA Style

Ferreira L, Sousa JJ, Lourenço JM, Peres E, Morais R, Pádua L. Comparative Analysis of TLS and UAV Sensors for Estimation of Grapevine Geometric Parameters. Sensors. 2024; 24(16):5183. https://doi.org/10.3390/s24165183

Chicago/Turabian Style

Ferreira, Leilson, Joaquim J. Sousa, José. M. Lourenço, Emanuel Peres, Raul Morais, and Luís Pádua. 2024. "Comparative Analysis of TLS and UAV Sensors for Estimation of Grapevine Geometric Parameters" Sensors 24, no. 16: 5183. https://doi.org/10.3390/s24165183

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop