Next Article in Journal
G-YOLO: A Lightweight Infrared Aerial Remote Sensing Target Detection Model for UAVs Based on YOLOv8
Next Article in Special Issue
Use of Unmanned Aerial Vehicles for Monitoring Pastures and Forages in Agricultural Sciences: A Systematic Review
Previous Article in Journal
Multi-Robot Path Planning Algorithm for Collaborative Mapping under Communication Constraints
Previous Article in Special Issue
Method of 3D Voxel Prescription Map Construction in Digital Orchard Management Based on LiDAR-RTK Boarded on a UGV
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Unoccupied-Aerial-Systems-Based Biophysical Analysis of Montmorency Cherry Orchards: A Comparative Study

by
Grayson R. Morgan
* and
Lane Stevenson
Department of Geography, Brigham Young University, Provo, UT 84602, USA
*
Author to whom correspondence should be addressed.
Drones 2024, 8(9), 494; https://doi.org/10.3390/drones8090494
Submission received: 31 August 2024 / Revised: 10 September 2024 / Accepted: 17 September 2024 / Published: 18 September 2024
(This article belongs to the Special Issue Recent Advances in Crop Protection Using UAV and UGV)

Abstract

:
With the global population on the rise and arable land diminishing, the need for sustainable and precision agriculture has become increasingly important. This study explores the application of unoccupied aerial systems (UAS) in precision agriculture, specifically focusing on Montmorency cherry orchards in Payson, Utah. Despite the widespread use of UAS for various crops, there is a notable gap in research concerning cherry orchards, which present unique challenges due to their physical structure. UAS data were gathered using an RTK-enabled DJI Mavic 3M, equipped with both RGB and multispectral cameras, to capture high-resolution imagery. This research investigates two primary applications of UAS in cherry orchards: tree height mapping and crop health assessment. We also evaluate the accuracy of tree height measurements derived from three UAS data processing software packages: Pix4D, Drone2Map, and DroneDeploy. Our results indicated that DroneDeploy provided the closest relationship to ground truth data with an R2 of 0.61 and an RMSE of 31.83 cm, while Pix4D showed the lowest accuracy. Furthermore, we examined the efficacy of RGB-based vegetation indices in predicting leaf area index (LAI), a key indicator of crop health, in the absence of more expensive multispectral sensors. Twelve RGB-based indices were tested for their correlation with LAI, with the IKAW index showing the strongest correlation (R = 0.36). However, the overall explanatory power of these indices was limited, with an R2 of 0.135 in the best-fitting model. Despite the promising results for tree height estimation, the correlation between RGB-based indices and LAI was underwhelming, suggesting the need for further research.

1. Introduction

With the world’s population continuing to grow and access to arable land declining [1,2], the importance of sustainable and precision agriculture is continuing to rise [3]. As measured from 2008, researchers predicted that the population, income, and consumption increase in food products would drive agriculture and food demand up by 70% by the year 2050 [2]. Farmers must use more water to irrigate their land [1,4,5,6,7], spending more money on energy, electricity, and fuel sources, and distributing substantial portions of their crops towards industrial uses [2]. If our current agriculture practices are to continue as they are without improvement, we should expect to see a higher percentage of yield failure, a lack of watershed recharge, an increase in pollution due to over-fertilization, and an increase in malnourishment across under-developed and developing regions [2].
Beginning with the first civilian experiments in 1979, a fascination with unoccupied aerial systems (UAS) has led to numerous uses across disciplines and applications, but especially with respect to agricultural applications [8,9,10,11,12]. Most recently, UAS have found a foothold in the world of precision agriculture. Since the 1980s, the overarching goal of precision agriculture (and sustainable agriculture as a whole) has been to measure and balance physical, biological, socioeconomic, and cultural determinants in a way that effectively uses beneficial inputs to maximize product output (yield) and reduce the introduction of harmful outputs like pollution [3,13]. The practice of precision agriculture started with fertilizer management across varying soil types, eventually leading to the use of autonomous farm vehicles and immediate on-site specific management practices [9]. UAS have quickly become an essential component of precision agriculture because they provide access to views that were previously inaccessible [14,15,16]. The ability of UAS to capture data at a diversity of spatial and temporal resolutions, and with simple or complex sensors, intrigues researchers and practitioners alike [17,18]. UAS offer a scale at which to view an area of interest that is unique to remote sensing, and different from other aerial photography and satellite data collection. Flexible data gathering techniques allow farmers to obtain imagery within a few short hours and at times most convenient or important to the stake holders of a project. UAS also hold a significant array of sensors, from LiDAR pucks to hyperspectral sensors to off the shelf RGB cameras, each with a unique application in different environments. Compared to other remote sensing data options, such as satellite data, UAS provide more-time-sensitive and higher-resolution data for maintaining farms and orchards.
The advantages of UAS use in agricultural systems are widely accepted, but further investigation into unique crops is important as each crop has different biophysical characteristics that can impact effective data collection practices [19,20]. UAS use in precision agriculture is well documented for most crops, including maize, soybeans, wheat, and rice [21,22,23,24]. Most fruit crops are similarly well documented except for sweet or tart cherry orchards. In a comprehensive survey on UAS use for remote sensing in orchards, Zhang et al. (2021) identified olive and citrus orchards as the most studied across Spain, the US, and China [25]. Of the 84 studies Zhang et al. investigated, none of the studies examined sweet or tart cherries. There has been extensive research conducted on apple, hazelnut, and almond orchards across China, the Mediterranean, and the US, but not cherry orchards [11,26,27,28]. Cherry orchards have a unique physical structure that present interesting challenges to height mapping that other orchards like apple orchards do not experience. The woody, misshapen branches can be difficult to map accurately and create 3D models of during photogrammetric processing and thus require specific study. Furthermore, despite generally growing in a rounded shape, the branch structures from tree to tree vary greatly and can be difficult to see through for a robust 3D model.
Two specific applications of UAS are meaningful to successful crop management in cherry orchards. First, structure from motion (SfM)-derived point clouds and digital terrain or digital surface models (DTM or DSM) are a significant output of the photogrammetric processing used to generate drone mapping projects [29,30]. Despite there being significant value in being able to ascertain true tree height throughout an orchard as tree height is strongly related to the type of output you receive from the crop, there is no consensus on the accuracy of UAS for obtaining cherry orchard tree heights. Other studies have thoroughly examined other orchard crops, but cherry orchards have been left alone [31,32,33]. Furthermore, there are several UAS data processing software packages that claim to offer survey grade and highly accurate 3D data from UAS images. This study investigates the accuracy of SfM products for cherry tree orchard height mapping from three UAS data processing software packages to assess the viability of orchard tree height mapping and the value of each processing tool.
Second, vegetation indices can be created from UAS-derived orthomosaics to assess crop health. Leaf area index, a metric used to describe the ratio of leaves in the trees to the ground area with correlation to quality fruit production, has shown a strong positive correlation with several vegetation indices that incorporate near infrared wavelengths of light [34,35,36,37,38]. However, not all farmers are able to obtain UAS with multispectral sensors due to cost, and many rely on the off-the-shelf RGB cameras to provide information on their crops. In response, RGB-based vegetation indices have been devised and tested to assess various biophysical characteristics of vegetation such as biomass and chlorophyll content [39,40,41]. Leaf area index (LAI) has been shown to have a strong relationship with RGB-based indices as well as with near infrared (NIR)-based indices, such as the normalized difference vegetation index, or NDVI. This study aims to identify opportunities of effectively operating UAS in Montmorency cherry tree orchards in rural Payson, Utah, by addressing the following research questions:
  • How accurately can an RTK-enabled UAS evaluate Montmorency cherry tree height in a sloped cherry orchard?
  • Which UAS imagery processing software out of DroneDeploy, Drone2Map, and Pix4D provides the most accurate product, and are the results statistically different?
  • How well can RGB-based indices predict Montmorency cherry tree LAI?

2. Materials and Methods

2.1. Study Area

Located near the border of Utah and Juab counties, Payson and nearby Santaquin Utah are known for their proximity to large fruit orchards (Figure 1). Situated in a humid continental climate in north-central Utah, the climate is known to be fairly dry with warm summers (upwards of 90 degrees Fahrenheit on average) and snowy, cold winters (as low as 35 degrees Fahrenheit on average). A city of 21,101 people according to the 2020 census, Payson is known for being a rural and family-friendly area with extensive fruit orchards.
Many of the orchards in this region, including the one within the area of interest, are Montmorency cherry tree orchards. Montmorency cherry trees are the most popular cherry trees in America for pies and preserves. It is a cold–hardy hybrid species that ripens early in the season. The trees produce bright red, medium-large fruit with firm yellow flesh and a rich, tart, tangy flavor. The tree blooms in late spring with clusters of white flowers, and the cherry fruit ripens in late June. The trees themselves are self-fertile, but having multiple trees aids in cherry production.

2.2. Field Data Collection

Montmorency cherry tree height and leaf area index (LAI) data were gathered on 26 September 2023 before remote sensing data collection. Tree height data were collected using a tape measure and a ladder. An Accupar LP-80 Ceptometer, by METER group, inc, Pullman, WA, USA, was used to obtain LAI data. The sensor’s computer used the solar information gathered at four directions beneath each tree canopy to compute LAI. Data collection began at 9:35 a.m. local time (3:35 p.m. UTC) and concluded at 11:15 a.m. local time. The temperature during this data collection period ranged from 14 degrees Celsius to 20.5 degrees Celsius. The humidity decreased over the two-hour data collection period from 53% humidity to 35% humidity. Trees were sampled in a randomly systematic design to assess trees fairly evenly throughout the study area of interest (Figure 2). Forty-three (43) trees were measured using a stratified random sampling method throughout the orchard grove of interest. Of the 43 trees selected, only 41 were used for analysis because of data corruption.

2.3. UAS Data Collection

UAS data were collected on 26 September 2023, beginning at 1:16 a.m. local time. Flights were conducted using the DJI mission planer installed in the DJI smart remote and flown using an RTK DJI Mavic 3M using a crossed patter for effective 3D mapping [42]. The DJI Mavic 3M possesses a 20 MP RGB visible wavelength camera with a 4/3 CMOS sensor. It also has a 5 MP multispectral camera with a 1/2.8-inch CMOS sensor that gathers data in the green (560 ± 16 nm), red (650 ± 16 nm), red edge (730 ± 16 nm), and NIR (860 ± 26 nm) wavelengths. The Mavic 3M weighs 951 g, with a maximum speed of 15 m/s. Maximum flight time is estimated at 43 min per battery without wind conditions. The drone came equipped with an onboard global navigation satellite system (GNSS) in addition to an RTK module on the top of the UAS (Figure 3B).
Flights were purposefully collected around solar noon, which was 1:00 p.m. local time (7:00 p.m. UTC), to limit shadows between tree canopy and confusion in the creation of the 3D modeling products. Approximately 1.75 h were needed to collect data covering the 3.6 hectare orchard region of interest. Flight planner parameters were adjusted to include 80% side and 80% front overlap to ensure adequate coverage for successful 3D model creation. The 45 m flight altitude allowed us to generate a 2.5 cm pixel size in processing. The RTK system of the drone was connected to the UNAVCO NTRIP for high-level GPS accuracy of 1.1 cm for the duration of the flights. This resulted in highly accurate GNSS positioning being tagged onto the collected RGB and multispectral images with 0.02 m horizontal accuracy and 0.03 vertical accuracy after connecting with 32 satellites, which is similar to other accuracies found in [43]. Ground control points (GCPs) were not collected at this time in order to assess the accuracy of the RTK GNSS onboard the drone.

2.4. Data Processing

Drone images were processed using three data-processing software packages to test the claims of each to provide highly accurate photogrammetric products from drone imagery using what are frequently referred to as robust structure from motion (SfM) processing algorithms. The three software packages assessed within this manuscript were selected for assessment based on their frequent use, as revealed in conversations at research conferences and other meetings. Many remote sensing faculty and practitioners within the drone space recommended these three software packages: Pix4D 4.0.9, DroneDeploy 2024, and Drone2Map 2024.1.0.

2.4.1. Pix4D

Pix4DMapper is a commonly used photogrammetric software with a suite of other drone-related software applications as appendages. Pix4DMapper is the desktop version of photogrammetric software that offers flexibility with processing parameters. A self-declared market leader in photogrammetric software technology, Pix4D is meant to be used with nearly any collected imagery, including from a UAS [44]. Pix4D boasts survey-grade results and the opportunity to completely control the workflow and outputs of any photogrammetric products. The pix4D definition of survey-grade results includes “sub-centimetre accuracy thanks to photogrammetric analysis”, including 1–2 pixel GSD horizontally and 1–3 pixels GSD vertically [44]. Pix4D makes a particular mention of their connection to research since their founding in 2011.
In processing data in Pix4DMapper version 4.0.9, 744 images were input into a general workflow 3D mapping project. Parameters were generally left in the default settings, but pixel size was adjusted so that maps from each processing software could be comparable. The pixel size was adjusted to be 2.5 cm, though at the detail, collecting data just over 1 cm was achievable. The collective processing experience took 6 h and 24 min from initial processing to the SfM point cloud creation, followed by orthomosaic generation. Data products were placed in the project folder on the machine where the Pix4D desktop application was installed.

2.4.2. Drone2Map

Drone2Map is the final of the three drone data processing suites that were investigated for drone data processing. Drone2Map is an Environmental Systems Research Institute, Inc. (or ESRI) product that works seamlessly with their ArcGIS Pro suite of GIS tools and caries the same intuitive graphical user interface design [45]. Like Pix4D, Drone2Map is a desktop application with 2D and 3D photogrammetric processing capabilities. Drone2Map does not require an internet connection, meaning you can gather data in the field and process it in the field immediately after data collection without needing to return to your office. It also provides several parameters that can be adjusted to increase processing accuracy and/or efficiency. An advantage of Drone2Map is the opportunity to process large datasets on a desktop and then share the results quickly with a team via the ArcGIS online tools [45]. The smooth integration of drone analysis with significant GIS tools makes Drone2Map attractive to many drone pilots.
Remote sensing data were added to the Drone2Map version 2024.1.0 project, and project parameters were adjusted according to the Pix4D project, which was completed first. Most parameters remained in the default settings, but pixel size was adjusted to match Pix4D (2.5 cm pixel size). Drone2Map provided a series of 2D and 3D products in the same manner that DroneDeploy and Pix4D provided the same products. The processing time for all of the elements of data processing combined was 13 h and 36 min.

2.4.3. DroneDeploy

DroneDeploy is a cloud-based photogrammetric data-processing software. Just like Pix4D, it has been designed to accommodate imagery from several sources but works well with UAS imagery. DroneDeploy operates differently from both Pix4D and Drone2Map in that DroneDeploy requires the uploading of images and processing is completed in the cloud [46]. DroneDeploy does not require software installation at all but also limits the user’s ability to adjust processing parameters. DroneDeploy makes no declarative statements regarding processing accuracy or “survey-grade” results but makes the case for ease of use and efficiency. While not completely a “black box” approach, DroneDeploy is excellent for drone operators and practitioners who do not desire to manipulate the software parameters and wish to allow DroneDeploy to determine the best options [46].
To process the same orchard data in DroneDeploy, 744 RGB images took 1 h and 15 min to upload to DroneDeploy as a new project. Once uploaded, the project processing began immediately and a total of 8 h and 15 min passed before an email was received stating that the processing had been completed. Once processing is completed, the user is able to access a myriad of data products from DroneDeploy including point clouds, DEM, DSM, orthomosaics, and vegetation index maps. Data were processed at default parameters, including the default highest spatial resolution possible, which was 1.27 cm. These products required downloading from the DroneDeploy website, however, and in zipped folders.

2.4.4. Comparison

The accuracy of DroneDeploy, Pix4D, and Drone2Map was assessed using the root mean squared error (RMSE) statistic, comparing the tree heights gathered in the field to the heights estimated by the 3D point clouds generated by each of the software packages. RMSE is calculated using Equation (1) below:
R M S E =   i = 1 N ( x i x ^ i ) 2 N ,
where N is the total observations, xi is the actual observed tree height, and x ^ i is the estimated tree height by each software. After the observations are compared to the in situ data, the three datasets are compared using ANOVA and independent t-test statistical tests and using IBM SPSS Statistics (version 29.0.1.1, IBM, Chicago, IL, USA) to determine if there are significant differences between the three datasets.

2.5. Biophysical Characteristics Estimation

2.5.1. Canopy Height Modeling

Canopy height modeling was conducted in ArcGIS Pro 3.1 for each of the three datasets generated by the three UAS data processing software packages described in Section 2.4. Canopy height models (CHM) are created by subtracting the DTM from the DSM, as seen in Equation (2) below:
CHM = DSM − DTM.
Point cloud datasets in .las data format were first obtained from each data processing software. In ArcGIS Pro, the point cloud was classified into ground points and non-ground points using the Classify LAS Ground tool if they were not already classified by the SfM processing software (Figure 4). After classification, the ground points were extracted and used to create the 5 cm DTM that represents the bare earth surface. The highest LAS points within each 5 cm pixel of a newly created raster were used to generate the DSM, which represents the tallest surfaces in each pixel. Each of these are created using the default parameters within the tool LAS Dataset to Tiled Raster. The DSM raster was snapped to match the DTM raster exactly during processing. Subsequent subtraction of the two rasters provided a CHM, representing the heights of the tallest object in each pixel. Tree heights were extracted using digitized polygons representing the tree canopies of the trees of interest. Using the zonal statistics as table tool in ArcGIS Pro, the maximum height found in a pixel within the tree canopies were assigned as estimated tree heights.
Unlike LiDAR point clouds that can penetrate tree canopy to provide multiple returns, the SfM-derived point clouds from drone imagery can only create a 3D point cloud from what can be seen. This is why images captured at oblique angles can provide a more robust 3D model. However, there were still some areas underneath the tree canopy where the DTM required more interpolation between known points. This can cause uncertainty that may impact true tree canopy height.

2.5.2. Modeling LAI

LAI has been shown to have a relationship with vegetation indices and certain wavelengths of light when recorded using remote sensing instruments. One goal of this study was to investigate the relationship between LAI of Montmorency cherry trees and RGB-based vegetation indices. To accomplish this, RGB-based index maps were created from the orthomosaics generated from the data processing in Pix4D 4.0.9. These were calculated using the ArcGIS Pro Raster Calculator tool that then created a new layer for each RGB-based index. A total of 12 RGB indices were tested that have a strong reputation for their relationships with vegetation biophysical characteristics. The RGB-based indices, their equations, and source material are found in detail in Table 1. RGB-based indices are growing in popularity as UAS become more accessible, especially with affordable off-the-shelf RGB cameras attached [47,48,49,50,51,52,53].
Once each new raster was created, the values for each index within the tree canopy polygons were extracted using the zonal statistics as table tool in ArcGIS Pro. The maximum statistic was then investigated with the LAI data captured in the field to determine significant correlations between each index and LAI. After correlation analysis, stepwise linear regression was run in IBM SPSS to determine the best model for estimating LAI using an RGB-based vegetation index.

3. Results

3.1. Modeled Tree Heights

Tree height data were estimated from canopy height models derived from datasets created through each of the three software packages, as shown in Figure 5. The DTMs showed a gradual decline in elevation to the north, and the CHM clearly represents each individual tree crown in the orchard. All three CHM were symbolized the same way and extracted to present the estimate tree heights.
Tree height estimations from each of the three datasets, along with the true height values, are represented in Figure 6. In each of the 41 used tree height values, there were frequent over- and under-estimations. No clear patterns were discerned as in some instances the estimated values were within a few centimeters, while others showed more significant overestimations of height. To evaluate the correlation between the true, measured heights of the orchard trees and the modeled tree heights, scatterplots of each of the three modeled heights were created, and the R squared metric was used to determine the best fit. DroneDeploy-derived data and modeled heights provided the closest relationship to the ground truth data, with an R2 of 0.61, while Pix4D and Drone2Map were not far behind, with an R2 of approximately 0.55. Each of the R squared values indicate a medium positive correlation between the modeled heights and the true heights.
The closest dataset to the true values, however, was the DroneDeploy dataset. The RMSE for each dataset is displayed in Table 2. All three datasets resulted in RMSE vales of over 30 cm, ranging from 31.83 cm to 34.03 cm.
The 2.2 cm difference between the three datasets’ RMSE was a small value but deserves further investigation. An ANOVA statistical test revealed a statistically significant result that required the rejection of the null hypothesis that all three datasets have an equal mean. After running three distinct two-tailed T-tests, we determined that while there was no significant difference (p < 0.5) between the means of the DroneDeploy and Drone2Map datasets or Drone2Map and Pix4D modeled heights, there was a significant difference between the means of the Pix4D modeled heights and the DroneDeploy modeled heights (Table 3). Therefore, we can conclude that DroneDeploy and Drone2Map produced better results than Pix4D for the purposes of this study.

3.2. LAI Modeling with RGB-Based Vegetation Indices

When comparing LAI data to RGB-based vegetation indices, the results were underwhelming. Figure 7 is a correlation matrix that represents the correlations between LAI and each RGB-based index. The IKAW index had the strongest correlation, with LAI at R = 0.36. The next closest indices were the RB Ratio and MGRVI index, with correlations at just over R = 0.20.
Despite the relationships between LAI- and RGB-based indices being generally underwhelming, the LAI data were used with all 12 indices in a stepwise-linear model in IBM SPSS to determine the best fit model for predicting LAI with the available RGB-based indices. Due to the multicollinearity and small impact of each index on LAI values outside of the IKAW index, the stepwise model removed all variables except the IKAW index and resulted in an R squared of 0.135. The IKAW variable in the equation was statistically significant at p < 0.05. However, the R squared of 0.135 does not have enough explanatory power to be meaningful in its predictive power.

4. Discussion

This study investigated the use of UAS for defining Montmorency cherry tree heights and relating RGB-based vegetation indices with each tree’s leaf area index. This was the first time UAS were used to investigate cherry orchard tree heights and LAI, though success has been found in other types of orchards and in forests [25,28,54,55,56]. One purpose of this study was to assess the use of an on-board RTK receiver for 3D mapping with UAS. RTK receivers are often used to collect GCPs on the ground for georeferencing during data processing but have found their way onto the UAS itself to provide highly accurate data collection. Often, the UAS are marketed as not requiring GCPs. Recently, RTK receivers on UAS have become more common as their costs are reduced. In most instances, however, an RTK system was not used on the drone itself for highly accurate image geotagging during data collection. For example, in an apple orchard, Hobart et al. found an error of up to 23 cm in height evaluations of extracted heights from photogrammetric point clouds when compared with ground LiDAR, which they considered more accurate [31]. In the Hobart et al. study, UAS data were georeferenced during data processing using ground control points collected with an RTK GNSS system in the field. When apple tree heights were assessed with LiDAR captured from a UAS, Hadas et al. found an RMSE of 9 cm with the largest difference found being 24 cm [33]. Hadas et al. used the GNSS receiver on the drone and LiDAR, which have historically provides more accurate representations of 3D products [57,58,59].
Tu et al. investigated tree heights of avocado trees in Australia and discovered an RMSE of 50 cm using RTK GNSS gathered GCPs [32]. The Tu et al. study, in comparison with those that used LiDAR, shows the difference so far in achievable horizontal and vertical accuracy with SfM derived point clouds. Avocado trees have a similar structure to that of the Montmorency cherry tree, suggest that accuracy levels might be similar. In another study, Gallardo-Salazar and Pompa-Garcia found that UAS effectively measured tree heights, with an RMSE of 36 cm [60]. The authors did not report the use GCPs or UAS on-board RTK GNSS but still achieved relatively high accuracy. Almond trees fared similarly, as the RMSE achieved using UAS-imagery-derived point clouds was 39 cm using 5 RTK GCPs during processing to achieve high-accuracy georeferencing [61]. The RMSE achieved by the dataset from each SfM processing option in this study for Montmorency cherry trees, from 31 cm to 34 cm, is comparable to the results of other studies, suggesting cherry trees can be approached in similar methods to other orchard trees.
DroneDeploy, Drone2Map, and Pix4D UAS data processing packages were likewise assessed for their ability to process UAS images and produce accurate, reliable point clouds and 3D data products. These three UAS data processing options vary in cost, consumer audience, and reputation (Table 4).
These are not the first examples of UAS SfM processing software to be compared, however. Kaimaris et al. (2017) compared Imagine UAV from Erdas Imagine and Agisoft metashape/photoscan [62]. The authors found that the two software packages generated products of similar accuracy without much difference. In another investigation, open-source UAS processing software compared favorably to Pix4DMapper [63]. In a comparison of Pix4DMapper, Agisoft Metashape, 3D Survey, and the SURE processing environments, Pix4DMapper was named the “best” of the four software packages according to several metrics, including being user-friendly, controlling parameters, and time/memory optimization [64]. Overall, however, they found that Agisoft Metashape, Pix4DMapper, and 3D Survey had very similar point clouds and 3D product accuracy. In addition to other processing experiments, Fraser and Congalton (2018) found that Agisoft and Pix4D both performed well and similarly, with a slight advantage noted for Agisoft metashape in the planimetric mapping [65]. Our study did not assess Agisoft metashape, but Pix4D, the cloud-based DroneDeploy, and the ESRI-owned Drone2Map. We found the accuracy of SfM results and accuracies to be quite similar between the three software packages, mirroring other studies of a variety of software packages. However, in investigating further, we discovered that there was a statistically significant difference between the lowest RMSE values from DroneDeploy and the highest values obtained from Pix4D. Despite there not being a large difference, it was significant, indicating better results from DroneDeploy. We suggest that this is due to the processing environment; in DroneDeploy, parameters are not adjustable before processing begins, and it is presumed that maximum parameters are used in a cloud environment. The other two software packages were based on a Dell Inspiron 5680 6-core intel i7 CPU desktop with 16 gb Ram and Nvidia GTX 1060 3 gb GPU. The computer specifications are similar to what practitioners may have access to but are limited in comparison to the processing power available on the cloud-based software package, and this may have impacted the accuracies.
Finally, we also assessed LAI using RGB-based vegetation indices. LAI is an important indicator for assessing plant health and growth [34,35,36,37,38]. LAI has historically shown a significant relationship with NIR-based vegetation indices [37,38]. As off-the-shelf UAS RGB cameras have become accessible and popular for precision agriculture, their use for assessing LAI and other metrics has increased. Interestingly, several studies have shown a growing literature suggesting a positive relationship between LAI- and RGB-based color indices and textures. Li et al. (2024) found that winter-wheat LAI displayed a significant positive correlation with color indices and a significant negative correlation with texture features [66]. In fact, the winter-wheat LAI was examined in relation to several indices we also tested, including Excess Green and the IKAW index. VARI, GRVI, and MGRVI, three indices also investigated in this study, were selected for model creation. Raj et al. established a positive correlation between Maize crop LAI and top-of-canopy RGB image factors in India [67]. Triana-Martinez et al. discovered that RGB indices did not exhibit positive correlation with RGB-based indices in rice crop, though NIR-based indices did [68]. On the other hand, Yamaguchi et al. (2020) found a strong correlation with rice crop and the color indices [37]. Our study of Montmorency cherry orchards revealed similar results, contrary to the positive results for winter wheat and maize crops. Some of the difficulty could possibly be attributed to the vegetative structure of the crops; wheat and maize have similar upright stalks, while the cherry trees and rice crops are more complex. Furthermore, temperature and humidity affect leaf inclination, which changed between the LAI data collection and when the RGB-vegetation index data were collected. The warmer, drier environment may have caused the relationship to be poor. Despite the disappointing results, future work should continue to investigate the potential of RGB-based vegetation indices and their relationship with LAI to determine if a better correlation can be established through adjusting parameters or using reflection targets to provide percentage reflectance rather than relying upon UAS data digital numbers for remote sensing data analysis. Other indices, including NIR indices, should be assessed as well.
Future work is also required to expand our understanding of SfM-derived point cloud accuracy. First, we have established similar tree height mapping results using UAS-mounted RTK receivers to others achieved using RTK-captured GCPs during processing. Further experiments should assess both options in the same environment as this has yet to be tested. UAS altitude during data collection, as well as lighting conditions and other UAS flight parameters, can and should be assessed as well. SfM derived point cloud comparison with UAS-mounted LiDAR would also reveal how SfM compares with the historically more accurate LiDAR data in the same environment, which has not been extensively conducted in a cherry tree orchard environment.

5. Conclusions

This study has highlighted the significant potential of unoccupied aerial systems (UAS) in enhancing precision agriculture, specifically within Montmorency cherry orchards. By addressing the unique challenges posed by the physical structure of cherry trees, we have demonstrated the viability of UAS for accurate height mapping and crop health assessment. Key findings include the following:
-
The use of RTK-enabled UAS has shown promising results in evaluating Montmorency cherry tree height in sloped orchards. Our findings indicate that UAS can provide relatively accurate height measurements (within 30 cm), which are crucial for optimizing yield and managing orchard health.
-
Among the three UAS imagery processing software packages tested—DroneDeploy, Drone2Map, and Pix4D—each demonstrated similar degrees of accuracy. Our analysis revealed statistically significant differences in the performance of these tools, with DroneDeploy cloud-based UAS data processing emerging as the most reliable for height mapping in cherry orchards.
-
The study explored the effectiveness of RGB-based vegetation indices in predicting the leaf area index (LAI) for Montmorency cherry trees. While NIR-based indices like NDVI have shown strong correlations with LAI, our research revealed that RGB-based indices did not have a significant correlation with LAI.
The integration of UAS in precision agriculture offers a scalable and efficient approach to managing cherry orchards, potentially leading to improved yield and sustainability. However, further research is needed to refine these techniques and explore their applicability to other crops and environments. Future studies should focus on the following:
-
Enhancing the accuracy of RGB-based vegetation indices and their relationship with LAI.
-
Comparing RTK on-board receivers and using GCPs for georeferencing after data collection in an orchard environment.
-
Developing standardized protocols for UAS data collection and processing to produce ideal tree canopy height data.
In conclusion, this study underscores the transformative impact of UAS technology in precision agriculture, paving the way for more-sustainable and productive farming practices. By leveraging advanced remote sensing tools, farmers can make informed decisions that enhance crop management and contribute to global food security.

Author Contributions

Conceptualization, G.R.M.; methodology, G.R.M.; software, G.R.M.; validation, G.R.M. formal analysis, G.R.M. and L.S.; investigation, G.R.M.; resources, G.R.M. and L.S.; data curation, G.R.M.; writing—original draft preparation, G.R.M. and L.S.; writing—review and editing, G.R.M. and L.S.; visualization, G.R.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Acknowledgments

The authors would like to express gratitude to McMullin Orchards for their kindness in allowing field data and UAS data collection to occur September 2023. They would also like to acknowledge the generous internal support at BYU.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  2. Wik, M.; Pingali, P.; Brocai, S. Global Agricultural Performance: Past Trends and Future Prospects; World Bank: Washington, DC, USA, 2008. [Google Scholar]
  3. Altieri, M.A. Agroecology: The Science of Sustainable Agriculture, 2nd ed.; Westview Press, Inc.: Boulder, CO, USA, 1995. [Google Scholar]
  4. Berger, K.; Machwitz, M.; Kycko, M.; Kefauver, S.C.; Van Wittenberghe, S.; Gerhards, M.; Verrelst, J.; Atzberger, C.; van der Tol, C.; Damm, A.; et al. Multi-sensor spectral synergies for crop stress detection and monitoring in the optical domain: A review. Remote Sens. Environ. 2022, 280, 113198. [Google Scholar] [CrossRef] [PubMed]
  5. Maes, W.H.; Steppe, K. Perspectives for remote sensing with unmanned aerial vehicles in precision agriculture. Trends Plant Sci. 2019, 24, 152–164. [Google Scholar] [CrossRef] [PubMed]
  6. Omia, E.; Bae, H.; Park, E.; Kim, M.S.; Baek, I.; Kabenge, I.; Cho, B. Remote sensing in field crop monitoring: A comprehensive review of sensor systems, data analyses and recent advances. Remote Sens. 2023, 15, 354. [Google Scholar] [CrossRef]
  7. Sishodia, R.P.; Shukla, S.; Graham, W.D.; Wani, S.P.; Jones, J.W.; Heaney, J. Current and future groundwater withdrawals: Effects, management, and energy policy options for a semi-arid Indian watershed. Adv. Water Resour. 2017, 110, 459. [Google Scholar] [CrossRef]
  8. Przybilla, H.J.; Wester-Ebbinghaus, W. Aerial photos by means of radio-controlled aircraft. Bildmess. Luftbildwes. 1979, 47, 137–142. [Google Scholar]
  9. Gebbers, R.; Adamchuk, V.I. Precision agriculture and food security. Science 2010, 327, 828–831. [Google Scholar] [CrossRef]
  10. Mulla, D.J. Twenty-five years of remote sensing in precision agriculture: Key advances and remaining knowledge gaps. Biosyst. Eng. 2012, 114, 358. [Google Scholar] [CrossRef]
  11. Sun, G.; Wang, X.; Ding, Y.; Lu, W.; Sun, Y. Remote measurement of apple orchard canopy information using unmanned aerial vehicle photogrammetry. Agronomy 2019, 9, 774. [Google Scholar] [CrossRef]
  12. Shahi, T.B.; Xu, C.; Neupane, A.; Guo, W. Recent advances in crop disease detection using UAV and deep learning techniques. Remote Sens. 2023, 15, 2450. [Google Scholar] [CrossRef]
  13. Velten, S.; Leventon, J.; Jager, N.; Newig, J. What is sustainable agriculture? A systematic review. Sustainability 2015, 7, 7833–7865. [Google Scholar] [CrossRef]
  14. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for Precision Agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  15. Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.-G. Unmanned Aerial Vehicles (UAV) in precision agriculture: Applications and challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
  16. Delavarpour, N.; Koparan, C.; Nowatzki, J.; Bajwa, S.; Sun, X. A technical study on UAV characteristics for precision agriculture applications and associated practical challenges. Remote Sens. 2021, 13, 1204. [Google Scholar] [CrossRef]
  17. Singh, K.K.; Frazier, A.E. A meta-analysis and review of Unmanned Aircraft System (UAS) imagery for terrestrial applications. Int. J. Remote Sens. 2018, 39, 5078–5098. [Google Scholar] [CrossRef]
  18. Simpson, J.E.; Holman, F.; Nieto, H.; Voelksch, I.; Mauder, M.; Klatt, J.; Fiener, P.; Kaplan, J.O. High spatial and temporal resolution energy flux mapping of different land covers using an off-the-shelf unmanned aerial system. Remote Sens. 2021, 13, 1286. [Google Scholar] [CrossRef]
  19. Avneri, A.; Aharon, S.; Brook, A.; Atsmon, G.; Smirnov, E.; Sadeh, R.; Abbo, S.; Peleg, Z.; Herrmann, I.; Bonfil, D.J.; et al. UAS-based imaging for prediction of chickpea crop biophysical parameters and yield. Comput. Electron. Agric. 2023, 205, 107581. [Google Scholar] [CrossRef]
  20. Lu, B.; He, Y.; Liu, H.H. Mapping vegetation biophysical and biochemical properties using unmanned aerial vehicles-acquired imagery. Int. J. Remote Sens. 2017, 39, 5265–5287. [Google Scholar] [CrossRef]
  21. dos Santos, R.A.; Filgueiras, R.; Mantovani, E.C.; Fernandes-Filho, E.I.; Almeida, T.S.; Venancio, L.P.; da Silva, A.C. Surface reflectance calculation and predictive models of biophysical parameters of maize crop from RG-nir sensor on board a UAV. Precis. Agric. 2021, 22, 1535–1558. [Google Scholar] [CrossRef]
  22. Maimaitijiang, M.; Ghulam, A.; Sidike, P.; Hartling, S.; Maimaitiyiming, M.; Peterson, K.; Shavers, E.; Fishman, J.; Peterson, J.; Kadam, S.; et al. Unmanned Aerial System (uas)-based phenotyping of soybean using multi-sensor data fusion and Extreme Learning Machine. ISPRS J. Photogramm. Remote Sens. 2017, 134, 43–58. [Google Scholar] [CrossRef]
  23. Hassani, K.; Gholizadeh, H.; Taghvaeian, S.; Natalie, V.; Carpenter, J.; Jacob, J. Application of UAS-based remote sensing in estimating winter wheat phenotypic traits and yield during the growing season. PFG–J. Photogramm. Remote Sens. Geoinf. Sci. 2023, 91, 77–90. [Google Scholar] [CrossRef]
  24. Pazhanivelan, S.; Kumaraperumal, R.; Shanmugapriya, P.; Sudarmanian, N.S.; Sivamurugan, A.P.; Satheesh, S. Quantification of biophysical parameters and economic yield in cotton and rice using drone technology. Agriculture 2023, 13, 1668. [Google Scholar] [CrossRef]
  25. Zhang, C.; Valente, J.; Kooistra, L.; Guo, L.; Wang, W. Orchard management with small unmanned aerial vehicles: A survey of sensing and analysis approaches. Precis. Agric. 2021, 22, 2007–2052. [Google Scholar] [CrossRef]
  26. Guimarães, N.; Sousa, J.J.; Pádua, L.; Bento, A.; Couto, P. Remote sensing applications in almond orchards: A comprehensive systematic review of current insights, research gaps, and future prospects. Appl. Sci. 2024, 14, 1749. [Google Scholar] [CrossRef]
  27. Liu, Z.; Guo, P.; Liu, H.; Fan, P.; Zeng, P.; Liu, X.; Feng, C.; Wang, W.; Yang, F. Gradient boosting estimation of the leaf area index of apple orchards in UAV remote sensing. Remote Sens. 2021, 13, 3263. [Google Scholar] [CrossRef]
  28. Vinci, A.; Brigante, R.; Traini, C.; Farinelli, D. Geometrical characterization of hazelnut trees in an intensive orchard by an unmanned aerial vehicle (UAV) for precision agriculture applications. Remote Sens. 2023, 15, 541. [Google Scholar] [CrossRef]
  29. Kovanič, Ľ.; Blistan, P.; Urban, R.; Štroner, M.; Blišťanová, M.; Bartoš, K.; Pukanská, K. Analysis of the suitability of high-resolution DEM obtained using ALS and UAS (SFM) for the identification of changes and monitoring the development of selected Geohazards in the alpine environment—A case study in high tatras, Slovakia. Remote Sens. 2020, 12, 3901. [Google Scholar] [CrossRef]
  30. Sturdivant, E.; Lentz, E.; Thieler, E.R.; Farris, A.; Weber, K.; Remsen, D.; Miner, S.; Henderson, R. UAS-SFM for coastal research: Geomorphic feature extraction and land cover classification from high-resolution elevation and optical imagery. Remote Sens. 2017, 9, 1020. [Google Scholar] [CrossRef]
  31. Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth height determination of tree walls for precise monitoring in Apple Fruit production using UAV photogrammetry. Remote Sens. 2020, 12, 1656. [Google Scholar] [CrossRef]
  32. Tu, Y.-H.; Johansen, K.; Phinn, S.; Robson, A. Measuring canopy structure and condition using multi-spectral UAS imagery in a horticultural environment. Remote Sens. 2019, 11, 269. [Google Scholar] [CrossRef]
  33. Hadas, E.; Jozkow, G.; Walicka, A.; Borkowski, A. Apple Orchard Inventory with a Lidar equipped unmanned aerial system. Int. J. Appl. Earth Obs. Geoinf. 2019, 82, 101911. [Google Scholar] [CrossRef]
  34. Jonckheere, I.; Fleck, S.; Nackaerts, K.; Muys, B.; Coppin, P.; Weiss, M.; Baret, F. Review of methods for in situ leaf area index determination: Part I. Theories, sensors and hemispherical photography. Agric. For. Meteorol. 2004, 121, 19–35. [Google Scholar] [CrossRef]
  35. Garrigues, S.; Lacaze, R.; Baret, F.J.T.M.; Morisette, J.T.; Weiss, M.; Nickeson, J.E.; Fernandes, R.; Plummer, S.; Shabanov, N.V.; Myneni, R.B.; et al. Validation and intercomparison of global Leaf Area Index products derived from remote sensing data. J. Geophys. Res. Biogeosci. 2008, 113, G02028. [Google Scholar] [CrossRef]
  36. Gower, S.T.; Kucharik, C.J.; Norman, J.M. Direct and indirect estimation of leaf area index, fAPAR, and net primary production of terrestrial ecosystems. Remote Sens. Environ. 1999, 70, 29–51. [Google Scholar] [CrossRef]
  37. Yamaguchi, T.; Tanaka, Y.; Imachi, Y.; Yamashita, M.; Katsura, K. Feasibility of combining deep learning and RGB images obtained by unmanned aerial vehicle for leaf area index estimation in Rice. Remote Sens. 2020, 13, 84. [Google Scholar] [CrossRef]
  38. Bajocco, S.; Ginaldi, F.; Savian, F.; Morelli, D.; Scaglione, M.; Fanchini, D.; Raparelli, E.; Bregaglio, S.U. On the use of NDVI to estimate Lai in field crops: Implementing a conversion equation library. Remote Sens. 2022, 14, 3554. [Google Scholar] [CrossRef]
  39. Morgan, G.R.; Wang, C.; Morris, J.T. RGB indices and canopy height modelling for mapping tidal marsh biomass from a small unmanned aerial system. Remote Sens. 2021, 13, 3406. [Google Scholar] [CrossRef]
  40. Gracia-Romero, A.; Kefauver, S.C.; Vergara-Díaz, O.; Zaman-Allah, M.A.; Prasanna, B.M.; Cairns, J.E.; Araus, J.L. Comparative performance of ground vs. aerially assessed RGB and multispectral indices for early-growth evaluation of maize performance under phosphorus fertilization. Front. Plant Sci. 2017, 8, 2004. [Google Scholar] [CrossRef]
  41. Lussem, U.; Bolten, A.; Gnyp, M.L.; Jasper, J.; Bareth, G. Evaluation of RGB-based vegetation indices from UAV imagery to estimate forage yield in grassland. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–3, 1215–1219. [Google Scholar] [CrossRef]
  42. Mavic 3 Multispectral Edition-See More, Work Smarter–DJI Agricultural Drones. Available online: https://ag.dji.com/mavic-3-m (accessed on 31 August 2024).
  43. Ekaso, D.; Nex, F.; Kerle, N. Accuracy assessment of real-time kinematics (RTK) measurements on unmanned aerial vehicles (UAV) for direct geo-referencing. Geo-Spat. Inf. Sci. 2020, 23, 165–181. [Google Scholar] [CrossRef]
  44. PIX4Dmapper: Professional Photogrammetry Software for Drone Mapping. Available online: https://www.pix4d.com/product/pix4dmapper-photogrammetry-software/ (accessed on 19 August 2024).
  45. GIS Drone Mapping: 2D & 3D Photogrammetry: Arcgis Drone2Map. Available online: https://www.esri.com/en-us/arcgis/products/arcgis-drone2map/overview?srsltid=AfmBOopDU0gPkXGwXiF1wQAM3CmAFwKI4Bmj3mHGGF_RWPudJyYyp5e7#2d-photogrammetry (accessed on 31 August 2024).
  46. Reality Capture: Drone Mapping Software: Photo Documentation. Available online: https://www.dronedeploy.com/ (accessed on 31 August 2024).
  47. Jing, R.; Gong, Z.; Zhao, W.; Pu, R.; Deng, L. Above-bottom biomass retrieval of aquatic plants with regression models and SfM data acquired by a UAV platform—A case study in Wild Duck Lake Wetland, Beijing, China. ISPRS J. Photogramm. Remote Sens. 2017, 134, 122–134. [Google Scholar] [CrossRef]
  48. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of Winter Wheat Above-Ground Biomass Using Unmanned Aerial Vehicle-Based Snapshot Hyperspectral Sensor and Crop Height Improved Models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef]
  49. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  50. Cen, H.; Wan, L.; Zhu, J.; Li, Y.; Li, X.; Zhu, Y.; Weng, H.; Wu, W.; Yin, W.; Xu, C.; et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods 2019, 15, 32. [Google Scholar] [CrossRef] [PubMed]
  51. Possoch, M.; Bieker, S.; Hoffmeister, D.; Bolten, A.; Schellberg, J.; Bareth, G. Multi-temporal crop surface models combined with the RGB vegetation index from UAV-based images for forage monitoring in grassland. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 991–998. [Google Scholar] [CrossRef]
  52. Michez, A.; Bauwens, S.; Brostaux, Y.; Hiel, M.-P.; Garré, S.; Lejeune, P.; Dumont, B. How far can consumer-grade UAV RGB imagery describe crop production? A 3D and multitemporal modeling approach applied to Zea Mays. Remote Sens. 2018, 10, 1798. [Google Scholar] [CrossRef]
  53. Sunoj, S.; Yeh, B.; Marcaida, M., III; Longchamps, L.; van Aardt, J.; Ketterings, Q.M. Maize grain and silage yield prediction of commercial fields using high-resolution UAS imagery. Biosyst. Eng. 2023, 235, 137–149. [Google Scholar] [CrossRef]
  54. Mo, J.; Lan, Y.; Yang, D.; Wen, F.; Qiu, H.; Chen, X.; Deng, X. Deep learning-based instance segmentation method of litchi canopy from UAV-acquired images. Remote Sens. 2021, 13, 3919. [Google Scholar] [CrossRef]
  55. Qin, H.; Zhou, W.; Yao, Y.; Wang, W. Individual tree segmentation and tree species classification in subtropical broadleaf forests using UAV-based LiDAR, hyperspectral, and ultrahigh-resolution RGB data. Remote Sens. Environ. 2022, 280, 113143. [Google Scholar] [CrossRef]
  56. Zarco-Tejada, P.J.; Diaz-Varela, R.; Angileri, V.; Loudjani, P. Tree height quantification using very high-resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods. Eur. J. Agron. 2014, 55, 89–99. [Google Scholar] [CrossRef]
  57. Morgan, G.; Hodgson, M.E.; Wang, C. Using SUAS-derived Point Cloud to supplement lidar returns for improved canopy height model on earthen dams. Pap. Appl. Geogr. 2020, 6, 436–448. [Google Scholar] [CrossRef]
  58. Wallace, L.; Lucieer, A.; Malenovský, Z.; Turner, D.; Vopěnka, P. Assessment of forest structure using two UAV techniques: A comparison of airborne laser scanning and structure from motion (SFM) point clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef]
  59. Ganz, S.; Käber, Y.; Adler, P. Measuring tree height with remote sensing—A comparison of photogrammetric and LIDAR data with different field measurements. Forests 2019, 10, 694. [Google Scholar] [CrossRef]
  60. Gallardo-Salazar, J.L.; Pompa-García, M. Detecting individual tree attributes and multispectral indices using unmanned aerial vehicles: Applications in a Pine Clonal Orchard. Remote Sens. 2020, 12, 4144. [Google Scholar] [CrossRef]
  61. Torres-Sánchez, J.; de Castro, A.I.; Peña, J.M.; Jiménez-Brenes, F.M.; Arquero, O.; Lovera, M.; López-Granados, F. Mapping the 3D structure of almond trees using UAV acquired photogrammetric point clouds and object-based image analysis. Biosyst. Eng. 2018, 176, 172–184. [Google Scholar] [CrossRef]
  62. Kaimaris, D.; Patias, P.; Sifnaiou, M. UAV and the comparison of image processing software. Int. J. Intell. Unmanned Syst. 2017, 5, 18–27. [Google Scholar] [CrossRef]
  63. Kim, T.H.; Lee, Y.C. Comparison of Open Source based Algorithms and Filtering Methods for UAS Image Processing. J. Cadastre Land InformatiX 2020, 50, 155–168. [Google Scholar] [CrossRef]
  64. Alidoost, F.; Arefi, H. Comparison of UAS-based photogrammetry software for 3D point cloud generation: A survey over a historical site. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, IV-4/W4, 55–61. [Google Scholar] [CrossRef]
  65. Fraser, B.T.; Congalton, R.G. Issues in unmanned aerial systems (UAS) data collection of Complex Forest Environments. Remote Sens. 2018, 10, 908. [Google Scholar] [CrossRef]
  66. Li, H.; Yan, X.; Su, P.; Su, Y.; Li, J.; Xu, Z.; Gao, C.; Zhao, Y.; Feng, M.; Shafiq, F.; et al. Estimation of winter wheat lai based on color indices and texture features of rgb images taken by uav. J. Sci. Food Agric. 2024. [Google Scholar] [CrossRef]
  67. Raj, R.; Walker, J.P.; Pingale, R.; Nandan, R.; Naik, B.; Jagarlapudi, A. Leaf area index estimation using top-of-canopy airborne RGB images. Int. J. Appl. Earth Obs. Geoinf. 2021, 96, 102282. [Google Scholar] [CrossRef]
  68. Triana Martinez, J.; De Swaef, T.; Borra-Serrano, I.; Lootens, P.; Barrero, O.; Fernandez-Gallego, J.A. Comparative leaf area index estimation using multispectral and RGB images from a UAV platform. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII, Orlando, FL, USA, 1–2 May 2023. [Google Scholar]
Figure 1. Study area of Payson, Utah, USA, with closeup of the cherry orchard of interest (orange).
Figure 1. Study area of Payson, Utah, USA, with closeup of the cherry orchard of interest (orange).
Drones 08 00494 g001
Figure 2. Surveyed trees (red) in area of interest surrounded by other orchards.
Figure 2. Surveyed trees (red) in area of interest surrounded by other orchards.
Drones 08 00494 g002
Figure 3. Example of a Montmorency cherry tree (A) and image of the DJI Mavic 3M multispectral drone used for flights (B).
Figure 3. Example of a Montmorency cherry tree (A) and image of the DJI Mavic 3M multispectral drone used for flights (B).
Drones 08 00494 g003
Figure 4. General process for creating the CHM.
Figure 4. General process for creating the CHM.
Drones 08 00494 g004
Figure 5. CHM map (left) and DTM map (right).
Figure 5. CHM map (left) and DTM map (right).
Drones 08 00494 g005
Figure 6. Scatterplots of modeled heights and in situ measured heights.
Figure 6. Scatterplots of modeled heights and in situ measured heights.
Drones 08 00494 g006
Figure 7. Correlation matrix of LAI with 12 RGB-based vegetation indices. Red cells indicate negative correlation, lighter colored or white cells reflect little to no correlation, and blue cells reflect a positive correlation.
Figure 7. Correlation matrix of LAI with 12 RGB-based vegetation indices. Red cells indicate negative correlation, lighter colored or white cells reflect little to no correlation, and blue cells reflect a positive correlation.
Drones 08 00494 g007
Table 1. RGB-based vegetation indices used for LAI modeling.
Table 1. RGB-based vegetation indices used for LAI modeling.
RGB-Based IndexEquationSources
ExG2 × G − R − B[47]
GCC or Green RatioG/B + G + R[48]
IKAW(R − B)/(R + B)[49]
MGRVI(G2 − R2)/(G2 + R2)[50]
MVARI(G − B)/(G + R − B)[50]
RGBVI(G2 − B × R)/(G2 + B × R)[51]
TGIG − (0.39 × R) − (0.61 × B)[52]
VARI(G − R)/(G + R − B)[50]
VDVI or GLA(2 × G − R − B)/(2 × G + R + B)[50]
Red–Blue Ratio Index (simple ratio)R/B[53]
Green–Blue Ratio Index (simple ratio)G/B[53]
Green–Red Ratio Index (simple ratio)G/R[53]
Table 2. RMSE values for all three datasets.
Table 2. RMSE values for all three datasets.
DroneDeployDrone2MapPix4D
RMSE value31.83 cm32.66 cm34.03 cm
Table 3. Independent samples t-test results.
Table 3. Independent samples t-test results.
DroneDeploy–Drone2MapDrone2Map–Pix4DDroneDeploy–Pix4D
Two sided p-value0.4100.1010.013 1
1 significant difference at p < 0.05.
Table 4. UAS data processing methods.
Table 4. UAS data processing methods.
DroneDeployDrone2MapPix4D
Cost
(individual/standard
pricing)
USD 329 /moUSD 145.83/moUSD 291.67/mo
AudienceConstruction/Engineering/Energy/AgricultureGIS/Geography/Mapping/EngineersPrecision Ag/Surveying/Mapping/Engineering
Computational
Environment
Cloud-basedDesktopDesktop
Ability to Adjust ParametersLowHighHigh
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Morgan, G.R.; Stevenson, L. Unoccupied-Aerial-Systems-Based Biophysical Analysis of Montmorency Cherry Orchards: A Comparative Study. Drones 2024, 8, 494. https://doi.org/10.3390/drones8090494

AMA Style

Morgan GR, Stevenson L. Unoccupied-Aerial-Systems-Based Biophysical Analysis of Montmorency Cherry Orchards: A Comparative Study. Drones. 2024; 8(9):494. https://doi.org/10.3390/drones8090494

Chicago/Turabian Style

Morgan, Grayson R., and Lane Stevenson. 2024. "Unoccupied-Aerial-Systems-Based Biophysical Analysis of Montmorency Cherry Orchards: A Comparative Study" Drones 8, no. 9: 494. https://doi.org/10.3390/drones8090494

Article Metrics

Back to TopTop