Next Article in Journal
Using Ultrawideband Technology to Control a Car to Reach Its Destination
Previous Article in Journal
Design of Photonic Crystal Fiber for 5G Communication Using COMSOL Multiphysics
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Estimation of Water Potential in Corn Plants Using Machine Learning Techniques with UAV Imagery and Evaluating the Effect of Flying Height †

by
Audberto Reyes-Rosas
1,*,
Francisco M. Lara-Viveros
1,
Lizeth Chávez-Cerón
1 and
Sasirot Khamkure
2
1
Department of Bioscience and Agrotechnology, Research Center of Applied Chemistry, Saltillo 25294, Mexico
2
Department of Irrigation and Drainage, CONAHCYT-Autonomous Agrarian Antonio Narro University, Saltillo 25315, Mexico
*
Author to whom correspondence should be addressed.
Presented at the 4th International Electronic Conference on Applied Sciences, 27 October–10 November 2023; Available online: https://asec2023.sciforum.net/.
Eng. Proc. 2023, 56(1), 157; https://doi.org/10.3390/ASEC2023-15882
Published: 7 November 2023
(This article belongs to the Proceedings of The 4th International Electronic Conference on Applied Sciences)

Abstract

:
The use of unmanned aerial vehicles (UAVs) in precision agriculture has proven to be a useful tool for crop monitoring. The use of this technology in irrigation water management represents a significant improvement opportunity compared to the tools commonly used. This study aimed to estimate the water content in corn plants using images captured by a drone, evaluating the effect that the flying height has on the accuracy of the estimation of this indicator. For this purpose, water potential (WP) was measured in corn plant leaves, which allows us to infer the presence of water stress, and indicates the need for irrigation in the plant. Aerial images of the crop were captured under three treatments based on irrigation levels (40%, 70%, and 100% water applied, compensating for evapotranspiration) to induce gradients of moisture content in the plants. Seven drone flights were carried out at different dates at 30, 50 and 70 m height. The water potential of the leaves was correlated with radiometrically calibrated multispectral images (R, G, B, red-edge, and near-infrared). Three models were developed: a multiple linear regression (LM), neural networks (NN), and a random forest (RF). The LM and NN models showed similar error metrics, with the RF model showing the best results, with an average root mean square error (RMSE) and coefficient of determination (R2) of 1.26 and 0.9, respectively, with the training dataset. The flying height, which affected the resolution of the images, was not significant in the estimation of WP in this height range.

1. Introduction

The use of unmanned aerial vehicles (UAVs) in precision agriculture has proven to be a very useful tool for remote crop monitoring, allowing non-invasive and faster information acquisition in relation to conventional methods. The use of this technology in irrigation water management represents a significant improvement opportunity, and can allow for the diagnosis of plant water status at a spatial level, allowing the detection of heterogeneity [1] and the quantification of water shortage.
The determination of the water status (WP) of plants allows the farmer to establish the timing and amount of watering. Plant water potential is an eco-physiological variable commonly used to estimate the water status of crops; however, the reliability of its values as a representation of the water status of plants requires some conditions to be met. One of the most important is that the measurement of WP of the samples (the leaves, or a portion of these) must be performed at the same time as their cutting to avoid water losses in their tissues; this is complicated when the crop area is extensive, and may imply the consumption of a large amount of time and labor.
The use of UAVs as an alternative that solves the disadvantages mentioned above requires the construction of one or more models that relate WP data previously measured in the field to be correlated with information extracted from the aerial images captured by a drone. Machine learning-based models have shown promise, as they could consider several aspects that influence the determination of the water status of plants [2]. The flight height of the drone is not standardized, and this height parameter affects the quality of the images captured that is, the greater the height, the lower the level of detail of the plants captured. For the same reason, at a lower flight height, the drone will require more flight time and battery consumption.
Several studies have been conducted to monitor the water status of crops, using mainly multispectral, hyperspectral, and thermal cameras, and applied to a variety of crops [3,4,5]. These studies generally involve the acquisition of various vegetation indices to be correlated with some physiological indicator related to the water status of the plant [6]. To do this, studies focus on the acquisition of these vegetation indices through the crop canopy [7].
The goal of this study is to evaluate the model that best estimates the WP in corn plants, as well as to establish whether or not there is an effect on the estimation of this parameter due to the flying height of the drone.

2. Materials and Methods

2.1. Establishment of the Experiment

Three treatments were established based on the level of irrigation applied to compensate for the evapotranspiration rate, applying a level of 40% (T3), 70% (T2) and 100% (T1) of the water required by the crop. In other words, the 100% treatment was provided with the total amount required, while the 40% treatment would have a deficit of 60% of applied water. Each treatment (Figure 1) had an area of 400 m2 (20 m × 20 m) cultivated with corn of a commercial variety called DK-2069. The crop was established on 15 May 2022, with the first flight taking place on 13 June 2022. The field is located in Ramos Arizpe, Coahuila, Mexico (25°39′08.1″ N, 101°06′49.2″ W). All macro and micronutrient requirements were supplied to the three treatments according to [8] through irrigation water in the spring–summer cycle.

2.2. Measurement of Water Potential in Plants

Using a Scholander pressure chamber (Plant water status console, Model 3005F01) at approximately 12:00 pm on the same day as the drone flight, the water potential (WP) was measured in mature leaves from the upper canopy of corn plants. The plants were selected according to the established sampling points, as shown in Figure 1. To determine the WP, a cut was made 30 cm from the apex of the leaves to subsequently measure the appearance of the sap meniscus from the main vein with the leaf inside the pressure chamber. These measurements were made weekly from June to August 2022.
The water potential reading is direct (in pressure units), and was considered to be correlated with the pixel intensities of each of the channels (R, G, B, RE, and NIR).

2.3. Aerial Image Acquisition

During each WP measurement day, three flights were performed with the drone over the experiment surface using a DJI P4 multispectral equipped with GPS, solar radiation sensor, and flight stabilization system. It also has a 12.4 megapixel multispectral camera with an image size of 4000 × 3000 pixels.
Images were captured under clear sky conditions around noon, with low wind speeds.

2.4. Image Processing

The aerial images were processed using the R v4.2 language and included a radiometric calibration. The images were segmented to remove the soil, and the image processing was performed using the NDVI vegetation index to highlight the vegetation from the rest of the elements in the orthomosaic image. A threshold value was calculated using the Otsu method to perform this differentiation, thus performing the segmentation. The EBImage library [9] was used within the R environment to process the orthomosaic images.

2.5. Modeling of WP Based on Multispectral Images

Three models were built from the data obtained using the R v4.2 language, considering WP the dependent variable. The average pixel intensity of the area surrounding 1 m around each sampling point (Figure 1) was calculated in each of the R, G, B, RE, and NIR channels of each segmented orthomosaic image. These values, along with the number of weeks after planting, were the input data for each model. The three regression models implemented were a multiple linear regression model (LM), an artificial neural network (NN), and random forest (RF). The metrics for evaluating the obtained models were the determination coefficient (R2) and the root mean square error (RMSE).
A grid search was used to select parameters for the construction of the NN and RF models. The best parameters found for NN were a hidden layer of seven units (size) and a weight decay (decay) of 0.1. For RF, a number of variables randomly sampled as candidates at each split (mtry) of 7 and a number of trees to grow (ntree) of 500 were selected. The datasets were divided into training and testing using 80% and 20% of the total data collected in the nine flights performed.

3. Results

The results of the metrics obtained from the constructed models are shown in Figure 2. The RMSE and R2 metrics were better with the training dataset (lower RMSE and higher R2). By combining the coefficients of RMSE and R2 for the three flight heights (30, 50, and 70 m), a least-significant difference (LSD) test was performed, which indicated a significant difference between the RF and LM models for both metrics. The NN model was established by this test in an intermediate group without significant difference compared to the RF and LM models (Table 1).
The average values obtained for each model are shown in Table 1. In this table, it is possible to see that the RF model had the highest R2 value and the lowest RMSE value for both the training and testing data sets. The least accurate model was the LM model, as confirmed by the LSD test.

4. Discussion

The results showed a better generalization of the water potential in the plants with the RF model, since in all cases, the RMSE and R2 values were always lower when the model was applied with testing data, which could indicate overfitting on the data, especially observed with the NN and RF models. The LM model had the lowest level of fit, indicating that the relationship between the dependent variable WP and the predictor variables of pixel intensity of the R, G, B, RE and NIR channels is possibly not linear.
According to what is observed in Figure 2, the flight height does not seem to influence the fit level of any of the models, since there is no obvious trend between the metrics used when comparing their values within each of the three models. This lack of trend is appreciated both in results obtained with the training data set as well as with testing data. This indicates that a drone can fly from 30 to 70 m without affecting the estimation of WP in plants. With a higher flying height, a larger area can be covered in a shorter time, thus allowing for a lower consumption of the drone’s battery. This contrasts with what was found in [10], wherein an effect of flying height and time of day was found; these authors were estimating a different parameter, that of canopy temperature, and using a different radiometric calibration method than the one used in the present study.
A study by Anders et al. [11] found that flights at altitudes of 126 to 235 m did not significantly affect vegetation structure. In the present study, flights were conducted at lower altitudes (30 to 70 m) to capture the finer details of individual leaves. The R2 values obtained in the present study are consistent with the general trend found in other studies, which typically report R2 values of no more than 0.7 when correlating images with water potential [12]. The present study also addressed the temporal factor by conducting multiple measurements throughout the crop cycle. This is an important limitation because it is labor-intensive to collect data with the pressure chamber.
Future work could focus on measuring WP in leaves taken from near the base or halfway up the plant, which are older leaves and may be more representative than the upper leaves considered in the present study.

Author Contributions

Conceptualization, methodology and writing, A.R.-R. and F.M.L.-V.; data curation, L.C.-C.; supervision, S.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

No applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gago, J.; Douthe, C.; Coopman, R.E.; Gallego, P.P.; Ribas-Carbo, M.; Flexas, J.; Escalona, J.; Medrano, H. UAVs Challenge to Assess Water Stress for Sustainable Agriculture. Agric. Water Manag. 2015, 153, 9–19. [Google Scholar] [CrossRef]
  2. Tosin, R.; Martins, R.; Pôças, I.; Cunha, M. Canopy VIS-NIR Spectroscopy and Self-Learning Artificial Intelligence for a Generalised Model of Predawn Leaf Water Potential in Vitis Vinifera. Biosyst. Eng. 2022, 219, 235–258. [Google Scholar] [CrossRef]
  3. Acharya, B.S.; Bhandari, M.; Bandini, F.; Pizarro, A.; Perks, M.; Joshi, D.R.; Wang, S.; Dogwiler, T.; Ray, R.L.; Kharel, G.; et al. Unmanned Aerial Vehicles in Hydrology and Water Management: Applications, Challenges, and Perspectives. Water Resour. Res. 2021, 57, e2021WR029925. [Google Scholar] [CrossRef]
  4. Adam, E.; Mutanga, O.; Rugege, D. Multispectral and Hyperspectral Remote Sensing for Identification and Mapping of Wetland Vegetation: A Review. Wetl. Ecol. Manag. 2010, 18, 281–296. [Google Scholar] [CrossRef]
  5. Zhao, T.; Stark, B.; Chen, Y.Q.; Ray, A.; Doll, D. More Reliable Crop Water Stress Quantification Using Small Unmanned Aerial Systems (SUAS). IFAC-PapersOnLine 2016, 49, 409–414. [Google Scholar] [CrossRef]
  6. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  7. Jay, S.; Baret, F.; Dutartre, D.; Malatesta, G.; Héno, S.; Comar, A.; Weiss, M.; Maupas, F. Exploiting the Centimeter Resolution of UAV Multispectral Imagery to Improve Remote-Sensing Estimates of Canopy Structure and Biochemistry in Sugar Beet Crops. Remote Sens. Environ. 2019, 231, 110898. [Google Scholar] [CrossRef]
  8. Alcántar-González, G.; Trejo-Téllez, L.I.; Gómez-Merino, F. Crops Nutrition (Nutrición de Cultivos); Colegio de Posgraduados: Montecillo, Mexico, 2016; ISBN 978-607-715-324-5. [Google Scholar]
  9. Pau, G.; Fuchs, F.; Sklyar, O.; Boutros, M.; Huber, W. EBImage-an R Package for Image Processing with Applications to Cellular Phenotypes. Bioinformatics 2010, 26, 979–981. [Google Scholar] [CrossRef] [PubMed]
  10. Awais, M.; Li, W.; Cheema, M.J.M.; Hussain, S.; Shaheen, A.; Aslam, B.; Liu, C.; Ali, A. Assessment of Optimal Flying Height and Timing Using High-Resolution Unmanned Aerial Vehicle Images in Precision Agriculture. Int. J. Environ. Sci. Technol. 2022, 19, 2703–2720. [Google Scholar] [CrossRef]
  11. Anders, N.; Smith, M.; Suomalainen, J.; Cammeraat, E.; Valente, J.; Keesstra, S. Impact of Flight Altitude and Cover Orientation on Digital Surface Model (DSM) Accuracy for Flood Damage Assessment in Murcia (Spain) Using a Fixed-Wing UAV. Earth Sci. Inform. 2020, 13, 391–404. [Google Scholar] [CrossRef]
  12. Lakso, A.N.; Santiago, M.; Stroock, A.D. Monitoring Stem Water Potential with an Embedded Microtensiometer to Inform Irrigation Scheduling in Fruit Crops. Horticulturae 2022, 8, 1207. [Google Scholar] [CrossRef]
Figure 1. Configuration of the three established treatments (T1, T2, and T3), showing within each of them the location (circle) in which the water potential of the corn plants was measured.
Figure 1. Configuration of the three established treatments (T1, T2, and T3), showing within each of them the location (circle) in which the water potential of the corn plants was measured.
Engproc 56 00157 g001
Figure 2. (a) Determination coefficient (R2) values obtained for each drone flight height (30, 50, and 70 m) with testing and training data sets. (b) RMSE values obtained for each drone flight height (30, 50, and 70 m) with testing and training data sets.
Figure 2. (a) Determination coefficient (R2) values obtained for each drone flight height (30, 50, and 70 m) with testing and training data sets. (b) RMSE values obtained for each drone flight height (30, 50, and 70 m) with testing and training data sets.
Engproc 56 00157 g002
Table 1. Average values obtained for the RMSE and R2 metrics, by model type, uniting the three flying heights (30, 50 and 70 m).
Table 1. Average values obtained for the RMSE and R2 metrics, by model type, uniting the three flying heights (30, 50 and 70 m).
ModelTraining DataTesting Data
RMSER2RMSER2
LM2.8 a0.53 a2.92 a0.44 a
NN1.94 ab0.76 ab2.61 ab0.57 ab
RF1.26 b0.9 b2.0 b0.73 b
Different letters indicate statistically significant difference between groups according to the least significant difference (LSD) test.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Reyes-Rosas, A.; Lara-Viveros, F.M.; Chávez-Cerón, L.; Khamkure, S. Estimation of Water Potential in Corn Plants Using Machine Learning Techniques with UAV Imagery and Evaluating the Effect of Flying Height. Eng. Proc. 2023, 56, 157. https://doi.org/10.3390/ASEC2023-15882

AMA Style

Reyes-Rosas A, Lara-Viveros FM, Chávez-Cerón L, Khamkure S. Estimation of Water Potential in Corn Plants Using Machine Learning Techniques with UAV Imagery and Evaluating the Effect of Flying Height. Engineering Proceedings. 2023; 56(1):157. https://doi.org/10.3390/ASEC2023-15882

Chicago/Turabian Style

Reyes-Rosas, Audberto, Francisco M. Lara-Viveros, Lizeth Chávez-Cerón, and Sasirot Khamkure. 2023. "Estimation of Water Potential in Corn Plants Using Machine Learning Techniques with UAV Imagery and Evaluating the Effect of Flying Height" Engineering Proceedings 56, no. 1: 157. https://doi.org/10.3390/ASEC2023-15882

Article Metrics

Back to TopTop