Next Article in Journal
Estimation of Surface Ozone Effects on Winter Wheat Yield across the North China Plain
Previous Article in Journal
Effects of Metal Oxide Nanoparticles on the Growth and Genotoxicity of Garden Cress (Lepidium sativum L.)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Maize Water Requirements Based on the Low-Cost Image Acquisition Methods and the Meteorological Parameters

1
Key Laboratory of Smart Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, Nanjing 210044, China
2
Intelligent Equipment Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
3
Jiangsu Academy of Agricultural Sciences, Nanjing 210014, China
4
Key Laboratory for Quality Testing of Software and Hardware Products on Agricultural Information, Ministry of Agriculture and Rural Affairs, Beijing 100097, China
*
Authors to whom correspondence should be addressed.
Agronomy 2024, 14(10), 2325; https://doi.org/10.3390/agronomy14102325
Submission received: 22 July 2024 / Revised: 29 August 2024 / Accepted: 7 October 2024 / Published: 10 October 2024
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
This study aims to enhance maize water demand calculation. We calculate crop evapotranspiration (ETc) through mobile phone photography and meteorological parameters. In terms of crop coefficient (Kc) calculation, we utilize the mobile phone camera image driver to establish a real-time monitoring model of Kc based on plant canopy coverage (PGC) changes. The calculation of PGC is achieved by constructing a PGC classification network and a Convolutional Block Attention Module (CBAM)-U2Net is implemented by the segment network. For the reference crop evapotranspiration (ETo) calculation, we constructed a simplified ETo estimation model based on SVR, LSTM, Optuna LSTM, and GWO-SVM using a public meteorological data-driven program, and evaluated its performance. The results demonstrate that our method achieves high classification accuracy for the PGC 98.9% and segmentation accuracy for the CBAM-U2net-based segmentation network 95.68%. The Kc calculation model exhibits a root mean square error (RMSE) of 0.053. In terms of ETo estimation, the Optuna-LSTM model with four variables demonstrates the best estimation effect, with a correlation coefficient (R2) of 0.953. The final R2 between the estimated ETc value and the true value is 0.918, with an RMSE of 0.014. This method can effectively estimate the water demand of maize.

1. Introduction

The United Nations Food and Agriculture Organization (FAO) report pointed out that due to factors such as population increase, climate change, natural disasters, and insect pests, approximately 691–783 million people around the world faced hunger in 2022, an increase of more than 122 million hungry people compared to 2019, and food security problems were increasingly intensifying [1]. As the crop with the highest yield in the world, maize is important for food security. However, maize production faces severe water resource crisis constraints. Currently, more than 733 million people in the world live in countries with severe water shortages of 70% or extreme water shortages of 100%, accounting for approximately 10% of the global population [2]. Properly arranging irrigation management according to the maize water demand process can effectively reduce irrigation water consumption; therefore, it is crucial to conduct research on monitoring the ETc of maize, and the calculation of ETc is related to Kc and ETo.
The Kc-ETo method given by FAO56 is the current mainstream method for estimating crop water demand. It is obtained by multiplying the crop evapotranspiration (ETc) and crop coefficient (Kc). This calculation method has been widely used in rice [3], wheat [4], maize [5], and other crops. Regarding the calculation of Kc, FAO provides Kc-recommended values for 84 crops. Crop coefficients (Kc) are influenced by various factors, including plant development, climate, and soil moisture levels, and require adjustment to reflect local conditions [6]. Researchers have investigated the correlation between Kc and growth-related factors of crops, such as the leaf area index and plant stature, and subsequently developed regression models for Kc calculation [7]. The algorithm uses a Unmanned Aerial Vehicle (UAV) to acquire images and create a crop height model from which crop heights are extracted to estimate Kc [8]. Researchers used satellite data from the operational land imager and moderate resolution imaging spectroradiometer to evaluate the actual Kc [9]. However, high-altitude remote sensing methods have problems such as low resolution and insufficient timeliness and accuracy in monitoring ground crops. UAV can quickly address real-time issues, but they have a high technical barrier to entry. However, with the widespread use of smartphones, mobile photography offers a cost-effective alternative for monitoring Kc values in real time.
Regarding the calculation of ETo, the most widely used is the Penman–Monteith (PM) formula, but it requires eight parameters, such as soil heat flux and net solar radiation (Rad) for the calculation; therefore, the calculation process is relatively complicated. To simplify the calculation method of ETo, researchers often use machine learning to predict ETo. Four machine learning models, namely radial basis function (RBF), M5Tree, locally weighted learning (LWL), and gradient boosted tree (GBT), were evaluated for predicting daily ETo [10]. They compared the calculation results with the three empirical models and pointed out that the accuracy of the gradient-boosting decision tree was the highest; it can be used as a simplified ETo calculation model in Jiangsu Province. Researchers employed support vector machines, artificial neural networks, and empirical methods to predict evapotranspiration in scenarios where meteorological data is incomplete. They concluded that support vector machines provide more accurate estimates of evapotranspiration compared to artificial neural networks and empirical approaches. [11]. Researchers used the random forest (RF) method to predict crop ETo in Brazilian forests using three meteorological parameters; the prediction results were good [12]. At present, there are many methods at home and abroad that use machine learning to estimate ETo, but how to improve the accuracy of machine learning algorithm estimation will become a new research direction.
Therefore, we propose a method for estimating maize water demand using mobile phone photography combined with default meteorological data. This approach aims to achieve the low-cost and high-precision real-time monitoring of crop water demand. The primary objectives of this study are threefold: (1) establishing a real-time monitoring model for Kc based on plant canopy coverage (PGC) considering crop growth characteristics; (2) establishing a default machine learning model for estimating ETo, considering its changing characteristics; and (3) utilizing machine learning techniques to derive ETo and Kc values. We will employ four machine learning algorithms: Support Vector Regression (SVR), Long Short-Term Memory networks (LSTM), Optuna-optimized LSTM, and Grey Wolf Optimizer-Support Vector Machine (GWO-SVM). These algorithms are used to calculate ETc (crop evapotranspiration) using the traditional Kc method and to validate its feasibility with field test data.

2. Materials and Methods

2.1. Research Area and Data Acquisition

The test site is located at the National Precision Agriculture Demonstration Base in Changping District, Beijing. Its coordinates are 116.46° east longitude and 40.18° north latitude. The study area falls within the warm temperate semi-humid semi-arid monsoon climate, with an average annual temperature of 11.8 °C, an average annual precipitation of 584 mm, and an altitude of 50.1 m. Please refer to Figure 1 for the specific location. The maize varieties used include 368 Nong keyu, 336 Nong kenuo, and 909 Jing ketian. They were planted on 10 June 2022, and harvested on 8 September 2022.

2.2. Experimental Design

2.2.1. Mobile Phone Image Data Acquisition

The RGB images were obtained through a photograph taken with a HUAWEI Nova 5 smartphone (Nova 5, Huawei, Shenzhen, Guangdong, China). The rear camera has a pixel size of 3840 × 2160 pixels. From top to bottom, there is a 16-megapixel ultra-wide-angle lens, a 48-megapixel ultra-clear main camera, and a 2-megapixel depth-of-field lens, next to the main camera, which we used for shooting, is a 2-megapixel macro lens. Figure 2 displays the photographs collected between 14 June and 3 August 2022. Overhead shots of 24 test areas were taken using manual photography. The camera lens, positioned 80 cm above the crops with an incident angle perpendicular to the ground, captured images at three different times each day: 9:00 AM, 12:00 PM, and 16:00 PM. In total, 3684 photos documenting maize in various growth stages were obtained. We dedicated 12 experimental plots to the cultivation of three distinct maize varieties: ‘368 Nong Keyu’, ‘336 Nong Kenuo’, and ‘909 Jing Ketian’. Each variety was allocated to 4 experimental plots, with each plot featuring an effective planting area of 0.75 square meters.

2.2.2. Environmental Data Acquisition

(1)
Meteorological data
An automatic weather station (WS1802, National Engineering Research Center for Intelligent Equipment in Agriculture, Beijing, China) was used to collect 9 pieces of meteorological data, including air temperature (AT), air humidity (AH), rainfall, maximum wind speed (MWS), minimum wind speed (MinWS), average wind speed (AWS), wind direction (WD), Rad, and atmospheric pressure (AP). The collection frequency of this equipment is once every hour, the height of the equipment is 2.5 m, and the placement position is 3 m from the test area. Among them, air temperature (measuring range is 30~70 °C, accuracy ±0.5 °C, stability ±1 °C/year); air humidity (measuring range 0~100%, accuracy ±3%, stability ±1%RH/year); wind direction (measuring range 0~360°, accuracy ±7°); wind speed (measuring range 1~67 m/s, accuracy ±0.3 m/s); rainfall (resolution: 0.2 mm, Accuracy ±4%); Rad (measurement range 0~2000 W/m2, resolution (1 W/2, accuracy 5%)); and atmospheric pressure (measurement range 300~1100 hPa, accuracy ±1.5 hPa–25 °C).
(2)
Actual measurement value
Firstly, we calculated the ETc value using a lysimeter plot (WTS−200, National Engineering Research Center for Intelligent Equipment in Agriculture, Beijing, China), as shown in Figure 3. It includes a weighing lysimeter data acquisition system; its parameters are shown in Table 1. Then, we obtained the ETo value using meteorological parameters, and finally obtained the Kc value using Formula (9).
The actual ETc value is obtained through a large weighing lysimeter. The module includes 24 small lysimeters, which are installed in the basement of an all-steel structure. Each lysimeter has an area of 1 m long, 0.75 m wide and 2 m deep, with a practical usable area of 18 m².The lysimeter plot contains field undisturbed soil and is equipped with a weighing lysimeter data acquisition system, which can convert pressure signals into electrical signals and record soil mass changes every 5 min. The sensitivity of measuring the increase or decrease in the moisture is 0.05 mm, and it is collected and recorded automatically under computer control. Since there was precipitation during the observation period, the actual data of the lysimeter is used to remove the rainfall to obtain the true value of ETc. The true value of ETo is calculated using the PM formula [13], which is as follows:
E T 0 = 0.408 Δ ( R n G ) + γ 900 T + 273 U 2 ( e a e d ) Δ + γ ( 1 + 0.34 U 2 )
Δ = 4098 e a ( T + 237.3 ) 2
R n = R n s R n l
G = 0.38 ( T d T d 1 )
γ = 0.00163 P / λ
U 2 = 4.87 U h / ln ( 67.8 h 5.42 )
e a = 0.611 exp ( 17.27 T T + 237.3 )
e d = e d ( T min ) + e d ( T max ) 2 = 1 2 e a ( T min ) R H max 100 + 1 2 e a ( T max ) R H min 100
E T 0 is the reference crop evapotranspiration; the unit is mm/d; T is the average temperature; the unit is °C; Δ is the tangent slope of the temperature-saturated vapour pressure relationship curve at T ; the unit is kPa·°C−1; R n is Net solar Rad; the unit is W/m2; R n s is static shortwave Rad; R n l is static longwave Rad, and its units are MJ/m2·d; G is the soil heat flux; the unit is MJ/m2·d. For daily estimates E T 0 , the daily soil heat flux d can be calculated, T d and T d 1 are the average temperatures on days d and d 1 , respectively; γ is the thermometer constant; the unit is kpa·°C−1; λ is latent heat; U 2 is the wind speed at a height of 2 m. The unit is m/s; h is the height;, U h is the wind speed at height h ; e a is the saturated vapor pressure; the unit is kpa; e d is the actual water vapor pressure; the unit is kpa; T min is the daily minimum temperature; T max is the daily maximum temperature.
The true value of Kc is obtained through Formula (9):
K c = E T c / E T o

2.3. Machine Learning Techniques

This study used the single-Kc method as a theoretical knowledge model for maize water demand estimation and optimizes the model through machine learning algorithms and machine vision algorithms. The model framework diagram is shown in Figure 4. Specifically, (1) Kc is utilized for real-time monitoring, employing visible light images captured by mobile phone, judging the PGC changes through the PGCClassfication model, and then using CBAM-U2Net to calculate PGC, and establishing a real-time monitoring model of Kc and PGC; (2) ETo estimation is used for default calculation and driven by meteorological data, and four types of machines are established, with the learning algorithm estimating the model and selecting the optimal model through analysis; and (3) finally, the Kc value calculated by machine vision driven by visible light and the ETo calculated by machine learning driven by meteorological data are brought into the Kc method to estimate ETc.

2.3.1. Dataset Construction

The mobile phone image collection process is affected by environmental and human factors, which inevitably produce problems such as angle errors and excessive light and shadow areas. Therefore, visible light images of maize need to be cleaned, mainly to remove angle deviations and images with greater light and shadow effects. A total of 3684 photographic data points were obtained, and after data preprocessing, 62 images were removed, resulting in 3622 images. This dataset allows individuals to create customized annotation tasks or perform image annotations. The dataset’s status is shown in Table 2.
Labelme is an image annotation tool developed by the Computer Science and Artificial Intelligence Laboratory (CSAIL) at MIT, which allows people to create customized annotation tasks or perform image annotation, and EISeg (Efficient Interactive Segmentation) is an efficient and intelligent interactive segmentation and labeling software developed based on PaddlePaddle. We utilized Labelme (version 3.6.1) and EISeg (version 1.0.0) to annotate the maize canopy in the 3622 images, and conduct intensive point selection on the maize canopy outline. At the same time, weeds that can be easily confused with maize in the image are not treated specifically and are uniformly categorized as background-affected pixels. Finally, a training set of 2536 images and a test set of 1086 images are obtained, including annotation files and image files. The image resolution is 4608 × 3456 and the bit depth is 24.
The environmental dataset consists of 9 meteorological parameters such as temperature, humidity, and Rad. During the period from 14 June 2022 to 3 August 2022, meteorological data are collected every hour, and 10,956 meteorological data are obtained. After removing 18 abnormal data in the dataset, the environmental dataset contains a total of 10,938 meteorological data.

2.3.2. Construction of Machine Vision Algorithm Based on Mobile Phone Imaging Driven Kc

Repeated image segmentation when the maize coverage reaches 100% will affect the calculation efficiency. Therefore, the PGC changes are first judged through the classification network. If the coverage is less than 1, it is input into the image segmentation network. If it is full, image segmentation is not required. The PGC-Classfication network is optimized through optimization tools such as deformable convolution and the CBAM attention mechanism, and the maize canopy PGC data is fed into the Kc real-time calculation model to obtain real-time and accurate Kc values.
(1)
PGC-Classfication
Our aim in introducing the CBAM (Convolutional Block Attention Module) attention module into PGC-Classification is to enable the convolutional neural network to adaptively adjust the channel and spatial dimensions of the maize canopy feature map [14], thereby improving the performance and generalization ability of the canopy segmentation network. The CBAM attention module consists of two sub-modules: the channel attention module and the spatial attention module. The channel attention module captures the relationship between different channels through global average pooling and max pooling and then generates channel weights through a multi-layer perceptron. The spatial attention module captures the relationship between different spatial locations through average pooling and max pooling in the channel dimension, and then generates spatial weights through a convolutional layer. But, it is different from segmentation in which an attention mechanism is added at each stage because we only add CBAM to the shallow feature extraction process. The shallow features of weeds and maize leaves are easily confused, so the focus of the model feature extraction will be on shallow (color, texture) details. With the addition of CBAM, when the model extracts shallow features, it can extract key detailed feature information that is easily overlooked, thereby achieving the purpose of accurate classification. In terms of deep feature extraction of the model, since weeds and maize leaves are quite different in contour and shape, there is no need to introduce the CBAM attention mechanism. There are two reasons: (1) as the number of network layers deepens, the extracted contours of the deep features are representative and the influence of the attention mechanism will become smaller; (2) introducing CBAM at each stage of the classification model will increase the number of parameters, thereby increasing the amount of calculations and affecting the calculation speed of the model.
We integrated the trained model into the back-end, upload crop images through the front-end page, and transfered them to the back-end algorithm model to determine whether the crop is full, as shown in Figure 5, which is our PGC-Classfication framework diagram.
(2)
CBAM-U2net
Since most of the distractors in the background of maize canopy images are unstructured and have different morphological contours, this brings difficulties to the segmentation task. The convolution process of the neural network relies on fixed-structure convolution kernels or dilated convolutions to extract features of the maize canopy [15,16]. However, the above two convolutions are difficult to adapt to the complex contour shape of the maize canopy. Therefore, we introduced deformable convolution to replace fixed convolution and dilation convolution [17,18]. This increases the flexibility of convolution operations, allowing it to adapt to different image contents, including objects with occlusions, deformations, or different perspectives. The formula for deformable convolution is
y ( H 0 ) = H n R ϖ ( H n ) x ( H 0 + H n + Δ H n )
In the formula, H n are all positions in the feature map, H 0 is the specified location for the feature map, and Δ H n is the offset. By setting the offset to adjust the sampling position of the maize feature map, the number of parameters can be reduced while increasing the accuracy.
U2net was proposed in 2020 [19], and is mainly composed of a nested U-net network [20]. It uses 6 different scale feature information fusions to help distinguish easily confused weeds and achieve the extraction of maize PGC in different growth stages. It consists of an encoder on the left and a decoder on the right. Our purpose was to segment maize leaves and the background in the image. First, we replaced the convolution with a deformable convolution, simulated the characteristics of the maize leaves in different states by learning local receptive fields, and added offsets to the convolution grid. This made the convolution process mimic the shape and outline of maize leaves; secondly, due to the complexity of the field environment, we incorporated an attention mechanism called CBAM, which is based on U2-net. It helps models more accurately identify and locate objects in images, especially when there is occlusion, deformation, or other complex conditions. To minimize the number of computational parameters, we chose not to integrate CBAM within the nested U-net structure. Instead, we appended CBAM to the U-net, positioning it after the different scale outputs from the network, resulting in the CBAM-U2net configuration. The model framework diagram is shown in Figure 6. The model structure is shown in Table 3.
In our CBAM-U²net model, the input is pictures of maize at different stages of growth, and the output is segmented maize leaves and backgrounds. The lack of interpretability in the training process inhibits the possibility of further network optimization. Therefore, it is necessary to visually analyze the training process in order to understand it. We use wandb, a lightweight visualization tool for deep learning, to visualize the training process and determine the correlation between crop PGC and crop coefficient, so as to establish a Kc real-time monitoring model.

2.3.3. Construction of ETo Default Parameter Model Based on Machine Learning

Using machine learning to estimate ETo can effectively reduce the amount of calculations and simplify the calculation method. We selected four representative machine learning algorithms, namely support vector regression (SVR) [21], Long short-term memory (LSTM) [22,23], GWO_SVM [24], and Optuna-LSTM [25]. The model framework is shown in Figure 7. We also studied the linear relationships between AT, AH, rainfall, MWS, MinWS, AWS, WD, radiation, AP and ETo, and retained the four meteorological parameters with the highest correlation. The input items were divided into four groups according to their correlation, and the four groups of parameters were used as input to different machine learning algorithms. The first three use the Bayesian algorithm to tune the parameters in the machine learning algorithm. The reasons for choosing these four machine learning algorithms are that they are computationally small and suitable for our dataset.
(1)
LSTM algorithm
LSTM algorithm is a derivative of recurrent neural networks. It can effectively solve the challenge of learning long-term dependencies in sequential data. Since the experimental data we collected is continuous, selecting an LSTM network can effectively utilize the time dimension information in the data.
(2)
Optuna-LSTM algorithm
The Optuna-LSTM algorithm is an optimization of LSTM. The reason we chose this network is that Optuna can make up for the shortcomings of LSTM’s inability to automatically adjust parameters. It combines the powerful functions of the automatic hyperparameter optimization framework Optuna with the advantages of LSTM.
(3)
SVR algorithm
Support vector regression (SVR) is a powerful algorithm due to its ability to effectively analyze the relationship between meteorological data and ETo. SVR incorporates the use of kernel functions, allowing it to transform the input space into a higher-dimensional feature space where the linear dependence between meteorological parameters can be better captured. We use hyperparameters such as Shrinking, Tolerance, and Cost, and optimize them through a Grid Search.
(4)
GWO_SVM algorithm
The GWO_SVM algorithm is also called the grey wolf optimizer support vector machine algorithm. The grey wolf optimizer is added to the support vector machine to solve complex meteorological parameter regression problems.

2.3.4. Performance Evaluation Measures

To evaluate the reliability of the above model, we used R2, mean absolute error (MAE), RMSE, mean square error (MSE), and precision (Precision) estimation results for evaluation.
R 2 = 1 i ( y ^ i y i 2 i ( y ¯ i y i 2
R M S E = 1 m i = 1 m ( y i y ^ i 2
M S E = 1 m i = 1 m ( y i y ^ i 2
M A E = 1 m i = 1 m ( y i y ^ i
Pr e c i s i o n = T P T P + F P
y ^ i is the predicted value, y i is the true value, and y ¯ i is the average value. MAE can reflect the actual situation of predicted value error. MSE is the expected value of the square of the difference between the model value and the observed value. RMSE is the arithmetic square root of MSE. The smaller the value, the better the effect. R2 is an indicator commonly used in regression analysis to evaluate the degree of model fitting. Precision is an evaluation method for image segmentation algorithms, where TP is the number of accurately identified maize canopy pixels, and FP is the number of falsely detected maize canopy pixels.

3. Results

3.1. Evaluation and Analysis of the Kc Estimation Model

(1)
Evaluation of growth trend classification algorithm
Our dataset contains images of the maize canopy under different lighting conditions and at different growth stages, which can increase the accuracy of the model. As shown in Figure 8, the accuracy of the efficientnet classification network after migration learning is close to 100% when the epoch is 3, and the accuracy of the efficientnet classification network without migration learning only reaches 97.3% when the epoch is 8. In terms of the loss rate, the loss rate of 0% can be achieved when the epoch is 2 after migration learning, while the loss rate of 10.3% is achieved when the epoch is 9 without In the case of migration learning, the loss rate is 10.3% when the epoch is 9. In comparison, the Vit-tiny classification network with migration learning can reach 100% accuracy at epoch 5, while the accuracy is 99.7% at epoch 8 in the case of non-migration learning, and in terms of loss rate, the initial loss rate of migration learning is 29.3% lower than that of non-migration learning.
As can be seen from Figure 9, before transfer learning, the efficientnet classification network can reach 0.5 for fullness judgment before transfer, and 0.83 for under-fullness judgment. After transfer learning, both of them reach 1.0. The fullness judgment of Vit-tiny before migration reached 1.0, the under-full judgment reached 0.83, and both reached 1.0 after transfer learning. To sum up, we choose efficientnet as the basic network of PGC-Classfication.
(2)
Comparative analysis of image segmentation algorithms
To verify the effects of different segmentation models, we compared the improved model with U2net, Deeplabv3 [26], and a fully convolutional network (FCN). As shown in Figure 10, the first row is the input image, the second row is the improved U2net segmentation prediction renderings, the third row is the U2net segmentation prediction renderings, the fourth row is the Deeplabv3 segmentation prediction renderings, and the fifth row is the FCN segmentation renderings. It can be seen from the figure that although FCN can segment the maize leaf outline, some weeds and pipelines are also regarded as foregrounds. Deeplabv3 can segment the maize outline better, but the pipelines in the figure are mistakenly segmented. U2net can segment maize images better, but some maize details will be lost after segmentation, and mis-segmentation will occur at the ends of leaves and intersections with weeds. The CBAM-U2net network can handle these problems better.
Under the same configuration environment, we use different segmentation algorithms, such as improved U2net, U2net, Deeplabv3, and FCN, to segment and predict maize images. First of all, the improved U2net network has certain advantages in terms of average intersection and union ratio, accuracy, and computing speed. As shown in Table 4, its average intersection and union ratio is 93.76%, which is 1.95% higher than the U2net network before the improvement; the accuracy is improved by 2.07%, the number of floating-point operations is 5 × 109 higher, and the training time is 0.74 h longer. Although, the training time and number of parameters are slightly higher, the prediction accuracy is significantly improved, so it has better prediction results than the previous model. A parallel comparison of Deeplabv3 and FCN segmentation networks shows that the average intersection ratio is 5.08% and 4.08% higher respectively, and the accuracy is 6.01% and 9.29% higher. The segmentation frame rate is slightly slower due to the large number of parameters, and the training time is also slightly higher than the two algorithms. Overall, although part of the time is sacrificed, it is better than the improved U2net and the other two split networks.
(3)
Kc estimation performance analysis
By improving the deep learning segmentation algorithm to segment maize and background, the weed interference was effectively reduced, and the 16 June 2022 to 3 August 2022 maize PGC results were obtained, as shown in Figure 11. 336 Nong kenuo at the emergence stage (16 June 2022), 4th leaf (23 June 2022), 5th leaf (1 July 2022), and 6th leaf (7 July 2022) PGCs are 0.94%, 2.28%, 5.74%, and 15.8% respectively. The PGCs of 368 Nong keyu in the four stages were 1.02%, 2.6%, and 7.45%, 17.6%, and the PGC of 909 Jing ketian in four periods was 0.88%, 2.45%, 6.62%, and 12.9%. After July 25th, the PGC of each variety of maize was 100%.
The nonlinear function quadratic is used to fit the true values of PGC and Kc. The obtained model fitting formula is shown in Equation (10). The maximum correlation coefficient between coverage and Kc value fitting reaches 0.901. As shown in Table 5, the RMSE between the estimated value and the true value is smallest during the 5th to 4th leaf period, reaching 0.021. Conversely, the maximum RMSE is observed during the 6th leaf to VT period, at 0.069. Additionally, the minimum MAE and MSE are recorded during the 6th leaf to VT period, being 0.001 and 0.032, respectively.Overall, the average RMSE, MSE and MAE between the estimated Kc value and the true value are 0.053, 0.089 and 0.008 respectively. The errors are at a low level, proving that our Kc estimation model is reliable.
K c = 0.696 + 2.184   P G C + ( 2.003 ) P G C 2

3.2. ETo Model Evaluation and Analysis

We conducted a correlation analysis between nine meteorological factors and ETo, as shown in Figure 12, and obtained the importance ranking: Rad (0.903) > AT (0.716) > AH (0.533) > MWS (0.225) > AWS (0.107) > WD (0.07) > MinWS (0.05) > Rainfall (0.03) > AP (0.012). Four different group combinations were designed based on the degree of importance; the first combination of content includes Rad, AT, AH, MWS, and AWS, the second combination of content includes Rad, AT, AH, and MWS, the third combination of content includes Rad, AT, and AH, and the fourth combination includes Rad and AT. Using SVR, LSTM, GWO_SVM, and Optuna-LSTM machine learning algorithms as models, the four combinations were input into the machine learning model to verify the default calculation of ETo.
As can be seen from Figure 13, in terms of algorithms, under different input variable conditions, the R2 of the Optuna-LSTM algorithm estimated value and the true value of ETo both exceed 0.93; the RMSE is lower than 0.06, and the MSE and MAE are also lower than 0.004 and 0.035, respectively. A comprehensive evaluation is better than the other three algorithms. In terms of input variable combinations, when the input combination is M2, the average R2 of the four machine learning algorithms is 0.931, which is higher than the other three input combinations; the average RMSE is 0.051, which is lower than the other three input combinations; MAE and MSE are higher than the other three input combinations. There is not much difference between the three input combinations, so the M2 input combination is the optimal combination. When the M2 group is input, the R2 of the Optuna-LSTM model reaches 0.953. When inputting the same input M1, M3, and M4, the best-performing algorithms are all Optuna-LSTM, with R2 exceeding 0.93.
As shown in Table 6, as the input variables decrease, the errors of each algorithm increase, while the changes in MAE and MSE are not obvious. Among them, the RMSE and MAE of the M1-Optuna-LSTM and M2-Optuna-LSTM groups are both 0.051 and 0.033, but the MSE of the M2 group is 0.001 less than the M1 group, which means that the M2-Optuna-LSTM estimation is more accurate than the M1-Optuna-LSTM; the M2 group has one less input amount than M1. Overall, M2-Optuna-LSTM is the optimal algorithm model.

3.3. Estimation and Verification of ETc

We multiplied the optimal M2-Optuna-LSTM model with the estimated Kc value to obtain the estimated ETc value and compare it to the true value of lysimeter ETc. As shown in Figure 14, we compared the difference between the estimated value and the true value. The correlation and error were analyzed, and we found that the R2 between the estimated value and the true value reached 0.918. As shown in Table 7, the RMSE in the 4th leaf are all lower than 0.02. Among them, the RMSE in the 5th leaf–6th leaf stage is the lowest at only 0.009. The period with the highest error is the 4th leaf–5th leaf stage, which is 0.019, and the Overall RMSE is 0.014. MAE and MSE showed a downward trend after the 5th leaf–6th leaf stage and reached the lowest at the leaf–VT stage, which were 0.003 and 0.045 respectively, proving that our ETc estimation model has good performance in different growth stages. Based on this research, a mobile application can be developed that integrates image data and meteorological data in real time to provide irrigation recommendations. This application can utilize machine learning algorithms to adjust irrigation plans based on real-time data, optimizing the use of water resources.

4. Discussion

Maize water demand changes with different meteorological conditions, planting methods and other factors. To solve the current problems of low accuracy and low real-time estimation of maize water demand, we used a Kc calculation as theoretical guidance and machine learning and machine vision. This method analyzes Kc and Eto, respectively, thereby establishing a real-time ETc estimation model. During the experiment, we found that Kc was positively correlated with PGC value, which is consistent with the research conclusion of Fernández-Pacheco D G et al. [27]. In PGC extraction, CBAM-U2net and U2net segmentation effects are better than the other two algorithms, with accuracy rates reaching 95.68% and 93.61%, respectively. This may be due to the fact that the saliency segmentation network captures global contextual image feature information [28]. Interfering items are effectively eliminated, and the accuracy increases by 2.07% after the attention mechanism is introduced. This may be because the attention mechanism pays more attention to the detailed texture features between crops and weeds, effectively reducing the false detection rate. To improve the PGC recognition rate (effectively removing weed interference), we will focus on adding attention mechanisms to deep learning algorithms in the future to ensure the accuracy of Kc. Because there is a positive correlation between Rad value and air temperature, air temperature is of great significance for estimating ETo [29]. Secondly, in the correlation analysis between meteorological parameters and ETo, we found that the correlation coefficients of Radiat 0.903) and AT 0.716 were relatively high, which verified the above conclusion. When using different meteorological parameter combinations, the MAE and MSE of the four algorithm models are not much different, but the R2 and RSME are quite different, which proves that the model fitting effect calculated by default input is better. There are differences in errors, which may be due to different utilization rates of the dataset features by the algorithms. Lykhovyd, P. and others proposed that existing data can be regarded as a time series and the salient features of historical data can be extracted to estimate the value of future time series with higher estimation accuracy. Meteorological data is a continuous and uninterrupted time series [30], so the LSTM series algorithm is chosen to have higher accuracy. The R2 of Optuna-LSTM and LSTM exceeds 0.93, and the RSME of both is less than 0.01, which is better than the GWO_SVM and SVR algorithms, indicating that our Meteorological time series datasets are input into the LSTM series of algorithms to more accurately calculate ETo values. This is because the LSTM series of algorithms can better use historical meteorological data to estimate the ETo value. As shown in Table 5, the basic LSTM model improves by 0.018, compared to the SVR algorithm and 0.019 compared with GWO_SVM. The LSTM algorithm after adding Optuna improves by 0.017. Therefore, to improve the accuracy of our model, we should pay more attention to the LSTM series of algorithms for continuous time series data sets and use them as basic algorithms for improvement.
In the end, the error in estimating water demand using the real-time monitoring method of Kc and the default machine learning method was only 0.014, proving that our method is effective. Because the input terminals of our water demand estimation model are only simple overhead pictures and four common meteorological parameters, and these conditions can be achieved by taking pictures on a mobile phone and using the weather component that comes with the mobile phone, mobile APPs can be developed based on our method in the future., real-time irrigation water demand can be obtained through simple online operations. At present, we calculate crop water requirements on a small scale based on mobile phone cameras. In the next stage, we will use fixed spherical cameras to estimate crop water requirements on a larger scale.

5. Conclusions

In this study, we propose a method for estimating maize water requirements using low-cost image acquisition. Our deep learning CBAM-U2net sub-model achieves a recognition coverage accuracy of 95.68%, and the RMSE of the Kc real-time estimation model, which takes PGC as input, is a mere 0.053. Secondly, we have identified the four most correlated meteorological parameters as Rad (0.903), AT (0.716), AH (0.533), and MWS (0.225), with AWS (0.107) being the least correlated. Using different combinations of these meteorological parameters in order of importance, we verified the effect on the default calculation of ETo. The M2-Optuna-LSTM algorithm, which takes four meteorological parameters as input, performed the best, with an R2 of 0.953 and an RMSE of only 0.049. Finally, by integrating the real-time estimated Kc values with the ETo values output by the M2-Optuna-LSTM model, we established an ETc calculation model. This model estimates ETc with an R2 of 0.918 against the true value and an RMSE of just 0.014. It proves that our maize water demand estimation model, which is based on low-cost mobile phone photography and meteorological parameters, can accurately estimate ETc.

Author Contributions

Conceptualization, W.Z. and F.S.; software, S.Z.; validation, T.L.; formal analysis, J.L.; data curation, J.T.; writing—original draft preparation, J.Z.; writing—review and editing, W.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National R&D program (2022YFD200160302), and the Project of Beijing Academy of Agricultural Sciences (Grant No. YXQN202304), Key Laboratory of Smart Agricultural Technology (Yangtze River Delta), Ministry of Agriculture and Rural Affairs, China (KSAT-YRD2023007).

Data Availability Statement

Data are contained within the article.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. UNICEF. The State of Food Security and Nutrition in the World (SOFI) Report—2023; United Nations: New York, NY, USA, 2023. [Google Scholar]
  2. UNESCO World Water Assessment Programme. The United Nations World Water Development Report 2023: Partnerships and Cooperation for Water; United Nations: New York, NY, USA, 2023. [Google Scholar]
  3. Qing, Z. A Brief Discussion on High-yielding maize Cultivation Techniques. Agric. Technol. Dev. 2023, 2, 182–184. [Google Scholar]
  4. Kumari, A.; Upadhyaya, A.; Jeet, P.; Al-Ansari, N.; Rajput, J.; Sundaram, P.K.; Saurabh, K.; Prakash, V.; Singh, A.K.; Raman, R.K.; et al. Estimation of Actual Evapotranspiration and Crop Coefficient of Transplanted Puddled Rice Using a Modified Non-Weighing Paddy Lysimeter. Agronomy 2022, 12, 2850. [Google Scholar] [CrossRef]
  5. Qiu, R.; Li, L.; Liu, C.; Wang, Z.; Zhang, B.; Liu, Z. Evapotranspiration estimation using a modified crop coefficient model in a rotated rice-winter wheat system. Agric. Water Manag. 2022, 264, 107501. [Google Scholar] [CrossRef]
  6. Sharma, V.; Irmak, S. Soil-water dynamics, evapotranspiration, and crop coefficients of cover-crop mixtures in seed maize cover-crop rotation fields. II: Grass-reference and alfalfa-reference single (normal) and basal crop coefficients. J. Irrig. Drain. Eng. 2017, 143, 04017033. [Google Scholar] [CrossRef]
  7. Pereira, L.S.; Paredes, P.; Melton, F.; Johnson, L.; Wang, T.; López-Urrea, R.; Cancela, J.J.; Allen, R.G. Prediction of crop coefficients from a fraction of ground cover and height. Background and validation using ground and remote sensing data. Agric. Water Manag. 2020, 241, 106197. [Google Scholar] [CrossRef]
  8. Malachy, N.; Zadak, I.; Rozenstein, O. Comparing methods to extract crop height and estimate crop coefficient from UAV imagery using structure from motion. Remote Sens. 2022, 14, 810. [Google Scholar] [CrossRef]
  9. Goffin, B.D.; Thakur, R.; Carlos, S.D.C.; Srsic, D.; Williams, C.; Ross, K.; Neira-Román, F.; Cortés-Monroy, C.C.; Lakshmi, V. Leveraging remotely-sensed vegetation indices to evaluate crop coefficients and actual irrigation requirements in the water-stressed Maipo River Basin of Central Chile. Sustain. Horiz. 2022, 4, 100039. [Google Scholar] [CrossRef]
  10. Rajput, J.; Singh, M.; Lal, K.; Khanna, M.; Sarangi, A.; Mukherjee, J.; Singh, S. Data-driven reference evapotranspiration (ET0) estimation: A comparative study of regression and machine learning techniques. Environ. Dev. Sustain. 2023, 26, 12679–12706. [Google Scholar] [CrossRef]
  11. Wen, X.; Si, J.; He, Z.; Wu, J.; Shao, H.; Yu, H. Support-vector-machine-based models for modeling daily reference evapotranspiration with limited climatic data in extreme arid regions. Water Resour. Manag. 2015, 29, 3195–3209. [Google Scholar] [CrossRef]
  12. Baratto, P.F.B.; Cecílio, R.A.; de Sousa Teixeira, D.B.; Zanetti, S.S.; Xavier, A.C. Random forest for spatialization of daily evapotranspiration (ETo) in watersheds in the Atlantic Forest. Environ. Monit. Assess. 2022, 194, 449. [Google Scholar] [CrossRef]
  13. Allen, R.G.; Pereira, L.S.; Raes, D.; Smith, M. Crop Evapotranspiration-Guidelines for Computing Crop Water Requirements; FAO Irrigation and Drainage Paper 56; Food and Agriculture Organization of the United Nations: Rome, Italy, 1998. [Google Scholar]
  14. Raza, S.; Das, B.; Chaudhry, R.; Goyal, V.; Lodha, R.; Sood, S.; Gautam, H.; Kapil, A. Efficiency of Real-Time PCR in the Diagnosis of Community-Acquired Bacterial Meningitis in Children. J. Microbiol. Infect. Dis. 2022, 12, 47–53. [Google Scholar] [CrossRef]
  15. Zhao, C.; Zhu, W.; Feng, S. Superpixel guided deformable convolution network for hyperspectral image classification. IEEE Trans. Image Process. 2022, 31, 3838–3851. [Google Scholar] [CrossRef]
  16. Niu, Z.; Zhong, G.; Yu, H. A review on the attention mechanism of deep learning. Neurocomputing 2021, 452, 48–62. [Google Scholar] [CrossRef]
  17. Zhuang, C.; Lu, Z.; Wang, Y.; Xiao, J.; Wang, Y. ACDNet: Adaptively combined dilated convolution for monocular panorama depth estimation. In Proceedings of the AAAI Conference on Artificial Intelligence, Vancouver, BC, Canada, 29 January 2022; Volume 36, pp. 3653–3661. [Google Scholar]
  18. Schmidt, C.; Athar, A.; Mahadevan, S.; Leibe, B. D2conv3d: Dynamic dilated convolutions for object segmentation in videos. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 8 January 2022; pp. 1200–1209. [Google Scholar]
  19. Qin, X.; Zhang, Z.; Huang, C.; Dehghan, M.; Zaiane, O.R.; Jagersand, M. U2-Net: Going deeper with nested U-structure for salient object detection. Pattern Recognit. 2020, 106, 107404. [Google Scholar] [CrossRef]
  20. Sanjar, K.; Bekhzod, O.; Kim, J.; Kim, J.; Paul, A.; Kim, J. Improved U-Net: Fully convolutional network model for skin-lesion segmentation. Appl. Sci. 2020, 10, 3658. [Google Scholar] [CrossRef]
  21. El Bilali, A.; Abdeslam, T.; Ayoub, N.; Lamane, H.; Ezzaouini, M.A.; Elbeltagi, A. An interpretable machine learning approach based on DNN, SVR, Extra Tree, and XGBoost models for predicting daily pan evaporation. J. Environ. Manag. 2023, 327, 116890. [Google Scholar] [CrossRef]
  22. Ashawa, M.; Douglas, O.; Osamor, J.; Jackie, R. Improving cloud efficiency through optimized resource allocation technique for load balancing using LSTM machine learning algorithm. J. Cloud Comput. 2022, 11, 1–17. [Google Scholar] [CrossRef]
  23. Jiang, Y.; Dai, P.; Fang, P.; Zhong, R.Y.; Zhao, X.; Cao, X. A2-LSTM for predictive maintenance of industrial equipment based on machine learning. Comput. Ind. Eng. 2022, 172, 108560. [Google Scholar] [CrossRef]
  24. Yan, X.; Lin, Z.; Lin, Z.; Vucetic, B. A Novel Exploitative and Explorative GWO-SVM Algorithm for Smart Emotion Recognition. IEEE Internet Things J. 2023, 10, 9999–10011. [Google Scholar] [CrossRef]
  25. Hanifi, S.; Lotfian, S.; Zare-Behtash, H.; Cammarano, A. Offshore wind power forecasting—A new hyperparameter optimisation algorithm for deep learning models. Energies 2022, 15, 6919. [Google Scholar] [CrossRef]
  26. Yurtkulu, S.C.; Şahin, Y.H.; Unal, G. Semantic segmentation with extended DeepLabv3 architecture. In Proceedings of the 2019 27th Signal Processing and Communications Applications Conference (SIU), IEEE, Sivas, Turkey, 24–26 April 2019; pp. 1–4. [Google Scholar]
  27. Fernández-Pacheco, D.G.; Escarabajal-Henarejos, D.; Ruiz-Canales, A.; Conesa, J.; Molina-Martínez, J.M. A digital image-processing-based method for determining the crop coefficient of lettuce crops in the southeast of Spain. Biosyst. Eng. 2014, 117, 23–34. [Google Scholar] [CrossRef]
  28. Wu, W.; Liu, G.; Liang, K.; Zhou, H. Inner Cascaded U²-Net: An Improvement to Plain Cascaded U-Net. CMES-Comput. Model. Eng. Sci. 2023, 134, 1323–1335. [Google Scholar] [CrossRef]
  29. Villa, M.; Dardenne, G.; Nasan, M.; Letissier, H.; Hamitouche, C.; Stindel, E. FCN-based approach for the automatic segmentation of bone surfaces in ultrasound images. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 1707–1716. [Google Scholar] [CrossRef]
  30. Lykhovyd, P. Comparing Reference Evapotranspiration Calculated in ETo Calculator (Ukraine) Mobile App with the Estimated by Standard FAO-Based Approach. Agri. Eng. 2022, 4, 747–757. [Google Scholar] [CrossRef]
Figure 1. Test site selection.
Figure 1. Test site selection.
Agronomy 14 02325 g001
Figure 2. Overhead shots of maize varieties at different growth stages.
Figure 2. Overhead shots of maize varieties at different growth stages.
Agronomy 14 02325 g002
Figure 3. Weighing lysimeter data acquisition system.
Figure 3. Weighing lysimeter data acquisition system.
Agronomy 14 02325 g003
Figure 4. Model framework diagram.
Figure 4. Model framework diagram.
Agronomy 14 02325 g004
Figure 5. PGC-Classification framework diagram.
Figure 5. PGC-Classification framework diagram.
Agronomy 14 02325 g005
Figure 6. CBAM—U2net framework diagram.
Figure 6. CBAM—U2net framework diagram.
Agronomy 14 02325 g006
Figure 7. ETo estimation model diagram.
Figure 7. ETo estimation model diagram.
Agronomy 14 02325 g007
Figure 8. Accuracy rate and loss rate chart before and after transfer learning.
Figure 8. Accuracy rate and loss rate chart before and after transfer learning.
Agronomy 14 02325 g008aAgronomy 14 02325 g008b
Figure 9. Classification network heat map.
Figure 9. Classification network heat map.
Agronomy 14 02325 g009
Figure 10. Comparison of the effects of different segmentation networks (a) input; (b) CBAM—U2net; (c) U2net; (d) Deeplabv3; (e) and FCN.
Figure 10. Comparison of the effects of different segmentation networks (a) input; (b) CBAM—U2net; (c) U2net; (d) Deeplabv3; (e) and FCN.
Agronomy 14 02325 g010
Figure 11. PGC changes in three maize varieties at different stages (a) after emergence stage (b) after 4th leaf (c) after 5th leaf (d) after 6th leaf.
Figure 11. PGC changes in three maize varieties at different stages (a) after emergence stage (b) after 4th leaf (c) after 5th leaf (d) after 6th leaf.
Agronomy 14 02325 g011
Figure 12. Correlation analysis between meteorological parameters and ETo true values. (a) Air temperature, (b) air humidity, (c) rainfall, (d) maximum wind speed, (e) minimum wind speed, (f) average wind speed, (g) wind direction, (h) and Rad (i) Atmospheric pressure.
Figure 12. Correlation analysis between meteorological parameters and ETo true values. (a) Air temperature, (b) air humidity, (c) rainfall, (d) maximum wind speed, (e) minimum wind speed, (f) average wind speed, (g) wind direction, (h) and Rad (i) Atmospheric pressure.
Agronomy 14 02325 g012aAgronomy 14 02325 g012b
Figure 13. ETo estimation under different meteorological parameter inputs.
Figure 13. ETo estimation under different meteorological parameter inputs.
Agronomy 14 02325 g013
Figure 14. Comparison chart between real and estimated values of ETc.
Figure 14. Comparison chart between real and estimated values of ETc.
Agronomy 14 02325 g014
Table 1. Weighing equipment parameters.
Table 1. Weighing equipment parameters.
ParameterRange
Measurement parametersWeight
Measuring range10 kg/20 kg/50 kg
Accuracy±0.02% R.O
Maximum countertop50 cm × 40 cm
Waterproof gradeIP68
Safe load limit150%
Output signal4–20 mA
Working current≤40 mA
Table 2. Dataset status.
Table 2. Dataset status.
Emergence Stage–4th Leaf Stage4th Leaf Stage–5th Leaf Stage5th Leaf Stage–6th Leaf Stage6th Leaf Stage–VT StageVT–
Image dataset7195015031580319
Environmental dataset21601510150847031057
Table 3. Model structure.
Table 3. Model structure.
Architecture Encode Stages and Decode Stages
Encode_1Encode_2Encode_3Encode_4Encode_5Encode_6Decode_5Decode_4Decode_3Decode_2Decode_1
U2net (ours)RSU-7RSU-6RSU-5RSU-4RSU-4FRSU-4FRSU-7RSU-7RSU-7RSU-7RSU-7
1:31:641:1281:2561:5121:5121:10241:10241:5121:2561:128
M:32M:32M:32M:32M:32M:32M:32M:32M:32M:32M:16
O:64O:64O:64O:64O:64O:64O:64O:64O:64O:64O:64
Table 4. Performance comparison chart of different segmentation networks.
Table 4. Performance comparison chart of different segmentation networks.
ModelsMIoU%PrecisionFLOPs/ × 109FPSTraining Time (h)
U2net (ours)93.7695.6811814.19.86
U2net91.8193.6111315.79.12
Deeplabv388.6889.678316.38.86
FCN87.6886.397916.97.63
Table 5. Error evaluation and analysis table of Kc model in different growth periods.
Table 5. Error evaluation and analysis table of Kc model in different growth periods.
Growth PeriodMSEMAERMSE
Emergence–4th leaf0.110.0120.033
4th leaf–5th leaf0.1140.0130.035
5th leaf–6th leaf0.0890.0080.021
6th leaf–VT0.0320.0010.069
Overall0.0890.0080.053
Table 6. Error evaluation analysis table of four machine learning models under M2 input.
Table 6. Error evaluation analysis table of four machine learning models under M2 input.
ClassModelMSEMAERMSE
M1LSTM0.0030.0410.057
SVR0.0020.0360.048
Optuna-LSTM0.0030.0330.051
GWO_SVM0.0020.0360.048
M2LSTM0.0030.0410.057
SVR0.0020.0360.048
Optuna-LSTM0.0020.0330.049
GWO_SVM0.0020.0360.048
M3LSTM0.0040.0430.063
SVR0.0020.0350.049
Optuna-LSTM0.0030.0350.057
M4LSTM0.0040.0380.061
SVR0.0030.0350.05
Optuna-LSTM0.0030.0330.056
GWO_SVM0.0030.0350.048
Table 7. Error evaluation and analysis table of ETc estimation model in different growth periods.
Table 7. Error evaluation and analysis table of ETc estimation model in different growth periods.
Reproductive PeriodMSEMAE
Emergence–4th leaf0.0780.006
4th leaf–5th leaf0.0840.007
5th leaf–6th leaf0.0630.004
6th leaf–VT0.0550.003
Overall0.0450.002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhao, J.; Tao, J.; Zhang, S.; Li, J.; Li, T.; Shan, F.; Zheng, W. Estimation of Maize Water Requirements Based on the Low-Cost Image Acquisition Methods and the Meteorological Parameters. Agronomy 2024, 14, 2325. https://doi.org/10.3390/agronomy14102325

AMA Style

Zhao J, Tao J, Zhang S, Li J, Li T, Shan F, Zheng W. Estimation of Maize Water Requirements Based on the Low-Cost Image Acquisition Methods and the Meteorological Parameters. Agronomy. 2024; 14(10):2325. https://doi.org/10.3390/agronomy14102325

Chicago/Turabian Style

Zhao, Jiuxiao, Jianping Tao, Shirui Zhang, Jingjing Li, Teng Li, Feifei Shan, and Wengang Zheng. 2024. "Estimation of Maize Water Requirements Based on the Low-Cost Image Acquisition Methods and the Meteorological Parameters" Agronomy 14, no. 10: 2325. https://doi.org/10.3390/agronomy14102325

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop