Next Article in Journal
Insight into the Optimization of Implementation Time in Cob Construction: Field Test and Compressive Strength Versus Drying Kinetics
Next Article in Special Issue
Artificial Intelligence and Industry 4.0? Validation of Challenges Considering the Context of an Emerging Economy Country Using Cronbach’s Alpha and the Lawshe Method
Previous Article in Journal
The Evaluation of Green Building’s Feasibility: Comparative Analysis across Different Geological Conditions
Previous Article in Special Issue
A Neural Network-Based Hydrological Model for Very High-Resolution Forecasting Using Weather Radar Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Assessment of Leaf Area and Biomass through AI-Enabled Deployment

1
Skolkovo Institute of Science and Technology, 121205 Moscow, Russia
2
Michurinsk State Agrarian University, 393760 Michurinsk, Russia
3
Sberbank of Russia, 117312 Moscow, Russia
*
Author to whom correspondence should be addressed.
Eng 2023, 4(3), 2055-2074; https://doi.org/10.3390/eng4030116
Submission received: 23 June 2023 / Revised: 18 July 2023 / Accepted: 19 July 2023 / Published: 25 July 2023
(This article belongs to the Special Issue Artificial Intelligence and Data Science for Engineering Improvements)

Abstract

:
Leaf area and biomass are important morphological parameters for in situ plant monitoring since a leaf is vital for perceiving and capturing the environmental light as well as represents the overall plant development. The traditional approach for leaf area and biomass measurements is destructive requiring manual labor and may cause damages for the plants. In this work, we report on the AI-based approach for assessing and predicting the leaf area and plant biomass. The proposed approach is able to estimate and predict the overall plants biomass at the early stage of growth in a non-destructive way. For this reason we equip an industrial greenhouse for cucumbers growing with the commercial off-the-shelf environmental sensors and video cameras. The data from sensors are used to monitor the environmental conditions in the greenhouse while the top-down images are used for training Fully Convolutional Neural Networks (FCNN). The FCNN performs the segmentation task for leaf area calculation resulting in 82% accuracy. Application of trained FCNNs to the sequences of camera images allowed the reconstruction of per-plant leaf area and their growth-dynamics. Then we established the dependency between the average leaf area and biomass using the direct measurements of the biomass. This in turn allowed for reconstruction and prediction of the dynamics of biomass growth in the greenhouse using the image data with 10% average relative error for the 12 days prediction horizon. The actual deployment showed the high potential of the proposed data-driven approaches for plant growth dynamics assessment and prediction. Moreover, it closes the gap towards constructing fully closed autonomous greenhouses for harvests and plants biological safety.

1. Introduction

Recent economic forecasts say that the world population will reach about 8.5 bln people by 2030. Growth of population imposes strict requirements for food availability; there should be enough food in terms of quantity and food supplies are expected to double by that time. Apart from significant growth of Earth population, there are two problems (i) urbanization and (ii) delivery of inexpensive fresh vegetables and fruits to remote areas. Therefore, food production in greenhouses and urban farming is becoming a promising tool to tackle this problems.
Precision agriculture and plant growth in controlled artificial conditions have become the state-of-the-art research and engineering areas addressing the increasing demand for food in the nearest future [1]. The development of autonomous greenhouses systems is becoming a big hit globally due to the shutdown hook from the COVID-19 pandemic. For above mentioned reasons, Wireless Sensor Networks (WSN), being an enabling technology for the Internet of Things (IoT) [2], is able to contribute to this process and guarantee better efficiency, automotive and sustainable crop yield.
Successful deployments of WSN in the agriculture related areas [3,4] and application of machine learning algorithms [5,6] to embedded and mobile devices [7] ensure bright future for intelligent deployments.
In fact, in order to successfully grow vegetables and operate an industrial greenhouse, an experienced and highly skilled grower is required. He is capable of establishing the typical greenhouse routine: setting optimal artificial light, watering, feeding, etc. All these factors are assessed by the experienced grower while taking decisions for guaranteeing optimal production. However, the grower takes decisions based on his prior experience rather than on science. Adding intelligent sensing for supporting the decisions can positively impact the agriculture: it helps identify the ideal time to harvest, to improve yields of crops, to reduce the operational costs, to ensure better resource management [8].
The first intelligent sensors joined in a WSN were performing environmental monitoring tasks including active volcano monitoring [9], zebra migration monitoring [10] and buildings monitoring [11]. However, these WSNs were not always truly intelligent and did not always possess the actuation capabilities: the WSN deployed for wildfire detection was rapidly destroyed by the fire passage which was extinguished only at the final stage [12]. Hence, there was a clear need to equip the WSNs with actuation capabilities and integrate them in existing monitoring infrastructure.
It terms of agriculture, a sensor-actuator WSN was deployed in a greenhouse for automatic climate control [3].
Due to significant progress, Artificial Intelligence (AI) and machine learning, as well as an opportunity to squeeze it to embedded systems [7], has helped evolve WSNs for precision agriculture. Recent WSN deployment for tomato monitoring engages reinforcement learning for optimal artificial light control in a greenhouse [4]. Still, there is a lack of agricultural WSN deployments and collected datasets which serve as a driving force for further development of AI-enabled precision agriculture.
In this work, we propose the data-driven enhancement for a plant growth dynamics modelling and prediction in an artificial growth system. We report on a WSN deployment consisting of off-the-shelf components: low-power sensors, video cameras and a server in the 720 m 2 industrial greenhouse for cucumber growing. The goal of this deployment is the assessment and prediction of the cucumbers’ growth dynamics using non-invasive real-time intelligent systems. It is essential for providing the information for selecting the optimal growth regimes in artificial environment. To realize this goal into practice we collected the dataset of top-down cucumber leaves’ images while using the sensor data for the environmental data collection and detection of anomalies, e.g., sudden and unpredictable significant changes in temperature and humidity. We used Fully Convolutional Neural Networks (FCNN) for segmentation task helping to calculate leaf area and manual measurement of biomass for finding the correlation between the leaf area and biomass.
The paper novelty is threefold: to the best of our knowledge, we first demonstrate the correlation between the cucumber leaf area and its biomass using computer vision in industrial scale experiments and use this dependency for the prediction of the biomass based on the 2D imagery data. This approach can significantly contribute to the in situ growing optimization as the conditions like shading or sudden temperature change [13] may have severe consequences on the growing and yield. Second, as a result of this work and its validation on a real deployment, there will be the state-of-the-art dataset available for the research community. Third, we apply FCNN, which have been successfully used in a number of industrial applications, for the segmentation task. This procedure is integrated in the greenhouse infrastructure and can be scaled to other greenhouses for plant growing.
The paper is organized as follows: we introduce the reader to the relevant works in the area in Section 2. In Section 3 we discuss methodology used in this research. Afterwards, we detail deployment in terms of hardware, software, plants and greenhouse facility in Section 4. Next, data analysis is demonstrated in Section 5 where the methods and results are provided. In Section 6 we discuss limitations of our work and specific findings. Finally, we provide concluding remarks and highlight our future work in Section 7.

2. Related Work

Originally, the plant growth dynamics evaluation in controlled artificial environments was a problem of top priority in life support systems development for space and relevant ground applications [14]. In spite of the fact that it came from space technologies, the development of artificial closed controlled systems for precision agriculture is highly relevant these days. It is expected to secure the food provision for increasing Earth population and people living either in remote areas or in harsh environments [15].
In this section, we summarise successful attempts of agricultural WSN deployments, though they are still limited. Also, we report on relevant research in terms of leaf area calculation and modelling.

2.1. Deployments

Among the early attempts for empowering farmers with intelligent sensing and WSN was the Internet of Tomatoes (yet another ‘IoT’ acronym) project. This project was led by Analog Devices and the idea behind it was to build a complete sensor-to-cloud solution for helping tomato farmers from New England to make smarter decisions while growing vegetables [16].
Another WSN deployment was aimed at controlling the climate conditions in a greenhouse [3]. The authors reported on the deployment of tiny sensing devices at different height for measuring temperature and relative humidity. The actuators start operating to keep the predefined settings of the greenhouse in the case when the threshold values are violated. This idea is realized through the application of tiny wireless sensors which are deployed anywhere in the greenhouse and perform monitoring tasks. They operate in a low duty cycle mode and periodically send the data to the user/cloud for carrying out the data analysis or storage. A similar solution is implemented in the WSN deployment for strawberry monitoring in a greenhouse [17]. The authors designed a WSN architecture where, in contrast to [16], the data analysis is carried out at the edge of the network. The goal of this deployment is to optimize the irrigation system and guarantee proper conditions of moisture for strawberry growth.
However, most WSN-based solutions experience the lack of ’cognitive’ abilities due to the limited computational and power resources of embedded systems. The lion’s share of these deployments operate as the rule-based systems instead of making attempts to adapt to the environment or finding optimal growth conditions.
An example of intelligent WSN deployment for tomatoes growth in a greenhouse is demonstrated in [4]. It is a fairly large 5000 m 2 deployment where a WSN collects the environmental data and send them to a server over the wireless channel. The entire system helps to adjust the artificial light through reinforcement learning which results in the reduction of operational costs.
At present, there is an ongoing discussion what is the best place for realizing the AI-based intelligent capabilities for agricultural monitoring systems and WSNs in particular: whether it is cloud [18] or edge [19]. Both options are characterised by their pros and cons, but the deployments, collected datasets are of high demand in the research community. In the proposed solution, we run FCNN on a local server.

2.2. Leaves Modelling

As noticed earlier, AI has penetrated to the monitoring applications in agriculture. For example, Machine Learning techniques have been applied for modelling and estimating the grasslands in Ireland [20]. The authors proposed and designed three models including Adaptive Neuro-Fuzzy Inference System (ANFIS), Artificial Neural Network (ANN) and Multiple Linear Regression (MLR) for estimating the subject by processing image data.
In fact, the 2D approach is typically used when the plant possesses large leaves and reasonably simple structure [21]. Unfortunately, this approach often relies on sophisticated software for image analysis. At the same time it is characterised by other disadvantages, e.g., leaf overlap and concavity. Another solution for plant digitization is based on laser scanning. It was applied to forestry and statistical analysis of canopies [22]. Its application in real scenarios is limited to extracting single plant attributes because of the computationally intensive tasks.
Another set of approaches is based on 3D imaging. This approach helps capture the plant shape in three dimensions and study it. Undoubtedly, it may seem tempting to process 3D images of the plant growth as we can derive more information about the plant structure in comparison with 2D images, but the systems for receiving precise 3D imaging data are typically much more expensive than 2D imaging systems. Laser scanning is also used for plant digitization and has been successfully applied to forestry and statistical analysis of canopies [22,23]. Its application is limited to extracting the single plant attributes. These tasks are computationally intensive. A 3D scanning system for taking quick and accurate images is proposed in [24]. The study involves two tilting cameras, the methods for camera calibration and background removal. Quite a similar approach where the authors use a robotic arm equipped with a 3D imaging system for 3D plant growth measurement is proposed in [25]. The drawback of this research is that it requires much time for processing and data recording. A semiautomatic 3D imaging system for plant modeling is reported in [26]. The bottom line of this research is to combine reconstructed 3D points and the images for guaranteeing a more effective segmentation of the data into individual leaves. Although the 3D imaging approach is becoming popular, the image acquisition for 3D reconstruction is typically carried out manually [27]. A 3D phenotyping platform for laboratory experiments was successfully developed and a 3D dataset coupled with environmental information was obtained in one of the most recent works [28]. Different approaches for 3D plant reconstruction are described in [25,29,30]. There is also research on the evaluation of biomass from 2D imagery, especially in the remote sensing area [31,32]. However, these studies rely on the evaluation of the broad area for the birds-eye-view and do not evaluate the individual plants or individual leaf area and their individual contribution to the overall biomass of the plant. However, in the proposed deployment we rely on 2D data collection with the application of FCNN for image analysis and detailed biomass prediction. In our next section, we detail our contribution and share our experience.

3. Methodology

The main idea of our research is to develop an intelligent computer-vision-based system providing robust and accurate plant biomass growth dynamics prediction. As the sensors we use the digital cameras generating images for further analysis and environmental sensors for monitoring specific anomalies. The methodology of the conducted research is the following:
  • To deploy and tune set of sensors and digital cameras. The sensors and cameras are installed in the industrial greenhouse and collect the data describing the plant growth dynamics (images) and environmental parameters.
  • To set up one month experiment on cucumbers growth and collect relevant data from sensors and cameras as well as carry out the biomass measurements.
  • To create semantic segmentation mask with the group of annotators to create fine-grained labels for further training of FCNN.
  • To train and evaluate FCNN for performing the segmentation tasks and leaf area calculation. Calculation of per-plant leaf area using sequences of the obtained images. For our study we have chosen UNet as a semantic segmentation model. This is the architecture of our choice for the following reasons: it has high performance in highly imbalanced datasets [33], it is lightweight in comparison to the most recent multiclass architectures and it is easy to deploy on low-power embedded system, used in this research.
  • To derive correspondence between the leaf area and biomass.
  • To estimate the parameters of leaf area growth model and perform predictions.
  • To reconstruct and predict the biomass based on the assessed and predicted values of leaf area.

4. Deployment

The deployment shown in Figure 1 was located in a dedicated area (720 m 2 ) of the initial plant growing stage in an industrial greenhouse facility. In this section we detail specific information about deployment making a special emphasis on plant we monitor, hardware and software.

4.1. Plants

The cultivation of cucumber seedlings was carried out according to the low-volume hydroponic technology based on growing plants in rock wool substrate. Sowing for each seed was carried out in rock wool blocks 10 × 10 × 6.5 cm (Grodan delta). All rock wool blocks were preliminarily dunked in a specially prepared nutrient solution (see Table 1). Vermiculite was sprinkled on top of the seeds to avoid additional evaporation. In total, 496 plants were sowed and evenly distributed on the floating table in rock wool blocks for the experiment.
The rock wool blocks were placed on one table, 8 m long, 1.82 m wide according to the experimental scheme (see Figure 2). The rock wool blocks were saturated with a nutrient solution completely sinking into it. The parameters of the fertiliser solution that was used for initial blocks saturation are: Electrical Conductivity (EC) equals 1.50 mS/cm and pH was in the range of 5.3–5.5. Further watering of plants was carried out by the partial flooding method. Necessary watering time was determined by the weight of the rock wool block. The weight of a fully saturated rock wool cube 10 × 10 × 6.5 cm is 650–660 g. With a drop in weight up to 350–370 g, watering was carried out. As seedlings grow, they need more elements of mineral nutrition. Therefore, EC of feeding solution was slightly increased during the experiment. However, when the seedlings formed four real leaves and a good root system, according to the technology of cultivation, they should be transplanted on rock wool slabs, which have a capacity of 16 L of nutrient solution, 4 L per plant (4 plants per slab). In our experiment, seedlings continued to grow in a 0.65 L cube. That is why further watering based on common technology was impossible: the last few watering procedures were made with distillate water (see Figure 3). The biomass measurements during the experiment are shown in Figure 4. Each point contains from 11 to 82 measurements of the biomass. In total, 480 measurements were performed.
Conductivity measurements were performed on a METTLER TOLEDO conductivity meter and measured in mS/cm. Measurements of pH were carried out on a Sartorius PB-11 instrument. Measurements of changing EC and pH in the rock wool blocks parameters during the experiment are shown in Table 2. Solutions’ samples were taken out by a syringe from the rock wool block. Two samples from the middle of each of 3 zones were taken. According to Table 2 next samples were taken: from Zone I—samples 1 and 4, from Zone II—samples 2 and 5 and from Zone III—samples 3 and 6. These measurements ensure equal conditions for growing all the plants. It reduces the deviations and makes the obtained dataset relevant and statistically correct.

4.2. Hardware

The WSN deployment consists of (i) a data collection server, (ii) WaspMote sensor nodes and (iii) Xiaomi XiaoFang 1080p cameras organized as a WSN communication at 2.4 GHz frequency. The sensor nodes are based on the low-power ATmega Microcontroller Unit (MCU), a wireless transmitter with the transmission power set to −0.77 dBm. It is chosen based on the empirical analysis of Received Signal Strength Indicator (RSSI) since the Link Quality Indicator (LQI) and Packet Delivery Rate (PDR) metrics are inaccessible in the industrial versions of WaspMotes. A power source is the battery pack containing three parallel 3.7 V Li-ion polymer cells with the total capacity of 6.6 Ah. It secures the long-term lifetime without recharging. The sensor nodes include several types of sensors: temperature, PAR, humidity and CO 2 . Several sensor nodes are placed on each tray evenly as shown in Figure 2. The cameras are accessed from the server using Real Time Streaming Protocol (RTSP).

4.3. Software and Data Storage

A schematic view of our data collection system is shown in Figure 5. It consists of three main components: (i) a Flask-based HTTP server implemented in Python programming language; (ii) a distributed task queue Celery and in-memory Redis database as a message broker with the persistence enabled; and (iii) a general-purpose schema-less Database Management System (DBMS) MongoDB allowing for the storage of unstructured data along with the arbitrary binary objects using GridFS. The HTTP API server allows for the collection of sensor data using push strategy with the sensor nodes sending measurements every 30 min. The MongoDB database is used to store the sensor measurements and camera images. It is hosted using a DBaaS service which was sometimes inaccessible due to the intermittent internet connectivity on the deployment site. Therefore, it is crucial to persistently store the received data locally using the Celery queue with Redis as a broker and task storage, and when the internet connection is restored we send the data to MongoDB. Additionally, Celery allows one to periodically (every 30 min) receive the image data from the cameras using the poll strategy. All of the software components were built using Docker containerization system, thus being easy-to-deploy.
We used the data from sensors in order to monitor the environmental conditions and prevent undesirable and abnormal plant growth. As mentioned earlier, for the development of growth model, validation of the methodology for biomass assessment and prediction, only measurements of leaf area obtained from images were used. This was enough for accurate predictions of biomass in normal environmental growth conditions. Also, there is an opportunity to include actuators and the data from sensors into the plant growth model for fine tuning of the modeling process.
The examples of obtained measurements of temperature and humidity corresponding to the experimental results described next are shown in Figure 6 and Figure 7. They ensure the permissible values of these environmental parameters during the experiment. According to the research reported in [13] and [34] the optimal temperature conditions are 18.3–32.2 °C. The best humidity range to ensure the maximum growth rate on the initial stage is 50–70%. However, the permissible humidity range that has not much effect on the total yield is 35–90% [35].
We performed the Dickey–Fuller test for ensuring the stationarity of growth conditions. This test demonstrated the following result: p-values were less than 0.05 for both the important environmental parameters including temperature and humidity. It means that the time series are stationary.
For maintaining sustainable growth of the plants it is important to monitor not only the absolute values of the environmental conditions, but also the rate of changing (first derivatives) [36,37]. Rapid changes of the environmental parameters, even being in the optimal boundaries, may affect the growth dynamics. It can result in the plant development in a wrong way. Tracking these changes is only possible when using the distributed sensors that provide the measurements with high time resolution. We performed this analysis using the obtained measurements. The results are demonstrated in Figure 8, Figure 9 and Figure 10. These results demonstrate that the environmental parameters changed smoothly what resulted in the normal plant growth dynamics.

4.4. Image Data Collection and Annotation

According to Figure 2, four digital cameras with resolution 1920 × 1080 were mounted 2 m above the floating table. These cameras took two images sequentially every 30 min for 31 days. A total of 2494 raw images were taken from each camera, and 9976 top-down images were taken in total. After the data cleaning procedure and choosing the images only for the time interval that represents the active growth stage interval, 4 sequences of 975 images representing 25 days of observation for each camera were kept. This data was used for further investigation and assessment of growth dynamics. All images were flattened using the calibration images to avoid distortions. In total, 248 images were annotated. Selection of 62 images out of 975 from each of the 4 cameras for annotation purposes was performed in the following way: from each day of observation 3 images at times 9:00, 15:00 and 21:00 were kept. The annotation procedure consisted of putting the segmentation masks and bounding boxes for each plant in the image. Overall, 45,389 instances for 248 images were obtained after the annotation procedure.

4.5. Inference on a Low-Power Embedded System with the AI Capabilities

In this section we describe an experiment which can realize the second option in terms of data analysis, i.e., perform the analysis on board of sensing device instead of sending the data to a server.
Computer vision algorithms, e.g., FCNN, described earlier are powerful methods for semantic segmentation of images in real-time. However, these neural networks are well known as greedy algorithms which require extensive computational resources and are not suitable for low-power embedded devices. Therefore, inference of these algorithms on board of single board computers and mobile devices is a challenging task. In the case of success the distributed network based on the low-power systems with AI capabilities could tremendously improve the capabilities of greenhouses. It could make the prediction of plant growth dynamics for each plant individually, make a decision on the specific chemical input to the soil, therefore maximizing the output of cultivation. Such an autonomous system could be powered by the external batteries and perform the data collection even in case of blackout.
However, the key advantage of this system is the ability to perform the data-intensive computing on board and sending the post-processed data to the server. It significantly reduces the requirements throughout the capacity of the data transmission system. It can send the semantic mask which occupies several KB instead of the raw image which can obtain several MB. For the large scale data collection and processing it would significantly influence the data transmission and WSN infrastructure in general.
In the current research, we report on the development of such a system for semantic segmentation of cucumbers. A single-board computer Nvidia Jetson Nano is a critical component of the proposed system. It has the mobile GPU on board with 128 cores and can easily handle even 4K video streams. However, what matters in the case of cucumbers’ growth dynamics in the greenhouse is autonomy. Therefore, the most intriguing parameter for the proposed research is power consumption for a single frame processing along with the ability to process the data-intensive FCNNs.
We tested the FCNN architecture on the embedded device. Also, we measured the time per single image processing and power consumption. The mean time for image processing is 3.5 s per image. It has a constant voltage of 5 V. However, the current varies. The power consumption during the computation is 6.5 W (1.3 A), with 5 W (1.0 A) in the idle mode. The power bank could easily power this system. To calculate the operation time of the proposed system, we rely on the formula: T = E U P , where E is the power capacity of the battery, T is the time of operation, P is the power consumption and U is the input voltage. For example, 10,000 mAh power bank can produce the power supply for up to 7.69 h of continuous operation—performing neural networks inference all the time. However, in our scenario, the system should capture new data and perform predictions every 30 min, staying the rest of the operation in the idle mode. Therefore, it could withstand the operation for up to 10 h in such a scenario. It makes an autonomous system an excellent option for applying distributed sensor systems in operation at areas with restricted power or communication capabilities. It also becomes a way to preserve the data even in case of blackouts or other infrastructure incidents.

5. Data Analysis

The set of FCNNs, e.g., U-Net [33], FCN8s, FCN16s [38], was trained within PyTorch framework and validated using the labeled dataset. U-Net consists of a contracting path that captures context and an expanding path that enables precise localization. It is effective for segmentation on small datasets with excessive data augmentation and it is also effective for border segmentation which is important for plant segmentation. The contracting path consists of 3 × 3 unpadded convolutions, each followed by ReLU, max-pooling 2 × 2 with stride 2. The expansive path contains the upsampling of the feature map, followed by 3 × 3 convolutions. After upsampling, the resulting feature map concatenates with the corresponding feature map from the contracting path. Then it is followed by two 3 × 3 convolutions each followed by ReLU. Finally, 1 × 1 convolution is used for each of 64 components. After that, pixelwise softmax over the whole feature map is calculated.
FCNN’s convolutional layers that include pooling and ReLU activations are followed by deconvolutional layers (or backwards convolutions) to upsample the intermediate tensors so that they match the width and height of the original input image.
Out of 62 images from each camera, 50 were used for training and 12 for validation. The remaining 913 images from each camera were kept as test data. To assess the quality of the trained model the average IoU between the predicted masks and the ground truth masks for validation data was evaluated using the following formula:
{ w g , w p } IoU ( w g , w p ) # { w p } ,
where
IoU = Area of Overlap Area of Union ,
and { w g , w p } are all the possible pairs of ground truth and predicted masks, while # { w p } is a number of predicted masks. Average IoU on the validation set using FCN8 semantic segmentation neural network achieved value of 81%. The training parameters were selected as follows: batch size = 2, learning rate = 0.008, class weight = 0.5. Images were also resized to 1280 × 720. Principal Component Analysis (PCA) aided FCNN was also trained and evaluated for the reference with the proposed modifications of the FCN8 and FCN16 (modified FCN8). This neural network also has a contracting convolutional part and an expanding upsampling part. However, it relies on transferring the learned weights of new classification networks to fine-tuning segmentation networks. In our case, the pre-trained VGG-16 layers were used for training [39]. It achieves 82% IoU in the proposed task after 100 epochs of training. Even though all the abovementioned FCNNs are not the most advanced in the area, e.g., Deeplab [40], there could be an overshoot for the proposed task of leaf area segmentation. Mostly, because the IoU of these networks is sufficient and we have no reason to speed up them. Captures are made every 30 min, and we can make every prediction with a low framerate.
In Figure 11a,b present train and validation losses and IoU accordingly for FCN8 and for modified FCN8 for 100 epochs. Early stopping criteria were used to retrieve the best model during the process of learning. The examples of predicted masks on the validation dataset are shown in Figure 12a,b; it represents images for different stages of growth. The examples of predicted masks on the test dataset are shown in Figure 13a,b. As can be noticed from these figures, predicted masks are accurate and are in full correspondence with the actual plants.
Using the sequence of selected 975 images from each camera, per-plant leaf area was calculated. As there are many plants in the images, the table can move in the horizontal direction, and there were direct biomass measurements of plants—the different amount of plants appearing on images. This means that the total segmented area should be divided by the actual amount of plants to obtain the averaged area of each plant. The example of the calculated average per-plant leaf area on the image sequence obtained from one of the cameras is shown in Figure 14. It should be noticed that there was a total power interruption for several days on the 18th day of the active plant growth. In Figure 14 the first 760 data points are shown, representing the continuous growth. The accuracy of the proposed FCNN used for segmentation additionally proved by the fact that it captured the diurnal fluctuations (see Figure 14) of the projection of the leaf area that is caused by the biological reasons, specifically relative motion of leaves. After the power interruption, the system switched on automatically and continue collecting images and data from sensors (rest 215 images). This experience showed the high relevance of implementation of the autonomous embedded systems for greenhouses that allows to overcome the problem. Nevertheless, the collected images and biomass measurements were sufficient to find dependency between leaf area and biomass. Figure 15 shows approximated dependency between leaf area and biomass using Equation (3). To construct this dependency, we used data points representing direct measurements of biomass and corresponding FCNN-calculated leaf area during the first 18 days (the first 10 points of the biomass measurements form Figure 4).
m = α S γ
The derived dependency for cucumbers is following (Equation (4)):
m = 0.00755 S 1.57 .
Using the obtained dependency it is possible to assess and predict biomass using the predicted leaf area. The Verhulst model, commonly used for description of the biological systems growth, was applied to make predictions of the leaf area (see Equation (5)):
d S d t = μ S ( 1 S S m a x ) ,
where μ is the growth rate ( 1 t i m e s t e p ), S is the measured (calculated) leaf area and S m a x is the maximum leaf area in cm 2 . Integration of Equation (5) gives the following Equation (6):
S ( t ) = S m a x S 0 e μ t S m a x + S 0 ( e μ t 1 ) ,
where S 0 is the initial leaf area. The Verhulst model is widely applied for assessment of the dynamics of life systems. For example, the Verhulst model was applied for spatio-temporal population control to the management of aquatic plants [41,42]. It should be noticed that the Verhulst model was used for assessment of the dynamics of the projected leaf area, not the leaf areas themselves. Growing plants broadwise can have an effect on the values of the calculated projected leaf area. The projection of leaf area has limitations for the investigated type plant because it is not able to grow infinitely broadwise. This effect was also observed experimentally. Also, the estimation and modeling of the biomass were carried out at the initial stage of growth. Thus, the biomass that is accumulated at the initial stages and estimated using the first 3–4 projected leaf area has its limitations.
Non-linear least square method was used for estimation of the parameters in the growth model (Equation (6)) based on the leaf area calculations obtained by the FCNN for the first 18 days. The result of the estimation is μ = 0.23 1/30 min, S m a x = 700 cm 2 and S 0 = 5.07 cm 2 . The relative error of the approximation of the data by the model is 5.5%. Using these coefficients it is possible to predict (extrapolate) the leaf area growth curve. The result of the fitting to the experimental data and prediction of the leaf area for 12 days ahead is shown in Figure 16.
Using these fitted and extrapolated values for the leaf area and derived dependency between leaf area and biomass, we calculated the predicted biomass for one month including the extrapolation interval (last 12 days). The result is shown in Figure 17, where the predicted biomass is presented as well as the biomass measurements that were used for construction of the dependency. Also, Figure 17 shows the measurements of the biomass that were taken during the last 12 days and that were not included in construction of the dependency. These last seven data points were used for validation of the prediction accuracy. The average relative error of the biomass prediction reached 10.7 % .

6. Discussion

The cucumber ( C u c u m i s s a t i v u s L.) is one of the most produced crops in greenhouses worldwide. Its production rate is up to 60%. Over the last few years, several dynamic or simulations models have been proposed to predict the cucumber growth and yield [43,44]. Such process-based models include huge amount of input heterogeneous environmental conditions parameters, complicated mathematical models that describe crop growth and need to be tuned and adapted for each plant cultivation and growing system. Also, precise measurements of some of these parameters are possible only manually which is time consuming and ineffective. This means that the efficiency of simulations can be low in the cases when the farmers could not monitor all the parameters as a routine. Leaf area and leaf biomass are important morphological parameters for in situ monitoring because the leaf is vital for perceiving and capturing light. Meanwhile, the traditional approach for leaf area and biomass measurements is destructive and may cause illness for plants. The incorporating IoT, computer vision and ANN systems can help solve these issues, because they provide possibilities for real-time monitoring of the plant’s phenology changes in a non-invasive way, and produce highly satisfactory forecasting of biomass (or other target parameters). In this study, the AI-based approach to assess and predict leaf area and plant biomass was proposed and validated. Our approach can estimate and predict precisely overall plants’ biomass at the early stage of growth in a non-destructive way. There is no need to carry out open-field vs. greenhouse experiments for assessing the impact of specific parameters [13] as it can be performed via the assessment of the leaf biomass. Discussion on 3D and laser based approached is provided in Section 2.2: Leaves Modelling.
The other important outcome of our research is that the proposed methodology can be used for the fundamental research that aims at finding plant characteristics, dependencies and assessment of the plant’s response to changes of the environmental parameters with high time resolution. This in turn opens wide possibilities for investigation of the hidden dynamics that was impossible to observe before, using standard techniques. Moreover, we studied the optimal conditions for cucumber growing based on the expert knowledge’s of our agronomy, so this experiment design could also used as a baseline for future studies.
Meanwhile some limitations of this study could be highlighted. Usually, cucumber plants grow vertically in industrial greenhouses. The main problem is that the newest top leaves could overlap the lower leaves on images, so the plant biomass may be underestimated. This fact limits the ability of computer vision systems (implemented for top-screen view monitoring) to catch the plant’s biomass in a long-term period. Technically, this issues can be resolved by deploying additional cameras or applying mathematical algorithms to automate leaf counting, but this issue was beyond the scope of this investigation.

7. Conclusions

In this article, we have reported on the approach for leaf area and biomass assessment which has been validated on a real deployment. For this reason we presented and tested the industrial deployment enabled by the AI-based sensing system for robust and accurate plant growth dynamics prediction. For the purpose of dataset collection that includes image data, environmental conditions and biomass measurements, we conducted one-month experiment on cucumbers growth in a greenhouse. Specifically, we obtained a dataset containing sequences of 9976 top-down images from 4 cameras, 480 direct measurements of biomass for a 17-day period and environmental data from sensors. First, we labeled the obtained image dataset and trained the FCNNs to perform automatic segmentation of cucumbers, achieving 82% of the IoU. Second, the trained FCNNs were applied to the sequences of images, thus, reconstructing average per-plant leaf area and growth dynamics. Then, we established correspondence between the area of leaves and biomass using the direct measurements of the biomass. Finally, it allowed us to predict dynamics of the biomass based on the predictions of the leaf area within 10% accuracy. Overall, we propose and evaluate the high effective and reliable data-driven based pipeline for the plant growth dynamics assessment and prediction using the common sensors such as 2D digital cameras.

Author Contributions

Conceptualization, D.S. and M.P.; methodology, D.S.; software, A.M.; validation, A.N., V.V. and S.N.; formal analysis, A.N.; investigation, D.S. and A.M.; resources, V.V.; data curation, D.S.; writing—original draft preparation, D.S. and A.M.; writing—review and editing, A.S. and S.N.; visualization, A.N.; supervision, A.S.; project administration, M.F.; funding acquisition, G.O. All authors have read and agreed to the published version of the manuscript.

Funding

The reported study was funded by Russian Foundation for Basic Research (RFBR), project number 19-29-09085 MK.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data is available upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ruan, J.; Jiang, H.; Zhu, C.; Hu, X.; Shi, Y.; Liu, T.; Rao, W.; Chan, F.T.S. Agriculture IoT: Emerging Trends, Cooperation Networks, and Outlook. IEEE Wirel. Commun. 2019, 26, 56–63. [Google Scholar] [CrossRef]
  2. Miorandi, D.; Sicari, S.; De Pellegrini, F.; Chlamtac, I. Internet of things: Vision, applications and research challenges. Hoc. Networks 2012, 10, 1497–1516. [Google Scholar] [CrossRef] [Green Version]
  3. Pahuja, R.; Verma, H.; Uddin, M. A wireless sensor network for greenhouse climate control. IEEE Pervasive Comput. 2013, 12, 49–58. [Google Scholar] [CrossRef]
  4. Somov, A.; Shadrin, D.; Fastovets, I.; Nikitin, A.; Matveev, S.; Seledets, I.; Hrinchuk, O. Pervasive Agriculture: IoT-Enabled Greenhouse for Plant Growth Control. IEEE Pervasive Comput. 2018, 17, 65–75. [Google Scholar] [CrossRef]
  5. Bhattacharya, S.; Lane, N.D. Sparsification and separation of deep learning layers for constrained resource inference on wearables. In Proceedings of the 14th ACM Conference on Embedded Network Sensor Systems CD-ROM, Stanford, CA, USA, 14–16 November 2016; pp. 176–189. [Google Scholar]
  6. Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
  7. Lane, N.D.; Bhattacharya, S.; Mathur, A.; Georgiev, P.; Forlivesi, C.; Kawsar, F. Squeezing deep learning into mobile and embedded devices. IEEE Pervasive Comput. 2017, 16, 82–88. [Google Scholar] [CrossRef]
  8. Chlingaryan, A.; Sukkarieh, S.; Whelan, B. Machine learning approaches for crop yield prediction and nitrogen status estimation in precision agriculture: A review. Comput. Electron. Agric. 2018, 151, 61–69. [Google Scholar] [CrossRef]
  9. Werner-Allen, G.; Lorincz, K.; Ruiz, M.; Marcillo, O.; Johnson, J.; Lees, J.; Welsh, M. Deploying a wireless sensor network on an active volcano. IEEE Internet Comput. 2006, 10, 18–25. [Google Scholar] [CrossRef]
  10. Zhang, P.; Sadler, C.M.; Lyon, S.A.; Martonosi, M. Hardware Design Experiences in ZebraNet. In Proceedings of the 2nd International Conference on Embedded Networked Sensor Systems, New York, NY, USA, 3–5 November 2004. [Google Scholar]
  11. Ceriotti, M.; Mottola, L.; Picco, G.P.; Murphy, A.L.; Guna, S.; Corra, M.; Pozzi, M.; Zonta, D.; Zanon, P. Monitoring heritage buildings with wireless sensor networks: The Torre Aquila deployment. In Proceedings of the 2009 International Conference on Information Processing in Sensor Networks, San Francisco, CA, USA, 13–16 April 2009; pp. 277–288. [Google Scholar]
  12. Doolin, D.M.; Sitar, N. Wireless sensors for wildfire monitoring. In Proceedings of the SPIE Smart Structures and Materials + Nondestructive Evaluation and Health Monitoring, San Diego, CA, USA, 6–10 March 2005. [Google Scholar]
  13. Angmo, P.; Phuntsog, N.; Namgail, D.; Chaurasia, O.; Stobdan, T. Effect of shading and high temperature amplitude in greenhouse on growth, photosynthesis, yield and phenolic contents of tomato (Lycopersicum esculentum Mill.). Physiol. Mol. Biol. Plants 2021, 27, 1539–1546. [Google Scholar] [CrossRef]
  14. De Pascale, S.; Arena, C.; Aronne, G.; De Micco, V.; Pannico, A.; Paradiso, R.; Rouphael, Y. Biology and crop production in Space environments: Challenges and opportunities. Life Sci. Space Res. 2021, 29, 30–37. [Google Scholar] [CrossRef]
  15. Nesteruk, S.; Shadrin, D.; Pukalchik, M.; Somov, A.; Zeidler, C.; Zabel, P.; Schubert, D. Image Compression and Plants Classification Using Machine Learning in Controlled-Environment Agriculture: Antarctic Station Use Case. IEEE Sens. J. 2021, 21, 17564–17572. [Google Scholar] [CrossRef]
  16. Devices, A. Internet of Tomatoes Project. Available online: https://www.analog.com/en/landing-pages/001/iot-internet-of-tomatoes.html# (accessed on 28 August 2021).
  17. Angelopoulos, C.M.; Filios, G.; Nikoletseas, S.; Raptis, T.P. Keeping data at the edge of smart irrigation networks: A case study in strawberry greenhouses. Comput. Networks 2020, 167, 107039. [Google Scholar] [CrossRef]
  18. Roopaei, M.; Rad, P.; Choo, K.R. Cloud of Things in Smart Agriculture: Intelligent Irrigation Monitoring by Thermal Imaging. IEEE Cloud Comput. 2017, 4, 10–15. [Google Scholar] [CrossRef]
  19. Shadrin, D.; Menshchikov, A.; Ermilov, D.; Somov, A. Designing Future Precision Agriculture: Detection of Seeds Germination Using Artificial Intelligence on a Low-Power Embedded System. IEEE Sens. J. 2019, 19, 11573–11582. [Google Scholar] [CrossRef]
  20. Ali, I.; Cawkwell, F.; Dwyer, E.; Green, S. Modeling managed grassland biomass estimation by using multitemporal remote sensing data—A machine learning approach. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 10, 3254–3264. [Google Scholar] [CrossRef]
  21. Rajendran, K.; Tester, M.; Roy, S.J. Quantifying the three main components of salinity tolerance in cereals. Plant Cell Environ. 2009, 32, 237–249. [Google Scholar] [CrossRef]
  22. Yang, X.; Strahler, A.H.; Schaaf, C.B.; Jupp, D.L.; Yao, T.; Zhao, F.; Wang, Z.; Culvenor, D.S.; Newnham, G.J.; Lovell, J.L.; et al. Three-dimensional forest reconstruction and structural parameter retrievals using a terrestrial full-waveform lidar instrument (Echidna®). Remote Sens. Environ. 2013, 135, 36–51. [Google Scholar] [CrossRef]
  23. Paulus, S.; Behmann, J.; Mahlein, A.K.; Plümer, L.; Kuhlmann, H. Low-cost 3D systems: Suitable tools for plant phenotyping. Sensors 2014, 14, 3001–3018. [Google Scholar] [CrossRef] [Green Version]
  24. Nguyen, C.V.; Fripp, J.; Lovell, D.R.; Furbank, R.; Kuffner, P.; Daily, H.; Sirault, X. 3D scanning system for automatic high-resolution plant phenotyping. In Proceedings of the 2016 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Gold Coast, Australia, 30 November–3 December 2016; pp. 1–8. [Google Scholar]
  25. Chaudhury, A.; Ward, C.; Talasaz, A.; Ivanov, A.G.; Huner, N.P.; Grodzinski, B.; Patel, R.V.; Barron, J.L. Computer vision based autonomous robotic system for 3D plant growth measurement. In Proceedings of the 2015 12th Conference on Computer and Robot Vision, Washington, DC, USA, 3–5 June 2015; pp. 290–296. [Google Scholar]
  26. Quan, L.; Tan, P.; Zeng, G.; Yuan, L.; Wang, J.; Kang, S.B. Image-based plant modeling. In Proceedings of the ACM SIGGRAPH 2006 Papers, Boston, MA, USA, 30 July–3 August 2006; pp. 599–604. [Google Scholar]
  27. Paulus, S.; Dupuis, J.; Mahlein, A.K.; Kuhlmann, H. Surface feature based classification of plant organs from 3D laserscanned point clouds for plant phenotyping. BMC Bioinform. 2013, 14, 238. [Google Scholar] [CrossRef] [Green Version]
  28. Uchiyama, H.; Sakurai, S.; Mishima, M.; Arita, D.; Okayasu, T.; Shimada, A.; Taniguchi, R.I. An easy-to-setup 3D phenotyping platform for KOMATSUNA dataset. In Proceedings of the Computer Vision Workshop (ICCVW), 2017 IEEE International Conference IEEE, Venice, Italy, 22–29 October 2017; pp. 2038–2045. [Google Scholar]
  29. Gibbs, J.A.; Pound, M.; French, A.P.; Wells, D.M.; Murchie, E.; Pridmore, T. Approaches to three-dimensional reconstruction of plant shoot topology and geometry. Funct. Plant Biol. 2017, 44, 62–75. [Google Scholar] [CrossRef]
  30. Pound, M.P.; French, A.P.; Murchie, E.H.; Pridmore, T.P. Automated recovery of 3D models of plant shoots from multiple colour images. Plant Physiol. 2014, 114. [Google Scholar]
  31. Huang, W.; Ratkowsky, D.A.; Hui, C.; Wang, P.; Su, J.; Shi, P. Leaf fresh weight versus dry weight: Which is better for describing the scaling relationship between leaf biomass and leaf area for broad-leaved plants? Forests 2019, 10, 256. [Google Scholar] [CrossRef] [Green Version]
  32. Jin, X.; Li, Z.; Feng, H.; Ren, Z.; Li, S. Deep neural network algorithm for estimating maize biomass based on simulated Sentinel 2A vegetation indices and leaf area index. Crop. J. 2020, 8, 87–97. [Google Scholar] [CrossRef]
  33. Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
  34. Cherie, E. The Complete Guide to Growing Tomatoes: A Complete Step-by-Step Guide Including Heirloom Tomatoes (Back-to-Basics Gardening); Atlantic Publishin G Group Inc.: Ocala, FL, USA, 2010. [Google Scholar]
  35. Shamshiri, R.R.; Jones, J.W.; Thorp, K.R.; Ahmad, D.; Che Man, H.; Taheri, S. Review of optimum temperature, humidity, and vapour pressure deficit for microclimate evaluation and control in greenhouse cultivation of tomato: A review. Int. Agrophysics 2018, 32, 287–302. [Google Scholar] [CrossRef]
  36. De Gelder, A.; Dieleman, J.; Bot, G.; Marcelis, L. An overview of climate and crop yield in closed greenhouses. J. Hortic. Sci. Biotechnol. 2012, 87, 193–202. [Google Scholar] [CrossRef]
  37. Rezvani, S.M.E.; Abyaneh, H.Z.; Shamshiri, R.R.; Balasundram, S.K.; Dworak, V.; Goodarzi, M.; Sultan, M.; Mahns, B. IoT-Based Sensor Data Fusion for Determining Optimality Degrees of Microclimate Parameters in Commercial Greenhouse Production of Tomato. Sensors 2020, 20, 6474. [Google Scholar] [CrossRef]
  38. Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
  39. Tai, L.; Ye, H.; Ye, Q.; Liu, M. PCA-aided fully convolutional networks for semantic segmentation of multi-channel fMRI. In Proceedings of the 2017 18th International Conference on Advanced Robotics (ICAR), Hong Kong, China, 10–12 July 2017; pp. 124–130. [Google Scholar]
  40. Chen, L.C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected crfs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848. [Google Scholar] [CrossRef] [Green Version]
  41. Frighetto, D.F.; Souza, G.M.; Molter, A. Spatio-temporal population control applied to management of aquatic plants. Ecol. Model. 2019, 398, 77–84. [Google Scholar] [CrossRef]
  42. Costa, R.; Zanotelli, C.; Hoffmann, D.; Belli Filho, P.; Perdomo, C.; Rafikov, M. Optimization of the treatment of piggery wastes in water hyacinth ponds. Water Sci. Technol. 2003, 48, 283–289. [Google Scholar] [CrossRef]
  43. Sun, Y.; Hu, K.; Zhang, K.; Jiang, L.; Xu, Y. Simulation of nitrogen fate for greenhouse cucumber grown under different water and fertilizer management using the EU-Rotate_N model. Agric. Water Manag. 2012, 112, 21–32. [Google Scholar] [CrossRef]
  44. Ramírez-Pérez, L.J.; Morales-Díaz, A.B.; Benavides-Mendoza, A.; De-Alba-Romenus, K.; González-Morales, S.; Juarez-Maldonado, A. Dynamic modeling of cucumber crop growth and uptake of N, P and K under greenhouse conditions. Sci. Hortic. 2018, 234, 250–260. [Google Scholar] [CrossRef]
Figure 1. Deployment in a greenhouse.
Figure 1. Deployment in a greenhouse.
Eng 04 00116 g001
Figure 2. The deployment schematic and biomass measurement schedule.
Figure 2. The deployment schematic and biomass measurement schedule.
Eng 04 00116 g002
Figure 3. Watering schedule.
Figure 3. Watering schedule.
Eng 04 00116 g003
Figure 4. Dynamics of changing biomass.
Figure 4. Dynamics of changing biomass.
Eng 04 00116 g004
Figure 5. Sensor data collection and storage.
Figure 5. Sensor data collection and storage.
Eng 04 00116 g005
Figure 6. Temperature monitoring.
Figure 6. Temperature monitoring.
Eng 04 00116 g006
Figure 7. Humidity monitoring.
Figure 7. Humidity monitoring.
Eng 04 00116 g007
Figure 8. Distribution of the first derivatives for temperature.
Figure 8. Distribution of the first derivatives for temperature.
Eng 04 00116 g008
Figure 9. Distribution of the first derivatives for humidity.
Figure 9. Distribution of the first derivatives for humidity.
Eng 04 00116 g009
Figure 10. Distribution of the first derivatives for CO 2 .
Figure 10. Distribution of the first derivatives for CO 2 .
Eng 04 00116 g010
Figure 11. (a) Loss dynamics and (b) IoU changing during training and validation procedure for FCN8s and modified FCN8s.
Figure 11. (a) Loss dynamics and (b) IoU changing during training and validation procedure for FCN8s and modified FCN8s.
Eng 04 00116 g011
Figure 12. Predicted masks on the validation dataset for different stages of growth shown on (a,b).
Figure 12. Predicted masks on the validation dataset for different stages of growth shown on (a,b).
Eng 04 00116 g012aEng 04 00116 g012b
Figure 13. Predicted masks on the test dataset and their examples shown on (a,b).
Figure 13. Predicted masks on the test dataset and their examples shown on (a,b).
Eng 04 00116 g013
Figure 14. Dynamics of the specific leaf area changing based on the calculations by using of FCNN.
Figure 14. Dynamics of the specific leaf area changing based on the calculations by using of FCNN.
Eng 04 00116 g014
Figure 15. Dependency between averaged biomass and specific leaf area.
Figure 15. Dependency between averaged biomass and specific leaf area.
Eng 04 00116 g015
Figure 16. Result of fitting to the experimental data and prediction of the leaf area for 12 days ahead based on the Verhulst model and calculations of the leaf area obtained by FCNN for the first 18 days of the experiment.
Figure 16. Result of fitting to the experimental data and prediction of the leaf area for 12 days ahead based on the Verhulst model and calculations of the leaf area obtained by FCNN for the first 18 days of the experiment.
Eng 04 00116 g016
Figure 17. Assessment of the biomass based on the correlations and predictions of the leaf area vs. experimental measurements.
Figure 17. Assessment of the biomass based on the correlations and predictions of the leaf area vs. experimental measurements.
Eng 04 00116 g017
Table 1. Composition of the nutrient solution for initial saturation of cubes by fertilizer and further watering of plants.
Table 1. Composition of the nutrient solution for initial saturation of cubes by fertilizer and further watering of plants.
N H 4 + K + C a 2 + M g 2 + N O 3 H 2 P O 4
mmol/L1.256.754.53.016.751.25
F e M n Z n B C u M o S O 4 2
mmol/L20.010.05.030.00.750.52.5
Table 2. Measurements of EC and pH changing dynamics during the experiment.
Table 2. Measurements of EC and pH changing dynamics during the experiment.
Date123456
EC, mS / cm pHEC, mS / cm pHEC, mS / cm pHEC, mS / cm pHEC, mS / cm pHEC, mS / cm pH
18.051.628.121.598.031.507.971.588.111.558.01.537.89
19.051.758.131.708.151.698.121.698.201.668.201.688.04
21.051.786.641.746.581.726.831.696.751.706.831.626.68
22.051.827.761.817.681.727.541.687.651.747.751.667.73
25.051.867.111.826.981.796.971.807.021.827.061.827.01
26.052.017.551.907.881.927.762.027.601.977.811.837.71
28.052.248.082.008.072.027.612.177.522.247.822.017.79
30.052.258.022.138.092.057.822.208.012.187.952.127.87
1.062.437.772.087.622.085.562.127.282.007.411.857.40
3.062.748.172.368.151.968.43---
4.063.187.732.328.172.138.26---
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Shadrin, D.; Menshchikov, A.; Nikitin, A.; Ovchinnikov, G.; Volohina, V.; Nesteruk, S.; Pukalchik, M.; Fedorov, M.; Somov, A. Assessment of Leaf Area and Biomass through AI-Enabled Deployment. Eng 2023, 4, 2055-2074. https://doi.org/10.3390/eng4030116

AMA Style

Shadrin D, Menshchikov A, Nikitin A, Ovchinnikov G, Volohina V, Nesteruk S, Pukalchik M, Fedorov M, Somov A. Assessment of Leaf Area and Biomass through AI-Enabled Deployment. Eng. 2023; 4(3):2055-2074. https://doi.org/10.3390/eng4030116

Chicago/Turabian Style

Shadrin, Dmitrii, Alexander Menshchikov, Artem Nikitin, George Ovchinnikov, Vera Volohina, Sergey Nesteruk, Mariia Pukalchik, Maxim Fedorov, and Andrey Somov. 2023. "Assessment of Leaf Area and Biomass through AI-Enabled Deployment" Eng 4, no. 3: 2055-2074. https://doi.org/10.3390/eng4030116

Article Metrics

Back to TopTop