Next Article in Journal
Reducing Noise, Artifacts and Interference in Single-Channel EMG Signals: A Review
Previous Article in Journal
Sensitiveness of Variables Extracted from a Fitness Smartwatch to Detect Changes in Vertical Impact Loading during Outdoors Running
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Lettuce Production in Intelligent Greenhouses—3D Imaging and Computer Vision for Plant Spacing Decisions

Business Unit Greenhouse Horticulture, Wageningen University & Research (WUR), 6708 PB Wageningen, The Netherlands
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(6), 2929; https://doi.org/10.3390/s23062929
Submission received: 30 January 2023 / Revised: 20 February 2023 / Accepted: 27 February 2023 / Published: 8 March 2023
(This article belongs to the Section Smart Agriculture)

Abstract

:
Recent studies indicate that food demand will increase by 35–56% over the period 2010–2050 due to population increase, economic development, and urbanization. Greenhouse systems allow for the sustainable intensification of food production with demonstrated high crop production per cultivation area. Breakthroughs in resource-efficient fresh food production merging horticultural and AI expertise take place with the international competition “Autonomous Greenhouse Challenge”. This paper describes and analyzes the results of the third edition of this competition. The competition’s goal is the realization of the highest net profit in fully autonomous lettuce production. Two cultivation cycles were conducted in six high-tech greenhouse compartments with operational greenhouse decision-making realized at a distance and individually by algorithms of international participating teams. Algorithms were developed based on time series sensor data of the greenhouse climate and crop images. High crop yield and quality, short growing cycles, and low use of resources such as energy for heating, electricity for artificial light, and CO2 were decisive in realizing the competition’s goal. The results highlight the importance of plant spacing and the moment of harvest decisions in promoting high crop growth rates while optimizing greenhouse occupation and resource use. In this paper, images taken with depth cameras (RealSense) for each greenhouse were used by computer vision algorithms (Deepabv3+ implemented in detectron2 v0.6) in deciding optimum plant spacing and the moment of harvest. The resulting plant height and coverage could be accurately estimated with an R2 of 0.976, and a mIoU of 98.2, respectively. These two traits were used to develop a light loss and harvest indicator to support remote decision-making. The light loss indicator could be used as a decision tool for timely spacing. Several traits were combined for the harvest indicator, ultimately resulting in a fresh weight estimation with a mean absolute error of 22 g. The proposed non-invasively estimated indicators presented in this article are promising traits to be used towards full autonomation of a dynamic commercial lettuce growing environment. Computer vision algorithms act as a catalyst in remote and non-invasive sensing of crop parameters, decisive for automated, objective, standardized, and data-driven decision making. However, spectral indexes describing lettuces growth and larger datasets than the currently accessible are crucial to address existing shortcomings between academic and industrial production systems that have been encountered in this work.

1. Introduction

Recent studies strongly indicate that food demand will increase by 35–56% over the period of 2010–2050 as a result of population increase, economic development, and urbanization, among other drivers [1]. The expected increase in food demand places pressure on natural resources and may lead to negative environmental impacts as well as biodiversity losses [2]. Among the possible solutions are the transformation of food production into a green industrial process and the promotion of policies for plant-based and high-nutrient diets [3].
Greenhouse systems allow sustainable intensification of food production with demonstrated high crop production per cultivation area [4]. While vegetable production is increasing in area and volume, the number of farms declines, resulting in more vegetable area and volume per farm and per grower [5]. At the same time, the availability of labor is an industry-wide challenge as well as the lack of experienced managers and growers in crop production. Greenhouses are highly dynamic production systems operating through an integrated set of activities performed by growers [6]. Growers need to consider various performance indicators such as yield, quality, timing, and sustainability standards and meet the volatile market demands, and prices in uncertain environmental conditions subject to weather conditions, for example [7].
Modern horticultural production is highly dependent on up-to-date information on farm operations. Production processes are already highly automated and controlled [8]. Information systems driven by the rapid developments in cloud computing, the Internet of Things, Big Data, machine learning, augmented reality, and robotics are changing the horticulture horizon toward precision horticulture [9,10,11,12]. Digital technologies, computational power, and high-fidelity sensors act as catalysts in the transition toward advanced and autonomous production systems. Non-invasive, near real-time data and information with high spatial and temporal resolution create opportunities for advisory or automated decision software and the design of advanced models, known as digital twins [13]. Monitoring and interpretation of the system’s dynamics at coarser and granular levels allow for location-specific operations to ascertain desired conditions that meet crop demands.
Digital twins are equivalent to real-life objects mimicking the behavior and states over their lifetime in virtual space [13]. Greenhouse digital twins can be seen as coupled dynamic climate and crop models representing the actual physical, biological, and integrated technical systems as virtual representations of reality [14]. Digital twins can be used to simulate the effects of different growing conditions and crop management strategies, give insights into their effect on performance indicators, and support decision-making [15]. There have been several achievements in the implementation of mechanistic crop and climate models in horticultural research to facilitate decision making in greenhouse operations [16,17,18]. Broadly validated dynamic models of the greenhouse climate and crop include, e.g., KASPRO [19] and INTKAM [20], which have been used for several research activities. A benchmark experiment in optimizing net profit using AI for the remote control of cucumber cultivation in 2018 [21] and a follow-up on optimizing the net profit of tomatoes as a function of yield and quality a year later [22], showed the potential of AI in controlling and outperforming human decisions by experienced growers. Automated greenhouse control wasthus demonstrated to be possible; therefore, our next focus was on the autonomy, robustness, and scalability of such control systems [23]. The goal of the third edition of the Autonomous Greenhouse Challenge was the full autonomous control of lettuce cultivation.
Commercial greenhouse production of lettuce (Lactuca sativa L.) is already highly automated. Lettuce is grown in controlled greenhouse environments including hydroponic, aquaponic, and vertical growing systems. The systems minimize labor requirements by using conveyor belts and lifts throughout the growing processes from seedling to harvesting [24]. Lettuce hydroponic systems include Nutrient Film Techniques (NFT), Deep Flow Techniques (DFT), as well as Ebb and Flow systems. NFTs are the most widespread method of recirculating nutrient solution systems [25] and employ a shallow stream of water with dissolved nutrients flowing over the roots of plants in water-tight gullies, here referred to as gutters. The nutrient solution is initially stored in a reservoir, pumped out into the gutters at an angle, and drained to a tank for filtering before re-cycling to the reservoir for re-use. Gutters are automatically filled with the growing media and lettuce heads and transported on conveyor belts to the main greenhouse area. When lettuce heads are fully grown, they are moved toward the harvesting area. At the harvesting area, cutting machines remove the plants from the gutters and transfer the lettuce heads for packaging while the gutters are washed, and the process starts again. During the growing period in the greenhouse, the distance between the gutters and crops on the gutters largely determines the required amount of greenhouse space and, therefore, resource use. From the perspective of greenhouse automation, it is important to note that the automated optimization of lettuce plant spacing is not yet implemented in practice.
Optimal cultivation temperatures for lettuce are relatively low and range from 15.5 °C to 28 °C during the daytime to 3 °C to 12 °C at night time [26]. The optimal pH ranges for the nutrient solution from 5.8 to 6.5 and its optimal electrical conductivity (EC) should be 1.5 mS/cm [27]. A wide variety of crop types can be distinguished among the existing lettuce cultivars, with crisp head and butterhead commonly grown in the United States and Western Europe, respectively, whereas Romaine and loose-leaf types are mainly cultivated in Mediterranean areas [28]. The crop is susceptible to physiological problems including outer leaf tip burn, inner tip burn, and discoloration of ribs [29]. Growth of lettuces, as with any crop, is related to incident radiation and CO2 concentration, and due to the relatively high surface area to volume ratio, has high transpiration rates [29]. A fully autonomous decision of optimum climate setpoints can contribute to better crop growth and lower resource use.
Since plant spacing is an important criterion for good vegetative growth on an m2 basis, it is a major aspect of yield maximization. Densely planted lettuces can obstruct morphological characteristics such as head size, leaf expansion and color, and compactness [30,31,32]. Wider spacing ensures higher light availability per head and that nutritional requirements are satisfied; however, this comes at the expense of less efficient utilization of the growing area and resources used. Optimum plant spacing is a management decision in hydroponic lettuce cultivation that can potentially be determined using 3D camera images and other sensor data, together with artificial intelligence algorithms to fully automate the operational process.
Modern camera systems and innovative artificial intelligence (AI) technologies such as computer vision allow objective, non-invasive, and continuous data for precision horticulture applications [33]. Advances in machine learning for image processing have resulted in a wide range of research and applications for crop monitoring [34]. Applications of computer vision can be found in the fields of pests, disease or weed detection [35,36,37], fruit and flower detection, counting and fruit ripeness [38,39], crop stress detection [40], yield estimation, or moment of harvesting [41,42]. Moving cameras or flying drones with mounted cameras scan plants from various viewpoints, addressing matters of occlusion and creating 3D representations of the crop [43]. High-resolution imaging in combination with deep learning techniques is expected to have great potential for precision farming and remote control operations for purposes of autonomous greenhouses [44].
Traditional computer vision techniques struggle with the challenging greenhouses environment because of varying environmental conditions. Light conditions are continuously changing, and occlusion makes it difficult to identify individual plants or plant organs [45]. The development of hand-crafted algorithms was often time-consuming and not reliable enough. However, recent development in the field of deep learning made it easier to use vision systems in greenhouses. High classification accuracies of up to 99.7% [46] on large plant datasets such as the “Oxford-Flowers102” [47] dataset show the power of deep learning for plant phenotyping. Already in 2017, the first paper appeared on the quality assessment of lettuce using artificial neural networks [48]. Lettuce was binarily classified as good” or “reject”. Although the algorithm was not complex as it had only two layers, it was one of the first publications that showed the possibility of using neural networks for lettuce classification. The ability of networks to learn plant features from single lettuce images can be determined by the recently published lettuce dataset [49]. At the moment, three papers have been published, obtaining high accuracy to estimate fresh weight from the images with a Root Mean Squared Error (RMSE) up to 25.3 g [50,51,52].
The above-mentioned examples are only focusing single lettuce images. With the development of instance segmentation algorithms it is possible to determine the growth rate of lettuce over time by extracting the leaf area of single lettuce plants, as seen in [53], their experiments were in a semi-commercial setting without overlapping lettuces. A more commercial example can be found in [54], in which aerial images were collected and the number of lettuces was determined including a size estimation into three different categories. One of the conclusions was that despite the fact that many individual lettuces can be detected, there is still a gap between object detection and trait measurements [54]. In greenhouses, the environmental conditions are much better for high-quality imaging, reducing the AI and trait measurement gap. Other researchers developed a high-throughput system for individual plant phenotyping of lettuce [55]. Each lettuce head was placed in an individual pot; by detecting the pot and by applying semantic segmentation, many plant traits were calculated including projected area and perimeter. The area and size are two of the most interesting growth indicators. However, when the leaves became larger than the pot size, prediction accuracy decreased; as a result, the growth curves were only accurate for the first weeks. It can be noticed that most experiments were carried out in semi-commercial conditions. When the leaves were overlapping, either the experiment stopped, or the extraction of the parameters was removed. Next to that, in each experiment, the interpretability of the results was difficult. There is still a mismatch between object detection, determining plant traits, and more importantly, what a grower should do with the provided information. If AI can extract growth rate, how should a grower use this information to improve the cultivation? Therefore, more advanced methods are needed that can extract information in greenhouses and conform to commercial practice while maintaining interpretability.
This paper describes the results of the third Autonomous Greenhouse Challenge, an experiment in which the autonomous control of lettuce production has been realized in six different greenhouse compartments, each controlled by AI algorithms developed by participating teams. During the experiment, the goal was to decide upon climate and crop management strategies to optimize the net profit of lettuce production, considering yield and product prices, resource use, and costs including greenhouse occupation. The experiment provided valuable public datasets which can be used for future AI training purposes, and which can be found under the Data Availability Statement. In this paper, we give an overall analysis of the results obtained by the teams. Next to that, we focus on the research question, of how computer vision and deep learning algorithms can be used for automated operational decisions of lettuce greenhouse production, as currently plant spacing and harvesting are determined on fixed schedules since transplanting. Furthermore, we examine how better utilization of the occupied growing area, efficient resource use that meets crop growing demands, and timely planning of harvest events can be supported by non-invasively estimated indicators such as the proposed light loss and harvest indicator. Results of other studies focus on answering similar questions on crop trait detection with computer vision in highly controlled and steady environmental conditions. This research realizes steps closer to commercial practice by processing smaller datasets of canopy images under varying environmental conditions.

2. Materials and Methods

This paper describes different steps of the realized research methodology, and an overview is given in Figure 1. In the preparation phase teams developed their own AI algorithms based on provided annotated single lettuce images and climate data time series from a climate and crop simulator [49]. After this preparation phase, two lettuce-growing experiments were conducted in greenhouses at Wageningen University & Research (Section 2.1. Greenhouse compartments and equipment Section 2.2. Crop and Section 2.3 Greenhouse climate and crop control). During the first greenhouse experiment, teams could gain experience in controlling the lettuce growth based on real-time data from the greenhouse (Section 2.4 Data communication, Section 2.5 Remote sensing, and data collection). New annotated images, of the full crop under the camera’s field of view and climate time series data were collected [56]. Teams could refine their algorithms before the second greenhouse experiment. Another set of annotated images, full crop canopy images, and climate times series were collected [56]. After the experiments, an analysis of climate, crop, and resource use was made and given in this paper (Section 3. Results Section 3.1, Section 3.2 and Section 3.3). An additional analysis of plant spacing decisions was made (Section 3. Results Section 3.4) based on different image processing methods (Section 2.6 Image processing for plant spacing decisions). The results are discussed and concluded in Section 4 and Section 5.

2.1. Greenhouse Compartments and Equipment

Each greenhouse compartment at the research facility of Wageningen University & Research in Bleiswijk, The Netherlands, had a size of 96 m2. The compartments were equipped with standard actuators also available in commercial high-tech greenhouses as shown in Figure 1. A pipe-rail heating on the floor with a peak capacity of 120 W/m2 controlled by the heating temperature setpoints, continuous roof ventilation (ventilation area of 0.3 m2 opening per m2 greenhouse area, equipped with anti-thrips netting), two types of inside moveable screens (LUXOUS 1547 D FR energy screen and OBSCURA 9950 FR W light blocking screen, Ludvig Svensson), white LED artificial lights of dimmable intensity controlled in a continuous range between 27 and 270 µmol/m2/s and efficiency of 2.4 μmol/J, (VYPR 2p, Fluence by Osram), a fogging system (maximum capacity of 330 g/m2/h), and CO2 supply (maximum capacity 15 g/m2/h) were available. Plants were grown in soil-pressed pots on NFT hydroponic gutters (Hortiplan, Belgium) placed on an inclination. A recirculating water system was supplying water and nutrients via pressure-compensated narrow tubes injecting water into the gutters.
The experiment of the third Autonomous Greenhouse Challenge was conducted in the first half of 2022 in six different high-tech Venlo-type greenhouses compartments of Wageningen University & Research, in Bleiswijk, The Netherlands. The basic greenhouse construction and equipment with actuators as well as the standard sensors and control of the greenhouse compartments were identical to the elements which can be found in commercial greenhouses (see Section 2.2 Crop and Section 2.3 Greenhouse climate and crop control). However, the greenhouse compartment size was much smaller than in commercial practice. Different teams (CVA, DigitalCucumbers, Koala, MondayLettuce, VeggieMight) and a reference were controlling the six compartments.

2.2. Crop

Two cultivation cycles of lettuce cv. “Lugano” (Rijk Zwaan, The Netherlands) were conducted in 6 equal greenhouse growing compartments. Lettuces were grown in a hydroponic NFT system. Seeds were propagated to seedlings 8 weeks before the transplanting date. The young plants were grown in cubes of compacted peat. On the days of transplanting (2 February and 3 May 2022, respectively), the seedlings were placed in the greenhouse compartments in small holes of slightly tilted gutters to which water with nutrients was supplied at a certain frequency.
The lettuces were grown in 3.2 m plastic gutters, all having 30 plant holes, with an 11 cm heart-to-heart distance. The gutters were 10 cm wide, so the maximum plant density was 92 (rounded) plants per m2 in the initial stage. Lettuces were grown in two rows of such gutters as depicted in Figure 1. Plant appearance, pests, and diseases were monitored weekly by experts without interfering with any operational control decisions in the compartments. Irrigation and nutrient recipes were determined by the experienced greenhouse staff of the Bleiswijk Research Center.
Leafy vegetables are sellable to retail at a particular weight and shape. The lettuce heads in the area of evaluation (Figure 2) were classified at the moment of harvest into three categories. Class A were sellable lettuces with a minimum average weight of 250 g, Class B were lettuces with a weight between 220 and 250 g, and Class C were non-sellable lettuce heads that were underweighted and or showed visible deformations. Malformations referred to quality aspects related to the shape of the plant and defects of the leaves (e.g., leaf discoloration, leaf rotting, and diseases).

2.3. Greenhouse Climate and Crop Control

Strategic and operational climate control was carried out by participating teams of the third Autonomous Greenhouse Challenge. Strategic decisions include, e.g., the use (installation) of screens or artificial lighting or the starting density of the crops, operational decisions included, e.g., the timing and amount of screen or lighting hours or crop spacing decisions. The AI algorithms of the teams were determining control setpoints of the heating temperature, CO2 concentration, humidity deficit, lighting intensity, operation of the blackout, and energy screens, as well as leeward and windward ventilation. The mechanistic climate and lettuce crop models of WUR (KASPRO and INTKAM, respectively) could be used by the teams as a training environment for the algorithms before the start of each cultivation cycle.
Resource use was calculated based on measured data: heating energy (MJ/m2) with a price of 0.0375 EUR/kWh, electricity (kWh/m2) with a price of 0.125 EUR/kWh for the on-peak hours (07:00–23:00), and 0.075 EUR/kWh for the off-peak hours, CO2 use (kg CO2/m2) with a price of 0.12 EUR/kg.
As in commercial practice, the spacing system allows for several plant densities; densities could be reduced from the starting density of 92 heads per m2 via a density of 60, 45, 30, 23, and 18, to the lowest density of 15 lettuce heads per m2. The teams’ algorithms had to automatically make the spacing decisions.
The following prices per lettuce head were given Class A = 0.50 EUR/head, Class B = 0.40 EUR/head, and Class C = 0.00 EUR/head. In commercial practice, harvested lettuce heads are sold per head, but as in reality, the economics of the greenhouse is eventually expressed in resource usage and production per average m2 of the growing area. Therefore, the price of the lettuce was multiplied by the average number of heads per m2 of the growing area. The formula to calculate the average lettuce crop density (heads/m2) is the following:
A v e r a g e C r o p D e n s i t y = D d = 1 D 1 d e n s i t y d
where D is the total number of days since transplanting until harvest and d e n s i t y d is the plant density at day d .
Teams had to maximize net profit. Net profit was calculated from income minus costs. Income was determined by multiplying the yield with the price per class. The total costs consisted of fixed and variable components associated with the greenhouse operation. On top of that teams were ‘charged’ for every manual intervention on their autonomous algorithm (EUR 1 per intervention). This penalty was meant to strongly discourage such interventions ensuring that the algorithms would work as autonomously as possible. Fixed costs accounted for the plant material, maintenance, and depreciation costs of the greenhouse equipment. The variable costs accounted for the resource use (electricity for artificial lighting, energy for heating, and CO2 injection).

2.4. Data Communication

Data communication between the underlying systems was vital to ensure a stable, uninterrupted integration and operation. In this experiment, an Azure file share was made available to ensure enough storage capacity for collected datasets. Azure Virtual Machines—NCasT4_v3-series (VMs) were used for high-performance computing and deploying AI workloads, such as real-time inferencing of user requests. The infrastructure supported the communication between the greenhouse climate computer, control systems, sensing devices, and the state of actuators, measured indoor and outdoor climate (Figure 3) (Appendix A Table A1). Numerical time-series data of the realized controls, climate, and additional sensor sensors can be found under the Data Availability Statement.

2.5. Remote Sensing and Data Collection

In each greenhouse compartment, standard sensors were made available, comparable to earlier experiments described in [21,22]. These consist of an outside weather station, obtained weather forecast, and indoor climate parameters (temperature, relative air humidity, PAR light, CO2) along with the status of all actuators (heating, fogging, lighting, screening, CO2) in 5 min intervals. The output of the standard sensors was continuously available as input for the teams’ algorithms.
In commercial production, lettuce traits are seldom collected during the growing cycle and crop performance is evaluated by growers’ visual inspections only. In this experiment, RealSense D415 [57] cameras were hung 1 m above the growing crop in the area of evaluation. The camera uses stereo vision and stores depth, RGB, and IR images. All camera parameters, both intrinsic and extrinsic, are provided with the published dataset under the Data Availability Statement. These parameters could be used to convert the images to point clouds. Images were taken every 15 min in each compartment during the cultivation cycles.
Periodic destructive harvests of six plants per compartment were taken on the day of planting and subsequently on a weekly basis Destructive measurements of plant height, diameter, fresh weight, and dry weight, and scores for leaf deformation due to outer leaf tip burn were carried out. The individual lettuce plants were taken from the right and left side of each compartment as shown in Figure 2b. Next to that, images of the individual plants were made each 15 min; an example can be found in (Figure 4).

2.6. Image Processing for Plant Spacing Decisions

One of the main research questions is how images taken in a greenhouse conform to commercial practice and computer vision can be used to determine the optimal spacing strategy. To do this, the images of the lettuce crop in the greenhouse need to be related to a relevant crop variable, such as crop growth rate. From the time series of images taken inside the greenhouses (Figure 4), the coverage can be calculated over time. The coverage can be defined as the area covered with green leaves relative to the total ground surface area. However, coverage might not be a good indicator to determine plant growth rate, as plant growth rate may decline once the leaves touch neighboring plants. It can be assumed that for very high coverage growth is hampered. So, coverage might not be a suitable parameter to be used for spacing decisions. Crop volume on the contrary might describe plant growth rate even if the coverage is close to 100%. Crop volume can be estimated by coverage and height and can be used to determine crop growth rates in time. The volume over time is a relevant crop parameter, however, it might not directly assess if spacing was carried out correctly. Another option might be to calculate light interception (or light loss). The different methods of “coverage” (Section 2.6.2), “volume over time” (Section 2.6.3), and “light loss over time” (Section 2.6.4) are explained in the following sections after describing how crop segmentation (Section 2.6.1) from greenhouse images is implemented.
For optimizing spacing decisions, the challenge lies in realizing a fast plant growth rate on one hand and limited use of space on the other hand. Early spacing facilitates fast growth rates thus increasing yield over time, late spacing facilitates less occupation of space thus decreasing resource use. An optimum spacing decision is therefore necessary.

2.6.1. Crop Segmentation

First, the lettuce crop needs to be segmented from the background. The growth of each lettuce head over time was identified using instance segmentation. However, this is only possible in the first 2 weeks. After that, the lettuce heads start to touch each other, and leaves overlapped. As visible in Figure 5 it is almost impossible to identify which leaves belong to which lettuce head. Therefore, semantic segmentation is used in this study. Specifically, DeepLabv3+, which is implemented in detectron2 (v0.6) [58]. For training purposes 23 images have been annotated, in which each pixel was either annotated as background or lettuce, an example is shown in Figure 5. All settings for training were kept the same as in the original implementation detectron2. Only the number of iterations was set to 2500 and the image size was set to 1024 × 1024. For validation purposes, 12 additional images have been annotated. The evaluation was carried out on the validation dataset using the mean Intersection over Union (mIoU) metric Equation (2). In which M denotes the mask of the class, respectively.
m I o U = ( | M G T l e t t u c e     M L e t t u c e M G T l e t t u c e     M L e t t u c e | + | M G T b a c k g r o u n d     M b a c k g r o u n d M G T b a c k g r o u n d     M b a c k g r o u n d | ) 2

2.6.2. Coverage

The coverage was calculated by segmenting the images with the trained DeepLabv3+ model. In these segmented images the number of pixels classified as lettuce was divided by the total number of pixels, see Equation (3).
C o v e r a g e [ % ] = #   l e t t u c e   p i x e l s   #   t o t a l   p i x e l s 100

2.6.3. Volume over Time

To determine the lettuce head volume over time from images taken in the greenhouse with conditions that conform to practice a “ground plane” is needed. This ground plane was determined by fitting a plane using RANSAC [59] through the non-lettuce pixels on the day of planting. RANSAC can compensate for slight skewness in camera mounting. This method assumes that the camera position does not change after planting. The height is subsequently calculated by determining the point-to-plane distance for each pixel classified as lettuce (Figure 6). The volume was then calculated by multiplying the height by the pixel size in mm and dividing by the density at each moment in time. By dividing by the density, the volume per plant was calculated which was needed to correct for different plant densities. The growth was then determined based on the volume increase.
H e i g h t i [ cm ]   =   d i s t _ p l a n e i d i s t i
V o l u m e   o v e r   t i m e   [ cm 3 ]   =   i = 0 N ( H e i g h t i p i x e l s i z e 2 ) d e n s i t y 1000

2.6.4. Light Loss over Time

For optimal use of the greenhouse area, it is important to evaluate the spacing decision. In this research, the light loss was calculated over time. Simplified we determined if the light loss after spacing was larger than the light loss before spacing due to overlapping leaves. The light loss after spacing was calculated by subtracting 100 minus the current coverage (Section 2.6.2) (Equation (6)). The light loss before spacing was calculated by determining the difference between the current coverage and the theoretical coverage. This theoretical coverage was the projection of previous coverage divided by the previous density and multiplied by the new density (Equation (7)). This difference indicated how much light was lost due to overlapping leaves. Now the light loss was calculated (Equation (8)). If for example, the light loss was negative, it indicated that too many leaves were overlapping, resulting in a light loss that was larger than the light loss after spacing. On the other hand, if the light loss was positive, then the spacing decision was too early, because in the new spacing density, there was more light lost than before.
L i g h t L o s s c u r r e n t = 100 c o v e r a g e t
L i g h t L o s s b e f o r e   s p a c i n g = c o v e r a g e t c o v e r a g e t 1 d e n s i t y t 1 d e n s i t y t
L i g h t L o s s = L i g h t L o s s c u r r e n t L i g h t L o s s b e f o r e   s p a c i n g  
where t denotes the time when spacing occurred and t 1 time before spacing.

3. Results

Two experiments were carried out consecutively. The first experiment offered the teams the possibility to test their algorithms in growing a real crop in a real greenhouse to bridge the gap between simulation and reality. The second experiment was the eventual challenge that determined the winner of this third Autonomous Greenhouse Challenge. In both experiments, the algorithms were optimizing income against costs to achieve a maximum net profit. The challenge was the outside conditions during the second experiment (early summer) were very different from those in the first experiment (late winter). Both data sets are made publicly available and can be used for further development of intelligent control of lettuce production systems. We show both but focus more on the second data set which was determining the winner of the competition.
The results of realized climate, resource use, crop yield, and applied plant spacing are given for the two cultivation cycles in the six greenhouse compartments.

3.1. Climate and Resource Use Analysis

The climate control strategy in the greenhouse largely determines the use of resources. Figure 7 illustrates the average daily greenhouse air temperature in the different compartments during the late winter and early summer experiments. The realized daily average temperature ranged between 18 °C and 22.5 °C for the first experiment (winter), whereas for the second experiment (early summer) the minimum and maximum diurnal temperatures were on average 1 °C higher. Teams decreased their energy consumption for heating by more than 80%, up to 97%, except for two teams DigitalCucumbers and VeggieMight (Appendix A Figure A1). For DigitalCucumbers the operation of the heating pipes is reflected in the high diurnal temperature realized in their compartment augmented by their low ventilation rates that maintained the highest CO2 concentration (Figure 8), despite the low CO2 dosage rates. VeggieMight realized lower temperatures, due to the higher ventilation rates, despite the higher energy used for heating. Appendix A Figure A1 shows the energy consumption for heating over time in both experiments, an important part of the resource use.
The daily light integral (DLI), mol m−2 plant−1 in the greenhouse compartments is the sum of outside sunlight, influenced by the team’s screen usage (Appendix A Figure A2) and topped up with the artificial light for each team (Appendix A Figure A4). Especially the longer day length and higher intensities of solar radiation resulted in a higher cumulative DLI during the second experiment. The realized indoor daily PAR for each team as the sum of solar radiation and artificial lightduring the two cultivation experiments is illustrated in Figure 9. In the second experiment, team VeggieMight realized the highest cumulative DLI (Figure 10) despite the zero hours of their artificial illumination (Appendix A Figure A4) and the highest total light interception per head of lettuce, as a result of the intelligent operation of their blackout screen (Appendix A Figure A3), and their low plant density (Table 1). Comparable cumulative DLIs for all teams were observed in the first experiment. However, in the first experiment, the light demand of lettuce was mainly covered with artificial light, as 50% to 88% of the measured light originated from LEDs (Appendix A Figure A4). During the second experiment, comparable light levels were again required to reach the target weight, illustrated by the colored circles in both figures. Due to the higher solar radiation, less artificial light was needed in this experiment. Appendix A Figure A5 shows the electricity consumption over time in both experiments of each team, an important part of the resource use.
All climate strategies applied by the teams resulted in differences in resource use which are summarized in Table 2.

3.2. Crop Yield Analysis

Figure 11 shows the initial plant densities of 92 heads/m2 chosen by all teams and the different densities realized over time. Throughout the cultivation periods, lettuce heads were spaced according to the decisions of each team’s algorithm. The reason for the spacing was to balance fast crop growth and minimize greenhouse space utilization thus resource use and costs for spacing. As the average plant density impacted costs due to resource use and labor for spacing events, teams MondayLettuce and DigitalCucumbers attempted to exploit the advantage of higher densities and fewer spacing interventions, respectively. The average plant densities of the two cultivation cycles for the six compartments are shown in Table 1.
During the first experiment, the team’s algorithms seemed to have computed the harvest time quite accurately at the target weight of 250 g, as can be seen in Figure 12. The crops of DigitalCucumbers and VeggieMight grew poorly, they were still far off from the targeted weight the moment that the first cultivation was terminated. For the second experiment, the algorithms of all participants were too late in harvesting, the harvest weight was higher than the target weight. Only the reference compartment, was harvested timely. Appendix A Table A3 summarizes the lettuce weight at harvest and the number of cultivation days per compartment. It also shows the dates at which the ideal target weight would have been achieved for the different compartments by linearly interpolating the weekly fresh weight measurement.
Total lettuce crop yields (Figure 12 and Appendix A Table A3 and the quality assessment (Appendix A Figure A6) resulted in a computed income from this cultivation cycle. In Section 3.3 this income is compared with the costs associated with the second cultivation cycle to obtain the computed net profit (Table 2).

3.3. Net Profit

The combination of climate strategies, resource use, crop yield, and quality realized by the teams resulted in different net profits. Details are shown in Table 2.

3.4. Plant Spacing Analysis

The net profit is relying on crop yield, quality, resource use, and greenhouse occupation. The realized plant growth rates, plant densities, and realized final harvest due to timely estimation of plant weights were shown to be crucial for the net profit. Therefore, the options of different detailed computer vision analyses to make timely decisions on spacing decisions are shown in this paper.

3.4.1. Coverage

The computer-vision-based data analysis of plant growth mainly relies on the segmentation of the images of lettuces taken at a defined area over time. An example of such an image is shown in Figure 12 (left). There results of being segmented with the DeepLabv3+ algorithm as described in the Materials and Methods sectionare shown in Figure 13 (right). The algorithm had a mIoU of 98.2% on the validation dataset, with 100% indicating that the segmentation is perfect. Even though the validation dataset was relatively small, the segmentation procedure can be considered to be sufficiently robust, since only a small fraction of processed images from the datasets segmentation occurred to be incorrect. An example is given in Figure 13. Although there are white edges visible (right), showing where pixels were falsely assigned as ‘lettuce’, these edges are relatively small. The algorithm appeared also to be robust in dealing with different light conditions in the greenhouses as shown in the correctly processed bottom row of Figure 13.
Figure 14 shows the percentage of cultivation area covered by lettuce for each image over time for each compartment. In the first experiment, less efficient space occupation and coverage were observed, due to the explorative decision-making of the teams. Considering the more strategic decisions during the second experiment, teams targeted more efficient space occupation. A high coverage percentage was realized in a shorter time. Most teams maintained a coverage above 90%, only teams VeggieMight and Reference seemed to have spaced too early if only coverage is considered for the decision making. However, as explained in the materials and methods section, coverage might not be a good parameter to decide on spacing, since growth might be hampered already by too late spacing.

3.4.2. Crop Volume over Time

Next to coverage, crop width, height, and volume are suitable crop traits associated with growth. To explore the potential of these traits, in this research height and volume were determined over time. Since volume is strongly correlated with height, a comparison between calculated and manual crop height is shown in Figure 15. Figure 15 shows a strong correlation between the calculated height as it follows from the RealSense camera images and the manually measured height (ground truth), with a high R2 and slope close to 1. Therefore, daily height measurements from the RealSense camera were assumed to be correct and were used to calculate the volume.
In Figure 16, the calculated height from images taken by the RealSense camera over time is shown for all compartments. Reference and VeggieMight have a lower predicted plant height due to the early spacing decision. DigitalCucumbers has the highest plant height, elongation occurred when plants were touching each other due to high density. An interesting phenomenon is that after the last spacing decision (ca. after 5 June) not only height (Figure 16) but also volume (Figure 17) is reaching a plateau. This means that the daily fresh weight increase (Figure 12) of the last weeks is not visible in this method of image analysis.
In Figure 17 the daily estimated plant volume is given using imaging. The circles indicate on which day the target harvest weight of 250 g was reached. Large differences between the compartments were observed. In the second experiment, the difference in volume between CVA and MondayLettuce is remarkable. At the optimal harvest day, CVA has a volume of 6705 cm3/head and MondayLettuce 4844 cm3/head. A part of this large difference is caused by the plant density, which is at the end 15 and 22.5 plants/m2 for CVA and MondayLettuce, respectively. Because of the high plant density of MondayLettuce, more leaves were overlapping (coverage was 98.5%, compared to 87.2% of CVA). Since overlapping leaves do not contribute to volume in the image analysis it explains the lower volume of MondayLettuce with respect to CVA (Figure 17). This phenomenon is also summarized in Table 3. In this table, the volume is sorted from high to low. Although the weight of the lettuce at optimal harvest day in each compartment is approximately 250 g, there are differences in volume per head for similar crop densities. The Reference for example had a much higher volume than Koala. Both Figure 16 and Figure 17, and Table 3 show differences from the similar weight. From this, it can be concluded that the volume calculation alone will not be a conclusive trait for the estimation of the weight of the head of lettuce.

3.4.3. Harvest Indicator over Time

As presented in the previous section teams could have harvested earlier given the target harvest weight of 250 g per head. From Figure 17 and Table 3, there were differences in volume for similar harvest moments, indicating that volume might not be an ideal indicator for determining the ideal moment of harvest. The correlation coefficient of all calculated traits from the image analysis can be found in Appendix A Table A3. From this table the area per plant multiplied by the maximum height has the highest correlation coefficient, higher than volume.
In Figure 18 (left), the fresh weight as a function of area per plant multiplied by maximum height is visualized. From this figure, it can be concluded that there is still some noise, the Mean Absolute Error (MAE) is 22.98 g/head and RMSE of 31.2. According to the second-order equation, the harvest weight is reached at 7840 cm3.
Figure 18 (right) illustrates the area per plant multiplied by the maximum height as the most representative harvest indicator. The colored circles indicated the moments of harvest that satisfy the fresh weight criterion 250 g whereas the grey horizontal line depicts the moment that the harvest indicator of 7840 cm3 is satisfied. In Table 4 the exact dates of the fresh weight criterion and harvest indicator are given for the different teams.

3.4.4. Light Loss over Time

In Figure 19, the result of the calculation of light loss over time of the second experiment is shown. This light loss factor can be calculated by comparing the coverage factor just before and just after a spacing instance. Therefore, the result yields a number of points rather than a time series. The hypothesis is that spacing is optimal when the light loss calculation gives a result close to zero at each spacing action. A light loss calculation close to zero means that the lettuce heads just started touching each other by the time that the spacing was performed. This allows for minimal greenhouse space occupation, which saves on resources, whereas quality losses are prevented. Figure 19 shows that especially VeggieMight and the Reference spaced too early, whereas Koala always spaced very late. DigitalCucumbers had two good spacing moments at the end of the experiment but was too late for two others. In the beginning, for both the first and second spacing decisions, they were the latest team which resulted in large light losses and irreversible damage to the crop (Appendix A Figure A6). CVA seemed to have the best spacing strategy since most of their spacing decisions were made with light loss points close to zero. However, even this team had once a large light loss smaller than −10. Keep in mind that the calculation of light loss can only be carried out after spacing. It should therefore be treated as an observable parameter to train decision-making algorithms that base the decision on (a combination of) covered fraction and average head volume.
After combining Figure 19 with Figure 14, we learn that 98% seems to be a reasonable coverage strategy for autonomous spacing decisions.

4. Discussion

In the experiment of this study, the strategic and operational scheme of lettuce crop cultivation was determined by AI algorithms developed by teams participating in the challenge. These AI algorithms were based on greenhouse climate and crop sensor information. The final optimization target was net profit, thus on the one hand side a high crop growth rate and high plant quality for a high income and on the other hand low resource use for low costs. Since greenhouse occupation is essential, optimal plant spacing decisions are important.
Commercial lettuce growing is a continuous process of daily planting young plantlets and harvesting the full-grown lettuce heads. Target weight is realized over a reasonable time window (6–8 weeks) dependent on the cultivation strategy. Economics were expressed per m2 of the production area, therefore the resource use and selling prices were multiplied by the average number of lettuce heads per m2 (Equation (2)).
Teams had two cultivation cycles. The first cycle was used by the teams to test and explore their algorithms, the second cycle determined the winner. As this means that the teams must have applied their latest skills and knowledge in this second growing cycle, the discussion is focusing on the early summer results.
For an efficient greenhouse occupation, and to leverage the effect of the average density of lettuce heads on the final profit, some teams maintained high densities (Table 1). At high densities, neighboring plants competed for light (Figure 10). In both experimental cycles, 11–15 mols PAR/head was needed to realize the target weight of 250 g per head. However, MondayLettuce used only 9 mol/head in the second experiment. This team maintained a low cumulative DLI and in combination with the highest density among all teams in the second cultivation, it yielded the lowest amount of total light interception per plant. Also, DigitalCucumbers realized a high density. The high plant density resulted in intertwined root systems that made the first spacing difficult and seems to be linked to the outer tip burn (Appendix A Figure A6) and the aversively malformed and elongated plants (Figure 16).
Team VeggieMight realized the highest cumulative indoor PAR, even without applying any supplemental lighting (Appendix A Figure A4). This was a result of zero deployment hours of the blackout screen and a very limited deployment of the energy screenThe choice not to use any lighting or any blackout screens saved fixed costs associated with the equipment and the associated running costs for electricity. However, also this team suffered from the occurrence of outer tip burn and malformations on the plants, even though they had the lowest average plant density. The high fraction of class C products resulted in a low income. Similar to VeggieMight, team Koala, did not use supplemental lighting. This team was also restrictive with CO2 dosing in the first weeks of the cropping cycle. The team maintained a high coverage bouncing from 93% to 98.9%. The fact that the algorithm of this team managed to reduce costs, managed to have a high average head density and had a high fraction of class A resulted in team Koala being the winner.
The final harvest was too late for all teams (Figure 12). Timely harvest would have resulted in lower resource use and higher average plant density. The effect of earlier harvest on net profit cannot be quantified, unfortunately, since the quality of the lettuce heads at earlier moments in time cannot be predicted from the collected data.
Contrary to commercial greenhouse operations with continuous planting, spacing, and harvesting, the two growing cycles of this study concerned single batches. The choice for single batches was required to fit the format of the Autonomous Greenhouse Challenge aimed at allowing teams to develop and show the potential of autonomous algorithms growing a crop based on data analyses and vision. As a result, the computed profits, although valid according to the rules of the Challenge, cannot completely be compared with commercial practice. Dedicated trials would be needed to reflect deeper on the lettuce growth responses in continuous commercial cycles. However, such trials were outside the scope of this research. Nevertheless, results show that greenhouse occupation is essential and that optimum plant spacing decisions are important.
In fully autonomous cultivation such decisions should be made based on continuous sensor information. In this study camera images obtained by RealSense cameras in the greenhouse were used to obtain information on crop growth. DeepLabv3+ was used to separate the lettuce from the background. The model was only trained with a minimal amount of data. However, considering the output images in 3.4 and the high mIoU (98.2) it can be concluded that the segmentation proved suitable to be used as a base for crop spacing decisions.
The RealSense cameras also provided data on the development of height and volume over time. We expected that these two traits could be used to describe growth. The development of volume over time has been related to biomass, as in [60,61]. As the height and width information was proved to be very accurate (a mIoU of 98.2% for the covering fraction and an R2 of 0.976 for the height estimation) the lettuce head volume could be reasonably estimated. However, the computed volume showed to be not suitable to predict the crop weight. First, this can be explained by the fact that overlapping leaves do not contribute to coverage or volume. Secondly, in Figure 16 the height over time flattens during the last 2 weeks, and related to that, in Figure 18 also the volume flattens during the last days. At the same time, destructive measurements show that the fresh weight grows especially in these last days. As neither the coverage nor the height and volume indicated this fresh weight growth, it can be concluded that in the final stage, growth takes place from the central point of the head, resulting in more compact lettuce heads.
The product of the multiplied area per lettuce head with the maximum height resulted in the highest correlation coefficient with fresh weight (Appendix A Table A3). Three papers using the [50,51,52] dataset had a RMSE up to 25.3. As indicated, we obtained a lower accuracy, however, we should take into account that the datasets are not fully comparable. Our dataset is made within the greenhouse, with many plants and resultingly overlapping leaves. The previous dataset and other research in lettuce growth contained data of single plants only [53,55]. In our research, the predicted fresh weight was able to determine non-destructively the moment of harvest for the majority of the teams. The suggested harvest indicator dates can be closely related to the harvest dates that satisfy the target weight criterion deducted from the intermediate destructive harvests. For MondayLettuce no results were derived as the high final density of the team resulted in notably lower volume for the team (Table 3) that was hampered by the high leaf occlusion.
The light loss indicator proved to be a good and automatically computable parameter to judge spacing decisions just after the spacing was performed. This hindsight factor is therefore welcome as an indicator to learn about the suitable covering factor to use as a threshold for making a spacing step. In Section 3.4.4, based on the light loss indicator, a covering factor of 98% seemed to be a suitable moment for spacing. The results of teams that spaced at even higher covering factors correlated with more severe issues with outer leaf tip burn and malformations and are therefore not advisable. Spacing at lower thresholds might have given a better quality but would for sure also lead to higher costs per m2 due to lower average plant densities. Further experience with spacing on a lower threshold might show that possible higher quality outweighs the additional costs. The Reference and VeggieMight for example had a light loss indicator that was mostly greater than 10. For these teams, a later spacing strategy would likely not have had negative consequences.
In the future, other harvest indicators can be explored by deploying spectral indexes to describe lettuce growth, to address existing shortcomings (overlapping leaves, increased compactness). Spectral indexes can be successfully linked to the leaf area index in greenhouses [62]. Kizil et al. [63] estimated the yield of lettuce plants using spectral indexes. Although their solution only worked for single plants it might be an opportunity to explore further for the purpose of spacing decisions and fresh weight estimation. Also, it is good to point out that in literature growth from non-destructive measurements is mostly determined under ‘ceteris paribus’ conditions, meaning that the environment does not change. In commercial practice and the given dataset, the environment is continually changing due to different climate, light, and spacing strategies. The latter necessitates the utilization of larger datasets than those currently accessible. The acquisition of such datasets combined with the given dataset has the potential to bridge the divide between academic research and industrial production systems in the future.

5. Conclusions

  • In the experiment described here, teams autonomously were able to control greenhouse lettuce crop production by AI algorithms.
  • Autonomous AI algorithms were developed based on greenhouse climate sensor information in time and on crop images maximizing the net profit of lettuce cultivation.
  • Realized crop growth and densities due to timely spacing decisions and realized final target harvest due to timely estimation of crop weight have shown to have a large impact on net profit.
  • Images from 3D cameras and intelligent computer vision algorithms are helpful to make timely decisions on plant spacing and final harvest decisions.
  • Images of the lettuce crop canopy in the greenhouse have to be related to relevant crop parameters to predict crop growth. From the images inside the greenhouses over time, coverage, crop volume, maximum height, and light loss can be calculated to determine the optimum spacing moment. If the light loss is close to zero, an optimum spacing moment was reached, in our experiments that were at a coverage of 98%. The product of area per plant with a maximum height of the plant is a promising indicator for the moment of harvest given a target weight. Deviations from other destructive indicators are highly linked to the results of the crop’s architecture as the impact of leaf occlusion.
  • We have shown that computer vision and deep learning algorithms can be used for automated plant spacing decisions toward the autonomous control of greenhouses. The provided open-source dataset contributes to another step in the development of autonomous greenhouses.
  • The reality gap between optimum research and commercial production conditions is a crucial aspect to be considered in computer vision applications. Larger datasets need to be acquired to bridge the gap.
  • Early pest and disease detection, real-time inclusion of the volatile market prices, robotics in activities of crop handling are among the next steps for higher levels of automation in horticulture (not part of this research).

Author Contributions

Conceptualization, A.S.P. and S.H.; methodology, S.H., F.d.Z., A.E., B.v.M. and A.S.P.; software, B.v.M., G.J., T.v.D. and F.d.Z.; validation, F.d.Z., B.v.M. and A.S.P.; formal analysis, B.v.M. and A.S.P.; investigation, M.B., A.E., F.d.Z., G.J., T.v.D., B.v.M. and A.S.P.; resources, S.H.; data curation, F.d.Z., B.v.M. and A.S.P.; writing—original draft preparation, A.S.P., B.v.M. and S.H.; writing—review and editing, F.d.Z., A.E., M.B., G.J. and T.v.D.; visualization, A.S.P. and B.v.M.; supervision, S.H.; project administration, S.H.; funding acquisition, S.H. All authors have read and agreed to the published version of the manuscript.

Funding

This organization of the international challenge has been funded by Tencent, with additional sponsorship by providing materials by Fluence, Gebr. Geers B.V., Sigrow, LetsGrow.com, Ridder, Hortiplan, Glastuinbouw Nederland, Kas als Energiebron and Gemeente Lansingerland.

Data Availability Statement

The complete dataset of the 3rd Autonomous Greenhouse Challenge: Time-series data on realized climate with annotated crop lettuce-images is published online at https://doi.org/10.4121/21960932.v1, (accessed on 2 January 2023).

Acknowledgments

We would like to thank our sponsors and our international jury members. We also would like to thank the colleagues of our greenhouse facility who carried out part of the daily crop supervision. Finally, we would like to thank all teams and individual team members participating in this challenge.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Appendix A

Table A1. Data collected throughout the cultivation cycles for all greenhouse compartments on outdoor and indoor greenhouse climate, weather forecast, requested and realized operational controls, weekly destructive plant measurements on which images were annotated, and final harvest plant data. Data are open access [49,56].
Table A1. Data collected throughout the cultivation cycles for all greenhouse compartments on outdoor and indoor greenhouse climate, weather forecast, requested and realized operational controls, weekly destructive plant measurements on which images were annotated, and final harvest plant data. Data are open access [49,56].
ParameterUnitIntervalsDescription
MeasurementOutdoor temperature°C5 minMeteo
Outdoor relative humidity%5 minMeteo
Global radiationW/m25 minMeteo
Wind speedm/s5 minMeteo
Wind direction-5 minMeteo
Rain[1 rain–0 dry]5 minMeteo
Heat emission- pyrgeometerW/m25 minMeteo
Absolute humidity content 5 minMeteo
Temperature greenhouse°C5 minIndoor climate
Relative humidity greenhouse%5 minIndoor climate
CO2 concentration greenhouseppm5 minIndoor climate
Humidity deficitg/m35 minIndoor climate
Leeward vent position % [0–100]5 minIndoor climate
Windward vent position % [0–100]5 minIndoor climate
Temperature rail pipe°C5 minIndoor climate
Assimilation lighting (LED)% [0–100]5 minIndoor climate
Energy screen position% [0–100]5 minIndoor climate
Blackout screen position% [0–100]5 minIndoor climate
Cumulative minutes of CO2 dosingminutes5 minIndoor climate
Heating temperature°C5 minIndoor climate
ForecastOutdoor temperature°C5 minMeteo
Outdoor relative humidity%5 minMeteo
Global radiationW/m25 minMeteo
Wind speedm/s5 minMeteo
Degree of cloudiness[1–8]5 minMeteo
ControlVentilation temperature°C5 minIndoor climate
Lee side min vent position% [0–100]5 minIndoor climate
Net pipe minimum °C5 minIndoor climate
Energy screen % [0–100]5 minIndoor climate
Blackout screen% [0–100]5 minIndoor climate
CO2 Ppm5 minIndoor climate
Humidity deficit g/m35 minIndoor climate
CropA class harvest gAt harvest>250 g
B class harvest gAt harvest220–250 g
C class harvest gAt harvest<220 g or visible malformations
Plant density #/m2Team dependent92 plants/m2 at transplanting
Day of harvestDays after transplantingOnceTeam dependent
HeightcmWeekly/At harvestWeekly sampled plants and at harvest day which was team dependent
DiametercmWeekly/At harvestWeekly sampled plants and at harvest day which was team dependent
Fresh WeightgWeekly/At harvestWeekly sampled plants and at harvest day which was team dependent
Dry WeightgWeekly/At harvestWeekly sampled plants and at harvest day which was team dependent
Leaf deformation[1–3]Weekly/At harvestWeekly sampled plants and at harvest day which was team dependent. Scoring protocol 1–3, applies to a head of lettuce
RGB, depth images-End of each cultivationAnnotated single crop and canopy images
Table A2. Actuators and defaul sensors installed in all the greenhouse compartments during the growing cycles and description of the installed equipment.
Table A2. Actuators and defaul sensors installed in all the greenhouse compartments during the growing cycles and description of the installed equipment.
Greenhouse CompartmentsDescription
EquipmentRail pipeMax capacity 129 W/m2
Energy screenLUXOUS 1547 D FR, Ludvig Svensson
Blackout screenOBSCURA 9950 FR W, Ludvig Svensson
LED lightsDimming 27–270 µmol/m2/s with efficiency 2.4 µmol/J, VYPR 2p, Fluence by Osram
Fogging 330 g/m2/h
CO2 supplyMax capacity 15 g/m2/h
Hydroponic gutters (NFT)Length 3.2 m, 30 plant holes, 11 cm heart-to-heart distance, 10 cm wide, Hortiplan
Measuring boxIndoor temperature, relative humidity and CO2 sensor in ventilated measuring box placed in the middle of the compartment above the growing crop
PAR sensorPAR sensor placed above canopy and below LED lights
RGB, depth cameraDepth Camera D415—Intel RealSense
Figure A1. Heating energy consumption (MJ/m2) of all compartments during the first and second cultivation.
Figure A1. Heating energy consumption (MJ/m2) of all compartments during the first and second cultivation.
Sensors 23 02929 g0a1
Figure A2. Energy screen usage in all compartments during the first and second cultivation.
Figure A2. Energy screen usage in all compartments during the first and second cultivation.
Sensors 23 02929 g0a2
Figure A3. Blackout screen usage in all compartments during the first and second cultivation.
Figure A3. Blackout screen usage in all compartments during the first and second cultivation.
Sensors 23 02929 g0a3
Figure A4. Artificial lighting usage in all compartments during the first and second cultivation.
Figure A4. Artificial lighting usage in all compartments during the first and second cultivation.
Sensors 23 02929 g0a4
Figure A5. Electricity consumption (kWh/m2) in all compartments during the first and second cultivation.
Figure A5. Electricity consumption (kWh/m2) in all compartments during the first and second cultivation.
Sensors 23 02929 g0a5
Table A3. Realized harvest dates and fresh weight at the moment of harvest along with the dates at which the weight target was realized by interpolating linearly the weekly destructive measured data.
Table A3. Realized harvest dates and fresh weight at the moment of harvest along with the dates at which the weight target was realized by interpolating linearly the weekly destructive measured data.
CompartmentRealized Harvest Date
[dd/mm]
Number of Cultivation Days
[Days]
Average FW at Realized Harvest
[g/Head]
Harvest Date Satisfying the FW Criterion
[dd/mm]
Average FW Satisfying the FW Criterion
[g/Head]
Reference9 June38271.188 June258.10
Koala17 June46402.819 June260.50
CVA13 June42342.067 June265.35
MondayLettuce14 June43294.969 June254.02
DigitalCucumbers15 June44390.857 June260.61
VeggieMight13 June43389.804 June251.91
Table A4. Correlation coefficient between measured and predicted fresh weight using corresponding parameters derived using RGB and depth image.
Table A4. Correlation coefficient between measured and predicted fresh weight using corresponding parameters derived using RGB and depth image.
ParametersCorrelation Coefficient
Coverage percentage0.5392
Average height [cm]0.6953
Median height [cm]0.6946
Max height [cm] 0.7606
Volume [cm3]0.6785
Head density−0.7912
Volume per plant [cm3/head]0.8975
Area per plant [cm2]0.8987
Mm per pixel−0.6784
Area per plant divided by volume per plant−0.4801
Volume per plant divided by area per plant0.6741
Area per plant multiplied by volume per plant0.9214
Area per plant divided by mm per pixel0.9048
Area per plant divided by the maximum height0.8126
Area per plant divided by median height0.8400
Area per plant divided by average height0.8360
Area per plant multiplied by the maximum height0.9340
Area per plant multiplied by the median height0.9048
Area per plant multiplied by average height0.9065
Figure A6. Classification of lettuces at the different harvest moments as Class A, Class B, and Class C lettuces for the first (left) and second (right) cultivation. The classification was carried out using a standardized protocol that distinguished the harvested products given the satisfaction of the target fresh weight, the presence, and severity of the outer-leaf burn, and the presence of diseases and or malformations that resulted in non-sellable products.
Figure A6. Classification of lettuces at the different harvest moments as Class A, Class B, and Class C lettuces for the first (left) and second (right) cultivation. The classification was carried out using a standardized protocol that distinguished the harvested products given the satisfaction of the target fresh weight, the presence, and severity of the outer-leaf burn, and the presence of diseases and or malformations that resulted in non-sellable products.
Sensors 23 02929 g0a6

References

  1. van Dijk, M.; Morley, T.; Rau, M.L.; Saghai, Y. A meta-analysis of projected global food demand and population at risk of hunger for the period 2010–2050. Nat. Food 2021, 2, 494–501. [Google Scholar] [CrossRef]
  2. Foley, J.A.; Ramankutty, N.; Brauman, K.A.; Cassidy, E.S.; Gerber, J.S.; Johnston, M.; Mueller, N.D.; O’Connell, C.; Ray, D.K.; West, P.C.; et al. Solutions for a cultivated planet. Nature 2011, 478, 337–342. [Google Scholar] [CrossRef] [Green Version]
  3. Aznar-Sánchez, J.A.; Velasco-Muñoz, J.F.; López-Felices, B.; Román-Sánchez, I.M. An Analysis of Global Research Trends on Greenhouse Technology: Towards a Sustainable Agriculture. Int. J. Environ. Res. Public Health 2020, 17, 664. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Stanghellini, C. Horticultural production in greenhouses: Efficient use of water. Acta Hortic. 2014, 1034, 25–32. [Google Scholar] [CrossRef]
  5. Vegetables; Yield and Cultivated Area per Kind of Vegetable. 2021. Available online: https://www.cbs.nl/en-gb/figures/detail/37738ENG (accessed on 5 September 2021).
  6. Verdouw, C.; Robbemond, R.; Kruize, J.W. Integration of Production Control and Enterprise Management Systems in Horticulture. In Proceedings of the 7th International Conference on Information and Communication Technologies in Agriculture, Food and Environment (HAICTA 2015), Kavala, Greece, 17–20 September 2015; pp. 124–135. [Google Scholar]
  7. Payne, H.J.; Hemming, S.; van Rens, B.A.; van Henten, E.J.; van Mourik, S. Quantifying the role of weather forecast error on the uncertainty of greenhouse energy prediction and power market trading. Biosyst. Eng. 2022, 224, 1–5. [Google Scholar] [CrossRef]
  8. Verdouw, C.; Bondt, N.; Schmeitz, H.; Zwinkels, H. Towards a Smarter Greenport: Public-Private Partnership to Boost Digital Standardisation and Innovation in the Dutch Horticulture. Int. J. Food Syst. Dyn. 2014, 5, 44–52. [Google Scholar] [CrossRef]
  9. Tzounis, A.; Katsoulas, N.; Bartzanas, T.; Kittas, C. Internet of Things in agriculture, recent advances and future challenges. Biosyst. Eng. 2017, 164, 31–48. [Google Scholar] [CrossRef]
  10. Wolfert, S.; Ge, L.; Verdouw, C.; Bogaardt, M.J. Big data in smart farming—A review. Agric. Syst. 2017, 153, 69–80. [Google Scholar] [CrossRef]
  11. Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef] [Green Version]
  12. Zhai, Z.; Martínez, J.F.; Beltran, V.; Martínez, N.L. Decision support systems for agriculture 4.0: Survey and challenges. Comput. Electron. Agric. 2020, 170, 105256. [Google Scholar] [CrossRef]
  13. Verdouw, C.; Tekinerdogan, B.; Beulens, A.; Wolfert, S. Digital twins in smart farming. Agric. Syst. 2021, 189, 103046. [Google Scholar] [CrossRef]
  14. Marshall-Colon, A.; Long, S.P.; Allen, D.K.; Allen, G.; Beard, D.A.; Benes, B.; Von Caemmerer, S.; Christensen, A.; Cox, D.J.; Hart, J.C.; et al. Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform. Front. Plant Sci. 2017, 8, 786. [Google Scholar] [CrossRef] [PubMed]
  15. Tzachor, A.; Richards, C.E.; Jeen, S. Transforming agrifood production systems and supply chains with digital twins. Npj Sci. Food 2022, 6, 1–5. [Google Scholar] [CrossRef]
  16. Buwalda, F.; Van Henten, E.J.; De Gelder, A.; Bontsema, J.; Hemming, J. Toward an optimal control strategy for sweet pepper cultivation-1. A dynamic crop model. Acta Hortic. 2006, 718, 367–374. [Google Scholar] [CrossRef]
  17. Tchamitchian, M.; Henry-Montbroussous, B.; Jeannequin, B.; Lagier, J. Serriste: Climate set-point determination for greenhouse tomatoes. Acta Hortic. 1998, 456, 321–328. [Google Scholar] [CrossRef]
  18. Kolokotsa, D.; Saridakis, G.; Dalamagkidis, K.; Dolianitis, S.; Kaliakatsos, I. Development of an intelligent indoor environment and energy management system for greenhouses. Energy Convers. Manag. 2010, 51, 155–168. [Google Scholar] [CrossRef]
  19. de Zwart, H.F. Analyzing Energy-Saving Options in Greenhouse Cultivation Using a Simulation Model. Ph.D. Thesis, Wageningen University, Wageningen, The Netherlands, 1996. [Google Scholar]
  20. Marcelis, L.; Elings, A.; De Visser, P.; Heuvelink, E. Simulating growth and development of tomato crop. Acta Hortic. 2009, 821, 101–110. [Google Scholar] [CrossRef] [Green Version]
  21. Hemming, S.; de Zwart, F.; Elings, A.; Righini, I.; Petropoulou, A. Remote control of greenhouse vegetable production with artificial intelligence—Greenhouse climate, irrigation, and crop production. Sensors 2019, 19, 1807. [Google Scholar] [CrossRef] [Green Version]
  22. Hemming, S.; Zwart, F.D.; Elings, A.; Petropoulou, A.; Righini, I. Cherry tomato production in intelligent greenhouses—Sensors and AI for control of climate, irrigation, crop yield, and quality. Sensors 2020, 20, 6430. [Google Scholar] [CrossRef]
  23. Hamon, R.; Junklewitz, H.; Sanchez, I. Robustness and Explainability of Artificial Intelligence; EUR 30040 EN; Publications Office of the European Union: Luxembourg, 2020; ISBN 978-92-76-14660-5. [Google Scholar] [CrossRef]
  24. Ciaglia Anne Bennett. High Tech Growing Systems Help Improve Efficiencies and Meet Consumer Demand. 2017. Available online: https://gpnmag.com/article/automation-high-tech-growing-systems-help-improve-efficiencies-meet-consumer-demand/ (accessed on 10 October 2022).
  25. Savvas, D.; Passam, H. Hydroponic Production of Vegetables and Ornamentals; Embryo Publications: Athens, Greece, 2002; p. 463. [Google Scholar]
  26. Jones, J.B., Jr. Hydroponics: A Practical Guide for the Soilless Grower; CRC Press: Boca Ratos, FL, USA, 2016. [Google Scholar]
  27. Resh, H.M. Hydroponic Food Production: A Definitive Guidebook for the Advanced Home Gardener and the Commercial Hydroponic Grower, 7th ed.; CRC Press: Boca Ratos, FL, USA, 2016. [Google Scholar]
  28. van Treuren, R.; van Eekelen, H.D.; Wehrens, R.; de Vos, R.C. Metabolite variation in the lettuce gene pool: Towards healthier crop varieties and food. Metabolomics 2018, 14, 1–4. [Google Scholar] [CrossRef] [Green Version]
  29. Masarirambi, M.T.; Nxumalo, K.A.; Musi, P.J.; Rugube, L.M. Common physiological disorders of lettuce (Lactuca sativa) found in Swaziland: A review. Am.-Eurasian J. Agric. Environ. Sci. 2018, 18, 50–56. [Google Scholar]
  30. Martinetti, L.; Ferrante, A.; Penati, M. Influenza della concimazione sulla produzione quanti-qualitativa di ortaggi baby leaf per la quarta gamma in coltivazione biologica e convenzionale. La Riv. Di Sci. Dell’alimentazione 2009, 38, 23–33. [Google Scholar]
  31. Scuderi, D.; Giuffrida, F.; Noto, G. Effects of salinity and plant density on quality of lettuce grown in floating system for fresh-cut. Acta Hortic. 2009, 843, 219–226. [Google Scholar] [CrossRef]
  32. Mengistu, F.G.; Tabor, G.; Dagne, Z.; Atinafu, G.; Tewolde, F.T. Effect of planting density on yield and yield components of lettuce (Lactuca sativa L.) at two agro-ecologies of Ethiopia. Afr. J. Agric. Res. 2021, 17, 549–556. [Google Scholar]
  33. Ojo, M.O.; Zahid, A. Deep Learning in Controlled Environment Agriculture: A Review of Recent Advancements, Challenges and Prospects. Sensors 2022, 22, 7965. [Google Scholar] [CrossRef]
  34. Javaid, M.; Haleem, A.; Khan, I.H.; Suman, R. Understanding the potential applications of Artificial Intelligence in Agriculture Sector. Adv. Agrochem 2022. [Google Scholar] [CrossRef]
  35. Mishra, P.; Polder, G.; Vilfan, N. Close Range Spectral Imaging for Disease Detection in Plants Using Autonomous Platforms: A Review on Recent Studies. Curr. Robot. Rep. 2020, 1, 43–48. [Google Scholar] [CrossRef] [Green Version]
  36. Nieuwenhuizen, A.T.; Kool, J.; Suh, H.K.; Hemming, J. Automated spider mite damage detection on tomato leaves in greenhouses. Acta Hortic. 2020, 1268, 165–172. [Google Scholar] [CrossRef]
  37. Suh, H.K.; Ijsselmuiden, J.; Hofstee, J.W.; van Henten, E.J. Transfer learning for the classification of sugar beet and volunteer potato under field conditions. Biosyst. Eng. 2018, 174, 50–65. [Google Scholar] [CrossRef]
  38. Rahnemoonfar, M.; Sheppard, C. Deep Count: Fruit Counting Based on Deep Simulated Learning. Sensors 2017, 17, 905. [Google Scholar] [CrossRef] [Green Version]
  39. Fonteijn, H.; Afonso, M.; Lensink, D.; Mooij, M.; Faber, N.; Vroegop, A.; Polder, G.; Wehrens, R. Automatic Phenotyping of Tomatoes in Production Greenhouses Using Robotics and Computer Vision: From Theory to Practice. Agronomy 2021, 11, 1599. [Google Scholar] [CrossRef]
  40. Nishina, H. Development of Speaking Plant Approach Technique for Intelligent Greenhouse. Agric. Agric. Sci. Procedia 2015, 3, 9–13. [Google Scholar] [CrossRef] [Green Version]
  41. Bac, C.W. Improving Obstacle Awareness for Robotic Harvesting of Sweet-Pepper. Ph.D. Thesis, Wageningen University, Wageningen, The Netherlands, 2015. [Google Scholar]
  42. Barth, R. Vision Principles for Harvest Robotics: Sowing Artificial Intelligence in Agriculture. Ph.D. Thesis, Wageningen University, Wageningen, The Netherlands, 2018. [Google Scholar] [CrossRef]
  43. Paulus, S. Measuring crops in 3D: Using geometry for plant phenotyping. Plant Methods 2019, 15, 1–13. [Google Scholar] [CrossRef]
  44. Tian, H.; Wang, T.; Liu, Y.; Qiao, X.; Li, Y. Computer vision technology in agricultural automation—A review. Inf. Process. Agric. 2019, 7, 1–19. [Google Scholar] [CrossRef]
  45. Tian, Z.; Ma, W.; Yang, Q.; Duan, F. Application status and challenges of machine vision in plant factory—A review. Inf. Process. Agric. 2021, 9, 195–211. [Google Scholar] [CrossRef]
  46. Wu, H.; Xiao, B.; Codella, N.; Liu, M.; Dai, X.; Yuan, L.; Zhang, L. Cvt: Introducing convolutions to vision transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 22–31. [Google Scholar]
  47. Nilsback, M.-E.; Zisserman, A. Automated Flower Classification over a Large Number of Classes. In Proceedings of the 2008 Sixth Indian Conference on Computer Vision, Graphics & Image Processing, Bhubaneswar, India, 16–19 December 2008; pp. 722–729. [Google Scholar] [CrossRef]
  48. Valenzuela, I.C.; Puno, J.C.V.; Bandala, A.A.; Baldovino, R.G.; de Luna, R.G.; De Ocampo, A.L.; Cuello, J.; Dadios, E.P. Quality assessment of lettuce using artificial neural network. In Proceedings of the 2017 IEEE 9th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment and Management (HNICEM), Manila, Philippines, 1–3 December 2017; pp. 1–5. [Google Scholar] [CrossRef]
  49. Hemming, S.; de Zwart, F.; Elings, A.; Bijlaard, M.; van Marrewijk, B.; Petropoulou, A. 3rd Autonomous Greenhouse Challenge: Online Challenge Lettuce Images; Dataset: 4TU.ResearchData. 2021. Available online: https://doi.org/10.4121/15023088.v1 (accessed on 2 January 2023).
  50. Lin, Z.; Fu, R.; Ren, G.; Zhong, R.; Ying, Y.; Lin, T. Automatic monitoring of lettuce fresh weight by multi-modal fusion based deep learning. Front. Plant Sci. 2022, 13. [Google Scholar] [CrossRef] [PubMed]
  51. Gang, M.-S.; Kim, H.-J.; Kim, D.-W. Estimation of Greenhouse Lettuce Growth Indices Based on a Two-Stage CNN Using RGB-D Images. Sensors 2022, 22, 5499. [Google Scholar] [CrossRef] [PubMed]
  52. Zhang, Y.; Li, M.; Li, G.; Li, J.; Zheng, L.; Zhang, M.; Wang, M. Multi-phenotypic parameters extraction and biomass estimation for lettuce based on point clouds. Measurement 2022, 204, 112094. [Google Scholar] [CrossRef]
  53. Lu, J.-Y.; Chang, C.-L.; Kuo, Y.-F. Monitoring Growth Rate of Lettuce Using Deep Convolutional Neural Networks. In Proceedings of the 2019 ASABE Annual International Meeting, Boston, MA, USA, 7–10 July 2019. [Google Scholar] [CrossRef]
  54. Bauer, A.; Bostrom, A.G.; Ball, J.; Applegate, C.; Cheng, T.; Laycock, S.; Rojas, S.M.; Kirwan, J.; Zhou, J. Combining computer vision and deep learning to enable ultra-scale aerial phenotyping and precision agriculture: A case study of lettuce production. Hortic. Res. 2019, 6, 70. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  55. Du, J.; Lu, X.; Fan, J.; Qin, Y.; Yang, X.; Guo, X. Image-Based High-Throughput Detection and Phenotype Evaluation Method for Multiple Lettuce Varieties. Front. Plant Sci. 2020, 11, 563386. [Google Scholar] [CrossRef]
  56. Petropoulou, A.; van Marrewijk, B.; Hemming, S.; de Zwart, F.; Elings, A.; Bijlaard, M. 3rd Autonomous Greenhouse Challenge-Real Challenge Data Climate and Images. Dataset: 4TU.ResearchData. 2023. Available online: https://data.4tu.nl/articles/dataset/3rd_Autonomous_Greenhouse_Challenge_Online_Challenge_Lettuce_Images/15023088 (accessed on 2 January 2023).
  57. Intel ® Depth Camera D415–Intel® RealSenseTM Depth and Tracking Cameras. Available online: https://www.intelrealsense.com/depth-camera-d415/ (accessed on 14 October 2022).
  58. Chen, L.C.; Zhu, Y.; Papandreou, G.; Schroff, F.; Adam, H. Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 801–818. [Google Scholar]
  59. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  60. Gray, D.; Steckel, J.R. Hearting and mature head characteristics of lettuce (Lactuca sativa L.) as affected by shading at different periods during growth. J. Hortic. Sci. 1981, 56, 199–206. [Google Scholar] [CrossRef]
  61. Aikman, D.P.; Benjamin, R. A model for plant and crop growth, allowing for competition for light by the use of potential and restricted crown zone areas. Ann. Bot. 1994, 73, 185–194. [Google Scholar] [CrossRef]
  62. Sarlikioti, V.; Meinen, E.; Marcelis, L. Crop Reflectance as a tool for the online monitoring of LAI and PAR interception in two different greenhouse Crops. Biosyst. Eng. 2011, 108, 114–120. [Google Scholar] [CrossRef]
  63. Kizil, Ü.; Genc, L.; Inalpulat, M.; Şapolyo, D.; Mirik, M. Lettuce (Lactuca sativa L.) yield prediction under water stress using artificial neural network (ANN) model and vegetation indices. Žemdirbystė=Agric. 2012, 99, 409–418. [Google Scholar]
Figure 1. Research methodology of the growing experiments and analysis of results. Data from the annotated single lettuce images is found in [49], whereas the annotated full crop data are found in [56].
Figure 1. Research methodology of the growing experiments and analysis of results. Data from the annotated single lettuce images is found in [49], whereas the annotated full crop data are found in [56].
Sensors 23 02929 g001
Figure 2. (a) Cross- and (b) top-view sections of one greenhouse experimental compartment with 96 m2 ground floor. (a) Compartment with crop and actuators: rail pipe, irrigation system, NFT gutters, CO2 supply, LED artificial light, and two screens. (b) Arrangement of lettuce gutters. Green boxes represent the harvest area for data analysis.
Figure 2. (a) Cross- and (b) top-view sections of one greenhouse experimental compartment with 96 m2 ground floor. (a) Compartment with crop and actuators: rail pipe, irrigation system, NFT gutters, CO2 supply, LED artificial light, and two screens. (b) Arrangement of lettuce gutters. Green boxes represent the harvest area for data analysis.
Sensors 23 02929 g002
Figure 3. Data communication structure Data flows from indoor and outdoor climate and additional sensors to the virtual machines and the online database. Decisions of algorithms of teams are written from the Virtual Machines to the online database from where another communication protocol writes the controls to the greenhouse climate computer before their implementation in the actual greenhouse compartments. Greenhouse staff receives decisions from online database for spacing and moment of harvest.
Figure 3. Data communication structure Data flows from indoor and outdoor climate and additional sensors to the virtual machines and the online database. Decisions of algorithms of teams are written from the Virtual Machines to the online database from where another communication protocol writes the controls to the greenhouse climate computer before their implementation in the actual greenhouse compartments. Greenhouse staff receives decisions from online database for spacing and moment of harvest.
Sensors 23 02929 g003
Figure 4. Two example images of RealSense D415 from the day of planting (left) and the day of harvest (right). On the day of harvest, plants were sampled from the field of view of the camera and were destructively measured for height, diameter, fresh weight, dry weight, and quality.
Figure 4. Two example images of RealSense D415 from the day of planting (left) and the day of harvest (right). On the day of harvest, plants were sampled from the field of view of the camera and were destructively measured for height, diameter, fresh weight, dry weight, and quality.
Sensors 23 02929 g004
Figure 5. Example of original (left) and annotated image (right). The annotated image has two classes in blue the lettuce class and pink the background class.
Figure 5. Example of original (left) and annotated image (right). The annotated image has two classes in blue the lettuce class and pink the background class.
Sensors 23 02929 g005
Figure 6. Side view of point cloud within green the fitted ground plane using RANSAC [58] to determine the height. At the bottom the plants at the start date and above the plants at the day of harvest.
Figure 6. Side view of point cloud within green the fitted ground plane using RANSAC [58] to determine the height. At the bottom the plants at the start date and above the plants at the day of harvest.
Sensors 23 02929 g006
Figure 7. Average daily temperature (°C) of all compartments during the first and second cultivation.
Figure 7. Average daily temperature (°C) of all compartments during the first and second cultivation.
Sensors 23 02929 g007
Figure 8. CO2 concentration (ppm) for the different compartments during the first and second cultivation.
Figure 8. CO2 concentration (ppm) for the different compartments during the first and second cultivation.
Sensors 23 02929 g008
Figure 9. Indoor photosynthetic active radiation (PAR) for the different compartments during the first and second cultivation.
Figure 9. Indoor photosynthetic active radiation (PAR) for the different compartments during the first and second cultivation.
Sensors 23 02929 g009
Figure 10. Cumulative light intercepted by each lettuce head per compartment in the first and second cultivation. Light interception per lettuce head was calculated by a multiplication of the daily light integral (DLI) with the green coverage per m2 growing area, divided by the head density on each particular day. The circles (o), mark the days at which the lettuce heads reached the target fresh weight of 250 g by linearly interpolating the data of the weekly destructive measurements on randomly selected lettuce heads.
Figure 10. Cumulative light intercepted by each lettuce head per compartment in the first and second cultivation. Light interception per lettuce head was calculated by a multiplication of the daily light integral (DLI) with the green coverage per m2 growing area, divided by the head density on each particular day. The circles (o), mark the days at which the lettuce heads reached the target fresh weight of 250 g by linearly interpolating the data of the weekly destructive measurements on randomly selected lettuce heads.
Sensors 23 02929 g010
Figure 11. Lettuce density (heads m2) and harvest dates (o) in the different compartments during the first and second cultivation period.
Figure 11. Lettuce density (heads m2) and harvest dates (o) in the different compartments during the first and second cultivation period.
Sensors 23 02929 g011
Figure 12. Development of fresh weight [g/plant] in the different compartments for the first and second cultivation. The curves were obtained by linear interpolation of the weekly randomly sampled plants and (destructively) weighed heads of lettuce. The end of the lines represents the chosen date of harvest and the fresh weight at harvest. The circles (o), represent the days on which the lettuce heads reached the target fresh weight of 250 g.
Figure 12. Development of fresh weight [g/plant] in the different compartments for the first and second cultivation. The curves were obtained by linear interpolation of the weekly randomly sampled plants and (destructively) weighed heads of lettuce. The end of the lines represents the chosen date of harvest and the fresh weight at harvest. The circles (o), represent the days on which the lettuce heads reached the target fresh weight of 250 g.
Sensors 23 02929 g012
Figure 13. Example of a real (left) and segmented (right) image using DeepLabv3+. From the segmentation, the coverage [%] was calculated.
Figure 13. Example of a real (left) and segmented (right) image using DeepLabv3+. From the segmentation, the coverage [%] was calculated.
Sensors 23 02929 g013
Figure 14. Coverage [%] of the greenhouse with lettuce heads, as calculated with the segmentation algorithm The frequently shown stepwise fall of coverage is a result of spacing actions.
Figure 14. Coverage [%] of the greenhouse with lettuce heads, as calculated with the segmentation algorithm The frequently shown stepwise fall of coverage is a result of spacing actions.
Sensors 23 02929 g014
Figure 15. Correlation of manually measured ground truth height (ground truth height) of lettuces during destructive measurements compared with predicted height (predicted height) from RealSense camera images of lettuces.
Figure 15. Correlation of manually measured ground truth height (ground truth height) of lettuces during destructive measurements compared with predicted height (predicted height) from RealSense camera images of lettuces.
Sensors 23 02929 g015
Figure 16. Daily calculated lettuce height from RealSense camera images in all compartments during the cultivation start.
Figure 16. Daily calculated lettuce height from RealSense camera images in all compartments during the cultivation start.
Sensors 23 02929 g016
Figure 17. Daily estimated lettuce volume in all compartments during the cultivation period.
Figure 17. Daily estimated lettuce volume in all compartments during the cultivation period.
Sensors 23 02929 g017
Figure 18. Relation of calculated area per plant multiplied by maximum height [cm3] and measured fresh weight [g/head] (left). Area per plant multiplied by maximum height [cm3/head] as a harvest indicator in realizing the target weight of 250 g/head in all compartments (dots) (right).
Figure 18. Relation of calculated area per plant multiplied by maximum height [cm3] and measured fresh weight [g/head] (left). Area per plant multiplied by maximum height [cm3/head] as a harvest indicator in realizing the target weight of 250 g/head in all compartments (dots) (right).
Sensors 23 02929 g018
Figure 19. Light loss calculation for all compartments for the spacing instances of the second experiment. Negative values indicate a (too) late spacing and large positive values a too-early spacing.
Figure 19. Light loss calculation for all compartments for the spacing instances of the second experiment. Negative values indicate a (too) late spacing and large positive values a too-early spacing.
Sensors 23 02929 g019
Table 1. The average density of lettuce heads for the two cultivations as calculated using Equation (1).
Table 1. The average density of lettuce heads for the two cultivations as calculated using Equation (1).
Experiment Planting DateReferenceKoalaCVAMonday
Lettuce
Digital
Cucumbers
Veggie
Might
3 February32.734.531.941.437.732.9
3 May29.030.429.936.731.728.7
Table 2. Net profit of different teams in the second experiment consisting of crop income minus costs (fixed costs, heating costs, electricity costs, CO2 costs, and intervention costs).
Table 2. Net profit of different teams in the second experiment consisting of crop income minus costs (fixed costs, heating costs, electricity costs, CO2 costs, and intervention costs).
CVAVeggie
Might
Digital
Cucumbers
KoalaMonday
Lettuce
Reference
Total income [€/m2]12.1610.3815.8414.1611.8312.12
Fixed costs [€/m2]7.856.418.507.069.646.59
Heating Costs [€/m2]0.010.290.160.040.030.02
Electricity costs [€/m2]0.230.000.460.000.450.34
CO2-costs [€/m2]0.600.530.340.110.180.53
Total operational costs [€/m2]8.697.249.457.2410.307.48
Intervention Costs [€/m2]2.001.003.001.002.00-
Net profit [€/m2]1.472.143.395.93−0.474.64
Table 3. Overview of plant traits at optimal harvest date (when target weight of 250 g per lettuce head is reached). Teams are sorted from high to low volume.
Table 3. Overview of plant traits at optimal harvest date (when target weight of 250 g per lettuce head is reached). Teams are sorted from high to low volume.
CompartmentOptimal Harvest DateDensity
[Heads/m2]
Coverage
[%]
Max Height [cm]Volume [cm3/Plant]
CVA 7 June 20221587.215.76705
Reference 8 June 20221896.815.36379
VeggieMight 4 June 20221890.916.56090
Koala 9 June 20221898.214.45581
DigitalCucumbers7 June 20221886.716.85356
MondayLettuce9 June 202222.598.516.44844
Table 4. Overview of area per plant multiplied by max height criterion at optimal harvest date (when harvest indicator is 7840 cm3 per lettuce head is reached).
Table 4. Overview of area per plant multiplied by max height criterion at optimal harvest date (when harvest indicator is 7840 cm3 per lettuce head is reached).
CompartmentRealized Harvest Date
[dd/mm]
Harvest Date Satisfying the FW Criterion
[dd/mm]
Harvest Date Satisfying the Area per Plant × Max Height Criterion
[dd/mm]
Satisfying the Area per Plant × Max Height Criterion
[cm3]
Reference9 June8 June5 June79,144
Koala17 June9 June3 June78,819
CVA13 June7 June3 June80,717
MondayLettuce14 June9 June--
DigitalCucumbers15 June7 June7 June83,610
VeggieMight13 June4 June3 June82,410
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Petropoulou, A.S.; van Marrewijk, B.; de Zwart, F.; Elings, A.; Bijlaard, M.; van Daalen, T.; Jansen, G.; Hemming, S. Lettuce Production in Intelligent Greenhouses—3D Imaging and Computer Vision for Plant Spacing Decisions. Sensors 2023, 23, 2929. https://doi.org/10.3390/s23062929

AMA Style

Petropoulou AS, van Marrewijk B, de Zwart F, Elings A, Bijlaard M, van Daalen T, Jansen G, Hemming S. Lettuce Production in Intelligent Greenhouses—3D Imaging and Computer Vision for Plant Spacing Decisions. Sensors. 2023; 23(6):2929. https://doi.org/10.3390/s23062929

Chicago/Turabian Style

Petropoulou, Anna Selini, Bart van Marrewijk, Feije de Zwart, Anne Elings, Monique Bijlaard, Tim van Daalen, Guido Jansen, and Silke Hemming. 2023. "Lettuce Production in Intelligent Greenhouses—3D Imaging and Computer Vision for Plant Spacing Decisions" Sensors 23, no. 6: 2929. https://doi.org/10.3390/s23062929

APA Style

Petropoulou, A. S., van Marrewijk, B., de Zwart, F., Elings, A., Bijlaard, M., van Daalen, T., Jansen, G., & Hemming, S. (2023). Lettuce Production in Intelligent Greenhouses—3D Imaging and Computer Vision for Plant Spacing Decisions. Sensors, 23(6), 2929. https://doi.org/10.3390/s23062929

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop