Next Article in Journal
Raw Data Simulation of Spaceborne Synthetic Aperture Radar with Accurate Range Model
Previous Article in Journal
Land Use Transitions and the Associated Impacts on Carbon Storage in the Poyang Lake Basin, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Communication

On the Use of Circadian Cycles to Monitor Individual Young Plants

1
Laboratoire Angevin de Recherche en Ingénierie des Systèmes (LARIS), UMR INRAe-IRHS, Université d’Angers, 62 Avenue Notre Dame du Lac, 49000 Angers, France
2
R&D SeedTech, Vilmorin-Mikado, Rue du Manoir, 49250 La Ménitré, France
3
Centre d’Études et de Recherche pour l’Aide à la Décision (CERADE), ESAIP, 18 Rue du 8 Mai 1945, 49124 Saint-Barthélemy-d’Anjou, France
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(11), 2704; https://doi.org/10.3390/rs15112704
Submission received: 8 April 2023 / Revised: 11 May 2023 / Accepted: 17 May 2023 / Published: 23 May 2023
(This article belongs to the Section Remote Sensing Image Processing)

Abstract

:
Occlusion is a very common problem in computer vision. The presence of objects seen as overlapped under a camera negatively impacts object recognition, object counting or shape estimation. This problem is especially important in plant imaging because plants are very self-similar objects which produce a lot of self-occlusions. A possible way to disentangle apparent occlusions is to acquire the same scene from different points of view when the object is motionless. Such a realization is not necessary if the objects move themselves under the camera and thus offer different points of view for free. This is the case in plant imagery, since plants have their own natural movements, including the so-called circadian rhythms. We propose to use these movements to solve some self-occlusion problems with a set of simple yet innovative sampling algorithms to monitor the growth of individualized young plants. The proposed sampling methods make it possible to monitor the growth of the individual plants until their overlap is definitive. The gain is significant with an average maximum duration of observation increase from 3 days to more than 10 days by comparison with a sampling method that would stop when the first overlap occurs.

1. Introduction

Monitoring young plant growth is of particular interest for the selection of more robust or resistant varieties. Tests are regularly carried out on the first days of the plant’s life to compare these varieties. During these experiments, a large number of plants are studied in parallel, often in restricted spaces. Moreover, a large number of dispersed replicates are required in the experimental designs [1], increasing the need for space at the same time. One is able to analyze the growth of these plants in order to conduct variety analyses if they are physically separated seen from the sensor [2]. The space restriction required for high-throughput experiments in industrial settings implies a very high density of seedlings under the cameras. Such constraints produce large number of occlusions during the growth. At first rare, these occlusions become more and more frequent until the plants form a unique canopy. Most current tools and methods for monitoring plant growth are limited by these occlusions [3,4,5,6]. The combination of occlusions and plant self-similarity makes it challenging to use standard tracking methods [7,8,9]. Some rare tools are proposed to extend the monitoring despite the overlap, as in [10] via watershed or extended feature space (color, texture, depth) as in [11,12] or via advanced machine learning object detection tools [13]. Here, we investigate a distinct approach to disentangle overlapped plants.
A possibility to separate occluded objects in computer vision is to acquire multiple views from different angles of the scene. This is a common approach in plant imaging [14,15,16]. A disadvantage of this approach is that it may reduce the throughput. Indeed, it may not be compatible with the synchronized imaging of a set of plants if the camera has to move around each plant. To preserve the throughput, we would need the plant to move. This can be achieved by conveying the plant, but it is a costly solution which can also cause some stress or risk contamination in case of plant–pathogens interactions. In this article, we propose to access synchronized multiple views of touching plants via the sole use of natural movements of plants to disentangle occluded objects.
Plants have an internal timekeeper known as a circadian clock that anticipates environmental cues such as light, temperature and regulates photoperiodic rhythmicity for the proper growth and fitness of the plants [17,18,19]. This circadian clock causes oscillations of the leaves synchronized with the photoperiod. Therefore, in controlled conditions, when plant are imaged at a sample rate much higher than the photoperiod, it is possible to acquire them from different perspectives. Circadian movements have already been analyzed via time lapse videos in the literature, since their amplitude can be related to stress [20,21,22]. Here, alternatively, we propose to use the presence of these circadian movements with simple sampling approaches to use such sequences of images in order to extend the duration of the individual monitoring of young plants as illustrated in Figure 1.
The sampling methods we present are based on simple sampling criterion merely without any image processing. There are existing image processing methods which reach higher growth stages than the one we propose [3,4,5,6,10,11,14,15]. These related works are, however, operating at much lower throughput. In most of these related works, the plant under study is isolated and analyzed individually with the additional help of depth sensors [3,11,14,15]. In other related works, several plants are monitored, but the model is not compatible with possible overlaps between plants [6] or directly isolated in bounding boxes [4]. In [5], the tips of the leaves must be visible from the sensor which is not always compatible with occlusions. As most related work, the method presented in [10] addresses the overlap problem with image processing techniques. This approach differs from ours, which is based on the selection of non-overlapping moments.

2. Materials

2.1. Acquisition System

We use a grid of sensors to acquire our spatio-temporal image sequences of the growth of a young plant. This high-throughput system is capable of acquiring RGB data at a defined sample rate and is similar to the one presented in [2,23]. It is composed of Intel RealSense D435 cameras [24], each one connected to a nano-computer Raspberry Pi 4 [25] which captures images of young plants from top view at a height of about fifty centimeters. Synchronization between cameras is given by the combination of scheduled tasks and the synchronization of the minicomputers through a local network. Note that the used sensors also allow the acquisition of depth images while maintaining a low cost. The added value of depth information has long been demonstrated for plant monitoring [11,26]. Coupling RGB and depth for sampling is an interesting perspective addressed in the Section 5. However, here, we focus on the sole use of RGB images.

2.2. Datasets

Datasets produced by our acquisition system are composed of RGB data at the resolution of 1280 × 720 pixels. The acquisition frequency was set to 15 min, i.e., high enough to capture the daily movements of plants corresponding to circadian cycles. Different species were observed with different environmental conditions:
  • Cabbage in a growth chamber with day–night cycles, respectively, equal to 14–10 h per day, with 6 cameras acquiring during 18 days. This corresponds to the acquisition of 56 frames per day.
  • Pepper in greenhouses with natural day–night cycles, with 3 cameras acquiring during 40 days. A time lapse is available on https://uabox.univ-angers.fr/index.php/s/IpgZwhS47F1UMHB (accessed on 10 May 2023), showing the influence of circadian cycles on plant movement.
The input data of the proposed sampling methods are sequences of binary segmentation masks corresponding to plant segmentation obtained from the raw RGB data. The binarization was either produced manually, i.e., it corresponds to a ground truth binary mask, or with a Random Forest [27] provided by the algorithm of pixel classification from Ilastik Version 1.4.0 software [28]. As a disclaimer, we stress here that the method to obtain the binarization is not important as long as the overall segmentation is correct. It could be replaced by any corresponding variant without an impact on the sampling methods proposed in the rest of the article. From the experiments, 4 datasets of binary masks were extracted:
  • Cabbage SGT # 1 corresponds to the plant segmentation ground truth of the 1008 images of the temporal sequence for the set of 13 monitored cabbage plants given in Figure 2.
  • Cabbage Seg # 1 corresponds to the plant segmentation prediction of Cabbage SGT via Ilastik software after annotation on Cabbage SGT .
  • Cabbage Seg # 2 corresponds to plant segmentation prediction by the same Ilastik model on a set of 15 cabbage plants distinct from Cabbage SGT .
  • Pepper Seg is similar to Cabbage Seg # 2 but applied on another plant species: 6 pepper plants.
These 4 datasets constitute binary masks of various quality and are representative of real-world conditions. A standard approach in computer vision would be to try to improve individually the segmentation of each frame via spatial image processing. Instead, we propose to assume that some of the frames among the time sequence are correct, and we propose simple temporal sampling methods to automatically select the best frames.
Segmentation ground truth is available for the dataset Cabbage # 1 but not for the Cabbage # 2 and Pepper datasets. For those ones, we had to extract a binary ground truth of the overlapped or non-overlapped status of each plant for each frame. Those manually annotations are essential to quantify the results on those datasets. Cabbage S G T # 1 and Cabbage S e g # 1 are extracted from the same images. The binary masks of Cabbage S e g # 1 are trivially of lower quality than Cabbage S G T # 1 , typically over or sub-fragmented plants due to artefacts of segmentation. The inclusion of Cabbage S e g # 1 enables analyzing the response of the proposed sampling method to such segmentation artefacts on the same objects as Cabbage S G T # 1 . Cabbage S e g # 2 allows an analysis of the transferability of the methods on distinct objects and lighting conditions. As a final generalization, Pepper S e g enables analyzing the results of the methods under different species and growing conditions.

3. Methods

In this section, we present the set of sampling algorithms developed to monitor the individual growth of young plants in time sequences of binary images thanks to circadian cycles movements. We define a frame as a binary image from the temporal sequence of binary images and an object as a connected component among a frame. We propose two types of methods: frame sampling and object sampling. The frame sampling methods process each frame as a whole to select the best frames in the temporal sequences. The object sampling methods process each object in the frame separately to produce a reconstructed temporal sequence for each plant in the end.

3.1. Frame Sampling

Let η ( d , t ) be the number of detected objects, i.e., connected component, in the frame at day d D = { 1 , , n d } N * and time t T = { 1 , , n t } N * . We hypothesize here that a decrease in the number of detected objects indicates the presence of overlaps between plants.

3.1.1. Baseline

We iterate over successive frames until an overlap is detected. In this case, the last frame obtained is considered as the last one available without plant overlapping, as represented in Figure 3. This basic approach corresponds to a trivial baseline that we propose to improve via more advanced criteria of selections for the frame or for the objects.

3.1.2. Periodic Frame Sampling

When monitoring dynamic processes, a standard approach is to have a periodic sampling rate. This type of sampling is designated notch filter in electrical engineering. Ambient conditions change significantly within a day. By synchronizing the sampling rate to the photoperiod, we obtain time series with the closest conditions between the time points. As the circadian rhythm acts on the inclination of the leaves, we expect that there exists an optimal time of the day to separate each plant on the longest possible duration.
We therefore tested all possible sample times of the day and select the one which enables maximum duration while keeping as long as possible the same number of objects in frames. As illustrated in Figure 3, this method corresponds to a down-sampling of fixed periodicity equal to the photoperiod and synchronized to the optimal time of the day t * T .

3.1.3. Non-Periodic Frame Sampling

The first overlap of objects might not be the last opportunity to have no overlap in the rest of the time sequence. In addition, the frame with no overlap might not be positioned at periodic time points. Therefore, as illustrated in Figure 3, an alternative to the two previous sampling methods consists of selecting all the frames in which no overlap is detected.

3.2. Object Sampling

For frame sampling methods, a single overlap is enough to not select a frame even if the other objects are not overlapped. To overcome the constraint of studying all objects simultaneously at the level of the frame, we now propose individualizing the monitoring of each plants following the two steps in the pipeline of Figure 4. First, the connected components in each frame are tracked. Overlaps are then detected among the sequence of individual tracked objects.

3.2.1. Plant Tracking

Objects are initialized at the emergence of the plants from the soil. The location of the emergence points is then used to define a research area. We selected this area as a disk centered on the emergence point and with the radius empirically chosen as the radius of the pots, as illustrated in Figure 4. Objects are then tracked within this research area as described in Figure 5. If several objects are found in the research area, the one with the center of mass the closest to the previous center of mass of the tracked object is selected.

3.2.2. Overlapping Detection

The individual objects might include some overlapping of plants at some times. As for frame sampling methods, we define temporal strategies to select non-overlapped plants in the sequence of each object.
We assume that when plants overlap, the evolution of the objects should be abrupt. The evolution of an object O i is extracted as a sequence of binary masks, noted M ( O i , ( d , t ) ) . For illustration in this article, we have characterized the size and shape of objects by their surface S ( O i , ( d , t ) ) = | M ( O i , ( d , t ) ) | and their diameter D ( O i , ( d , t ) ) = max p 0 , p 1 M ( O i , ( d , t ) ) p 0 p 1 , where | · | measures the cardinality, max · measures the maximum, and | | · | | measures the Euclidian norm. To analyze object evolution, we then defined δ the temporal features of interest as
δ ( O i , ε ) = S ( O i , t 0 + ε ) S ( O i , t 0 ) D ( O i , t 0 + ε ) D ( O i , t 0 ) S ( O i , t 0 + ε ) S ( O i , t 0 ) S ( O i , t 0 ) D ( O i , t 0 + ε ) D ( O i , t 0 ) D ( O i , t 0 )
with ε representing the time interval used. Here again as a disclaimer, we advocate that many different feature vectors δ could be chosen and that the specific choice of parameters could be further optimized without loss of general interest of the proposed sampling methods.
To detect if an object is composed of a single plant or an overlapping of several plants, we set a threshold on the absolute and relative growth of the selected features defined in Equation (1). Absolute thresholding is first used to detect if physical quantities are sufficiently high for an abrupt evolution. Then, relative thresholding is used to analyze this evolution in relation to the size of the object.
As for frame sampling strategies defined in Figure 3, we detect abrupt evolution with or without a periodic constraint. A first approach is to use a non-periodic object sampling by detecting abrupt growth on successive objects. Another approach is to use a periodic object sampling by detecting abrupt growth on objects separated by one photoperiod, i.e., ε = 24 h . Let NPOS be the method of non-periodic object sampling and POS be the method of periodic object sampling.
As shown in Figure 3, the sequences returned by the NPOS and POS sampling methods are distinct, since they operate at different time scales for the exclusion of abrupt changes. Thus, it is possible to further apply an optimal periodic sampling on the non-periodic object sampling. We tested this combination of non-periodic sampling followed by a periodic sampling at the level of each object that we called (NP+P)OS.

3.3. Metrics

To compare the frame sampling and object sampling methods, one can analyze the maximum duration of plant monitoring. An obvious metric would be to use the gain of time by comparison with the baseline method. However, due to possible errors in overlapping detection, some methods can produce final output exceeding the temporal ground truth (TGT). To take this issue into account, we consider the signed Mean Absolute Error (sMAE) as the metric
sMAE ( y ^ , y ) = sign k = 1 n ( y ^ k y k ) × 1 n k = 1 n | y ^ k y k | = sign k = 1 n ( y ^ k y k ) × MAE ( y ^ , y )
where y k is the temporal ground truth for maximum duration of monitoring and y k ^ is the maximum duration of monitoring for the concerned method. We look for the minimal absolute value of the sMAE and have a negative sign to avoid the monitoring of overlapped plants.
One can also analyze the source of errors committed by the different sampling methods. This is given by the rate of wrong object selection for the tracking part and the rate of wrong frame classification for the overlapping detection part. An overall analysis of the daily output sequence quality can be given by its rate of wrong frames. It is also meaningful to look at the number of empty frames in the output sequence. Counting empty frames is relevant to quantify the number of days without frames containing the object detected as non-overlapped.

3.4. Parameter Tuning

Some parameters are used for the overlapping detection part of the object sampling methods. They correspond to the thresholds of the absolute and relative features defined in Equation (1). In order to maximize the duration of monitoring and optimize the metric defined in Equation (2), we need to tune these parameters. For a first proof of interest of the sampling methods proposed here, the thresholds were set empirically on a subset of three objects from the Cabbage S G T # 1 dataset. All remaining objects of Cabbage S G T # 1 , Cabbage S e g # 1 and Cabbage S e g # 2 are used to validate the methods.
The growth rate of plants directly depends on the species studied. This has an impact on the optimal values of the thresholds. To ensure the transferability of the methods to the change of species, we also use two objects from the Pepper S e g dataset to fine-tune the corresponding parameters.

4. Results

We are now ready to compare the different sampling methods proposed to monitor individual plant growth. The comparison on the criterion of maximum duration of the output sequence is provided in Table 1 for the Cabbage datasets.
An overall result expressed by the maximum monitoring duration is the superiority of the object sampling methods compared to the frame sampling one. The best results are obtained by (NP+P)OS with a gain of 7.23 days on the segmentation ground truth compared to the baseline method. This represents a considerable advance in the monitoring of plants. We go from 3 days of monitoring to a monitoring of about 10 days. This leads to a possible analysis time of the sequence 2.41 times longer than for the basic method. This method optimizes the s M A E , defined in Equation (2), with an average of 0.23 days lower than the temporal ground truth on segmentation ground truth. On these examples, for none of the objects do we exceed the temporal ground truth, which gives an insight into the ability of the method not to extend the final sequence to wrongly selected frames. This observation is not the same for the NPOS and POS methods, which in addition to presenting larger deviations from the temporal ground truth tend to exceed the maximum possible real duration, which is reflected by the positive sign of the s M A E .
Let us focus on the errors produced by object sampling methods in Table 2. For each dataset, the (NP+P)OS method brings less errors in the output sequence. The combination of nearest temporality and growth seasonality outperforms the use of each one separately. Applied on segmentation ground truth, no output sequence has a misdetected frame. Even with segmentation errors, the (NP+P)OS method ensures a low amount of wrong sampled frames.
In Table 2, the transition from Cabbage S G T # 1 to Cabbage S e g # 1 with the addition of segmentation errors demonstrates a strong transferability of the methods. The methods still perform well overall on datasets with segmentation errors. Transferability to another species is also shown by the stability of the results when switching to the Pepper S e g dataset.
It is important to note that the POS method has lower frame classification errors on segmentation predictions than the NPOS and (NP+P)OS methods. On the other hand, due to the algorithm structure, a false detection of overlap is directly propagated throughout the daily series.
In Figure 6, we present an example of frame selection with object sampling methods and the gain of the methods compared to the baseline method. This example is relevant to the comparison of the methods because there is a long period with overlapping on the studied plant. The gain can therefore be significant. The two methods POS and (NP+P)OS return correct sequences and bring a gain of 9 days. The periodic sampling of (NP+P)OS fixed the errors of the NPOS method. This example shows the improvement brought by the combination of the periodic and non-periodic sampling.

5. Discussion

The methods presented provide a significant extension of the duration of the plant monitoring. However, there are some remaining errors. We propose to analyze them for the best method, i.e., the (NP+P)OS method, as provided in Table 3. For the (NP+P)OS method applied on segmentation ground truth, the main source of errors corresponds to the simultaneous occurrence of overlaps. When several overlaps appear in the time interval between two frames, the abrupt change is well detected, and the object is well identified as overlapped. However, we do not know how many plants are actually overlapped. When the loss of an overlap on the object is detected, we can conclude that the object is non-overlapped when it still is, as shown in Figure 7. To avoid this kind of error, we could increase the acquisition frequency. In addition, one could estimate the surface of leaves at each time frame over the population and detect such simultaneous overlaps.
The second source of error corresponds to a wrong overlapping detection due to the fact that objects can reach the limits of the frames. Then, objects are cut, and it results in an artificial abrupt growth. For each plant monitored, we should then restrict the application of the sampling methods to a spatio-temporal domain in which we are not certain of reaching the limits of the frames.
The binarization of the predicted images was produced by a standard Random Forest algorithm available under open source software [28]. However, any alternative solution providing correct segmentation is likely to be used. Depending on the segmentation accuracy, different gains can be obtained. This impact of the quality of the segmentation on the performance of our sampling algorithms is an interesting perspective for further work.
The closest work carried out in [10] proposes a different approach. The acquisition frequency is much lower and the few overlaps are processed by image processing methods, in particular watershed [29]. The claimed gain for the method in [10] is 28%, which is equivalent to 2 days on the dataset considered. The (NP+P)OS method seems to be better for each of our studied datasets with a gain presented in Table 2 in the range of 10 days. Interestingly, our method does not perform spatial image processing but rather focuses on the temporal selection of the frame or the object in the images. These two approaches are complementary and compatible with each others. Their coupling could be studied in order to further extend the duration of monitoring of overlapped plants. In addition, additional sensors such as the depth sensor used in [3,11,14,15,23] could be used to enhance the contrast between overlapping plants.
Our sampling methods require an oversampling of the plant compared to the time scale of the circadian movement of the plants. This oversampling comes with a cost corresponding to an increase of data flow and data storage. In our hardware setup, images were first acquired on nano-computers [25] and then transferred to a main storing server. So far for this article, all images were transferred to the main server, and the sampling methods were simulated on data at rest. However, this hardware setup, with two processing units, offers opportunities for the deployment of our sampling methods during the acquisition process. The segmentation of the images could be first deployed on the nano-computers together with the sampling methods proposed in this article. Frames or objects detected as overlapped could be erased directly. Then, only the sampled frames or objects could be transferred to the main server. The overlapping detection of our sampling methods requires a buffer which varies from one method to another. Only the last frame is needed for non-periodic sampling when a full sliding day needs to be buffered for periodic sampling. Embedding our sampling method in nano-computers together with advanced watershed approaches and enhanced contrast with depth cameras constitutes a practical perspective currently under development in our group.
The proposed sampling methods are presented in controlled conditions. Segmentation is facilitated by an overall even and stable illumination. This also ensures that the movements correspond to natural movements of the plants. However, the proposed sampling methods could be extended to other growth conditions. For instance, they could be applied in outdoor conditions benefiting from other sources of movements (due to wind). In such conditions, more elaborated segmentation algorithms should be used, but the sampling methods would remain valuable.

6. Conclusions

In this article, we proposed a simple yet innovative approach to address the issue of plant overlapping to monitor the growth of ensemble of plants observed from top view. Instead of processing images to separate the objects touching, we demonstrated that it is possible to use the natural movement of the plants provided by the circadian cycles. Such movements are available for free when operating in controlled conditions. Oversampling the acquisition provided situations where overlapping plants at a certain time are not overlapping anymore at another time of the day. The best sampling strategies demonstrated capabilities of extending the observation time between 3 days and 10 days by comparison with a sampling method that would stop when the first overlaps occurs. Further improvement could be made by coupling this approach with standard image processing approaches to separate touching objects or via additional sensors providing higher contrast such as depth sensors.

Author Contributions

Conceptualization, D.R., C.T., P.R. and M.C.; methodology, D.R., C.T., P.R. and M.C.; software, M.C.; validation, D.R., C.T., P.R. and M.C.; formal analysis, D.R., C.T. and M.C.; investigation, D.R., C.T., P.R. and M.C.; resources, D.R., C.T., P.R. and M.C.; data curation, D.R., C.T., P.R. and M.C.; writing—original draft preparation, M.C.; writing—review and editing, D.R. and C.T.; visualization, D.R., C.T. and M.C.; supervision, D.R., C.T. and P.R.; project administration, D.R. and C.T.; funding acquisition, D.R. and C.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by ANRT (Association Nationale de la Recherche et de la Technologie), under grant agreement 2020-1738, Vilmorin-Mikado and University of Angers.

Data Availability Statement

The data presented in this study are available on request at [email protected]. The data are not publicly available due to privacy reason.

Acknowledgments

The authors thank the collaborators from the ImHorPhen, LARIS, University of Angers, and VisArt, R&D Vilmorin-Mikado and Limagrain teams.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Sinclair, J.B.; Dhingra, O.D. Basic Plant Pathology Methods; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
  2. Samiei, S.; Rasti, P.; Ly Vu, J.; Buitink, J.; Rousseau, D. Deep learning-based detection of seedling development. Plant Methods 2020, 16, 103. [Google Scholar] [CrossRef]
  3. Apelt, F.; Breuer, D.; Nikoloski, Z.; Stitt, M.; Kragler, F. Phytotyping4D: A light-field imaging system for non-invasive and accurate monitoring of spatio-temporal plant growth. Plant J. 2015, 82, 693–706. [Google Scholar] [CrossRef]
  4. Rehman, T.U.; Zhang, L.; Wang, L.; Ma, D.; Maki, H.; Sánchez-Gallego, J.A.; Mickelbart, M.V.; Jin, J. Automated leaf movement tracking in time-lapse imaging for plant phenotyping. Comput. Electron. Agric. 2020, 175, 105623. [Google Scholar] [CrossRef]
  5. Gall, G.E.C.; Pereira, T.D.; Jordan, A.; Meroz, Y. Fast estimation of plant growth dynamics using deep neural networks. Plant Methods 2022, 18, 21. [Google Scholar] [CrossRef] [PubMed]
  6. De Vylder, J.; Vandenbussche, F.; Hu, Y.; Philips, W.; Van Der Straeten, D. Rosette Tracker: An Open Source Image Analysis Tool for Automatic Quantification of Genotype Effects. Plant Physiol. 2012, 160, 1149–1159. [Google Scholar] [CrossRef] [PubMed]
  7. Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple online and realtime tracking. In Proceedings of the 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 25–28 September 2016; pp. 3464–3468. [Google Scholar] [CrossRef]
  8. Wojke, N.; Bewley, A.; Paulus, D. Simple Online and Realtime Tracking with a Deep Association Metric. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017. [Google Scholar] [CrossRef]
  9. Nasseri, M.H.; Moradi, H.; Hosseini, R.; Babaee, M. Simple online and real-time tracking with occlusion handling. arXiv 2021, arXiv:2103.04147. [Google Scholar] [CrossRef]
  10. Ortega, J.; Castillo, S.; Gehan, M.; Fahlgren, N. Segmentation of Overlapping Plants in Multi-Plant Image Time Series; Authorea: Hoboken, NJ, USA, 2021. [Google Scholar] [CrossRef]
  11. Chéné, Y.; Rousseau, D.; Lucidarme, P.; Bertheloot, J.; Caffier, V.; Morel, P.; Belin, E.; Chapeau-Blondeau, F. On the use of depth camera for 3D phenotyping of entire plants. Comput. Electron. Agric. 2012, 82, 122–127. [Google Scholar] [CrossRef]
  12. Mohammed Amean, Z.; Low, T.; Hancock, N. Automatic leaf segmentation and overlapping leaf separation using stereo vision. Array 2021, 12, 100099. [Google Scholar] [CrossRef]
  13. Li, Y.; Mao, H.; Girshick, R.; He, K. Exploring Plain Vision Transformer Backbones for Object Detection. In Proceedings of the Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, 23–27 October 2022; Proceedings, Part IX. Springer: Cham, Switzerland, 2022. [Google Scholar] [CrossRef]
  14. Nguyen, C.V.; Fripp, J.; Lovell, D.R.; Furbank, R.; Kuffner, P.; Daily, H.; Sirault, X. 3D Scanning System for Automatic High-Resolution Plant Phenotyping. In Proceedings of the 2016 International Conference on Digital Image Computing: Techniques and Applications (DICTA), Gold Coast, QLD, Australia, 30 November–2 December 2016; pp. 1–8. [Google Scholar] [CrossRef]
  15. Wu, J.; Xue, X.; Zhang, S.; Qin, W.; Chen, C.; Sun, T. Plant 3D reconstruction based on LiDAR and multi-view sequence images. Int. J. Precis. Agric. Aviat. 2018, 1, 37–43. [Google Scholar] [CrossRef]
  16. Gosta, M.; Grgic, M. Accomplishments and challenges of computer stereo vision. In Proceedings of the ELMAR-2010, Zadar, Croatia, 15–17 September 2010; pp. 57–64. [Google Scholar]
  17. Sweeney, B.M. Rhythmic Phenomena in Plants; Academic Press: Cambridge, MA, USA, 2013. [Google Scholar]
  18. McClung, C.R. Plant Circadian Rhythms. Plant Cell 2006, 18, 792–803. [Google Scholar] [CrossRef] [PubMed]
  19. Satter, R.L.; Galston, A.W. Mechanisms of Control of Leaf Movements. Annu. Rev. Plant Physiol. 1981, 32, 83–110. [Google Scholar] [CrossRef]
  20. Greenham, K.; Lou, P.; Remsen, S.E.; Farid, H.; McClung, C.R. TRiP: Tracking Rhythms in Plants, an automated leaf movement analysis program for circadian period estimation. Plant Methods 2015, 11, 33. [Google Scholar] [CrossRef] [PubMed]
  21. Yin, X.; Liu, X.; Chen, J.; Kramer, D.M. Joint multi-leaf segmentation, alignment, and tracking for fluorescence plant videos. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 1411–1423. [Google Scholar] [CrossRef] [PubMed]
  22. Geldhof, B.; Pattyn, J.; Eyland, D.; Carpentier, S.; Van de Poel, B. A digital sensor to measure real-time leaf movements and detect abiotic stress in plants. Plant Physiol. 2021, 187, 1131–1148. [Google Scholar] [CrossRef] [PubMed]
  23. Garbouge, H.; Rasti, P.; Rousseau, D. Enhancing the Tracking of Seedling Growth Using RGB-Depth Fusion and Deep Learning. Sensors 2021, 21, 8425. [Google Scholar] [CrossRef] [PubMed]
  24. Intel RealSense Documentation. Available online: https://dev.intelrealsense.com/docs (accessed on 10 May 2023).
  25. Raspberry Pi Documentation. Available online: https://www.raspberrypi.com/documentation/ (accessed on 10 May 2023).
  26. Li, Z.; Guo, R.; Li, M.; Chen, Y.; Li, G. A review of computer vision technologies for plant phenotyping. Comput. Electron. Agric. 2020, 176, 105672. [Google Scholar] [CrossRef]
  27. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  28. Berg, S.; Kutra, D.; Kroeger, T.; Straehle, C.N.; Kausler, B.X.; Haubold, C.; Schiegg, M.; Ales, J.; Beier, T.; Rudy, M.; et al. ilastik: Interactive machine learning for (bio)image analysis. Nat. Methods 2019, 16, 1226–1232. [Google Scholar] [CrossRef] [PubMed]
  29. Beucher, S. Watersheds of functions and picture segmentation. In Proceedings of the ICASSP’82. IEEE International Conference on Acoustics, Speech, and Signal Processing, Paris, France, 3–5 May 1982; Volume 7, pp. 1928–1931. [Google Scholar] [CrossRef]
Figure 1. Graphical abstract: example of daily sequence of an individualized plant obtained by the proposed methods.
Figure 1. Graphical abstract: example of daily sequence of an individualized plant obtained by the proposed methods.
Remotesensing 15 02704 g001
Figure 2. Example of frames extracted from the temporal sequence on the cabbage experiment. RGB data are in the first row, plant segmentation ground truth is in the second one and plant segmentation prediction is in the third one. In the column, different times are represented.
Figure 2. Example of frames extracted from the temporal sequence on the cabbage experiment. RGB data are in the first row, plant segmentation ground truth is in the second one and plant segmentation prediction is in the third one. In the column, different times are represented.
Remotesensing 15 02704 g002
Figure 3. Illustration of frame sampling methods. In the grid, the white frames correspond to frames without overlapping and gray frames correspond to frames with overlapping. Blue frames correspond to the frames selected by the sampling method.
Figure 3. Illustration of frame sampling methods. In the grid, the white frames correspond to frames without overlapping and gray frames correspond to frames with overlapping. Blue frames correspond to the frames selected by the sampling method.
Remotesensing 15 02704 g003
Figure 4. Pipeline for object sampling methods.
Figure 4. Pipeline for object sampling methods.
Remotesensing 15 02704 g004
Figure 5. Example of tracking for object in blue. The other connected components of the scene are shown in yellow. The emergence point is represented in blue and the circle is the research area. ➀ and ➃ are trivial cases where only one object is found in the research area. In ➁ and ➂, the selected object is the closest to the previous center of mass of the object. The red plus sign is the center of mass of the tracked object in the previous frame.
Figure 5. Example of tracking for object in blue. The other connected components of the scene are shown in yellow. The emergence point is represented in blue and the circle is the research area. ➀ and ➃ are trivial cases where only one object is found in the research area. In ➁ and ➂, the selected object is the closest to the previous center of mass of the object. The red plus sign is the center of mass of the tracked object in the previous frame.
Remotesensing 15 02704 g005
Figure 6. Comparison on an example of results obtained on Cabbage S G T with one frame per hour for each object sampling method: Periodic Object Sampling, Non-Periodic Object Sampling and (Non-Periodic + Periodic) Object Sampling from top to bottom. On the left column, the frames selected in the daily output sequence are surrounded in blue. On the right column, the last frame obtained with the baseline method is surrounded in white and the last frame obtained with the concerned method is in yellow. The baseline stops early because an overlap occurs elsewhere in the frame (cf. first overlap in Figure 2). For all frames, the blue pixels correspond to the tracked object when not overlapped, the red ones correspond to the tracked object when overlapped and the yellow ones correspond to other connected components of the scene.
Figure 6. Comparison on an example of results obtained on Cabbage S G T with one frame per hour for each object sampling method: Periodic Object Sampling, Non-Periodic Object Sampling and (Non-Periodic + Periodic) Object Sampling from top to bottom. On the left column, the frames selected in the daily output sequence are surrounded in blue. On the right column, the last frame obtained with the baseline method is surrounded in white and the last frame obtained with the concerned method is in yellow. The baseline stops early because an overlap occurs elsewhere in the frame (cf. first overlap in Figure 2). For all frames, the blue pixels correspond to the tracked object when not overlapped, the red ones correspond to the tracked object when overlapped and the yellow ones correspond to other connected components of the scene.
Remotesensing 15 02704 g006
Figure 7. Frame misclassification due to simultaneous change. Object of interest is represented in blue and other connected components are shown in yellow. In ➁, two overlaps appeared simultaneously during the monitoring of the plant represented in blue in ➀. In ➂, the loss of one overlapped plant is well detected, but the object is misclassified as non-overlapped because of the previous simultaneous overlap.
Figure 7. Frame misclassification due to simultaneous change. Object of interest is represented in blue and other connected components are shown in yellow. In ➁, two overlaps appeared simultaneously during the monitoring of the plant represented in blue in ➀. In ➂, the loss of one overlapped plant is well detected, but the object is misclassified as non-overlapped because of the previous simultaneous overlap.
Remotesensing 15 02704 g007
Table 1. Maximum duration of monitoring (in days) for each method on Cabbage segmentation ground truth and segmentation prediction. Each P i is representing one plant of interest. For frame sampling methods, we only represent the temporal ground truth corresponding to the real maximum duration applying the method. Orange cells correspond to objects used to fine-tune the parameters of the object sampling methods.
Table 1. Maximum duration of monitoring (in days) for each method on Cabbage segmentation ground truth and segmentation prediction. Each P i is representing one plant of interest. For frame sampling methods, we only represent the temporal ground truth corresponding to the real maximum duration applying the method. Orange cells correspond to objects used to fine-tune the parameters of the object sampling methods.
Cabbage SGT # 1 Cabbage Seg # 1
Frame Sampling TGTObject SamplingFrame SamplingObject Sampling
BaselinePFSNPFSPOSNPOS(NP+P)OSTGTTGTPOSNPOS(NP+P)OSTGT
P 1 35512555012334
P 2 1514121214131212
P 3 77777777
P 4 1111111110101010
P 5 111168111178
P 6 12161212111199
P 7 1377713777
P 8 1414141416111114
P 9 1712121217121212
P 10 16161616168816
P 11 1166711337
P 12 1612121215121212
P 13 1713131317131313
sMAE
(days)
−7.46−5.46−5.46+2.77+0.77−0.23 −10.07+3+1.69−1.31
GAIN (days) +2+2+10.23+8.23+7.23+7.46 +13.07+11.76+8.76+10.07
Table 2. Overview of the errors for the object sampling methods. In Bold we highlight the best results.
Table 2. Overview of the errors for the object sampling methods. In Bold we highlight the best results.
TrackingClassificationOutput—Daily Sequence
Wrong Object SelectionWrong Overlapping DetectionWrong FramesEmpty FramessMAE from TGTGain from Baseline
(%)(%)(%)(%)(days)(days)
Cabbage SGT # 1 POS0.034.8116.9816.98+2.77+10.23
NPOS2.358.673.05+0.77+8.23
(NP+P)OS03.23−0.23+7.23
Cabbage Seg # 1 POS0.034.9420.8116.11+3+13.07
NPOS8.49.620.99+1.69+11.76
(NP+P)OS0.882.15−1.31+8.76
Cabbage Seg # 2 POS03.4715.315.88+2+11.6
NPOS10.9224.5720+3.87+12.67
(NP+P)OS1.677.5−1.27+8.6
Pepper Seg POS0.29.5912.117.83+2.83+9.33
NPOS12.0718.1111.02+4.67+9.17
(NP+P)OS3.7712.26−4.83+5.67
Table 3. Source of errors for the (NP+P)OS method applied on Cabbage S G T # 1 as percentage of all errors.
Table 3. Source of errors for the (NP+P)OS method applied on Cabbage S G T # 1 as percentage of all errors.
SegmentationSimultaneous ChangeFrame Limiting
Cabbage S G T # 1 - 58.33 % 42.67 %
Cabbage S e g # 1 30.77 % 40.38 % 29.54 %
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cordier, M.; Torres, C.; Rasti, P.; Rousseau, D. On the Use of Circadian Cycles to Monitor Individual Young Plants. Remote Sens. 2023, 15, 2704. https://doi.org/10.3390/rs15112704

AMA Style

Cordier M, Torres C, Rasti P, Rousseau D. On the Use of Circadian Cycles to Monitor Individual Young Plants. Remote Sensing. 2023; 15(11):2704. https://doi.org/10.3390/rs15112704

Chicago/Turabian Style

Cordier, Mathis, Cindy Torres, Pejman Rasti, and David Rousseau. 2023. "On the Use of Circadian Cycles to Monitor Individual Young Plants" Remote Sensing 15, no. 11: 2704. https://doi.org/10.3390/rs15112704

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop