Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects
Abstract
:1. Introduction
1.1. Scope of the Study
1.2. Paper Organization
2. Sensing and Automation Technologies for Ornamental Crops
2.1. Smart Irrigation
Crop | Nursery Types | Soil Sensor Types | Water Saving | References |
---|---|---|---|---|
Ornamentals | Container | Capacitance-based (WSNs) | 20% to 25% | Chappell et al. [32] |
Hydrangea | Container | Capacitance-based (WSNs) | Not Reported | Coates et al. [36] |
Red Maple and Cherokee Princess | Container and Greenhouse | Matric potential and capacitance sensors (WSNs) | Not Reported | Lea-Cox et al. [31] |
Hydrangea | Container | Electrical conductivity (WSNs) | As much as 83% | Kim et al. [35] |
Woody Ornamental Plants: Oakleaf Hydrangea, Japanese Andromeda, Catawba Rosebay and Mountain Laurel | Container and Greenhouse | Capacitance-based (WSNs) | 50% | Wheeler et al. [34] |
Dogwood and Red Maple | Pot-in-pot | Capacitance-based (WSNs) | 34% to 63% | Belayneh et al. [37] |
Dogwood and Red Maple | Pot-in-pot | Capacitance-based (WSNs) | 62.9% | Lea-Cox and Belayneh [38] |
Ornamental plants | Pots in indoor | Capacitance-based (IoT) | Not Reported | Banda-Chávez et al. [39] |
2.2. Plant Stress Detection
Crop | Stress Type | Imaging Type | Processing Method | Accuracies | References |
---|---|---|---|---|---|
Rose | Powdery mildew | RGB (a video camera: Everio) | Images were converted to HSV, and then segmentation performed to extract the disease region | Highest 93.2% of disease region matching | Velázquez-López et al. [42] |
Rose | Fifteen different rose diseases | Color images downloaded from the Google search engine and ChromeDriver | A hybrid deep learning model (CNNs with SVM) | 90.26% accuracy, 90.59% precision, 92.44% recall, and 91.50% F1-score | Nuanmeesri [46] |
Rose | Powdery mildew and gray mold | RGB (Canon 550D Kiss X4); Thermal camera (ITI-P400) | Image registration of visible and thermal images and then segmentation to segment diseased area | Not reported | Minaei et al. [45] |
Tulip | Tulip breaking virus | RGB (Nikon D70 with a NIKON 18–70 mm zoom lens); Spectral camera (Specim, spectrum from 430 to 900 nm with a resolution of 4.5 nm) | Spatial information was extracted after segmentation, and then Fisher’s linear discriminant analysis (LDA) used for the detection | Best results of 9, 18 and 29% detection error were achieved for Barcelona, Monte Carlo, Yokohama tulip variety, respective using the spectral camera | Polder et al. [47] |
Tulip | Tulip breaking virus | RGB (Prosilica GC2450 and GC2450); RGB-NIR multispectral (JAI AD120GE); Multispectral (using six-band filter wheel, range 500-750 nm) | Plant segmented by thresholding the excessive-green image ((2G–R–B) > 0) and then LDA for TBV classification | 92% of TBV-diseased plants were accurately classified using RGB-NIR multispectral system | Polder et al. [43] |
Cyclamen | Botrytis | Hyperspectral imaging (400–1000 nm) | Selected most discriminating wavelengths and then applied LDA | 90% of pixels were classified correctly | Polder et al. [48] |
Pinus radiata seedlings | Pitch canker disease (F. circinatum infection) | Hyperspectral imaging (600–2500 nm) | Wavebands were selected using the Boruta algorithm, and then Random forests were used for discriminating infected seedlings | 0.82 and 0.84 KHAT values for healthy-infected and infected damaged discrimination, respectively | Poona and Ismail [44] |
Lemon myrtle | Myrtle rust | Hyperspectral imaging (350–2500 nm) | Four wavebands were chosen, and RF was applied for discrimination | 90% of overall accuracy | Heim et al. [49] |
Pyrethrum | Ray blight disease | Multispectral radiometer | Reflectance was measured, and data were analyzed using regression analysis | Not reported | Pethybridge et al. [50] |
Rose | Powdery mildew and gray mold | Infrared thermal camera (ITI-P400) | Image registration and then segmentation were performed to extract features, and finally, neuro-fuzzy classifiers were used for classification | 92.3% and 92.59% estimation rates were achieved for powdery mildew and gray mold, respectively | Jafari et al. [51] |
Rose | Botrytis cinerea infection | Infrared thermal camera (ITI-P400) | Analyzed extracted thermal features with radial-basis neural networks | 96.4% correct estimation rate | Jafari et al. [52] |
2.3. Smart Spraying
Crops | Nursery Types | Sprayer and Sensor Type | Performance | References |
---|---|---|---|---|
Multiple ornamental tree species | Field nursery | Two sprayers: Vertical boom with an ultrasonic sensor; Air assisted sprayer with a laser sensor | Chemical usage was reduced by 70%, 66%, and 52% at different growth stages of the target trees; achieved uniform spray deposits at all tested travel speeds | Zhu et al. [62] |
Multiple ornamental tree species | Field nursery liners | Spray boom with ultrasonic sensor | The mean spray deposit was 0.72–0.90 μL/cm2; the mean spray coverage was 12–14.7% | Jeon and Zhu [63] |
Multiple ornamental tree species | Field nursery liners | Spray boom with ultrasonic sensor | Spray volume was reduced by 86.4%; lower spray deposit and droplet density | Jeon et al. [64] |
Tsuga canadensis Thuja occidentalis | Container-grown | Laser scanner air-assisted sprayer | Spray coverage differences were not significantly different | Chen et al. [57] |
Ornamental nursery and grapevine | Field nursery | Laser scanner air-assisted sprayer | Chemical usage reduced by 50% at a travel speed of 3.2 to 8.0 km/h | Liu et al. [65] |
Japanese maple | Field nursery | Laser-guided air-assisted sprayer | Spray savings of 12 to 43% | Shen et al. [66] |
Prairifire crabapple Honey locust | Field nursery; pot-in-pot | Laser-guided air-assisted sprayer | Chemical savings of 36% and 30% in the Prairifire crabapple and Honey locust nurseries, respectively | Zhu et al. [59] |
Multiple ornamental tree species | Field nursery | Laser-guided air-assisted sprayer | Chemical savings of 56% and 52% for two nurseries | Chen et al. [67] |
2.4. Plant Biometrics and Identification
Crops | Sensor Type | Model | Performance | References |
---|---|---|---|---|
Seven different plant cultivars–container | RGB camera | Color co-occurrence matrices (intensity, saturation, and hue) | Overall classification accuracy of 91% | Shearer and Holmes [74] |
Perennial peanut and Fire chief arborvitae–container | RGB camera | Vegetation index thresholding and the support vector machine (SVM) | Accuracy of more than 94% | She et al. [75] |
Fire Chief arborvitae–container | UAS-based RGB camera | Custom counting algorithm | Counting error on gravel and black fabric of 8% and 2%, respectively | Leiva et al. [76] |
Spruce, Mongolian scotch pine, Manchurian ash–field | RGB-Depth camera | YoloV4 with Ghostnet | Accuracy of more than 92% in both counts and height measurements | Yuan et al. [77] |
Eleven different tree nurseries–field | UAS-based Multispectral camera | Grey Level Co-occurrence Matrix for texture images; Maximum Likelihood algorithm, and Principal Component Analysis | Accuracy of 87%, depending on components reduction on spectral camera | Gini et al. [78] |
Six different species–container | LiDAR sensor | Logistic regression functions, support vector machines (SVM) | Accuracy greater than 98% | Weiss et al. [79] |
Almond tree nursery | LiDAR and light curtain sensors | Custom segmentation and thresholding algorithm | Tree detection acc of 95.7% (LiDAR) and 99.48% (light curtain sensors); Dead/alive tree detection acc of 93.75% (LiDAR) and 94.16% (light curtain sensors) | Garrido et al. [73] |
Flower-Field | RGB camera | ResNet18, ResNet50, and DenseNet121 | Accuracy of 91.88%, 97.34%, and 99.82% respectively | Zhang et al. [71] |
Flower-Field | RGB camera | DenseNet121 | Accuracy of 98.6% for 50 epochs | Alipour et al. [80] |
Flower-Field | RGB camera | CNN, VGG16, MobileNet2, and Resnet50 | Test accuracy: 91%, 89.35%, 92.12%, 71.75%, respectively | Narvekar and Rao [83] |
Flower-Field | RGB camera | Custom and Inception v3 | Accuracy of 83% and 99%, respectively | Dharwadkar et al. [81] |
Flower-Field | RGB camera | Naive Bayes (NB), Generalized Linear Model (GLM), Multilayer Perceptron (MP), Decision Tree (DT), Random Forest (RF), Gradient Boosted Trees (GBT), and Support Vector Machine (SVM) | RF is the best-performing model, with an accuracy of 78.5%. | Malik et al. [82] |
Flower-Field | RGB camera | Viola-Jones object detection and normalized cross-correlation algorithm | Classification accuracy of more than 99% with <0.5 s processing time | Soleimanipour and Chegini [84] |
2.5. Other Significant Works
3. Future Prospects/Directions
3.1. Advanced Camera Sensor Applications
3.1.1. ToF, LiDAR, and 3D Sensors Applications
3.1.2. Spectral Sensor Applications
3.2. Enhanced Deep Network Applications
3.3. Edge-AI Applications
3.4. Radio Frequency Identification Tagging Applications
3.5. Integrated Robotics Applications
4. Discussion and Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- USDA. U.S. Horticulture in 2014 (Publication ACH12-33); United States Department of Agriculture: Beltsville, MD, USA. Available online: https://www.agcensus.usda.gov/Publications/2012/Online_Resources/Highlights/Horticulture/Census_of_Horticulture_Highlights.pdf (accessed on 21 November 2022).
- Lea-Cox, J.D.; Zhao, C.; Ross, D.S.; Bilderback, T.E.; Harris, J.R.; Day, S.D.; Hong, C.; Yeager, T.H.; Beeson, R.C.; Bauerle, W.L.; et al. A Nursery and Greenhouse Online Knowledge Center: Learning Opportunities for Sustainable Practice. HortTechnology 2010, 20, 509–517. [Google Scholar] [CrossRef]
- Majsztrik, J.C.; Fernandez, R.T.; Fisher, P.R.; Hitchcock, D.R.; Lea-Cox, J.; Owen, J.S.; Oki, L.R.; White, S.A. Water Use and Treatment in Container-Grown Specialty Crop Production: A Review. Water. Air. Soil Pollut. 2017, 228, 151. [Google Scholar] [CrossRef] [PubMed]
- Majsztrik, J.; Lichtenberg, E.; Saavoss, M. Ornamental Grower Perceptions of Wireless Irrigation Sensor Networks: Results from a National Survey. HortTechnology 2013, 23, 775–782. [Google Scholar] [CrossRef]
- Wheeler, W.D.; Thomas, P.; van Iersel, M.; Chappell, M. Implementation of Sensor-Based Automated Irrigation in Commercial Floriculture Production: A Case Study. HortTechnology 2018, 28, 719–727. [Google Scholar] [CrossRef]
- Rihn, A.L.; Velandia, M.; Warner, L.A.; Fulcher, A.; Schexnayder, S.; LeBude, A. Factors Correlated with the Propensity to Use Automation and Mechanization by the US Nursery Industry. Agribusiness 2022, 39, 110–130. [Google Scholar] [CrossRef]
- USDA ERS. Farm Labor. Available online: https://www.ers.usda.gov/topics/farm-economy/farm-labor/ (accessed on 20 November 2022).
- McClellan, M. Don’t Wait, Automate. Available online: https://www.nurserymag.com/article/five-tips-automation/ (accessed on 20 November 2022).
- Silwal, A.; Davidson, J.R.; Karkee, M.; Mo, C.; Zhang, Q.; Lewis, K. Design, Integration, and Field Evaluation of a Robotic Apple Harvester. J. Field Robot. 2017, 34, 1140–1159. [Google Scholar] [CrossRef]
- Liu, B.; Ding, Z.; Tian, L.; He, D.; Li, S.; Wang, H. Grape Leaf Disease Identification Using Improved Deep Convolutional Neural Networks. Front. Plant Sci. 2020, 11, 1082. [Google Scholar] [CrossRef]
- Xiong, Y.; Peng, C.; Grimstad, L.; From, P.J.; Isler, V. Development and Field Evaluation of a Strawberry Harvesting Robot with a Cable-Driven Gripper. Comput. Electron. Agric. 2019, 157, 392–402. [Google Scholar] [CrossRef]
- Ye, H.; Huang, W.; Huang, S.; Cui, B.; Dong, Y.; Guo, A.; Ren, Y.; Jin, Y. Recognition of Banana Fusarium Wilt Based on UAV Remote Sensing. Remote Sens. 2020, 12, 938. [Google Scholar] [CrossRef]
- Gajjar, R.; Gajjar, N.; Thakor, V.J.; Patel, N.P.; Ruparelia, S. Real-Time Detection and Identification of Plant Leaf Diseases Using Convolutional Neural Networks on an Embedded Platform. Vis. Comput. 2022, 38, 2923–2938. [Google Scholar] [CrossRef]
- Lehnert, C.; English, A.; Mccool, C.; Tow, A.W.; Perez, T. Autonomous Sweet Pepper Harvesting for Protected Cropping Systems. IEEE Robot. Autom. Lett. 2017, 2, 872–879. [Google Scholar] [CrossRef]
- Birrell, S.; Hughes, J.; Cai, J.Y.; Iida, F. A Field-Tested Robotic Harvesting System for Iceberg Lettuce. J. Field Robot. 2020, 37, 225–245. [Google Scholar] [CrossRef] [PubMed]
- Yasukawa, S.; Li, B.; Sonoda, T.; Ishii, K. Development of a Tomato Harvesting Robot. Proc. Int. Conf. Artif. Life Robot. 2017, 22, 408–411. [Google Scholar] [CrossRef]
- Amatya, S.; Karkee, M.; Gongal, A.; Zhang, Q.; Whiting, M.D. Detection of Cherry Tree Branches with Full Foliage in Planar Architecture for Automated Sweet-Cherry Harvesting. Biosyst. Eng. 2016, 146, 3–15. [Google Scholar] [CrossRef]
- Mahmud, M.S.; Zahid, A.; He, L.; Martin, P. Opportunities and Possibilities of Developing an Advanced Precision Spraying System for Tree Fruits. Sensors 2021, 21, 3262. [Google Scholar] [CrossRef] [PubMed]
- Lu, J.; Hu, J.; Zhao, G.; Mei, F.; Zhang, C. An In-Field Automatic Wheat Disease Diagnosis System. Comput. Electron. Agric. 2017, 142, 369–379. [Google Scholar] [CrossRef]
- Jiang, P.; Chen, Y.; Liu, B.; He, D.; Liang, C. Real-Time Detection of Apple Leaf Diseases Using Deep Learning Approach Based on Improved Convolutional Neural Networks. IEEE Access 2019, 7, 59069–59080. [Google Scholar] [CrossRef]
- Abdulridha, J.; Batuman, O.; Ampatzidis, Y. UAV-Based Remote Sensing Technique to Detect Citrus Canker Disease Utilizing Hyperspectral Imaging and Machine Learning. Remote Sens. 2019, 11, 1373. [Google Scholar] [CrossRef]
- Torres-Sánchez, J.; Peña, J.M.; de Castro, A.I.; López-Granados, F. Multi-Temporal Mapping of the Vegetation Fraction in Early-Season Wheat Fields Using Images from UAV. Comput. Electron. Agric. 2014, 103, 104–113. [Google Scholar] [CrossRef]
- Pearse, G.D.; Tan, A.Y.S.; Watt, M.S.; Franz, M.O.; Dash, J.P. Detecting and Mapping Tree Seedlings in UAV Imagery Using Convolutional Neural Networks and Field-Verified Data. ISPRS J. Photogramm. Remote Sens. 2020, 168, 156–169. [Google Scholar] [CrossRef]
- Zhang, C.; Atkinson, P.M.; George, C.; Wen, Z.; Diazgranados, M.; Gerard, F. Identifying and Mapping Individual Plants in a Highly Diverse High-Elevation Ecosystem Using UAV Imagery and Deep Learning. ISPRS J. Photogramm. Remote Sens. 2020, 169, 280–291. [Google Scholar] [CrossRef]
- Feng, A.; Zhou, J.; Vories, E.D.; Sudduth, K.A.; Zhang, M. Yield Estimation in Cotton Using UAV-Based Multi-Sensor Imagery. Biosyst. Eng. 2020, 193, 101–114. [Google Scholar] [CrossRef]
- Maja, J.M.J.; Robbins, J. Controlling Irrigation in a Container Nursery Using IoT. AIMS Agric. Food 2018, 3, 205–215. [Google Scholar] [CrossRef]
- You, A.; Parayil, N.; Krishna, J.G.; Bhattarai, U.; Sapkota, R.; Ahmed, D.; Whiting, M.; Karkee, M.; Grimm, C.M.; Davidson, J.R. An Autonomous Robot for Pruning Modern, Planar Fruit Trees. arXiv 2022, arXiv:220607201. [Google Scholar]
- Liu, B.; Tan, C.; Li, S.; He, J.; Wang, H. A Data Augmentation Method Based on Generative Adversarial Networks for Grape Leaf Disease Identification. IEEE Access 2020, 8, 102188–102198. [Google Scholar] [CrossRef]
- Lea-Cox, J.D.; Bauerle, W.L.; van Iersel, M.W.; Kantor, G.F.; Bauerle, T.L.; Lichtenberg, E.; King, D.M.; Crawford, L. Advancing Wireless Sensor Networks for Irrigation Management of Ornamental Crops: An Overview. HortTechnology 2013, 23, 717–724. [Google Scholar] [CrossRef]
- Cornejo, C.; Haman, D.Z.; Yeager, T.H. Evaluation of Soil Moisture Sensors, and Their Use to Control Irrigation Systems for Containers in the Nursery Industry; ASAE Paper No. 054056; ASAE: St. Joseph, MI, USA, 2005. [Google Scholar] [CrossRef]
- Lea-Cox, J.D.; Ristvey, A.G.; Kantor, G.F. Using Wireless Sensor Technology to Schedule Irrigations and Minimize Water Use in Nursery and Greenhouse Production Systems ©. Comb. Proc. Int. Plant Propagators Soc. 2008, 58, 512–518. [Google Scholar]
- Chappell, M.; Dove, S.K.; van Iersel, M.W.; Thomas, P.A.; Ruter, J. Implementation of Wireless Sensor Networks for Irrigation Control in Three Container Nurseries. HortTechnology 2013, 23, 747–753. [Google Scholar] [CrossRef]
- van Iersel, M.W.; Chappell, M.; Lea-Cox, J.D. Sensors for Improved Efficiency of Irrigation in Greenhouse and Nursery Production. HortTechnology 2013, 23, 735–746. [Google Scholar] [CrossRef]
- Wheeler, W.D.; Chappell, M.; van Iersel, M.; Thomas, P. Implementation of Soil Moisture Sensor Based Automated Irrigation in Woody Ornamental Production. J. Environ. Hortic. 2020, 38, 1–7. [Google Scholar] [CrossRef]
- Kim, J.; Chappell, M.; Van Iersel, M.W.; Lea-Cox, J.D. Wireless Sensors Networks for Optimization of Irrigation, Production, and Profit in Ornamental Production. Acta Hortic. 2014, 1037, 643–649. [Google Scholar]
- Coates, R.W.; Delwiche, M.J.; Broad, A.; Holler, M.; Evans, R.; Oki, L.; Dodge, L. Wireless Sensor Network for Precision Irrigation Control in Horticultural Crops; American Society of Agricultural and Biological Engineers: St. Joseph, MI, USA, 2012; Volume 3. [Google Scholar]
- Belayneh, B.E.; Lea-Cox, J.D.; Lichtenberg, E. Costs and Benefits of Implementing Sensor-Controlled Irrigation in a Commercial Pot-in-Pot Container Nursery. HortTechnology 2013, 23, 760–769. [Google Scholar] [CrossRef] [Green Version]
- Lea-Cox, J.D.; Belayneh, B.E. Implementation of Sensor-Controlled Decision Irrigation Scheduling in Pot-in-Pot Nursery Production. Acta Hortic. 2013, 1034, 93–100. [Google Scholar] [CrossRef]
- Manuel Banda-Chávez, J.; Pablo Serrano-Rubio, J.; Osvaldo Manjarrez-Carrillo, A.; Maria Rodriguez-Vidal, L.; Herrera-Guzman, R. Intelligent Wireless Sensor Network for Ornamental Plant Care. In Proceedings of the IECON 2018—44th Annual Conference of the IEEE Industrial Electronics Society, Washington, DC, USA, 21–23 October 2018; Volume 1. [Google Scholar]
- Beeson, R., Jr.; Brooks, J. Evaluation of a Model Based on Reference Crop Evapotranspiration (ETo) for Precision Irrigation Using Overhead Sprinklers during Nursery Production of Ligustrum Japonica. Proc. V Int. Symp. Irrig. Hortic. Crops 2006, 792, 85–90. [Google Scholar]
- Zubler, A.V.; Yoon, J.Y. Proximal Methods for Plant Stress Detection Using Optical Sensors and Machine Learning. Biosensors 2020, 10, 193. [Google Scholar] [CrossRef]
- Velázquez-López, N.; Sasaki, Y.; Nakano, K.; Mejía-Muñoz, J.M.; Kriuchkova, E.R. Detection of Powdery Mildew Disease on Rose Using Image Processing with Open CV. Rev. Chapingo Ser. Hortic. 2011, 17, 151–160. [Google Scholar] [CrossRef]
- Polder, G.; van der Heijden, G.W.A.M.; van Doorn, J.; Baltissen, T.A.H.M.C. Automatic detection of tulip breaking virus (TBV) in tulip fields using machine vision. Biosyst. Eng. 2014, 117, 35–42. [Google Scholar] [CrossRef]
- Poona, N.K.; Ismail, R. Using Boruta-Selected Spectroscopic Wavebands for the Asymptomatic Detection of Fusarium Circinatum Stress. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 3764–3772. [Google Scholar] [CrossRef]
- Minaei, S.; Jafari, M.; Safaie, N. Design and Development of a Rose Plant Disease-Detection and Site-Specific Spraying System Based on a Combination of Infrared and Visible Images. J. Agric. Sci. Technol. 2018, 20, 23–36. [Google Scholar]
- Nuanmeesri, S. A Hybrid Deep Learning and Optimized Machine Learning Approach for Rose Leaf Disease Classification. Eng. Technol. Appl. Sci. Res. 2021, 11, 7678–7683. [Google Scholar] [CrossRef]
- Polder, G.; van der Heijden, G.W.A.M.; van Doorn, J.; Clevers, J.G.P.W.; van der Schoor, R.; Baltissen, A.H.M.C. Detection of the Tulip Breaking Virus (TBV) in Tulips Using Optical Sensors. Precis. Agric. 2010, 11, 397–412. [Google Scholar] [CrossRef]
- Polder, G.; Pekkeriet, E.; Snikkers, M. A Spectral Imaging System for Detection of Botrytis in Greenhouses. In Proceedings of the EFITA-WCCA-CIGR Conference “Sustainable Agriculture through ICT Innovation”, Turin, Italy, 24–27 June 2013. [Google Scholar]
- Heim, R.H.J.; Wright, I.J.; Allen, A.P.; Geedicke, I.; Oldeland, J. Developing a Spectral Disease Index for Myrtle Rust (Austropuccinia psidii). Plant Pathol. 2019, 68, 738–745. [Google Scholar] [CrossRef]
- Pethybridge, S.J.; Hay, F.; Esker, P.; Groom, T.; Wilson, C.; Nutter, F.W. Visual and Radiometric Assessments for Yield Losses Caused by Ray Blight in Pyrethrum. Crop Sci. 2008, 48, 343–352. [Google Scholar] [CrossRef]
- Jafari, M.; Minaei, S.; Safaie, N. Detection of Pre-Symptomatic Rose Powdery-Mildew and Gray-Mold Diseases Based on Thermal Vision. Infrared Phys. Technol. 2017, 85, 170–183. [Google Scholar] [CrossRef]
- Jafari, M.; Minaei, S.; Safaie, N.; Torkamani-Azar, F.; Sadeghi, M. Classification Using Radial-Basis Neural Networks Based on Thermographic Assessment of Botrytis Cinerea Infected Cut Rose Flowers Treated with Methyl Jasmonate. J. Crop Prot. 2016, 5, 591–602. [Google Scholar] [CrossRef]
- Buitrago, M.F.; Groen, T.A.; Hecker, C.A.; Skidmore, A.K. Changes in Thermal Infrared Spectra of Plants Caused by Temperature and Water Stress. ISPRS J. Photogramm. Remote. Sens. 2016, 111, 22–31. [Google Scholar] [CrossRef]
- de Castro, A.; Maja, J.M.; Owen, J.; Robbins, J.; Peña, J. Experimental Approach to Detect Water Stress in Ornamental Plants Using SUAS-Imagery. In Proceedings of the Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III, Orlando, FL, USA, 16–17 April 2018; Volume 10664, pp. 178–188. [Google Scholar]
- Braman, S.; Chappell, M.; Chong, J.; Fulcher, A.; Gauthier, N.; Klingeman, W.; Knox, G.; LeBude, A.; Neal, J.; White, S.; et al. Pest Management Strategic Plan for Container and Field-Produced Nursery Crops: Revision 2015. In Proceedings of the Southern Nursery Integrated Pest Management Working Group (SNIPM), Mills River, NC, USA, 30–31 July 2009; Volume 236. [Google Scholar]
- Mizell, R.F.; Short, D.E. Integrated Pest Management in the Commercial Ornamental Nursery. 2015; Volume 8. Available online: https://site.caes.uga.edu/sehp/files/2020/03/UF-IPM-in-the-Commercial-Ornamental-Nursery.pdf (accessed on 20 November 2022).
- Chen, Y.; Zhu, H.; Ozkan, H.E. Development of a Variable-Rate Sprayer with Laser Scanning Sensor to Synchronize Spray Outputs to Tree Structures. Trans. ASABE 2012, 55, 773–781. [Google Scholar] [CrossRef]
- Hudson, W.G.; Garber, M.P.; Oetting, R.D.; Mizell, R.F.; Chase, A.R.; Bondari, K. Pest Management in the United States Greenhouse and Nursery Industry: V. Insect and Mite Control. HortTechnology 1996, 6, 216–221. [Google Scholar] [CrossRef]
- Zhu, H.; Rosetta, R.; Reding, M.E.; Zondag, R.H.; Ranger, C.M.; Canas, L.; Fulcher, A.; Derksen, R.C.; Ozkan, H.E.; Krause, C.R. Validation of a Laser-Guided Variable-Rate Sprayer for Managing Insects in Ornamental Nurseries. Trans. ASABE 2017, 60, 337–345. [Google Scholar] [CrossRef]
- Fox, R.D.; Derksen, R.C.; Zhu, H.; Brazee, R.D.; Svensson, S.A. A History of Air-Blast Sprayer Development and Future Prospects. Trans. ASABE 2008, 51, 405–410. [Google Scholar] [CrossRef]
- Chen, L.; Zhu, H.; Horst, L.; Wallhead, M.; Reding, M.; Fulcher, A. Management of Pest Insects and Plant Diseases in Fruit and Nursery Production with Laser-Guided Variable-Rate Sprayers. HortScience 2021, 56, 94–100. [Google Scholar] [CrossRef]
- Zhu, H.; Jeon, H.Y.; Gu, J.; Derksen, R.C.; Krause, C.R.; Ozkan, H.E.; Chen, Y.; Reding, M.E.; Ranger, C.M.; Cañas, L.; et al. Development of Two Intelligent Spray Systems for Ornamental Nurseries©. In Proceedings of the International Plant Propagators’ Society, Miami, FL, USA, 1 August 2010; Volume 60, p. 322. [Google Scholar]
- Jeon, H.; Zhu, H. Development of a Variable-Rate Sprayer for Nursery Liner Applications. Trans. ASABE 2012, 55, 303–312. [Google Scholar] [CrossRef]
- Jeon, H.Y.; Zhu, H.; Derksen, R.C.; Ozkan, H.E.; Krause, C.R.; Fox, R.D. Performance Evaluation of a Newly Developed Variable-Rate Sprayer for Nursery Liner Applications. Trans. ASABE 2011, 54, 773–781. [Google Scholar]
- Liu, H.; Zhu, H.; Shen, Y.; Chen, Y. Embedded Computer-Controlled Laser Sensor-Guided Air-Assisted Precision Sprayer Development. In Proceedings of the ASABE Annual International Meeting, New Orleans, LA, USA, 26–29 July 2015. [Google Scholar]
- Shen, Y.; Zhu, H.; Liu, H.; Chen, Y.; Ozkan, E. Development of a Laser-Guided, Embedded-Computercontrolled, Air-Assisted Precision Sprayer. Trans. ASABE 2017, 60, 1827–1838. [Google Scholar] [CrossRef]
- Chen, L.; Wallhead, M.; Zhu, H.; Fulcher, A. Control of Insects and Diseases with Intelligent Variable-Rate Sprayers in Ornamental Nurseries. J. Environ. Hortic. 2019, 37, 90–100. [Google Scholar] [CrossRef]
- Fessler, L.; Fulcher, A.; Schneider, L.; Wright, W.C.; Zhu, H. Reducing the Nursery Pesticide Footprint with Laser-Guided, Variable-Rate Spray Application Technology. HortScience 2021, 141, 1572–1584. [Google Scholar] [CrossRef]
- Wei, J.; Salyani, M. Development of a Laser Scanner for Measuring Tree Canopy Characteristics: Phase 1. Prototype Development. Trans. Am. Soc. Agric. Eng. 2004, 47, 2101–2107. [Google Scholar] [CrossRef]
- Campbell, J.; Sarkhosh, A.; Habibi, F.; Ismail, A.; Gajjar, P.; Zhongbo, R.; Tsolova, V.; El-sharkawy, I. Biometrics Assessment of Cluster- and Berry-related Traits of Muscadine Grape Population. Plants 2021, 10, 1067. [Google Scholar] [CrossRef]
- Zhang, R.; Tian, Y.; Zhang, J.; Dai, S.; Hou, X.; Wang, J.; Guo, Q. Metric Learning for Image-Based Flower Cultivars Identification. Plant Methods 2021, 17, 1–14. [Google Scholar] [CrossRef] [PubMed]
- Maltoni, D.; Maio, D.; Jain, A.K.; Prabhakar, S. Handbook of Fingerprint Recognition; Springer Science and Business Media: New York, NY, USA, 2009. [Google Scholar] [CrossRef]
- Garrido, M.; Perez-Ruiz, M.; Valero, C.; Gliever, C.J.; Hanson, B.D.; Slaughter, D.C. Active Optical Sensors for Tree Stem Detection and Classification in Nurseries. Sens. Switz. 2014, 14, 10783–10803. [Google Scholar] [CrossRef] [PubMed]
- Shearer, S.A.; Holmes, R.G. Plant identification using color co-occurrence matrices. Trans. ASAE 1990, 33, 1237–1244. [Google Scholar] [CrossRef]
- She, Y.; Ehsani, R.; Robbins, J.; Leiva, J.N.; Owen, J. Applications of High-Resolution Imaging for Open Field Container Nursery Counting. Remote Sens. 2018, 10, 2018. [Google Scholar] [CrossRef]
- Leiva, J.N.; Robbins, J.; Saraswat, D.; She, Y.; Ehsani, R. Evaluating Remotely Sensed Plant Count Accuracy with Differing Unmanned Aircraft System Altitudes, Physical Canopy Separations, and Ground Covers. J. Appl. Remote Sens. 2017, 11, 036003. [Google Scholar] [CrossRef]
- Yuan, X.; Li, D.; Sun, P.; Wang, G.; Ma, Y. Real-Time Counting and Height Measurement of Nursery Seedlings Based on Ghostnet–YoloV4 Network and Binocular Vision Technology. Forests 2022, 13, 1459. [Google Scholar] [CrossRef]
- Gini, R.; Sona, G.; Ronchetti, G.; Passoni, D.; Pinto, L. Improving Tree Species Classification Using UAS Multispectral Images and Texture Measures. ISPRS Int. J. Geo-Inf. 2018, 7, 315. [Google Scholar] [CrossRef]
- Weiss, U.; Biber, P.; Laible, S.; Bohlmann, K.; Zell, A. Plant Species Classification Using a 3D LIDAR Sensor and Machine Learning. In Proceedings of the 2010 Ninth International Conference on Machine Learning and Applications, Washington, DC, USA, 12–14 December 2010; pp. 339–345. [Google Scholar]
- Alipour, N.; Tarkhaneh, O.; Awrangjeb, M.; Tian, H. Flower Image Classification Using Deep Convolutional Neural Network. In Proceedings of the 2021 7th International Conference on Web Research (ICWR), Tehran, Iran, 19–20 May 2021; pp. 1–4. [Google Scholar]
- Dharwadkar, S.; Bhat, G.; Subba Reddy, N.V.; Aithal, P.K. Floriculture Classification Using Simple Neural Network and Deep Learning. In Proceedings of the 2017 2nd IEEE International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT), Bangalore, India, 19–20 May 2017; pp. 619–622. [Google Scholar]
- Malik, M.; Aslam, W.; Nasr, E.A.; Aslam, Z.; Kadry, S. A Performance Comparison of Classification Algorithms for Rose Plants. Comput. Intell. Neurosci. 2022, 2022, 1842547. [Google Scholar] [CrossRef] [PubMed]
- Narvekar, C.; Rao, M. Flower Classification Using CNN and Transfer Learning in CNN-Agriculture Perspective. In Proceedings of the 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), Thoothukudi, India, 3–5 December 2020; pp. 660–664. [Google Scholar]
- Soleimanipour, A.; Chegini, G.R. A Vision-Based Hybrid Approach for Identification of Anthurium Flower Cultivars. Comput. Electron. Agric. 2020, 174, 105460. [Google Scholar] [CrossRef]
- Gunjal, S.; Waskar, D.; Dod, V.; Bhujbal, B.; Ambad, S.N.; Rajput, H.; Hendre, P.; Thoke, N.; Bhaskar, M. Horticulture Nursery Management. 2012. Available online: https://k8449r.weebly.com/uploads/3/0/7/3/30731055/horticulture_plant_nursery1-signed.pdf (accessed on 20 November 2022).
- Li, M.; Ma, L.; Zong, W.; Luo, C.; Huang, M.; Song, Y. Design and Experimental Evaluation of a Form Trimming Machine for Horticultural Plants. Appl. Sci. Switz. 2021, 11, 2230. [Google Scholar] [CrossRef]
- Zhang, M.; Guo, W.; Wang, L.; Li, D.; Hu, B.; Wu, Q. Modeling and Optimization of Watering Robot Optimal Path for Ornamental Plant Care. Comput. Ind. Eng. 2021, 157, 107263. [Google Scholar] [CrossRef]
- Sharma, S.; Borse, R. Automatic Agriculture Spraying Robot with Smart Decision Making. Adv. Intell. Syst. Comput. 2016, 530, 743–758. [Google Scholar] [CrossRef]
- Prabha, P.; Vishnu, R.S.; Mohan, H.T.; Rajendran, A.; Bhavani, R.R. A Cable Driven Parallel Robot for Nursery Farming Assistance. In Proceedings of the 2021 IEEE 9th Region 10 Humanitarian Technology Conference (R10-HTC), Bangalore, India, 30 September–2 October 2021; pp. 1–6. [Google Scholar]
- Kim, W.S.; Lee, D.H.; Kim, Y.J.; Kim, T.; Lee, W.S.; Choi, C.H. Stereo-Vision-Based Crop Height Estimation for Agricultural Robots. Comput. Electron. Agric. 2021, 181, 105937. [Google Scholar] [CrossRef]
- Wang, X.; Singh, D.; Marla, S.; Morris, G.; Poland, J. Field-Based High-Throughput Phenotyping of Plant Height in Sorghum Using Different Sensing Technologies. Plant Methods 2018, 14, 1–16. [Google Scholar] [CrossRef]
- Andújar, D.; Ribeiro, A.; Fernández-Quintanilla, C.; Dorado, J. Using Depth Cameras to Extract Structural Parameters to Assess the Growth State and Yield of Cauliflower Crops. Comput. Electron. Agric. 2016, 122, 67–73. [Google Scholar] [CrossRef]
- Polder, G.; Hofstee, J.W. Phenotyping Large Tomato Plants in the Greenhouse Using a 3D Light-Field Camera. In Proceedings of the 2014 Montreal, Quebec, QC, Canada, 13–16 July 2014; American Society of Agricultural and Biological Engineers, 2014; p. 1. [Google Scholar]
- Kerkech, M.; Hafiane, A.; Canals, R.; Ros, F. Vine Disease Detection by Deep Learning Method Combined with 3d Depth Information. In Proceedings of the International Conference on Image and Signal Processing, 9th International Conference, ICISP 2020, Marrakesh, Morocco, 4–6 June 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 82–90. [Google Scholar]
- Gai, J.; Xiang, L.; Tang, L. Using a Depth Camera for Crop Row Detection and Mapping for Under-Canopy Navigation of Agricultural Robotic Vehicle. Comput. Electron. Agric. 2021, 188, 106301. [Google Scholar] [CrossRef]
- Gongal, A.; Karkee, M.; Amatya, S. Apple Fruit Size Estimation Using a 3D Machine Vision System. Inf. Process. Agric. 2018, 5, 498–503. [Google Scholar] [CrossRef]
- Vázquez-Arellano, M.; Paraforos, D.S.; Reiser, D.; Garrido-Izard, M.; Griepentrog, H.W. Determination of Stem Position and Height of Reconstructed Maize Plants Using a Time-of-Flight Camera. Comput. Electron. Agric. 2018, 154, 276–288. [Google Scholar] [CrossRef]
- Hämmerle, M.; Höfle, B. Direct Derivation of Maize Plant and Crop Height from Low-Cost Time-of-Flight Camera Measurements. Plant Methods 2016, 12, 50. [Google Scholar] [CrossRef]
- Vázquez-Arellano, M.; Reiser, D.; Paraforos, D.S.; Garrido-Izard, M.; Burce, M.E.C.; Griepentrog, H.W. 3-D Reconstruction of Maize Plants Using a Time-of-Flight Camera. Comput. Electron. Agric. 2018, 145, 235–247. [Google Scholar] [CrossRef]
- Li, J.; Tang, L. Developing a Low-Cost 3D Plant Morphological Traits Characterization System. Comput. Electron. Agric. 2017, 143, 1–13. [Google Scholar] [CrossRef]
- Pamornnak, B.; Limsiroratana, S.; Khaorapapong, T.; Chongcheawchamnan, M.; Ruckelshausen, A. An Automatic and Rapid System for Grading Palm Bunch Using a Kinect Camera. Comput. Electron. Agric. 2017, 143, 227–237. [Google Scholar] [CrossRef]
- Cao, Q.; Yang, G.; Duan, D.; Chen, L.; Wang, F.; Xu, B.; Zhao, C.; Niu, F. Combining Multispectral and Hyperspectral Data to Estimate Nitrogen Status of Tea Plants (Camellia sinensis (L.) O. Kuntze) under Field Conditions. Comput. Electron. Agric. 2022, 198, 107084. [Google Scholar] [CrossRef]
- Chandel, A.K.; Khot, L.R.; Yu, L.X. Alfalfa (Medicago sativa L.) Crop Vigor and Yield Characterization Using High-Resolution Aerial Multispectral and Thermal Infrared Imaging Technique. Comput. Electron. Agric. 2021, 182, 105999. [Google Scholar] [CrossRef]
- Abbas, A.; Jain, S.; Gour, M.; Vankudothu, S. Tomato Plant Disease Detection Using Transfer Learning with C-GAN Synthetic Images. Comput. Electron. Agric. 2021, 187, 106279. [Google Scholar] [CrossRef]
- Xiao, D.; Zeng, R.; Liu, Y.; Huang, Y.; Liu, J.; Feng, J.; Zhang, X. Citrus Greening Disease Recognition Algorithm Based on Classification Network Using TRL-GAN. Comput. Electron. Agric. 2022, 200, 107206. [Google Scholar] [CrossRef]
- Zhang, L.; Nie, Q.; Ji, H.; Wang, Y.; Wei, Y.; An, D. Hyperspectral Imaging Combined with Generative Adversarial Network (GAN)-Based Data Augmentation to Identify Haploid Maize Kernels. J. Food Compos. Anal. 2022, 106, 104346. [Google Scholar] [CrossRef]
- Mazzia, V.; Khaliq, A.; Salvetti, F.; Chiaberge, M. Real-Time Apple Detection System Using Embedded Systems With Hardware Accelerators: An Edge AI Application. IEEE Access 2020, 8, 9102–9114. [Google Scholar] [CrossRef]
- Zhang, Y.; Yu, J.; Chen, Y.; Yang, W.; Zhang, W.; He, Y. Real-Time Strawberry Detection Using Deep Neural Networks on Embedded System (Rtsd-Net): An Edge AI Application. Comput. Electron. Agric. 2022, 192, 106586. [Google Scholar] [CrossRef]
- Codeluppi, G.; Davoli, L.; Ferrari, G. Forecasting Air Temperature on Edge Devices with Embedded AI. Sensors 2021, 21, 3973. [Google Scholar] [CrossRef]
- Coppola, M.; Noaille, L.; Pierlot, C.; de Oliveira, R.O.; Gaveau, N.; Rondeau, M.; Mohimont, L.; Steffenel, L.A.; Sindaco, S.; Salmon, T. Innovative Vineyards Environmental Monitoring System Using Deep Edge AI. Artif. Intell. Digit. Ind.–Appl. 2022, 261–278. [Google Scholar] [CrossRef]
- Aghi, D.; Cerrato, S.; Mazzia, V.; Chiaberge, M. Deep Semantic Segmentation at the Edge for Autonomous Navigation in Vineyard Rows. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; pp. 3421–3428. [Google Scholar]
- Deng, F.; Zuo, P.; Wen, K.; Wu, X. Novel Soil Environment Monitoring System Based on RFID Sensor and LoRa. Comput. Electron. Agric. 2020, 169, 105169. [Google Scholar] [CrossRef]
- Luvisi, A.; Panattoni, A.; Materazzi, A. RFID Temperature Sensors for Monitoring Soil Solarization with Biodegradable Films. Comput. Electron. Agric. 2016, 123, 135–141. [Google Scholar] [CrossRef]
- Vellidis, G.; Tucker, M.; Perry, C.; Kvien, C.; Bednarz, C. A Real-Time Wireless Smart Sensor Array for Scheduling Irrigation. Comput. Electron. Agric. 2008, 61, 44–50. [Google Scholar] [CrossRef]
- Dey, S.; Bhattacharyya, R.; Karmakar, N.; Sarma, S. A Folded Monopole Shaped Novel Soil Moisture and Salinity Sensor for Precision Agriculture Based Chipless RFID Applications. In Proceedings of the 2019 IEEE MTT-S International Microwave and RF Conference (IMARC), Mumbai, India, 13–15 December 2019. [Google Scholar] [CrossRef]
- Wang, J.; Chang, L.; Aggarwal, S.; Abari, O.; Keshav, S. Soil Moisture Sensing with Commodity RFID Systems. In Proceedings of the MobiSys’20: The 18th Annual International Conference on Mobile Systems, Applications, and Services, Toronto, ON, Canada, 15–19 June 2020; Volume 13. [Google Scholar] [CrossRef]
- Aroca, R.V.; Hernandes, A.C.; Magalhães, D.V.; Becker, M.; Vaz, C.M.P.; Calbo, A.G. Calibration of Passive UHF RFID Tags Using Neural Networks to Measure Soil Moisture. J. Sens. 2018, 2018, 3436503. [Google Scholar] [CrossRef]
- Hasan, A.; Bhattacharyya, R.; Sarma, S. Towards Pervasive Soil Moisture Sensing Using RFID Tag Antenna-Based Sensors. In Proceedings of the 2015 IEEE International Conference on RFID Technology and Applications (RFID-TA), Tokyo, Japan, 16–18 September 2015; pp. 165–170. [Google Scholar]
- Yong, W.; Shuaishuai, L.; Li, L.; Minzan, L.; Ming, L.; Arvanitis, K.; Georgieva, C.; Sigrimis, N. Smart Sensors from Ground to Cloud and Web Intelligence. IFAC-Pap. 2018, 51, 31–38. [Google Scholar] [CrossRef]
- Barge, P.; Gay, P.; Piccarolo, P.; Tortia, C. RFID Tracking of Potted Plants from Nursery to Distribution. In Proceedings of the International Conference Ragusa SHWA2010, Ragusa, Italy, 16–18 September 2010. [Google Scholar]
- Sugahara, K. Traceability System for Agricultural Products Based on RFID and Mobile Technology. IFIP Adv. Inf. Commun. Technol. 2009, 295, 2293–2301. [Google Scholar] [CrossRef]
- Voulodimos, A.S.; Patrikakis, C.Z.; Sideridis, A.B.; Ntafis, V.A.; Xylouri, E.M. A Complete Farm Management System Based on Animal Identification Using RFID Technology. Comput. Electron. Agric. 2010, 70, 380–388. [Google Scholar] [CrossRef]
- Weiss, U.; Biber, P. Plant Detection and Mapping for Agricultural Robots Using a 3D LIDAR Sensor. Robot. Auton. Syst. 2011, 59, 265–273. [Google Scholar] [CrossRef]
- Ge, Y.; Xiong, Y.; From, P.J. Symmetry-Based 3D Shape Completion for Fruit Localisation for Harvesting Robots. Biosyst. Eng. 2020, 197, 188–202. [Google Scholar] [CrossRef]
- Skoczeń, M.; Ochman, M.; Spyra, K.; Nikodem, M.; Krata, D.; Panek, M.; Pawłowski, A. Obstacle Detection System for Agricultural Mobile Robot Application Using RGB-D Cameras. Sensors 2021, 21, 5292. [Google Scholar] [CrossRef]
- Ji, W.; Gao, X.; Xu, B.; Chen, G.Y.; Zhao, D. Target Recognition Method of Green Pepper Harvesting Robot Based on Manifold Ranking. Comput. Electron. Agric. 2020, 177, 105663. [Google Scholar] [CrossRef]
- Gai, R.; Chen, N.; Yuan, H. A Detection Algorithm for Cherry Fruits Based on the Improved YOLO-v4 Model. Neural Comput. Appl. 2021, 1–12. [Google Scholar] [CrossRef]
- Jia, W.; Tian, Y.; Luo, R.; Zhang, Z.; Lian, J.; Zheng, Y. Detection and Segmentation of Overlapped Fruits Based on Optimized Mask R-CNN Application in Apple Harvesting Robot. Comput. Electron. Agric. 2020, 172, 105380. [Google Scholar] [CrossRef]
Crops | Nursery Types | Specifications | Performance | References |
---|---|---|---|---|
Multiple species of nursery plants | Container grown | Genetic algorithm for optimized path planning | Reduced water consumption; the optimal path for watering | Zhang et al. [87] |
Five different plant species | Container grown | Integrated knife and rotary base for trimming | Overall performance was more than 93%; time: 8.89s | Li et al. [86] |
Unspecified | Field grown | Algorithm: Support Vector Machine (SVM) | High accuracy for disease identification and growth monitoring | Sharma and Borse [88] |
Unspecified | Field grown | Cable-driven manipulator; pre-trained VGG16 for vision system | Weed detection accuracy of 96.29%; accurate trajectory planning in simulation | Prabha et al. [89] |
Sensor Types | Advantages | Disadvantages |
---|---|---|
Image sensors (RGB camera, multispectral, hyperspectral, etc.) |
|
|
Range sensors (LiDAR, ultrasonic, etc.) |
|
|
Infrared sensors (temperature sensors) |
|
|
Volumetric sensors (soil moisture sensors) |
|
|
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mahmud, M.S.; Zahid, A.; Das, A.K. Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects. Sensors 2023, 23, 1818. https://doi.org/10.3390/s23041818
Mahmud MS, Zahid A, Das AK. Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects. Sensors. 2023; 23(4):1818. https://doi.org/10.3390/s23041818
Chicago/Turabian StyleMahmud, Md Sultan, Azlan Zahid, and Anup Kumar Das. 2023. "Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects" Sensors 23, no. 4: 1818. https://doi.org/10.3390/s23041818
APA StyleMahmud, M. S., Zahid, A., & Das, A. K. (2023). Sensing and Automation Technologies for Ornamental Nursery Crop Production: Current Status and Future Prospects. Sensors, 23(4), 1818. https://doi.org/10.3390/s23041818