Human–Robot Interaction in Agriculture: A Systematic Review
Abstract
:1. Introduction
1.1. Background
1.2. The General Context of Human–Robot Interaction in Agriculture
1.2.1. Human–Robot Interaction Definition
1.2.2. Main Design Concepts
1.2.3. Communication Frameworks
1.2.4. Safety and Human Factors
1.2.5. Human–Robot Interaction Evaluation and Metrics
1.2.6. Aim and Structure of the Paper
2. Materials and Methods
2.1. Critical Steps in Performing the Systematic Review
- (1)
- Formulation of a primary research question: “What is the state of the art and what are future perspectives in HRI in agriculture?”
- (2)
- Development of a research protocol: The methodology followed for screening the relevant literature and data extraction and analysis was included in a written document. This was accepted by all the authors of this study, prior to the start of the literature search, to minimize bias.
- (3)
- Literature search: The methodology for selecting the relevant studies is described in Section 2.2 along with the implemented electronic databases, inclusive criteria, and review stages based on the PRISMA guidelines [44].
- (4)
- Data extraction: Specific items, regarding references (including journal, title, and authors), objective, method, crop type, interaction modes, automation levels, and key outcomes, were gathered in an online shared spreadsheet.
- (5)
- Quality appraisal of the selected studies: Although quality remains a challenging concept to define, the present study used the tool developed by Hoy et al. [47] (described in Section 2.3), which comprises specific internal and external validity criteria.
- (6)
- Data analysis and results: The first step in this procedure included a simple descriptive assessment of each study, presented in tabular form, followed by a statistical analysis.
- (7)
- Interpretation of results: Conclusions were drawn based on the available scientific evidence, while areas were identified to focus on for future research.
2.2. Literature Search
2.3. Methodological Quality Assessment
- High (++), indicating low risk of bias;
- Acceptable (+), indicating moderate risk of bias;
- Low (−), indicating high risk of bias.
2.4. Classification of Modes of Human and Robot Working Together
- Isolation mode, where HRI is never permitted, while normally, barriers are used;
- Coexistence mode, which is similar to the above mode, yet without barriers;
- Synchronization mode, where robot and human focus on different tasks in a synchronized manner and work in different working areas;
- Cooperation mode, where robot and human focus, again, on different tasks, however, working in the same working area;
- Collaboration mode, where robot and human focus on the same task and work in the same working area.
2.5. Assessment of the Level of Automation during Decision and Action Stage
3. Results
3.1. Preliminary Data Visualization Analysis
3.1.1. Time Distribution
3.1.2. Distribution of the Contributing International Journals, Conferences, and Disciplines
3.2. Methodological Quality of the Reviewed Studies
3.3. Brief Review of the Relevant Literature
- Humans alone detect and mark the targets, while HRI is never permitted. This is compatible with both level 1 in Sheridan’s scale and isolation mode;
- Robots recommend the targets and humans approve and mark them. In particular, the targets are automatically identified with the use of a detection algorithm. Then, humans recognize the algorithm’s true detections by ignoring the false ones and mark the possible missing targets. This interaction corresponds to levels 3–4 in Sheridan’s scale, as mentioned in these studies. In addition, following the analysis described in Section 2.4, this interaction is classified as collaboration, since both robots and humans focus on the same task;
- The targets are automatically detected by the corresponding machine learning algorithm, with the human role being to cancel the false findings, while, like at the above level, the humans marks the missing items. This type of synergy is equivalent to levels 5–7 in the Sheridan scale and, again, is classified as collaboration;
- Purely autonomous marking of targets takes place, in which human intervention is never permitted. Obviously, similar to the first type of synergy that was mentioned above, no HRI exists, demonstrating the highest level of automation in the Sheridan scale, namely 10.
4. Discussion and Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Guenat, S.; Purnell, P.; Davies, Z.G.; Nawrath, M.; Stringer, L.C.; Babu, G.R.; Balasubramanian, M.; Ballantyne, E.E.F.; Bylappa, B.K.; Chen, B.; et al. Meeting sustainable development goals via robotics and autonomous systems. Nat. Commun. 2022, 13, 3559. [Google Scholar] [CrossRef] [PubMed]
- IFR Press Room Robots Help Reaching UN Goals of Sustainable Development, International Federation of Robotics Reports. Available online: https://ifr.org/ifr-press-releases/news/robots-help-reaching-un-sdgs (accessed on 1 June 2023).
- Pearson, S.; Camacho-Villa, T.C.; Valluru, R.; Gaju, O.; Rai, M.C.; Gould, I.; Brewer, S.; Sklar, E. Robotics and Autonomous Systems for Net Zero Agriculture. Curr. Robot. Rep. 2022, 3, 57–64. [Google Scholar] [CrossRef]
- Lampridi, M.; Sørensen, C.; Bochtis, D. Agricultural Sustainability: A Review of Concepts and Methods. Sustainability 2019, 11, 5120. [Google Scholar] [CrossRef] [Green Version]
- Benos, L.; Makaritis, N.; Kolorizos, V. From Precision Agriculture to Agriculture 4.0: Integrating ICT in Farming—Information and Communication Technologies for Agriculture—Theme III: Decision; Bochtis, D.D., Sørensen, C.G., Fountas, S., Moysiadis, V., Pardalos, P.M., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 79–93. ISBN 978-3-030-84152-2. [Google Scholar]
- Toriyama, K. Development of precision agriculture and ICT application thereof to manage spatial variability of crop growth. Soil Sci. Plant Nutr. 2020, 66, 811–819. [Google Scholar] [CrossRef]
- Lampridi, M.G.; Kateris, D.; Vasileiadis, G.; Marinoudi, V.; Pearson, S.; Sørensen, C.G.; Balafoutis, A.; Bochtis, D. A Case-Based Economic Assessment of Robotics Employment in Precision Arable Farming. Agronomy 2019, 9, 175. [Google Scholar] [CrossRef] [Green Version]
- Marinoudi, V.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Robotics and labour in agriculture. A context consideration. Biosyst. Eng. 2019, 184, 111–121. [Google Scholar] [CrossRef]
- Terazono, E. Farm Robots Given COVID-19 Boost. Available online: https://www.ft.com/content/0b394693-137b-40a4-992b-0b742202e4e1 (accessed on 22 September 2022).
- Bochtis, D.; Benos, L.; Lampridi, M.; Marinoudi, V.; Pearson, S.; Sørensen, C.G. Agricultural workforce crisis in light of the COVID-19 pandemic. Sustainability 2020, 12, 8212. [Google Scholar] [CrossRef]
- Moysiadis, V.; Tsolakis, N.; Katikaridis, D.; Sørensen, C.G.; Pearson, S.; Bochtis, D. Mobile Robotics in Agricultural Operations: A Narrative Review on Planning Aspects. Appl. Sci. 2020, 10, 3453. [Google Scholar] [CrossRef]
- Benos, L.; Sørensen, C.G.; Bochtis, D. Field Deployment of Robotic Systems for Agriculture in Light of Key Safety, Labor, Ethics and Legislation Issues. Curr. Robot. Rep. 2022, 3, 49–56. [Google Scholar] [CrossRef]
- Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
- Droukas, L.; Doulgeri, Z.; Tsakiridis, N.L.; Triantafyllou, D.; Kleitsiotis, I.; Mariolis, I.; Giakoumis, D.; Tzovaras, D.; Kateris, D.; Bochtis, D. A Survey of Robotic Harvesting Systems and Enabling Technologies. arXiv 2022, arXiv:2207.10457. [Google Scholar] [CrossRef] [PubMed]
- Van Wynsberghe, A.; Ley, M.; Roeser, S. Ethical Aspects of Human–Robot Collaboration in Industrial Work Settings BT—The 21st Century Industrial Robot: When Tools Become Collaborators; Aldinhas Ferreira, M.I., Fletcher, S.R., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 255–266. ISBN 978-3-030-78513-0. [Google Scholar]
- Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
- Marinoudi, V.; Lampridi, M.; Kateris, D.; Pearson, S.; Sørensen, C.G.; Bochtis, D. The Future of Agricultural Jobs in View of Robotization. Sustainability 2021, 13, 12109. [Google Scholar] [CrossRef]
- Kruse, T.; Pandey, A.K.; Alami, R.; Kirsch, A. Human-aware robot navigation: A survey. Rob. Auton. Syst. 2013, 61, 1726–1743. [Google Scholar] [CrossRef] [Green Version]
- Tsarouchi, P.; Makris, S.; Chryssolouris, G. Human–robot interaction review and challenges on task planning and programming. Int. J. Comput. Integr. Manuf. 2016, 29, 916–931. [Google Scholar] [CrossRef]
- Adamides, G.; Katsanos, C.; Parmet, Y.; Christou, G.; Xenos, M.; Hadzilacos, T.; Edan, Y. HRI usability evaluation of interaction modes for a teleoperated agricultural robotic sprayer. Appl. Ergon. 2017, 62, 237–246. [Google Scholar] [CrossRef] [PubMed]
- Benos, L.; Bechar, A.; Bochtis, D. Safety and ergonomics in human-robot interactive agricultural operations. Biosyst. Eng. 2020, 200, 55–72. [Google Scholar] [CrossRef]
- Hopko, S.; Wang, J.; Mehta, R. Human Factors Considerations and Metrics in Shared Space Human-Robot Collaboration: A Systematic Review. Front. Robot. AI 2022, 9, 799522. [Google Scholar] [CrossRef]
- Vasconez, J.P.; Kantor, G.A.; Auat Cheein, F.A. Human–robot interaction in agriculture: A survey and current challenges. Biosyst. Eng. 2019, 179, 35–48. [Google Scholar] [CrossRef]
- Moysiadis, V.; Katikaridis, D.; Benos, L.; Busato, P.; Anagnostis, A.; Kateris, D.; Pearson, S.; Bochtis, D. An Integrated Real-Time Hand Gesture Recognition Framework for Human–Robot Interaction in Agriculture. Appl. Sci. 2022, 12, 8160. [Google Scholar] [CrossRef]
- Lu, D.; Yu, Y.; Liu, H. Gesture recognition using data glove: An extreme learning machine method. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 1349–1354. [Google Scholar]
- Jaramillo-Yánez, A.; Benalcázar, M.E.; Mena-Maldonado, E. Real-Time Hand Gesture Recognition Using Surface Electromyography and Machine Learning: A Systematic Literature Review. Sensors 2020, 20, 2467. [Google Scholar] [CrossRef]
- Ceolini, E.; Frenkel, C.; Shrestha, S.B.; Taverni, G.; Khacef, L.; Payvand, M.; Donati, E. Hand-Gesture Recognition Based on EMG and Event-Based Camera Sensor Fusion: A Benchmark in Neuromorphic Computing. Front. Neurosci. 2020, 14, 637. [Google Scholar] [CrossRef] [PubMed]
- Tran, D.-S.; Ho, N.-H.; Yang, H.-J.; Baek, E.-T.; Kim, S.-H.; Lee, G. Real-Time Hand Gesture Spotting and Recognition Using RGB-D Camera and 3D Convolutional Neural Network. Appl. Sci. 2020, 10, 722. [Google Scholar] [CrossRef] [Green Version]
- Oudah, M.; Al-Naji, A.; Chahl, J. Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. J. Imaging 2020, 6, 73. [Google Scholar] [CrossRef]
- Su, H.; Ovur, S.E.; Zhou, X.; Qi, W.; Ferrigno, G.; De Momi, E. Depth vision guided hand gesture recognition using electromyographic signals. Adv. Robot. 2020, 34, 985–997. [Google Scholar] [CrossRef]
- Vasconez, J.P.; Guevara, L.; Cheein, F.A. Social robot navigation based on HRI non-verbal communication: A case study on avocado harvesting. In Proceedings of the ACM Symposium on Applied Computing, Limassol, Cyprus, 8–2 April 2019; Association for Computing Machinery: New York, NY, USA, 2019; Volume Part F147772, pp. 957–960. [Google Scholar]
- Jin, B.; Cruz, L.; Gonçalves, N. Face Depth Prediction by the Scene Depth. In Proceedings of the 2021 IEEE/ACIS 19th International Conference on Computer and Information Science (ICIS), Shanghai, China, 23–5 June 2021; pp. 42–48. [Google Scholar]
- Benos, L.; Bochtis, D.D. An Analysis of Safety and Health Issues in Agriculture Towards Work Automation BT—Information and Communication Technologies for Agriculture—Theme IV: Actions; Bochtis, D.D., Pearson, S., Lampridi, M., Marinoudi, V., Pardalos, P.M., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 95–117. ISBN 978-3-030-84156-0. [Google Scholar]
- Benos, L.; Tsaopoulos, D.; Bochtis, D. A Review on Ergonomics in Agriculture. Part I: Manual Operations. Appl. Sci. 2020, 10, 1905. [Google Scholar] [CrossRef] [Green Version]
- Akalin, N.; Kristoffersson, A.; Loutfi, A. Do you feel safe with your robot? Factors influencing perceived safety in human-robot interaction based on subjective and objective measures. Int. J. Hum. Comput. Stud. 2022, 158, 102744. [Google Scholar] [CrossRef]
- Rubagotti, M.; Tusseyeva, I.; Baltabayeva, S.; Summers, D.; Sandygulova, A. Perceived safety in physical human–robot interaction—A survey. Rob. Auton. Syst. 2022, 151, 104047. [Google Scholar] [CrossRef]
- Hoffman, G. Evaluating Fluency in Human–Robot Collaboration. IEEE Trans. Hum. Mach. Syst. 2019, 49, 209–218. [Google Scholar] [CrossRef]
- Castro, A.; Silva, F.; Santos, V. Trends of Human-Robot Collaboration in Industry Contexts: Handover, Learning, and Metrics. Sensors 2021, 21, 4113. [Google Scholar] [CrossRef]
- Mizanoor Rahman, S.M. Performance Metrics for Human-Robot Collaboration: An Automotive Manufacturing Case. In Proceedings of the 2021 IEEE International Workshop on Metrology for Automotive (MetroAutomotive), Bologna, Italy, 1–2 July 2021; pp. 260–265. [Google Scholar]
- Murphy, R.R.; Schreckenghost, D. Survey of metrics for human-robot interaction. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 197–198. [Google Scholar]
- Steinfeld, A.; Fong, T.; Kaber, D.; Lewis, M.; Scholtz, J.; Schultz, A.; Goodrich, M. Common Metrics for Human-Robot Interaction. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction; Association for Computing Machinery, New York, NY, USA, 2–3 March 2006; pp. 33–40. [Google Scholar]
- Pina, P.; Cummings, M.; Crandall, J.; Della Penna, M. Identifying generalizable metric classes to evaluate human-robot teams. In Proceedings of the 3rd Annal Conference on Human-Robot Interaction, Amsterdam, The Netherlands, 12–15 March 2008; pp. 13–20. [Google Scholar]
- Lasota, P.A.; Shah, J.A. Analyzing the Effects of Human-Aware Motion Planning on Close-Proximity Human–Robot Collaboration. Hum. Factors 2015, 57, 21–33. [Google Scholar] [CrossRef] [Green Version]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Wright, R.W.; Brand, R.A.; Dunn, W.; Spindler, K.P. How to Write a Systematic Review. Clin. Orthop. Relat. Res. 2007, 455, 23–29. [Google Scholar] [CrossRef] [Green Version]
- Lee, C.-L.; Strong, R.; Dooley, K.E. Analyzing Precision Agriculture Adoption across the Globe: A Systematic Review of Scholarship from 1999–2020. Sustainability 2021, 13, 10295. [Google Scholar] [CrossRef]
- Hoy, D.; Brooks, P.; Woolf, A.; Blyth, F.; March, L.; Bain, C.; Baker, P.; Smith, E.; Buchbinder, R. Assessing risk of bias in prevalence studies: Modification of an existing tool and evidence of interrater agreement. J. Clin. Epidemiol. 2012, 65, 934–939. [Google Scholar] [CrossRef]
- Xie, Y.; Szeto, G.; Dai, J. Prevalence and risk factors associated with musculoskeletal complaints among users of mobile handheld devices: A systematic review. Appl. Ergon. 2017, 59, 132–142. [Google Scholar] [CrossRef]
- Benos, L.; Tsaopoulos, D.; Bochtis, D. A Review on Ergonomics in Agriculture. Part II: Mechanized Operations. Appl. Sci. 2020, 10, 3484. [Google Scholar] [CrossRef]
- Matheson, E.; Minto, R.; Zampieri, E.G.G.; Faccio, M.; Rosati, G. Human–Robot Collaboration in Manufacturing Applications: A Review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
- Parasuraman, R.; Sheridan, T.B.; Wickens, C.D. A model for types and levels of human interaction with automation. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 2000, 30, 286–297. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Bechar, A.; Edan, Y. Human-robot collaboration for improved target recognition of agricultural robots. Ind. Rob. 2003, 30, 432–436. [Google Scholar] [CrossRef]
- Oren, Y.; Bechar, A.; Edan, Y. Performance analysis of a human-Robot collaborative target recognition system. Robotica 2012, 30, 813–826. [Google Scholar] [CrossRef]
- Vásconez, J.P.; Auat Cheein, F.A. Workload and production assessment in the avocado harvesting process using human-robot collaborative strategies. Biosyst. Eng. 2022, 223, 56–77. [Google Scholar] [CrossRef]
- Bechar, A.; Meyer, J.; Edan, Y. An objective function to evaluate performance of human-robot systems for target recognition tasks. In Proceedings of the 2007 IEEE International Conference on Systems, Man and Cybernetics, Montreal, QC, Canada, 7–10 October 2007; pp. 967–972. [Google Scholar]
- Bechar, A.; Meyer, J.; Edan, Y. An objective function to evaluate performance of human-robot collaboration in target recognition tasks. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2009, 39, 611–620. [Google Scholar] [CrossRef]
- Tkach, I.; Edan, Y.; Bechar, A. Algorithms for dynamic switching of collaborative human-robot system in target recognition tasks. IFAC Proc. Vol. 2009, 42, 2179–2184. [Google Scholar] [CrossRef]
- Tkach, I.; Bechar, A.; Edan, Y. Switching between collaboration levels in a human-robot target recognition system. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2011, 41, 955–967. [Google Scholar] [CrossRef]
- Berenstein, R.; Edan, Y. Human-robot cooperative precision spraying: Collaboration levels and optimization function. In IFAC Proceedings Volumes (IFAC-PapersOnline); IFAC Secretariat: Dubrovnik, Croatia, 2012; Volume 45, pp. 799–804. [Google Scholar]
- Adamides, G.; Christou, G.; Katsanos, C.; Xenos, M.; Hadzilacos, T. Usability guidelines for the design of robot teleoperation: A taxonomy. IEEE Trans. Hum. Mach. Syst. 2015, 45, 256–262. [Google Scholar] [CrossRef]
- Cheein, F.A.; Herrera, D.; Gimenez, J.; Carelli, R.; Torres-Torriti, M.; Rosell-Polo, J.R.; Escola, A.; Arno, J. Human-robot interaction in precision agriculture: Sharing the workspace with service units. In Proceedings of the IEEE International Conference on Industrial Technology, Seville, Spain, 17–19 March 2015; Volume 2015, pp. 289–295. [Google Scholar]
- Adamides, G.; Katsanos, C.; Constantinou, I.; Christou, G.; Xenos, M.; Hadzilacos, T.; Edan, Y. Design and development of a semi-autonomous agricultural vineyard sprayer: Human-robot interaction aspects. J. Field Robot. 2017, 34, 1407–1426. [Google Scholar] [CrossRef]
- Berenstein, R.; Edan, Y. Human-robot collaborative site-specific sprayer. J. Field Robot. 2017, 34, 1519–1530. [Google Scholar] [CrossRef]
- Montesdeoca, J.C.; Toibero, M.; Carelli, R. Person-following based on social navigation into the sensorized environments. In Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), Macao, China, 5–8 December 2017; pp. 799–803. [Google Scholar]
- Guo, P.; Dusadeerungsikul, P.O.; Nof, S.Y. Agricultural cyber physical system collaboration for greenhouse stress management. Comput. Electron. Agric. 2018, 150, 439–454. [Google Scholar] [CrossRef]
- Baxter, P.; Cielniak, G.; Hanheide, M.; From, P.J. Safe Human-Robot Interaction in Agriculture. In Proceedings of the ACM/IEEE International Conference, New York, NY, USA, 5–8 March 2018. [Google Scholar]
- Dusadeerungsikul, P.O.; Nof, S.Y. A collaborative control protocol for agricultural robot routing with online adaptation. Comput. Ind. Eng. 2019, 135, 456–466. [Google Scholar] [CrossRef]
- Huuskonen, J.; Oksanen, T. Augmented Reality for Supervising Multirobot System in Agricultural Field Operation. IFAC-PapersOnLine 2019, 52, 367–372. [Google Scholar] [CrossRef]
- Rysz, M.; Ganesh, P.; Burks, T.F.; Mehta, S.S. Risk-averse Optimization for Improving Harvesting Efficiency of Autonomous Systems through Human Collaboration. IFAC-PapersOnLine 2019, 52, 207–212. [Google Scholar] [CrossRef]
- Dusadeerungsikul, P.O.; Nof, S.; Bechar, A.; Tao, Y. Collaborative Control Protocol for Agricultural Cyber-Physical System. In Proceedings of the 25th International Conference on Production Research Manufacturing Innovation: Cyber Physical Manufacturing, Chicago, IL, USA, 9–14 August 2019. [Google Scholar]
- Seyyedhasani, H.; Peng, C.; Jang, W.; Vougioukas, S.G. Collaboration of human pickers and crop-transporting robots during harvesting—Part I: Model and simulator development. Comput. Electron. Agric. 2020, 172, 105324. [Google Scholar] [CrossRef]
- Seyyedhasani, H.; Peng, C.; Jang, W.; Vougioukas, S.G. Collaboration of human pickers and crop-transporting robots during harvesting—Part II: Simulator evaluation and robot-scheduling case-study. Comput. Electron. Agric. 2020, 172, 105323. [Google Scholar] [CrossRef]
- Huang, Z.; Miyauchi, G.; Gomez, S.A.; Bird, R.; Amar, S.K.; Jansen, C.; Liu, Z.; Parsons, S.; Sklar, E. Toward robot co-labourers for intelligent farming. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020. [Google Scholar]
- Lai, Y.-L.; Chen, P.-L.; Yen, P.-L. A Human-Robot Cooperative Vehicle for Tea Plucking. In Proceedings of the 2020 7th International Conference on Control, Decision and Information Technologies (CoDIT), Prague, Czech Republic, 29 June 2020–2 July 2020; Volume 1, pp. 217–222. [Google Scholar]
- Anagnostis, A.; Benos, L.; Tsaopoulos, D.; Tagarakis, A.; Tsolakis, N.; Bochtis, D. Human activity recognition through recurrent neural networks for human-robot interaction in agriculture. Appl. Sci. 2021, 11, 2188. [Google Scholar] [CrossRef]
- Rysz, M.W.; Mehta, S.S. A risk-averse optimization approach to human-robot collaboration in robotic fruit harvesting. Comput. Electron. Agric. 2021, 182, 106018. [Google Scholar] [CrossRef]
- Benos, L.; Kokkotis, C.; Tsatalas, T.; Karampina, E.; Tsaopoulos, D.; Bochtis, D. Biomechanical Effects on Lower Extremities in Human-Robot Collaborative Agricultural Tasks. Appl. Sci. 2021, 11, 11742. [Google Scholar] [CrossRef]
- Tagarakis, A.C.; Benos, L.; Aivazidou, E.; Anagnostis, A.; Kateris, D.; Bochtis, D. Wearable Sensors for Identifying Activity Signatures in Human-Robot Collaborative Agricultural Environments. Eng. Proc. 2021, 9, 5. [Google Scholar] [CrossRef]
- Aivazidou, E.; Tsolakis, N. Transitioning towards human–robot synergy in agriculture: A systems thinking perspective. Syst. Res. Behav. Sci. 2023, 40, 536–551. [Google Scholar] [CrossRef]
- Mallas, A.; Rigou, M.; Xenos, M. Comparing the Performance and Evaluation of Computer Experts and Farmers when Operating Agricultural Robots: A Case of Tangible vs Mouse-Based UIs. Hum. Behav. Emerg. Technol. 2022, 2022, 6070285. [Google Scholar] [CrossRef]
- Sheridan, T.; Verplank, W. Human and Computer Control of Undersea Teleoperators; Technical Reports; MIT Man-Machine Systems Laboratory: Cambridge, MA, USA, 1978. [Google Scholar]
- Huang, Z.; Gomez, A.S.; Bird, R.; Kalsi, A.S.; Jansen, C.; Liu, Z.; Miyauchi, G.; Parsons, S.; Sklar, E.I. Understanding human responses to errors in a collaborative human-robot selective harvesting task. In Proceedings of the UKRAS20 Conference: “Robots into the Real World”, Lincoln, UK, 17 April 2020. [Google Scholar]
- Benos, L.; Tagarakis, A.C.; Dolias, G.; Berruto, R.; Kateris, D.; Bochtis, D. Machine Learning in Agriculture: A Comprehensive Updated Review. Sensors 2021, 21, 3758. [Google Scholar] [CrossRef] [PubMed]
- Sun, J.-H.; Ji, T.-T.; Zhang, S.-B.; Yang, J.-K.; Ji, G.-R. Research on the Hand Gesture Recognition Based on Deep Learning. In Proceedings of the 2018 12th International Symposium on Antennas, Propagation and EM Theory (ISAPE), Hangzhou, China, 3–6 December 2018; pp. 1–4. [Google Scholar]
- Hussain, S.; Saxena, R.; Han, X.; Khan, J.A.; Shin, H. Hand gesture recognition using deep learning. In Proceedings of the 2017 International SoC Design Conference (ISOCC), Seoul, Republic of Korea, 5–8 November 2017; pp. 48–49. [Google Scholar]
- Jin, B.; Cruz, L.; Gonçalves, N. Pseudo RGB-D Face Recognition. IEEE Sens. J. 2022, 22, 21780–21794. [Google Scholar] [CrossRef]
- Hong, Z.; Hong, M.; Wang, N.; Ma, Y.; Zhou, X.; Wang, W. A wearable-based posture recognition system with AI-assisted approach for healthcare IoT. Futur. Gener. Comput. Syst. 2022, 127, 286–296. [Google Scholar] [CrossRef]
- Sørensen, L.B.; Germundsson, L.B.; Hansen, S.R.; Rojas, C.; Kristensen, N.H. What Skills Do Agricultural Professionals Need in the Transition towards a Sustainable Agriculture? A Qualitative Literature Review. Sustainability 2021, 13, 13556. [Google Scholar] [CrossRef]
- European Parliament Ethical Aspects of Cyber-Physical Systems. Available online: https://www.europarl.europa.eu/thinktank/en/document/EPRS_STU(2016)563501 (accessed on 8 December 2021).
- Ferland, F.; Reveleau, A.; Leconte, F.; Létourneau, D.; Michaud, F. Coordination mechanism for integrated design of Human-Robot Interaction scenarios. Paladyn J. Behav. Robot. 2017, 8, 100–111. [Google Scholar] [CrossRef] [Green Version]
- Cammarata, A.; Sinatra, R.; Maddio, P.D. A Two-Step Algorithm for the Dynamic Reduction of Flexible Mechanisms BT—Mechanism Design for Robotics; Gasparetto, A., Ceccarelli, M., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 25–32. [Google Scholar]
Reference | External Validity | Internal Validity | Overall Quality | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | |
[52] | Y | N | Y | Y | Y | Y | Y | Y | Y | Y | ++ |
[55] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[56] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[57] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[58] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[53] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[59] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[60] | C | C | C | C | C | Y | Y | C | Y | Y | ++ |
[61] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[20] | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | ++ |
[62] | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | ++ |
[63] | Υ | Ν | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | ++ |
[64] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[65] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[66] | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | Υ | ++ |
[67] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[68] | C | Y | C | Y | Y | Y | Y | Y | Y | Y | ++ |
[69] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[70] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[71] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[72] | C | Y | Y | Y | Y | Y | Y | Y | Y | Y | ++ |
[21] | C | C | C | C | C | Y | Y | C | C | Y | ++ |
[73] | C | N | C | Y | Y | Y | Y | Y | Y | Y | ++ |
[74] | N | N | Y | Y | Y | Y | Y | Y | Y | Y | ++ |
[75] | Y | N | Y | Y | Y | Y | Y | Y | Y | Y | ++ |
[76] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[77] | Y | Y | Y | Y | Y | Y | Y | Y | Y | Y | ++ |
[78] | Y | N | C | Y | Y | Y | Y | Y | Y | Y | ++ |
[24] | C | N | C | Y | Y | Y | Y | Y | Y | Y | ++ |
[79] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
[80] | C | Y | N | C | Y | Y | Y | Y | Y | Y | ++ |
[54] | C | C | C | C | C | Y | Y | Y | Y | Y | ++ |
Ref 1 | Subject | Method | Crop | Interaction Mode | Automation Level 2 | Main Results |
---|---|---|---|---|---|---|
[52] | Target detection | Lab exp 3 | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | Synergy increased the performance by 4% and by 14% compared with the solely manual or autonomous detection, respectively |
[55] | Target detection | Simulation | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | An objective function was developed for evaluating system performance, while the optimal collaboration level may change depending on human and robot sensitivities |
[56] | Target detection | Simulation | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | The best system performance and collaboration level depend on the environment, the task, and the system characteristics |
[57] | Target detection | Simulation | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | Real-time switching of the synergistic levels was accomplished by developed algorithms for increasing system performance |
[58] | Target detection | Simulation | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | Real-time switching of the synergistic levels was achieved, resulting in improved system performance by more than 90% |
[53] | Target detection | Simulation | Melon | Isolation; Collaboration | 1; 3–4; 5–7; 10 | Operational costs were studied, showing that human decision time strongly affects the performance |
[59] | Target detection/Precision spraying | Lab exp/Simulation | Grape | Isolation; Collaboration | 1; 3–4; 5–7; 10 | Four levels of HRI 7 were developed and tested, as well as a spraying coverage optimization function |
[60] | Robot navigation | Design Principles | N/A | N/A | N/A | A taxonomy was presented and evaluated in terms of an existing user interface for robot teleoperation |
[61] | Movements identification | Design Principles | Olive | N/A | N/A | Guidelines are described for addressing problems in sharing human–robot environments |
[20] | Robot navigation/Target detection/Precision spraying | Field and lab exp | Grape | Synchronization | 1–2 | Multiple views, head -mounted display, PC 4 keyboard contributed to higher perceived usability |
[62] | Robot navigation/Target detection/Precision spraying | Field and lab exp | Grape | Synchronization | 1–2 | Similar results to [20], while camera placement on the top-back of the robot and on the end-effector improved the surroundings and activity awareness |
[63] | Target detection/Precision spraying | Field exp | Grape | Isolation; Collaboration | 1; 3–4; 5–7; 10 | The collaborative spraying system reduces the sprayed material by half |
[64] | Social navigation | Simulation | N/A | Coexistence | N/A | A controller modifies the length of personal space and velocity in order to keep a social distance |
[65] | Stress management | Simulation | N/A | Isolation; Cooperation | 1–3; 10 | Collaboration allows for saving time |
[66] | Load lift and carrying | Field exp | Strawberry | Cooperation | N/A | The pilot study showed that the experienced workers positively viewed the cooperation and considered it safe |
[67] | Stress management | Simulation | N/A | Cooperation; Collaboration | 3–5 | The developed protocol provides the highest efficiency as compared to a system without synergy |
[68] | Fleet of robots (tele-)operation | Field exp | N/A | Collaboration | 3–7 | The AR 5 system improves the situational awareness of a human for managing a fleet of robots |
[69] | Harvesting | Simulation | Orange | N/A | N/A | The developed risk-averse solution minimizes economic costs |
[70] | Stress management | Simulation | N/A | Cooperation; Collaboration | 3–5 | H-R 6 synergy can respond to emergency stresses situations fast and effectively |
[71] | Harvesting | Simulation | Strawberries and grapes | Cooperation | N/A | Development of model and simulator to predict efficiencies of coupled operations pertaining to manual harvesting and robot transport |
[72] | Harvesting | Field exp/Simulation | Strawberry | Cooperation | N/A | Simulations robustness of [71] was evaluated; 5 robots serving as tray-transport from 25 pickers improved efficiency by 10.2% |
[21] | Ergonomics and safety | Design Principles | N/A | N/A | N/A | A combined approach is proposed that redefines practical limits, reprioritizes safety measures, and determines the riskiest postures |
[73] | Target detection | Lab exp | Strawberry | Collaboration | 2–5 | Both experienced and non-experienced groups opt for robots producing more false positive results |
[74] | Harvesting | Field exp | Tea | Cooperation | N/A | The robot kept on a side-by-side route with two workers |
[75] | Human activity recognition | Field exp | N/A | Cooperation | N/A | The prediction of the defined sub-activities demonstrated an 85.6% average accuracy, while fusion of all sensors’ data can yield the maximum accuracy |
[76] | Harvesting | Simulation | Citrus varieties | N/A | N/A | H-R collaboration can optimize economic viability of robotic harvesters, especially when it occurs in the early stages of harvesting |
[77] | Ergonomics | Lab exp | N/A | Cooperation | N/A | A deposit height of robot equal to 90 cm was suggested by avoiding large lumbar flexion |
[78] | Human activity recognition | Field exp | N/A | Cooperation | N/A | Six continuous activities with wearable sensors were performed for a HRI scenario under several variants for obtaining a dataset for ergonomics research |
[24] | Human activity recognition | Field exp | Pistacia | Cooperation | 5 | A real-time skeleton-based recognition framework was developed using 5 hand gestures and successfully tested in field experiments |
[79] | Transitioning toward H-R synergy | Design Principles | N/A | N/A | N/A | The interplay among the socio-economic factors and underlying mental models driving the shift from pure automation to HRI are presented via a systems thinking approach |
[80] | Robot navigation/Precision spraying | Field exp/Simulation | Grape | Collaboration | 1–3 | Both groups (computer experts and farmers) made effective use of user interfaces with the tangible one receiving more positive evaluations |
[54] | Load lift and carrying | Simulation | Avocado | Cooperation | 5 | H-R synergy increases the production but necessitates slightly more energy during harvesting |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Benos, L.; Moysiadis, V.; Kateris, D.; Tagarakis, A.C.; Busato, P.; Pearson, S.; Bochtis, D. Human–Robot Interaction in Agriculture: A Systematic Review. Sensors 2023, 23, 6776. https://doi.org/10.3390/s23156776
Benos L, Moysiadis V, Kateris D, Tagarakis AC, Busato P, Pearson S, Bochtis D. Human–Robot Interaction in Agriculture: A Systematic Review. Sensors. 2023; 23(15):6776. https://doi.org/10.3390/s23156776
Chicago/Turabian StyleBenos, Lefteris, Vasileios Moysiadis, Dimitrios Kateris, Aristotelis C. Tagarakis, Patrizia Busato, Simon Pearson, and Dionysis Bochtis. 2023. "Human–Robot Interaction in Agriculture: A Systematic Review" Sensors 23, no. 15: 6776. https://doi.org/10.3390/s23156776
APA StyleBenos, L., Moysiadis, V., Kateris, D., Tagarakis, A. C., Busato, P., Pearson, S., & Bochtis, D. (2023). Human–Robot Interaction in Agriculture: A Systematic Review. Sensors, 23(15), 6776. https://doi.org/10.3390/s23156776