Advanced Applications of Industrial Robotics: New Trends and Possibilities
Abstract
:1. Introduction
2. Main Robotisation Strategies
2.1. Classical Robotisation Strategy
2.2. Modern Robotisation Strategy
3. Recent Achievements in Industrial Robotics Classified according to Implementation Area
3.1. Human–Machine Interaction
3.2. Object Recognition
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
Extended default “program from demonstration” feature of collaborative robots to adapt them to environments with moving objects. | Franka Emika Panda cobot with 7 degrees of freedom, with a Realsense D435 RGB-D camera mounted on the end-effector. | Grasping method to fine-tune using reinforcement learning techniques. | The system can grasp various objects from a demonstration, regardless of their position and orientation, in less than 5 min of training time. | [35,46] |
Introduction of a set of metrics for primary comparison of robotic systems’ detailed functionality and performance. | Robot with different grippers. | Recognition method and the grasping method. | Developed original robot performance metrics and tested on four robot systems used in the Amazon Robotics Challenge competition. Results of analysis showed the difference between the systems and promising solutions for further improvements. | [36,45,47] |
To build a low-cost system for identifying shapes to program industrial robots for the 2D welding process. | Robot ABB IRB 140 with a digital camera, which detects contours on a 2D surface. | A binarisation and contour recognition method. | A low-cost system based on an industrial vision was developed and implemented for the simple programming of the movement path. | [48,49] |
The patch-based density forecasting networks (PDFNs) directly forecast crowd density maps of future frames instead of trajectories of each moving person in the crowd. | Fixed surveillance camera | Density Forecasting in Image Space. Density Forecasting in Latent Space. PDFNs. Spatio-Temporal Patch-Based Gaussian filter. | Proposed patch-based models, PDFN-S and PDFN-ST, outperformed baselines on all the datasets. PDFN-ST successfully forecasted dynamics of individuals, a small group, and a crowd. The approach cannot always forecast sudden changes in walking directions, especially when they happened in the later frames. | [45] |
To separate the objects from a set according to their colour. | Pneumatic Robot arm | Force in response to applied pressure. | The proposed robotic arm may be considered for sorting. Servo motors and image processing cameras can be used to achieve higher repeatability and accuracy. | [37,50] |
An image processing-based method for coal and gangue sorting. Development of a positioning and identification system. | Coal and gangue sorting robot | Threshold segmentation methods. Clustering method. Morphological corrosion and expansion methods. The centre of mass method. | Efficiency is evaluated using the images of coal and gangue, which are randomly picked from the production environment. The average coordinate errors in the x and y directions are 2.73% and 2.72%, and the identification accuracy of coal and gangue samples is 88.3% and 90.0%, respectively, and the sum of the time for identification, positioning, and opening the camera for a single sample averaged 0.130 s. | [41,51,52] |
A computer vision-based robotic sorter is capable of simultaneously detecting and sorting objects by their colours and heights. Vision-based process encompasses identification, manipulation, selection, and sorting objects depending on colour and geometry. | A 5 or 6 DOF robotic arm and a camera with the computer vision software detecting various colours and heights and geometries. | Computer Vision methods with the Haar Cascade algorithm. The Canny edge detection algorithm is used for shape identification. | A robotic arm is used for picking and placing objects based on colour and height. In the proposed system, colour and height sorting efficiency is around 99%. Effectiveness, high accuracy and low cost of computer vision with a robotic arm in the sorting process according to color and shape are revealed. | [38,53,54] |
A novel multimodal convolutional neural network for RGB-D object detection. | A base solid waste sorting system consisting of a server, vision sensors, industrial robot, and rotational speedometer. | Comparison with single modal methods. Washington RGB-D object recognition benchmark evaluated. | Meeting the real-time requirements and ensuring high precision. Achieved 49.1% mean average precision, processing images in real-time at 35.3 FPS on one single Nvidia GTX1080 GPU. Novel dataset. | [40,55] |
Practicality and feasibility of a faster R-CNN model using a dataset containing images of symmetric objects. | Five DoF robot arm “OWI Robotic Arm Edge.” | CNN learning algorithm that processes images with multiple layers (filters) and classifies objects in images. Regional Proposal Network (RPN) | The accuracy and precision rate are steadily enhanced. The accuracy rate of detecting defective and non-defective objects is successfully improved, increasing the training dataset to up to 400 images of defective and non-defective objects. | [39,56,57] |
An automatic sorting robot with height maps and near-infrared (NIR) hyperspectral images to locate objects’ ROI and conduct online statistic pixel-based classification in contours. 24/7 monitoring. | The robotic system with four modules: (1) the main conveyor, (2) a detection module, (3) a light source module, and (4) a manipulator. Mask-RCNN and YOLOv3 algorithms. | Method for an automatic sorting robot. Identification include pixel, sub-pixel, object-based methods. | The prototype machine can automatically sort construction and demolition waste with a size range of 0.05–0.5 m. The sorting efficiency can reach 2028 picks/h, and the online recognition accuracy nearly reaches 100%. Can be applied in technology for land monitoring. | [41,58,59] |
Overcoming current limitations on the existing robotic solutions for picking objects in cluttered environments. | Intelligent autonomous robots for picking different kinds of objects. Universal jamming gripper. | A comparative study of the algorithmic performance of the proposed method. | When a corner is detected, it takes just 0.003 s to output the target point. With lines, the required time depends on the object’s configuration, ranging from 0.02 s, when objects have almost the same depth, to 0.06 s in the worst-case scenario. | [43,60,61,62] |
3.3. Medical Application
3.4. Path Planning, Path Optimisation
3.5. Food Industry
3.6. Agricultural Applications
3.7. Civil Engineering Industry
4. Discussion
5. Conclusions
- development of intelligent companion equipment for robots (sensors, grippers, and servo-applications);
- AI-based solutions for signal processing and decision making;
- the redesign of general objects and the related features for robotic applications;
- provision of psychological solutions for robot–human collaboration and acceptance of robots in the workplace.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- ISO—ISO 8373:2012—Robots and Robotic Devices—Vocabulary. Available online: https://www.iso.org/standard/55890.html (accessed on 7 April 2021).
- IFR Presents World Robotics Report 2020—International Federation of Robotics. Available online: https://ifr.org/ifr-press-releases/news/record-2.7-million-robots-work-in-factories-around-the-globe (accessed on 7 April 2021).
- ScienceDirect Search Results—Keywords (Industrial Robot). Available online: https://www.sciencedirect.com/search?qs=Industrial%20robot (accessed on 7 April 2021).
- Dekle, R. Robots and industrial labor: Evidence from Japan. J. Jpn. Int. Econ. 2020, 58, 101108. [Google Scholar] [CrossRef]
- Olivares-Alarcos, A.; Foix, S.; Alenyà, G. On inferring intentions in shared tasks for industrial collaborative robots. Electronics 2019, 8, 1306. [Google Scholar] [CrossRef] [Green Version]
- Smith, R.; Cucco, E.; Fairbairn, C. Robotic Development for the Nuclear Environment: Challenges and Strategy. Robotics 2020, 9, 94. [Google Scholar] [CrossRef]
- Rojas, R.A.; Wehrle, E.; Vidoni, R. A Multicriteria Motion Planning Approach for Combining Smoothness and Speed in Collaborative Assembly Systems. Appl. Sci. 2020, 10, 5086. [Google Scholar] [CrossRef]
- Ivanov, S.; Seyitoğlu, F.; Markova, M. Hotel managers’ perceptions towards the use of robots: A mixed-methods approach. Inf. Technol. Tour. 2020, 22, 505–535. [Google Scholar] [CrossRef]
- Colim, A.; Sousa, N.; Carneiro, P.; Costa, N.; Arezes, P.; Cardoso, A. Ergonomic intervention on a packing workstation with robotic aid-case study at a furniture manufacturing industry. Work 2020, 66, 229–237. [Google Scholar] [CrossRef] [PubMed]
- Giusti, A.; Guzzi, J.; Ciresan, D.C.; He, F.L.; Rodriguez, J.P.; Fontana, F.; Faessler, M.; Forster, C.; Schmidhuber, J.; Di Caro, G.; et al. A Machine Learning Approach to Visual Perception of Forest Trails for Mobile Robots. IEEE Robot. Autom. Lett. 2016, 1, 661–667. [Google Scholar] [CrossRef] [Green Version]
- Elsisi, M.; Mahmoud, K.; Lehtonen, M.; Darwish, M.M.F. Effective Nonlinear Model Predictive Control Scheme Tuned by Improved NN for Robotic Manipulators. IEEE Access 2021, 9, 64278–64290. [Google Scholar] [CrossRef]
- Elsisi, M.; Mahmoud, K.; Lehtonen, M.; Darwish, M.M.F. An improved neural network algorithm to efficiently track various trajectories of robot manipulator arms. IEEE Access 2021, 9, 11911–11920. [Google Scholar] [CrossRef]
- A Brief History of Collaborative Robots|Material Handling and Logistics. Available online: https://www.mhlnews.com/technology-automation/article/21124077/a-brief-history-of-collaborative-robots (accessed on 8 April 2021).
- Colgate, J.E.; Peshkin, M.A. Cobots. U.S. Patent 5,952,796, 14 September 1999. [Google Scholar]
- Galin, R.; Meshcheryakov, R. Automation and robotics in the context of Industry 4.0: The shift to collaborative robots. IOP Conf. Ser. Mater. Sci. Eng. 2019, 537, 032073. [Google Scholar] [CrossRef]
- Tran, M.Q.; Elsisi, M.; Mahmoud, K.; Liu, M.K.; Lehtonen, M.; Darwish, M.M.F. Experimental Setup for Online Fault Diagnosis of Induction Machines via Promising IoT and Machine Learning: Towards Industry 4.0 Empowerment. IEEE Access 2021, 9, 115429–115441. [Google Scholar] [CrossRef]
- Elsisi, M.; Mahmoud, K.; Lehtonen, M.; Darwish, M.M.F. Reliable Industry 4.0 Based on Machine Learning and IoT for Analyzing, Monitoring, and Securing Smart Meters. Sensors 2021, 21, 487. [Google Scholar] [CrossRef]
- Rao, S.K.; Prasad, R. Impact of 5G Technologies on Industry 4.0. Wirel. Pers. Commun. 2018, 100, 145–159. [Google Scholar] [CrossRef]
- Pérez, L.; Rodríguez-Jiménez, S.; Rodríguez, N.; Usamentiaga, R.; García, D.F.; Wang, L. Symbiotic human–robot collaborative approach for increased productivity and enhanced safety in the aerospace manufacturing industry. Int. J. Adv. Manuf. Technol. 2020, 106, 851–863. [Google Scholar] [CrossRef]
- Song, J.; Chen, Q.; Li, Z. A peg-in-hole robot assembly system based on Gauss mixture model. Robot. Comput. Integr. Manuf. 2021, 67, 101996. [Google Scholar] [CrossRef]
- De Pace, F.; Manuri, F.; Sanna, A.; Fornaro, C. A systematic review of Augmented Reality interfaces for collaborative industrial robots. Comput. Ind. Eng. 2020, 149, 106806. [Google Scholar] [CrossRef]
- Matheson, E.; Minto, R.; Zampieri, E.G.G.; Faccio, M.; Rosati, G. Human-robot collaboration in manufacturing applications: A review. Robotics 2019, 8, 100. [Google Scholar] [CrossRef] [Green Version]
- ISO—ISO/TS 15066:2016—Robots and Robotic Devices—Collaborative Robots. Available online: https://www.iso.org/standard/62996.html (accessed on 2 December 2021).
- Tannous, M.; Miraglia, M.; Inglese, F.; Giorgini, L.; Ricciardi, F.; Pelliccia, R.; Milazzo, M.; Stefanini, C. Haptic-based touch detection for collaborative robots in welding applications. Robot. Comput. Integr. Manuf. 2020, 64, 101952. [Google Scholar] [CrossRef]
- Tannous, M.; Bologna, F.; Stefanini, C. Load cell torques and force data collection during tele-operated robotic gas tungsten arc welding in presence of collisions. Data Br. 2020, 31, 105981. [Google Scholar] [CrossRef]
- Knudsen, M.; Kaivo-oja, J. Collaborative Robots: Frontiers of Current Literature. J. Intell. Syst. Theory Appl. 2020, 3, 13–20. [Google Scholar] [CrossRef]
- Ghosh, A.; Soto, D.A.P.; Veres, S.M.; Rossiter, A. Human robot interaction for future remote manipulations in industry 4.0. Proc. IFAC-Pap. 2020, 53, 10223–10228. [Google Scholar] [CrossRef]
- Ghosh, A.; Veres, S.M.; Paredes-Soto, D.; Clarke, J.E.; Rossiter, J.A. Intuitive programming with remotely instructed robots inside future gloveboxes. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 209–211. [Google Scholar]
- Weidemann, A.; Rußwinkel, N. The Role of Frustration in Human–Robot Interaction—What Is Needed for a Successful Collaboration? Front. Psychol. 2021, 12, 707. [Google Scholar] [CrossRef]
- Spezialetti, M.; Placidi, G.; Rossi, S. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Front. Robot. AI 2020, 7, 532279. [Google Scholar] [CrossRef] [PubMed]
- Ge, S.; Wang, P.; Liu, H.; Lin, P.; Gao, J.; Wang, R.; Iramina, K.; Zhang, Q.; Zheng, W. Neural Activity and Decoding of Action Observation Using Combined EEG and fNIRS Measurement. Front. Hum. Neurosci. 2019, 13, 357. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mavridis, N. A review of verbal and non-verbal human–robot interactive communication. Robot. Auton. Syst. 2015, 63, 22–35. [Google Scholar] [CrossRef] [Green Version]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human emotion recognition: Review of sensors and methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shubha, P. International Journal of Engineering Technology Research & Management: A review of multi object recognition based on deep learining. Int. J. Eng. Technol. Res. Manag. 2020, 2, 27–33. [Google Scholar]
- De Coninck, E.; Verbelen, T.; Van Molle, P.; Simoens, P.; Dhoedt, B. Learning robots to grasp by demonstration. Robot. Auton. Syst. 2020, 127, 103474. [Google Scholar] [CrossRef]
- Fujita, M.; Domae, Y.; Noda, A.; Garcia Ricardez, G.A.; Nagatani, T.; Zeng, A.; Song, S.; Rodriguez, A.; Causo, A.; Chen, I.M.; et al. What are the important technologies for bin picking? Technology analysis of robots in competitions based on a set of performance metrics. Adv. Robot. 2020, 34, 560–574. [Google Scholar] [CrossRef]
- Sughashini, K.R.; Sunanthini, V.; Johnsi, J.; Nagalakshmi, R.; Sudha, R. A pneumatic robot arm for sorting of objects with chromatic sensor module. Mater. Today Proc. 2021, 45, 6364–6368. [Google Scholar] [CrossRef]
- Shaikat, A.S.; Akter, S.; Salma, U. Computer Vision Based Industrial Robotic Arm for Sorting Objects by Color and Height. J. Eng. Adv. 2020, 1, 116–122. [Google Scholar] [CrossRef]
- Chen, P.; Elangovan, V. Object Sorting using Faster R-CNN. Int. J. Artif. Intell. Appl. 2020, 11, 27–36. [Google Scholar] [CrossRef]
- Yu, Y.; Zou, S.; Yin, K. A novel detection fusion network for solid waste sorting. Int. J. Adv. Robot. Syst. 2020, 17, 172988142094177. [Google Scholar] [CrossRef]
- Xiao, W.; Yang, J.; Fang, H.; Zhuang, J.; Ku, Y.; Zhang, X. Development of an automatic sorting robot for construction and demolition waste. Clean Technol. Environ. Policy 2020, 22, 1829–1841. [Google Scholar] [CrossRef]
- Li, M.; Duan, Y.; He, X.; Yang, M. Image positioning and identification method and system for coal and gangue sorting robot. Int. J. Coal Prep. Util. 2020, 1–19. [Google Scholar] [CrossRef]
- D’Avella, S.; Tripicchio, P.; Avizzano, C.A. A study on picking objects in cluttered environments: Exploiting depth features for a custom low-cost universal jamming gripper. Robot. Comput. Integr. Manuf. 2020, 63, 101888. [Google Scholar] [CrossRef]
- Ciszak, O.; Juszkiewicz, J.; Suszyński, M. Programming of Industrial Robots Using the Recognition of Geometric Signs in Flexible Welding Process. Symmetry 2020, 12, 1429. [Google Scholar] [CrossRef]
- Minoura, H.; Yonetani, R.; Nishimura, M.; Ushiku, Y. Crowd Density Forecasting by Modeling Patch-Based Dynamics. IEEE Robot. Autom. Lett. 2021, 6, 287–294. [Google Scholar] [CrossRef]
- De Coninck, E.; Verbelen, T.; Van Molle, P.; Simoens, P.; Idlab, B.D. Learning to Grasp Arbitrary Household Objects from a Single Demonstration. IEEE Int. Conf. Intell. Robot. Syst. 2019, 2372–2377. [Google Scholar] [CrossRef]
- Kaya, O.; Tağlıoğlu, G.B.; Ertuğrul, Ş. The Series Elastic Gripper Design, Object Detection, and Recognition by Touch. J. Mech. Robot. 2022, 14, 014501. [Google Scholar] [CrossRef]
- Kulkarni, R.G. Robot Path Planning with Sensor Feedback for Industrial Applications; Wichita State University: Wichita, KS, USA, 2021. [Google Scholar]
- Abdalrahman, M.; Brice, A.; Hanson, L. New Era of Automation in Scania’ s Manufacturing Systems—A Method to Automate a Manual Assembly Process; Libraries at Lund University: Lund, Sweden, 2021. [Google Scholar]
- Thike, A.; Moe San, Z.Z.; Min Oo, D.Z. Design and Development of an Automatic Color Sorting Machine on Belt Conveyor. Int. J. Sci. Eng. Appl. 2019, 8, 176–179. [Google Scholar] [CrossRef]
- Wang, Z.; Xie, S.; Chen, G.; Chi, W.; Ding, Z.; Wang, P. An Online Flexible Sorting Model for Coal and Gangue Based on Multi-Information Fusion. IEEE Access 2021, 9, 90816–90827. [Google Scholar] [CrossRef]
- Sun, Z.; Huang, L.; Jia, R. Coal and gangue separating robot system based on computer vision. Sensors 2021, 21, 1349. [Google Scholar] [CrossRef] [PubMed]
- Fadhil, A.T.; Abbar, K.A.; Qusay, A.M. Computer Vision-Based System for Classification and Sorting Color Objects. IOP Conf. Ser. Mater. Sci. Eng. 2020, 745, 012030. [Google Scholar] [CrossRef]
- Peršak, T.; Viltužnik, B.; Hernavs, J.; Klancnik, S. Vision-Based Sorting Systems for Transparent Plastic Granulate. Appl. Sci. 2020, 10, 4269. [Google Scholar] [CrossRef]
- Sun, L.; Zhao, C.; Yan, Z.; Liu, P.; Duckett, T.; Stolkin, R. A novel weakly-supervised approach for RGB-D-based nuclear waste object detection. IEEE Sens. J. 2019, 19, 3487–3500. [Google Scholar] [CrossRef] [Green Version]
- Albinali, H.; Alzahrani, F.A. Faster R-CNN for detecting regions in human-annotated micrograph images. In Proceedings of the 2021 International Conference of Women in Data Science at Taif University (WiDSTaif), Taif, Saudi Arabia, 30–31 March 2021. [Google Scholar]
- Li, S.; Zhao, X.; Li, W. Analysis of Object Detection Performance Based on Faster R-CNN. J. Phys. Conf. Ser. 2021, 1827, 012085. [Google Scholar] [CrossRef]
- Cipta Ramadhan Kete, S.; Darma Tarigan, S.; Effendi, H. Land use classification based on object and pixel using Landsat 8 OLI in Kendari City, Southeast Sulawesi Province, Indonesia. IOP Conf. Ser. Earth Environ. Sci. 2019, 284, 012019. [Google Scholar] [CrossRef]
- Hespeler, S.C.; Nemati, H.; Dehghan-Niri, E. Non-destructive thermal imaging for object detection via advanced deep learning for robotic inspection and harvesting of chili peppers. Artif. Intell. Agric. 2021, 5, 102–117. [Google Scholar] [CrossRef]
- Birglen, L.; Schlicht, T. A statistical review of industrial robotic grippers. Robot. Comput. Integr. Manuf. 2018, 49, 88–97. [Google Scholar] [CrossRef]
- Shim, M.; Kim, J.H. Design and optimization of a robotic gripper for the FEM assembly process of vehicles. Mech. Mach. Theory 2018, 129, 1–16. [Google Scholar] [CrossRef]
- Linghu, C.; Zhang, S.; Wang, C.; Yu, K.; Li, C.; Zeng, Y.; Zhu, H.; Jin, X.; You, Z.; Song, J. Universal SMP gripper with massive and selective capabilities for multiscaled, arbitrarily shaped objects. Sci. Adv. 2020, 6, eaay5120. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Richter, F.; Orosco, R.K.; Yip, M.C. Open-Sourced Reinforcement Learning Environments for Surgical Robotics. arXiv 2019, arXiv:1903.02090. [Google Scholar]
- Kassahun, Y.; Yu, B.; Tibebu, A.T.; Stoyanov, D.; Giannarou, S.; Metzen, J.H.; Vander Poorten, E. Surgical robotics beyond enhanced dexterity instrumentation: A survey of machine learning techniques and their role in intelligent and autonomous surgical actions. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 553–568. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pierson, H.A.; Gashler, M.S. Deep learning in robotics: A review of recent research. Adv. Robot. 2017, 31, 821–835. [Google Scholar] [CrossRef] [Green Version]
- Downey, J.E.; Weiss, J.M.; Muelling, K.; Venkatraman, A.; Valois, J.S.; Hebert, M.; Bagnell, J.A.; Schwartz, A.B.; Collinger, J.L. Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping. J. Neuroeng. Rehabil. 2016, 13, 28. [Google Scholar] [CrossRef] [Green Version]
- Fong, J.; Ocampo, R.; Gross, D.P.; Tavakoli, M. Intelligent Robotics Incorporating Machine Learning Algorithms for Improving Functional Capacity Evaluation and Occupational Rehabilitation. J. Occup. Rehabil. 2020, 30, 362–370. [Google Scholar] [CrossRef] [PubMed]
- Rudovic, O.; Lee, J.; Dai, M.; Schuller, B.; Picard, R.W. Personalized machine learning for robot perception of affect and engagement in autism therapy. Sci. Robot. 2018, 3, eaao6760. [Google Scholar] [CrossRef] [Green Version]
- Grischke, J.; Johannsmeier, L.; Eich, L.; Griga, L.; Haddadin, S. Dentronics: Towards robotics and artificial intelligence in dentistry. Dent. Mater. 2020, 36, 765–778. [Google Scholar] [CrossRef]
- Ma, Q.; Kobayashi, E.; Wang, J.; Hara, K.; Suenaga, H.; Sakuma, I.; Masamune, K. Development and preliminary evaluation of an autonomous surgical system for oral and maxillofacial surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2019, 15, e1997. [Google Scholar] [CrossRef]
- Otani, T.; Raigrodski, A.J.; Mancl, L.; Kanuma, I.; Rosen, J. In vitro evaluation of accuracy and precision of automated robotic tooth preparation system for porcelain laminate veneers. J. Prosthet. Dent. 2015, 114, 229–235. [Google Scholar] [CrossRef] [Green Version]
- Lang, T.; Staufer, S.; Jennes, B.; Gaengler, P. Clinical validation of robot simulation of toothbrushing—Comparative plaque removal efficacy. BMC Oral Health 2014, 14, 82. [Google Scholar] [CrossRef] [Green Version]
- Nelson, C.A.; Hossain, S.G.M.; Al-Okaily, A.; Ong, J. A novel vending machine for supplying root canal tools during surgery. J. Med. Eng. Technol. 2012, 36, 102–116. [Google Scholar] [CrossRef] [PubMed]
- Lepidi, L.; Chen, Z.; Ravida, A.; Lan, T.; Wang, H.L.; Li, J. A Full-Digital Technique to Mount a Maxillary Arch Scan on a Virtual Articulator. J. Prosthodont. 2019, 28, 335–338. [Google Scholar] [CrossRef]
- Zhang, Y.; De Jiang, J.G.; Liang, T.; Hu, W.P. Kinematics modeling and experimentation of the multi-manipulator tooth-arrangement robot for full denture manufacturing. J. Med. Syst. 2011, 35, 1421–1429. [Google Scholar] [CrossRef] [PubMed]
- Spin-Neto, R.; Mudrak, J.; Matzen, L.H.; Christensen, J.; Gotfredsen, E.; Wenzel, A. Cone beam CT image artefacts related to head motion simulated by a robot skull: Visual characteristics and impact on image quality. Dentomaxillofacial Radiol. 2013, 42, 32310645. [Google Scholar] [CrossRef] [Green Version]
- Li, C.; Gu, X.; Xiao, X.; Lim, C.M.; Duan, X.; Ren, H. A Flexible Transoral Robot Towards COVID-19 Swab Sampling. Front. Robot. AI 2021, 8, 51. [Google Scholar] [CrossRef]
- Jose, K.; Pratihar, D.K. Task allocation and collision-free path planning of centralized multi-robots system for industrial plant inspection using heuristic methods. Rob. Auton. Syst. 2016, 80, 34–42. [Google Scholar] [CrossRef]
- Das, P.K.; Jena, P.K. Multi-robot path planning using improved particle swarm optimization algorithm through novel evolutionary operators. Appl. Soft Comput. J. 2020, 92, 106312. [Google Scholar] [CrossRef]
- Fascista, A.; Coluccia, A.; Ricci, G. A Pseudo Maximum likelihood approach to position estimation in dynamic multipath environments. Signal Processing 2021, 181, 107907. [Google Scholar] [CrossRef]
- Karaagac, A.; Haxhibeqiri, J.; Ridolfi, M.; Joseph, W.; Moerman, I.; Hoebeke, J. Evaluation of accurate indoor localization systems in industrial environments. In Proceedings of the 2017 22nd IEEE International Conference on Emerging Technologies and Factory Automation (ETFA), Limassol, Cyprus, 12–15 September 2017; pp. 1–8. [Google Scholar]
- Makomo, T.J.; Erin, K.; Boru, B. Real Time Application for Automatic Object and 3D Position Detection and Sorting with Robotic Manipulator. Sak. Univ. J. Sci. 2020, 24, 703–711. [Google Scholar] [CrossRef]
- Hermansson, T.; Carlson, J.S.; Linn, J.; Kressin, J. Quasi-static path optimization for industrial robots with dress packs. Robot. Comput. Integr. Manuf. 2021, 68, 102055. [Google Scholar] [CrossRef]
- Nguyen, V.; Melkote, S. Hybrid statistical modelling of the frequency response function of industrial robots. Robot. Comput. Integr. Manuf. 2021, 70, 102134. [Google Scholar] [CrossRef]
- Jiao, J.; Tian, W.; Zhang, L.; Li, B.; Hu, J.; Li, Y.; Li, D.; Zhang, J. Variable stiffness identification and configuration optimization of industrial robots for machining tasks. Res. Sq. 2020. [Google Scholar] [CrossRef]
- Ding, L.; Jiang, W.; Zhou, Y.; Zhou, C.; Liu, S. BIM-based task-level planning for robotic brick assembly through image-based 3D modeling. Adv. Eng. Inform. 2020, 43, 100993. [Google Scholar] [CrossRef]
- Leroux, M.; Raison, M.; Adadja, T.; Achiche, S. Combination of eyetracking and computer vision for robotics control. In Proceedings of the IEEE Conference on Technologies for Practical Robot Applications, TePRA, Woburn, MA, USA, 11–12 May 2015; IEEE Computer Society: Washington, DC, USA, 2015. [Google Scholar]
- Xu, Y.; Fang, G.; Lv, N.; Chen, S.; Jia Zou, J. Computer vision technology for seam tracking in robotic GTAW and GMAW. Robot. Comput. Integr. Manuf. 2015, 32, 25–36. [Google Scholar] [CrossRef]
- Rojas, R.A.; Garcia, M.A.R.; Gualtieri, L.; Rauch, E. Combining safety and speed in collaborative assembly systems—An approach to time optimal trajectories for collaborative robots. Procedia CIRP 2021, 97, 308–312. [Google Scholar] [CrossRef]
- Roveda, L.; Magni, M.; Cantoni, M.; Piga, D.; Bucca, G. Human–robot collaboration in sensorless assembly task learning enhanced by uncertainties adaptation via Bayesian Optimization. Rob. Auton. Syst. 2021, 136, 103711. [Google Scholar] [CrossRef]
- Fu, G.; Gu, T.; Gao, H.; Lu, C. A postprocessing and path optimization based on nonlinear error for multijoint industrial robot-based 3D printing. Int. J. Adv. Robot. Syst. 2020, 17, 172988142095224. [Google Scholar] [CrossRef]
- Cvitanic, T.; Nguyen, V.; Melkote, S.N. Pose optimization in robotic machining using static and dynamic stiffness models. Robot. Comput. Integr. Manuf. 2020, 66, 101992. [Google Scholar] [CrossRef]
- Wang, Z.; Zhang, R.; Keogh, P. Real-Time Laser Tracker Compensation of Robotic Drilling and Machining. J. Manuf. Mater. Process. 2020, 4, 79. [Google Scholar] [CrossRef]
- Schultz, U.P. Reversible control of robots. In Reversible Computation: Extending Horizons of Computing. RC 2020. Lecture Notes in Computer Science); Ulidowski, I., Lanese, I., Schultz, U., Ferreira, C., Eds.; Springer: Berlin/Heidelberg, Germany, 2020; Volume 12070, pp. 177–186. [Google Scholar] [CrossRef]
- Jiang, J.; Huang, Z.; Bi, Z.; Ma, X.; Yu, G. State-of-the-Art control strategies for robotic PiH assembly. Robot. Comput. Integr. Manuf. 2020, 65, 101894. [Google Scholar] [CrossRef]
- Kumar, S.; Singhal, P.; Krovi, V.N. Computer-vision-based decision support in surgical robotics. IEEE Des. Test 2015, 32, 89–97. [Google Scholar] [CrossRef]
- Bader, F.; Rahimifard, S. Challenges for industrial robot applications in food manufacturing. In Proceedings of the 2nd International Symposium on Computer Science and Intelligent Control, Stockholm, Sweden, 21–23 September 2018. [Google Scholar]
- Grobbelaar, W.; Verma, A.; Shukla, V.K. Analyzing human robotic interaction in the food industry. J. Phys. Conf. Ser. 2021, 1714, 012032. [Google Scholar] [CrossRef]
- Sandey, K.K.; Qureshi, M.A.; Meshram, B.D.; Agrawal, A.; Uprit, S. Robotics—An Emerging Technology in Dairy Industry. Int. J. Eng. Trends Technol. 2017, 43, 58–62. [Google Scholar]
- Wang, Z.; Or, K.; Hirai, S. A dual-mode soft gripper for food packaging. Rob. Auton. Syst. 2020, 125, 103427. [Google Scholar] [CrossRef]
- Blöcher, K.; Alt, R. AI and robotics in the European restaurant sector: Assessing potentials for process innovation in a high-contact service industry. Electron. Mark. 2020, 31, 529–551. [Google Scholar] [CrossRef]
- Bader, F.; Rahimifard, S. A methodology for the selection of industrial robots in food handling. Innov. Food Sci. Emerg. Technol. 2020, 64, 102379. [Google Scholar] [CrossRef]
- Boschetti, G.; Carbone, G. Advances in Italian Mechanism Science; Springer: Cham, Switzerland, 2017; Volume 18, ISBN 9783030558062. [Google Scholar]
- Zhang, B.; Xie, Y.; Zhou, J.; Wang, K.; Zhang, Z. State-of-the-art robotic grippers, grasping and control strategies, as well as their applications in agricultural robots: A review. Comput. Electron. Agric. 2020, 177, 105694. [Google Scholar] [CrossRef]
- Chang, C.-L.; Lin, K.-M. Smart Agricultural Machine with a Computer Vision-Based Weeding and Variable-Rate Irrigation Scheme. Robotics 2018, 7, 38. [Google Scholar] [CrossRef] [Green Version]
- Patrício, D.I.; Rieder, R. Computer vision and artificial intelligence in precision agriculture for grain crops: A systematic review. Comput. Electron. Agric. 2018, 153, 69–81. [Google Scholar] [CrossRef] [Green Version]
- Tankova, T.; da Silva, L.S. Robotics and Additive Manufacturing in the Construction Industry. Curr. Robot. Rep. 2020, 1, 13–18. [Google Scholar] [CrossRef] [Green Version]
- Davila Delgado, J.M.; Oyedele, L.; Ajayi, A.; Akanbi, L.; Akinade, O.; Bilal, M.; Owolabi, H. Robotics and automated systems in construction: Understanding industry-specific challenges for adoption. J. Build. Eng. 2019, 26, 100868. [Google Scholar] [CrossRef]
- Robinson, G. Global Construction Market to Grow $8 Trillion by 2030: Driven by China, US and India; Global Construction Perspectives and Oxford Economics: London, UK, 2016; Volume 44, pp. 1–3. [Google Scholar]
- Aparicio, C.C.; Balzan, A.; Trabucco, D. Robotics in construction: Framework and future directions. Int. J. High-Rise Build. 2020, 9, 105–111. [Google Scholar]
- Follini, C.; Magnago, V.; Freitag, K.; Terzer, M.; Marcher, C.; Riedl, M.; Giusti, A.; Matt, D.T. Bim-integrated collaborative robotics for application in building construction and maintenance. Robotics 2021, 10, 2. [Google Scholar] [CrossRef]
- Parascho, S.; Han, I.X.; Walker, S.; Beghini, A.; Bruun, E.P.G.; Adriaenssens, S. Robotic vault: A cooperative robotic assembly method for brick vault construction. Constr. Robot. 2020, 4, 117–126. [Google Scholar] [CrossRef]
- Kazemian, A.; Yuan, X.; Davtalab, O.; Khoshnevis, B. Computer vision for real-time extrusion quality monitoring and control in robotic construction. Autom. Constr. 2019, 101, 92–98. [Google Scholar] [CrossRef]
- Gautam, M.; Fagerlund, H.; Greicevci, B.; Christophe, F.; Havula, J. Collaborative Robotics in Construction: A Test Case on Screwing Gypsum Boards on Ceiling. In Proceedings of the 2020 5th International Conference on Green Technology and Sustainable Development, Ho Chi Minh City, Vietnam, 27–28 November 2020; pp. 88–93. [Google Scholar]
- Balzan, A.; Aparicio, C.C.; Trabucco, D. Robotics in construction: State-of-art of on-site advanced devices. Int. J. High-Rise Build. 2020, 9, 95–104. [Google Scholar]
- Ghasempourabadi, M.; Taraz, M. Human-robot interaction in construction: A literature review. Malays. J. Sustain. Environ. 2021, 8, 49–74. [Google Scholar]
- Bodea, S.; Mindermann, P.; Gresser, G.T.; Menges, A. Additive Manufacturing of Large Coreless Filament Wound Composite Elements for Building Construction. 3D Print. Addit. Manuf. 2021. ahead of print. [Google Scholar] [CrossRef]
- Zhang, M.; Yan, J. A data-driven method for optimizing the energy consumption of industrial robots. J. Clean. Prod. 2021, 285, 124862. [Google Scholar] [CrossRef]
- Aksoy, S.; Ozan, E. Robots and Their Applications. Int. Res. J. Eng. Technol. 2020. [Google Scholar] [CrossRef] [Green Version]
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
To improve flexibility, productivity and quality of a multi-pass gas tungsten arc welding (GTAW) process performed by a collaborative robot. | A haptic interface. 6-axis robotic arm (Mitsubishi MELFA RV-13FM-D). The end effector with GTAW torch. A monitoring camera (Xiris XVC-1000). A Load Cell (ATI Industrial Automation Mini45-E) to evaluate tool force interactions with work pieces. | A haptic-based approach is designed and tested in a manufacturing scenario proposing light and low-cost real-time algorithms for “touch” detection. | Two main criteria were analysed to assess the performance: the 3-Sigma rule and the Hampel identifier. Experimental results showed better performance of the 3-Sigma rule in terms of precision percentage (mean value of 99.9%) and miss rate (mean value of 10%) concerning the Hampel identifier. Results confirmed the influence of the contamination level related to the dataset. This algorithm adds significant advances to enable the use of light and simple machine learning approaches in real-time applications. | [24,25] |
To produce more advanced or complex forms of interaction by enabling cobots with semantic understanding capabilities or AI-aided anticipation skills. | Collaborative robots | Artificial intelligence. | The overview provides hints of future cobot developments and identifies future research frontiers related to economic, social, and technological dimensions. | [26] |
To strike a balance in order to find a suitable level of autonomy for human operators. | Model of Remotely Instructed Robots (RIRs.) | Modelling method. | Developed model in which the robot is autonomous in task execution, but also aids the operator’s ultimate decision-making process about what to do next. Presentation of the robot’s own model of the work scene enables corrections to be made by the robot, as well as it can enhance the operator’s confidence in the robot’s work. | [27,28] |
Objective | Interaction | Approach | Solution | Ref. |
---|---|---|---|---|
Frustration | Close cooperative work | Controlled coordination | Sense of control of frustration, affective computing. | [29] |
Emotion recognition | By collecting different kinds of data. | Discrete models describing emotions used, facial expression analysis, camera positioning. | Affective computing. Empowering robots to observe, interpret and express emotions. Endow robots with emotional intelligence. | [30] |
Decoding of action observation | Elucidating the neural mechanisms of action observation and intention understanding. | Decoding the underlying neural processes. | The dynamic involvement of the mirror neuron systems (MNS) and the theory of mind ToM/mentalising network during action observation. | [31] |
Verbal and non-verbal communication | Interactive communication. | Symbol grounding | Composition of grounded semantics, online negotiation of meaning, affective interaction and closed-loop affective dialogue, mixed speech-motor planning, massive acquisition of data-driven models for human–robot communication through crowd-sourced online games, real-time exploitation of online information and services for enhanced human–robot communication. | [32] |
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
Create bridge between reinforcement learning and the surgical robotics communities by presenting the first open-sourced reinforcement learning environments for surgical da Vinci robots. | Patient Side Manipulator (PSM) arm. Da VinciR©Surgical Robot. Large Needle Driver (LND), with a jaw gripper to grab objects such as suturing needle. | Reinforced learning, OpenAI Gym DDPG (Deep Deterministic Policy Gradients) and HER (Hindsight Experience Replay) V-REP physics simulator | Developed new reinforced learning environment for fast and effective training of surgical da Vinci robots for autonomous operations. | [63] |
A method of shared control where the user controls a prosthetic arm using a brain–machine interface and receives assistance with positioning the hand when it approaches an object. | Brain–machine interface system. Robotic arm. RGB-D camera mounted above the arm base. | Shared control system. An autonomous robotic grasping system | Shared control system for a robotic manipulator, making control more accurate, more efficient, and less difficult than an alone control system. | [66] |
A personalised deep learning framework can adapt robot perception of children’s affective states and engagement to different cultures and individuals. | Unobtrusive audiovisual sensors and wearable sensors, providing the child’s heart-rate, skin-conductance (EDA), body temperature, and accelerometer data. | Feed-forward multilayer neural networks. GPA-net | Achieved an average agreement of ~60% with human experts to estimate effect and engagement. | [68] |
An overview of existing applications and concepts of robotic systems and artificial intelligence in dentistry, for functional capacity evaluations, of the role of ML in surgery using surgical robotics, of deep learning vis-à-vis physical robotic systems, focused on contemporary research. | An overview | An overview | An overview | [64,65,67,69] |
Transoral robot towards COVID-19 swab sampling. | Flexible manipulator, an endoscope with a monitor, a master device. | Teleoperated configuration for swab sampling | A flexible transoral robot with a teleoperated configuration is proposed to address the surgeons’ risks during the face-to-face COVID-19 swab sampling. | [77] |
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
The position of the objects—possible trajectory to an object in real-time. | A robotic system consisting of an ABB IRB120 robot equipped with a gripper and a 3D Kinect sensor. | Detection of the workpieces. Object recognition techniques are applied using available algorithms in MATLAB’s Computer Vision and Image Acquisition Toolbox. | The algorithm for finding 3D object position according to colour segmentation in real-time. The main focus was on finding the depth of an object from the Kinect sensor. Kinect could distinguish colour correctly, and the robot could accurately navigate to the detected object. | [82] |
The combination of eye-tracking and computer vision automate the approach of a robot to its targeted point by acquiring its 3D location. | Eye-tracking device, webcam. | Image analysis and geometrical reconstruction. | The computed coordinates of the target 3D localisation have an average error of 5.5 cm, which is 92% more accurate than eye-tracking only for the point of gaze calculation, with an estimated error of 72 cm. | [87] |
Computer vision technology for real-time seam tracking in robotic gas tungsten arc welding (GTAW). | Welding robot GTAW—the robot arm, the robot controller, the vision system, isolation unit, the weld power supply, and the host computer. Passive vision system. | Passive vision system image processing. | The developed method is feasible and sufficient to meet the specific precision requirements of some applications in robotic seam tracking. | [88] |
A higher fidelity model for predicting the entire pose-dependent FRF of an industrial robot by combining the advantages of Experimental Modal Analysis (EMA) with Operational Modal Analysis for milling processes. | KUKA KR500-3 6 DOF industrial robot | Hybrid statistical modelling: Frequency Response Function (FRF) modelling method. | A Bayesian inference and hyperparameter updating approach for updating the EMA-calibrated GPR models of the robot FRF with OMA-based FRF data improved the model’s compliance RMSE by 26% and 27% in the x and y direction tool paths, respectively, compared to only EMA-based calibration. The methodology reduced the average number of iterations and calibration times required to determine the optimal GPR model hyperparameters by 50.3% and 31.3%, respectively. | [84] |
Safe trajectories without neglecting cognitive ergonomics and production efficiency aspects. | UR3 lightweight robot | Experimental tasks | The task’s execution time was reduced by 13.1% regarding the robot’s default planner and 19.6% concerning the minimum jerk smooth collaboration planner. This new approach is highly relevant for manufacturers of collaborative robots (e.g., for integration as a path option in the robot pendant software) and for users (e.g., an online service for calculating the optimal path and subsequent transfer to the robot). | [89] |
An industrial robot moving between stud welding operations in a stud welding station. | Industrial robot | Quasi-static path optimisation for an industrial robot | The method was successfully applied to a stud welding station for an industrial robot moving between two stud welding operations. Even for a difficult case, the optimised path reduced the internal force in the dress pack. It kept the dressed robot from the surrounding geometry with a prescribed safety clearance during the entire robot motion. | [83] |
An industrial assembly task for learning and optimisation, considering uncertainties. | A Franka EMIKA Panda manipulator | Task trajectory learning approach. Task optimisation approach. | The proposed approach made the robot learn the task execution and compensate for the task uncertainties. The HMM + BO methodology and the HMM algorithm without optimisation were compared. This comparison shows the capabilities of the optimisation stage to compensate for task uncertainties. In particular, the HMM + BO methodology shows an assembly task success rate of 93%, while the HMM algorithm shows a success rate of only 19%. | [90] |
The postprocessing and path optimisation based on the non-linear errors to improve the accuracy of multi-joint industrial robot-based 3D printing. | Multi-joint industrial robot for 3D printing | Path smoothing method | Multi-joint industrial robot-based 3D printing can be used for the high-precision printing of complex freeform surfaces. An industrial robot with only three joints is used, and the solutions of joint angles for the tool orientations are not proposed, which is essential for printing the freeform surface. | [91] |
A comparative study of robot pose optimisation using static and dynamic stiffness models for different cutting scenarios. | KUKA KR 500–3 industrial robot, aluminium 6061 | Complete pose (CP) and the decoupled partial pose (DPP) methods. Effect of optimisation method on machining accuracy | A dynamic model-based robot pose optimisation yields significant improvement over a static model-based optimisation for cutting conditions where the time-varying cutting forces approach the robot’s natural frequencies. A static model-based optimisation is sufficient when the frequency content of the cutting forces is not close to the robot’s natural frequencies. | [92] |
The feasibility and validity of proposed stiffness identification and configuration optimisation methods. | KUKA KR500 industrial robot | Robot stiffness characteristics and optimisation methods. Point selection method | The smooth processing strategy improves optimisation efficiency, ensuring minimal stiffness loss. According to the machining results of a cylinder head of a vehicle engine, the milling quality was improved obviously after the configuration optimisation, and the validity of these methods are verified. | [85] |
Real-time compensation setups. | A standard KUKA KR120R2500 PRO industrial robot with a spindle end-effector | Real-time Closed Loop Compensation method | Real-time metrology feedback cannot fully compensate for the sudden error spikes caused by the backlash. The mitigation strategy of automatically reducing feed rate (ASC) was demonstrated to reduce backlash error significantly. However, ASC considerably increases the cycle time for a toolpath that involves many direction reversals and leads to uneven cutter chip load and variation in surface finish. Backlash, therefore, remains the largest source of residual error for a robot under real-time metrology compensation. | [93] |
Building Information Model (BIM)-based robotic assembly model that contains all the required information for planning. | ABB IRB6700-235 robot (6 DOF), a construction plane (approximately 1.5 m × 0.9 m), a scene modelling camera (Sony a5100), and a modelling computer (Dell Precise). | Image-based 3D modelling method. Experimental method | A general IFC model for robotic assembly contains all the information needed for task-level planning; BIM and image-based modelling are used to calibrate robot pose for the unification of the robot coordinate system, construction area, and assembly task; a simple conversion process is presented to convert the 3D placement point coordinates of each brick into the robotic control instructions. In the process of experimental verification, task-level planning can maintain the same accuracy as that of the traditional method but saves time when facing more complex tasks. | [86] |
A model of reversibly controlled industrial robots based on abstract semantics. | Robotic assembly | Error recovery using reverse execution | A programming model which enables robot assembly programs to be executed in reverse. Temporarily switching the direction of program execution can be an efficient error recovery mechanism. Additional benefits arise from supporting reversibility in robotic assembly language, namely, increased code reuse and automatically derived disassembly sequences. | [94] |
The control strategies for robotic PiH assemblies and the limitations of the current robotic assembly technologies. | Robotic PiH assembly | Typical peg-in-hole (PiH) assembly methods | The system outperforms the operator performing the same task with magnified visual feedback regarding both completion time and the number of successful insertions. The proposed strategies can correctly diagnose the assembly process’s position errors and effectively realise error recovery. | [95] |
An overview of computer vision for preoperative, intraoperative, and postoperative surgical stages to assist with planning, tool detection, identification, pose tracking, and augmented reality, for surgical skill assessment and retrospective analysis of the procedure. | An overview | An overview | An overview | [96] |
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
The applications of industrial robots in the food industry and their automation prospects. A 4-step Food Industrial Robot Methodology for selecting industrial robots for food processing operations. | Articulated robot, parallel robot, Cartesian robot | The four steps within the Food Industrial Robot Methodology (FIRM). | The FIRM presented in this paper outlined the ability to classify industrial robot capabilities and match them to specific characteristics of foodstuffs and requirements for their processing based on four steps that navigate eight tasks. This work also identified many factors that should lay the groundwork for future research in the application of industrial robots within food manufacturing. | [102] |
Identification, analysis, and understanding robotics in one of the largest sectors, the food chain. | Robots in the food chain | Case study of a Delivery Bot | The emergence of robotics in business is widely seen across the world. However, the trust in human–robotic interaction appears to be underdeveloped. Reducing the number of repetitive jobs by replacing them with robots is not replacing jobs but paving the way for more intelligent jobs. | [98] |
Maximise performance by utilising fewer resources | Dual model soft gripper for food packaging | Grasp and suck process for various types of objects having a weight up to 1 kg | The proposed dual-mode gripper can perform grasp and suck functions for multiple types of nutrients. Additional improvements may be automatic switching of the gripper finger configuration and distance adjustment. | [100] |
Challenges in the application of industrial robots in the food industry | An overview | An overview | An overview | [97] |
Path planning optimisation technique in the food industry | The proposed optimisation technique is based on the use of an off-axis tool | EPSON T6 SCARA robot | This path optimisation technique shortens the cycle time and reduces energy consumption. | [103] |
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
The potential applications in agriculture by presenting a variety of manipulators and various forms of sensors. | Parallel grippers, angular grippers, and biologically inspired grippers manufactured by Festo. Various sensors | Application methods. | State-of-the-art robotic grippers, grasping and control strategies, and their applications in agricultural robots. Applications of robotic grippers in food, agricultural, and bio-system engineering were summarised in detail. | [104] |
A scheme that combines computer vision and multi-tasking processes to develop a small-scale smart agricultural machine that can automatically weed and perform variable rate irrigation within a cultivated field. | The frames of the machine, the weeding and watering mechanism, the image and soil moisture sensor, the actuator, and the graphical user interface (GUI) | Image processing methods such as HSV (hue (H), saturation (S), value (V)) colour conversion, estimation of thresholds during the image binary segmentation process, and morphology operator procedures. Fuzzy logic, multi-tasking processes | The system can classify plants and weeds in real time with an average classification rate of 90% or higher. This allows the machine to perform weeding and watering while maintaining the moisture content of the deep soil at 80 ± 10% and an average weeding rate of 90%. | [105] |
A systematic overview aiming to identify the applicability of computer vision in precision agriculture to produce the five most-produced grains in the world: maise, rice, wheat, soybean, and barley. Different approaches to treat disease detection, grain quality, and phenotyping. | An overview | An overview | An overview | [106] |
Objective | Technology | Approach | Improvement | Ref. |
---|---|---|---|---|
A novel fabrication process for the assembly of full-scale masonry vaults without falsework. | Two industrial robotic arms (ABB 4600 2.55). The prototype of the robotically assembled brick vault. | The fabrication method is based on a cooperative assembly approach in which two robots alternate between placement and support first to build a stable central arch. | Cooperative robotic assembly methods can be applied to constructing a spanning structure built without a temporary falsework. Where traditional manufacturing techniques require geometric guides, this project shows how it can instead leverage the robots’ precision to accurately place bricks in bespoke orientations. | [112] |
A computer vision for real-time extrusion quality monitoring during robotic building construction. | Laboratory-scale concrete printer. Logitech 720p camera to capture extrusion videos. The extrusion videos are processed in real-time by a Raspberry Pi 3B. | OpenCV library, adopted, shape-based approach. Gaussian filter. | The developed system can print up to ten 120 cm long concrete layers. It uses an extrusion mechanism similar to the Contour Crafting machine to print layers having a height of 3.81 cm and a width of 2.54 cm, from concrete and mortar at different linear speeds (up to 10 cm/s) and deposition rates. The vision system detected all designed variation levels (±5 to ±15 L/m3 change in the water in the mixture). In terms of accuracy and responsiveness to material variations, the obtained experimental results imply the excellent potential for using computer vision for automated quality monitoring of construction-scale 3D printing. | [113] |
Presents the possibilities of applying lightweight cobots to individual tasks in the construction sector. | Presenting of light robotics together with 3D printing technology provides the rapid advantage of prototyping to test ideas and applications. | The simplest visual system was used to follow a simplified approach, which can be controlled directly by a robot controller. | Future research on increasing the dynamics of torsional tasks using a mobile robot with a scissor lift could result in the cobot and mobile platform covering the entire construction area. | [114] |
To determine if improved robotic technologies have also been used in the building industry. | An overview | An overview | An overview | [115] |
To determine how robotic automation can help in the construction industry. | A common framework for current technological innovation in this field and a development plan were outlined. | The projected impacts on traditional processes, construction sites, emerging technologies, and related professions are summarised to identify future implications and future directions toward self-sufficiency. | Artificial intelligence must be a successful factor in the involvement of robotic devices in the construction industry. | [110] |
Provide a systematic overview of human-robot interactions concerning various types of robots | Human–robot interaction, human–robot cooperation (HRC). | An overview | Further investigation of multi-function robots, human–robot interaction in robotic fabrication, and multipurpose robots. | [116] |
The main goal is to fully describe feedback based on sensor informed programs for process monitoring and fabrication data collection and analysis. | Additive manufacturing. | An overview | Effective robotic production still requires the communication and management of progressively improving materials and building systems. | [117] |
Application of a Building Information Modelling (BIM) method for efficient and simple deployment of robot systems for building construction and operation | BIM integrative, collaborative robotics. | The robot is provided with a priori geometric and semantic information about the environment with the help of the BIM system. | Future improvements consist of the assessment of the actual applicability of the system on the construction site and closing the gap between robotic systems and the construction site. | [111] |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Dzedzickis, A.; Subačiūtė-Žemaitienė, J.; Šutinys, E.; Samukaitė-Bubnienė, U.; Bučinskas, V. Advanced Applications of Industrial Robotics: New Trends and Possibilities. Appl. Sci. 2022, 12, 135. https://doi.org/10.3390/app12010135
Dzedzickis A, Subačiūtė-Žemaitienė J, Šutinys E, Samukaitė-Bubnienė U, Bučinskas V. Advanced Applications of Industrial Robotics: New Trends and Possibilities. Applied Sciences. 2022; 12(1):135. https://doi.org/10.3390/app12010135
Chicago/Turabian StyleDzedzickis, Andrius, Jurga Subačiūtė-Žemaitienė, Ernestas Šutinys, Urtė Samukaitė-Bubnienė, and Vytautas Bučinskas. 2022. "Advanced Applications of Industrial Robotics: New Trends and Possibilities" Applied Sciences 12, no. 1: 135. https://doi.org/10.3390/app12010135
APA StyleDzedzickis, A., Subačiūtė-Žemaitienė, J., Šutinys, E., Samukaitė-Bubnienė, U., & Bučinskas, V. (2022). Advanced Applications of Industrial Robotics: New Trends and Possibilities. Applied Sciences, 12(1), 135. https://doi.org/10.3390/app12010135