Next Article in Journal
Optimization of Green Sample Preparation for the Determination of Hydroxycinnamic Acids in Multi-Floral Honey Using Response Surface Methodology
Previous Article in Journal
The Diagnosis-Effective Sampling of Application Traces
Previous Article in Special Issue
Dual-Modality Cross-Interaction-Based Hybrid Full-Frame Video Stabilization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Virtual Simulation and Experiment of Quality Inspection Robot Workstation

1
College of Aerospace Engineering, Shenyang Aerospace University, Shenyang 110136, China
2
College of Aeronautical Machinery Manufacturing, Airforce Aviation Repair Institute of Technology, Changsha 410124, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2024, 14(13), 5778; https://doi.org/10.3390/app14135778
Submission received: 29 May 2024 / Revised: 24 June 2024 / Accepted: 27 June 2024 / Published: 2 July 2024
(This article belongs to the Special Issue Cross Applications of Interactive Smart System and Virtual Reality)

Abstract

:
(1) Background: Quality inspection robots are widely used in automated production lines. However, the design cycle is long, iteration costs are high, and algorithm development is challenging. It is difficult to perform effective validation during the design phase. Applying virtual reality technology to simulate quality inspection robot workstations offers a new approach to addressing the issues. (2) Methods: The research creates a simulation platform for quality inspection robot workstations based on a virtual reality architecture. The platform creates an immersive quality inspection robot workstation operation interface and conducts testing of the inspection process, thereby validating the rationality of the quality inspection robot workstation design. Building upon this foundation, we conducted experimental comparisons of various defect detection algorithms. (3) Results: Compared to the traditional YOLOv7 algorithm, the improved YOLOv7 algorithm achieved an 18.1% increase in recognition precision. Experimental results demonstrate that the quality inspection robot workstation simulation platform can be applied to validating workstation design proposals. (4) Conclusions: It has a positive impact on reducing the research and development costs of quality inspection robot workstations and shortening the defect recognition algorithm development cycle.

1. Introduction

With the rapid development of industrial automation and next-generation information technology, robots have become a key research area in intelligent manufacturing [1]. Quality inspection robot workstations offer advantages such as high detection efficiency, accuracy, and flexibility, playing a significant role in improving quality, inspection efficiency, and accuracy on production lines. However, the long design cycle and high construction costs of quality inspection robot workstations make it challenging to perform effective simulation validation during the design phase.
The introduction of virtual reality presents new opportunities for industrial simulation. Virtual reality technology precisely simulates industrial production environments and processes, including machine operations and changes in physical conditions. It creates an immersive, three-dimensional virtual environment. Engineers can accurately simulate the operation process of robot workstations in this virtual environment, assisting designers in rapidly constructing product models and evaluating and optimizing the performance of robot workstations [2,3,4,5]. Applying VR technology for simulation testing is a cost-effective and efficient approach, providing an effective means for evaluating industrial quality inspection robot workstations.
Currently, virtual reality technology is widely applied in the industrial sector. Scholars have conducted research in areas such as production and manufacturing. Zhu Zhaoju [6] constructed a digital twin of a robot using the Unity3D platform. Through the use of virtual-to-real mapping techniques, the simulation achieved real-time visualization and monitoring of robot milling motion. Elisa Prati [7] introduced an integrated approach to designing human–robot collaborative workstations in industrial workshops. By leveraging virtual reality (VR) technology, this method enables designers to create interactive workstation prototypes and validate design outcomes during the early stages of development. Wei Xia [8] devised an image acquisition technique for a 6-DOF serial robot, utilizing industrial cameras. The detection system’s digital twin unit was established, and the proposed UR10 robot kinematics model and acquisition trajectory planning method were implemented. The Unity3D software facilitated the configuration of the digital twin environment for the detection system. Kamali [9] proposed a deep reinforcement learning (DRL) approach to address the motion planning problem for robot arms. This method enables the reflection of human hand movements onto the robotic arm. By constructing a virtual reality environment platform, users can smoothly control the position and orientation of the robot’s end effector through hand movements. Gustavo Caiza [10] proposed the use of immersive digital twin (DT) to implement an industrial process monitoring and control architecture for manufacturing execution systems (MES). Cyberphysical systems, MES, robotics, the Internet of Things, virtual reality, and Open Platform Communications Unified Architecture (OPC UA) communication protocols are used to integrate these technologies and enhance the capabilities of the DT by providing higher performance. However, research on virtual reality technology in the field of quality inspection robot workstations is relatively scarce.
To address the aforementioned issues, we propose a method that utilizes VR technology to develop a simulation platform for quality inspection robot workstations. Through this virtual simulation platform, we conduct testing of quality inspection robot workstations, validate the rationality of workstation designs, and accelerate the design, testing, and deployment process of quality inspection robot workstations.
Our key contributions are as follows:
  • Design of a virtual reality framework for quality inspection robot workstations: We develop a virtual reality framework for quality inspection robot workstations. With the virtual reality framework, we achieve accurate and robust reconstructions suitable for a variety of industrial applications.
  • Development of a virtual simulation platform: We build an immersive interactive virtual simulation platform for quality inspection robot workstations, and apply the defect detection algorithm to the virtual simulation platform for the first time.
  • Defect detection experiments in the virtual environment: We conduct experiments on different defect detection algorithms on the virtual simulation platform, and confirm the effectiveness of the improved YOLOv7 algorithm.

2. Virtual Reality Framework for Quality Inspection Robot Workstation

A quality inspection robot workstation is an operational system primarily centered around quality inspection robots. It utilizes machine vision to perform quality inspections on workpieces. The virtual simulation platform for quality inspection robot workstations can be divided into five levels: User Service, Mechanism Model, Decision Algorithm, Data Process, and Physical Device, as shown in Figure 1 [11].
Physical devices primarily include components such as robotic arms, mobile rails, vision systems, sensors, and AGVs used in quality inspection workstations. The mechanism model is constructed through geometric modeling and model rendering to create three-dimensional representations of the physical equipment. This includes kinematic models and topological structures. By simulating geometric features, motion models, and decision systems, the digital model achieves a comprehensive mapping of the physical equipment information. The User Service layer, leveraging the effective virtual-to-real mapping mechanism between physical and digital spaces, enables process simulation, solution validation, and algorithm optimization for quality inspection workstations in the virtual environment. Digital space and physical space interact through a data management system, enabling data collection, transmission, computation, and feedback for devices. Quality inspection robots capture images of the workpiece surface using cameras. The decision system invokes defect detection algorithms to analyze the quality of the workpiece surface. After processing, the results are fed back to the user service end, thus achieving the simulation-driven quality inspection robot workstation system.

3. Development of Virtual Simulation Platform

3.1. Digital Model Construction

Developing a simulation platform for quality inspection robot workstations based on virtual reality platform technology architecture starts with constructing the digital model of the quality inspection robot workstation, as shown in Figure 2 [12,13,14]. Digital model construction involves integrating physical information to map virtual units to the geometry, structure, properties, behavior, rules, logic, and entire product lifecycle of physical entities. This process includes three main components: setting up workstations, constructing geometric models, and defining physical rules for robots. Workstation scene construction involves setting up virtual scenes for production areas, inspection zones, and finished product areas by configuring scene cameras, UI elements, scripts, and lighting. This process aims to achieve a highly accurate representation of factory environments. Geometric model construction involves building the geometric model of a quality inspection robot using the 3D modeling software CATIA, ensuring that the model components have the same structure, shape, size, and assembly relationships as the actual robot. The model is then imported into 3Ds Max, a 3D rendering software, to apply high-fidelity texture materials and color attributes, enhancing the visual effect of the model. Robot physical rule construction involves loading forward and inverse kinematics models onto the three-dimensional model of the robot within the virtual simulation engine Unity3D. This process enables the robot to simulate movement in digital space, ensuring a genuine constraint relationship between the behavior of the physical entity and the established rules.

3.2. Robot Simulation Drive

In the virtual reality environment, the industrial quality inspection robot is driven by simulation. Based on the kinematics theory of the robot, we establish the formula relationship among the relevant kinematic parameters and impose the rotation angle and speed constraints of each joint through the script to simulate its physical behavior characteristics. The forward kinematics analysis of the robot structure is carried out to determine the pose of the robot’s end connecting rod coordinate system in the fixed base coordinate system. According to the forward kinematics of the robot and its transformation principle, the D–H coordinate system of the industrial quality inspection robot is established, and the mechanical structure of the robot is transformed into an analytical mathematical model through the D–H coordinate system, so as to realize the drive of the industrial quality inspection robot in the intelligent virtual simulation platform. The established robot D–H coordinate system and its original state are shown in Figure 3.
The pose transformation between the local coordinate systems of adjacent links requires two rotational homogeneous transformations and two translational homogeneous transformations. The change matrix between { i 1} system transformation and { i } system transformation is expressed as:
T i = R o t z , θ i T r a n s 0,0 , d i T r a n s a i , 0,0 R o t x , α i
where θ i , d i , a i and α i represent multiple linkage parameters of the robot, and θ i is the Angle between axis X i 1 and axis X i . d i is the directional distance between axis X i 1 and axis X i along axis Z i 1 ; a i is the directional distance between axis Z i 1 and axis Z i along axis X i ; α i is the angle between axis Z i 1 and axis Z i . Then, multiply the adjacent connecting rod transformation matrix successively to obtain the coordinate transformation relationship between the robot’s sixth axis and the end coordinate system, as shown in Equation (2):
T 6 = T 1 θ 1 T 2 θ 2 T 3 θ 3 T 4 θ 4 T 5 θ 5 T 6 θ 6
The industrial quality inspection robot is independently driven by six independent servo motors, which are successively divided into levels from the fixed end according to 6 degrees of freedom [15,16]. In the Unity 3D platform Hierarchy, the virtual reality model is divided into the Base, Shoulder, Elbow, Wrist1, Wrist2, Wrist3 hierarchy. On the basis of completing the hierarchical division of the equipment structure, the joint entity is added to the joint and connecting rod of the robot. In the Hierarchy attribute panel, assign the “Articulation Body” component to the robot, adding rigid body and collision properties to the joints and links. In a six-axis robot, the joints are limited to rotating around one axis, so set the Shoulder, Elbow, Wrist1, Wrist2, and Wrist3 parts to “Revolute” rotation constraints, allowing them to rotate around the parent anchor point’s axis. After assigning the kinematic and controller attributes, drive the robot’s motion control by writing a C# script, ultimately achieving the simulation drive of an industrial inspection robot on a virtual reality platform.

3.3. Human-Computer Interaction System Development

In the virtual reality environment, the user interacts with the quality inspection robot in various forms and controls the robot to carry out quality inspection tasks [17,18,19]. By adding ‘Button’ and ‘Slider’ components to the Canvas panel, users can operate the industrial quality inspection robot through virtual buttons, sliders, and other controls. This includes adjusting the robot’s running speed, selecting inspection items, and viewing inspection results. By importing plugins such as the SteamVR Plugin SDK, users can control the system using handheld controllers. By adding an Audio Source component to set up a real-time feedback system, users can obtain information such as the status of the robot, inspection progress, and results based on auditory feedback. In the VR environment, users can also inspect quality assessment results and closely examine defect details, including specific information about non-conforming products, defect types, and their locations.

4. Workflow Simulation Testing

4.1. Experiment Environment Setting

The simulation testing equipment primarily includes the Vive Pro head-mounted device, Vive controllers, base stations, the quality inspection robot workstation simulation platform, and a computer (with Windows 11, AMD Ryzen 5 5600 G CPU, NVIDIA GeForce RTX 3060Ti GPU, and 16 GB RAM). When using virtual reality devices, it is necessary to set up a virtual interaction area of at least 2 m × 1.5 m. Each base station has a 120° field of view, ensuring it tilts downward between 30° and 45° to fully cover the designated virtual interaction area.

4.2. Quality Inspection Robot Workstation Test Process

The simulation tests are carried out according to the operation process of quality inspection robot workstation in real work. The workflow can be broken down into the following steps: First, the Automated Guided Vehicle (AGV) picks up parts from the inspection area using an automated loading and unloading system. The AGV then transports these picked parts to the quality inspection robot. At the quality inspection robot workstation, the detection robot utilizes a high-resolution camera and sophisticated image processing algorithms to perform defect inspection on the parts. Once the inspection is complete, the AGV separately transports the qualified parts and the non-conforming parts to their respective locations. Qualified parts are sent to the finished product area, while non-conforming parts are routed to a specialized processing area for further handling. During the inspection process, data such as part inspection results and inspection time are automatically stored and uploaded to the Manufacturing Execution System (MES).

4.3. Test Results and Analysis

In this experiment, the workflow of the quality inspection robot workstation was verified in a virtual environment, and the simulation results are a shown in Figure 4.
From the simulation results, the quality inspection robot is capable of defect detection on the parts carried by the AGV. After identifying the defect type, the data are uploaded to the Manufacturing Execution System (MES). The AGV receives this data and transports the parts to their respective units, achieving the expected outcome. Through experimentation, it has been verified that the quality inspection robot workstation provides a favorable human–computer interaction environment, and the workflow operates seamlessly. The design proposal for the inspection robot workstation is feasible.

5. Defect Detection Experiments in the Virtual Environment

5.1. Defect Detection Algorithm Model

The industrial quality inspection robot utilizes machine vision to recognize defect types on workpieces and employs the improved YOLOv7 algorithm for part defect detection. YOLO series object detection algorithms have been widely used in the field of computer vision because of their high efficiency and real-time performance. YOLOv7 is the most advanced algorithm in the YOLO series, which has been improved in accuracy and speed [20,21]. In the aspect of defect detection, the YOLOv7 algorithm still has the disadvantage that it is difficult to detect because the detected target is small and the features are not obvious. On the foundation of the YOLOv7 algorithm, the improved YOLOv7 algorithm incorporates the Convolutional Block Attention Module (CBAM) attention module into the feature fusion network. The CBAM incorporates both spatial attention and channel attention mechanisms, gradually refining feature maps. This approach emphasizes important features while suppressing less relevant information. The model architecture is illustrated in Figure 5. The improved YOLOv7 algorithm effectively solves the problems of small defect areas on the metal surface and significant distribution changes in the image and greatly improves the accuracy of defect detection.
After incorporating the Convolutional Block Attention Module (CBAM) into the feature fusion network of the improved YOLOv7, the structure of the defect detection algorithm is depicted in Figure 6.

5.2. Experiment’s Data Processing

The defect detection experiment utilized the GC10-DET dataset as the data source for algorithm testing. GC10-DET is a collection of metallic surface defect data gathered from real-world industrial scenarios [22]. It encompasses four types of surface defects: oil spots, water spots, welding lines, and inclusions, as illustrated in Figure 7. The dataset comprises 1584 grayscale images, randomly divided into training, validation, and test sets in an 8:1:1 ratio.

5.3. Evaluation Indicators

To better compare the experimental performance of improved network models on datasets and describe and analyze the experimental results, we select commonly used evaluation indicators in the field of object detection as the evaluation criteria for this experiment [23]. These indicators include precision (P), recall rate (R), average precision (AP), and mean average precision (mAP). The relevant calculation formulas are shown in the figure below.
P = T P T P + F P
R = T P T P + F N
A P = 0 1 P ( R ) d R
m A P = i = 1 N A P ( i ) N

5.4. Model Training and Deployment

Using the PyTorch 1.13.1 + CUDA 11.7 deep learning framework, we train the algorithmic model and subsequently deploy it to the quality inspection robot’s workstation’s virtual simulation platform. Within the virtual environment, we conduct defect detection experiments. First, we export the model’s weight file (.weights) and configuration file (.cfg) to the ONNX format. Next, in Unity3D, we utilize the Barracuda library from ML-Agents to load the ONNX model. By writing C# scripts, we load the model, process input images captured by the camera, and invoke the model for inference. The output of the YOLO model includes detected object bounding boxes, class labels, and confidence scores. By writing scripts, we parse these outputs and transform the detection results into visual elements within the Unity scene. This is achieved by creating UI elements in the scene to display the detection boxes and category labels, enabling defect detection for workpieces.

5.5. Experimental Results and Analysis

When conducting defect detection experiments for parts on the quality inspection robot workstation’s virtual simulation platform, we utilize machine vision to identify part defects. Under the same dataset split, we compare the currently popular YOLO series algorithms with the proposed improved YOLOv7 algorithm in this study. These comparative algorithms include YOLOv5s, YOLOv5m, YOLOv5l, YOLOv5x, YOLOv7, and YOLOv7-X. The comparative experimental results are shown in Table 1, with the optimal values highlighted in bold.
Compared with the experimental results in the table, it is found that the precision (P) of the YOLOv7-X algorithm is the highest, reaching 73.1%. The improved YOLOv7 algorithm has the highest recall rate (R) and mean average precision (mAP) are, which are 72.7% and 73.3%, respectively. Compared with the YOLOv7 algorithm, the precision (P) of the improved YOLOv7 algorithm has increased by 18.1%, the recall rate (R) has improved by 3.6%, and the mean average precision (mAP) has in-creased by 1.4%. It can be seen from the test results in Figure 8 that the robot can accurately detect surface defects and has a high precision rate for various types of defects. The proposed method in this paper is superior to the previous algorithm in terms of precision and recall rate, thus proving the effectiveness of the improved YOLOv7 algorithm for defect detection.

6. Conclusions

In response to the high development costs, lengthy design cycles, and challenges in validating algorithm effectiveness for quality inspection robot workstations, we developed a virtual simulation platform for quality inspection robot workstations based on virtual reality technology. Within this virtual environment, we rapidly constructed product models, performed performance analysis, and optimized the system. We created virtual simulation work scenarios, designed an improved YOLOv7 algorithm, and tested the quality inspection robot workstation workflow within the virtual simulation environment. Building upon this foundation, we conducted defect recognition experiments for the quality inspection robot using virtual reality technology. By comparing experimental results, we demonstrated the effectiveness of the improved YOLOv7 algorithm. Compared to the YOLOv7 model, the precision (P) has increased by 18.1%, the recall rate (R) has improved by 3.6%, and the mean average precision (mAP) has increased by 1.4%. Through experiments, we validated the feasibility of the quality inspection robot workstation design. Modeling, simulating, and optimizing complex systems of quality inspection robot workstations using virtual reality technology holds significant potential for industrial production.
Our future work includes lightweight improvements to the object detection algorithm. We also plan to improve the model fidelity of the industrial quality inspection robot between the VR interface and the physical systems.

Author Contributions

Conceptualization, Z.L.; methodology, W.Z.; software, H.N.; data curation, Y.L.; writing—original draft preparation, D.W.; writing—review and editing, J.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This paper was supported by Industry-academia cooperation and collaborative education project of Chinese Ministry of Education under grant No. 202101078067 and Shenyang Science and Technology Plan Project under grant No. 23407330.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zhang, Q.; Xiao, R.; Liu, Z.; Duan, J.; Qin, J. Process Simulation and Optimization of Arc Welding Robot Workstation Based on Digital Twin. Machines 2023, 11, 53. [Google Scholar] [CrossRef]
  2. Li, X.; Fan, D.; Deng, Y.; Lei, Y.; Omalley, O. Sensor fusion-based virtual reality for enhanced physical training. Robot. Intell. Autom. 2024, 44, 48–67. [Google Scholar] [CrossRef]
  3. Liu, Z.; Wang, D.; Wang, H.; Qi, M.; Zhu, W.; Sun, Y. Design and Development of Aircraft Final Assembly Pulsating Production Line Based on Virtual Reality Technology. Aviat. Precis. Manuf. Technol. 2024, 60, 1–4. [Google Scholar] [CrossRef]
  4. Yang, L.; Han, C.; Liu, C.; He, X. Vision-Based Virtual Simulation Platform for Planetary Rovers. J. Aerosp. Inf. Syst. 2023, 20, 849–858. [Google Scholar] [CrossRef]
  5. Zhou, T.; Zhu, Q.; Du, J. Intuitive robot teleoperation for civil engineering operations with virtual reality and deep learning scene reconstruction. Adv. Eng. Inform. 2020, 46, 101170. [Google Scholar] [CrossRef]
  6. Zhu, Z.; Lin, Z.; Huang, J.; Zheng, L.; He, B. A digital twin-based machining motion simulation and visualization monitoring system for milling robot. Int. J. Adv. Manuf. Technol. 2023, 127, 4387–4399. [Google Scholar] [CrossRef]
  7. Prati, E.; Villani, V.; Peruzzini, M.; Sabattini, L. An Approach Based on VR to Design Industrial Human-Robot Collaborative Workstations. Appl. Sci. 2021, 11, 11773. [Google Scholar] [CrossRef]
  8. Xia, W.; Liu, X.; Yue, C.; Li, H.; Li, R.; Wei, X. Tool Wear Image On-Machine Detection Based on Trajectory Planning of 6-DOF Serial Robot Driven by Digital Twin. Int. J. Adv. Manuf. Technol. 2023, 125, 3761–3775. [Google Scholar] [CrossRef]
  9. Kamali, K.; Bonev, I.A.; Desrosiers, C. Real-time Motion Planning for Robotic Teleoperation Using Dynamic-goal Deep Reinforcement Learning. In Proceedings of the 2020 17th Conference on Computer and Robot Vision (CRV), Ottawa, ON, Canada, 13–15 May 2020; pp. 182–189. [Google Scholar] [CrossRef]
  10. Caiza, G.; Sanz, R. An Immersive Digital Twin Applied to a Manufacturing Execution System for the Monitoring and Control of Industry 4.0 Processes. Appl. Sci. 2024, 14, 4125. [Google Scholar] [CrossRef]
  11. Yun, H.; Jun, M.B. Immersive and Interactive Cyber-Physical System (I2CPS) and Virtual Reality Interface for Human Involved Robotic Manufacturing. J. Manuf. Syst. 2022, 62, 234–248. [Google Scholar] [CrossRef]
  12. Zhu, K.; Zong, Q.; Zhang, R. Real-time Virtual Simulation Platform for Multi-UVA hunting target using Deep Reinforcement Learning. In Proceedings of the 2021 40th Chinese Control Conference (CCC), Shanghai, China, 26–28 July 2021; pp. 4978–4983. [Google Scholar]
  13. Zhang, X.; Yin, Y.; Wan, F. Underwater Manipulation Training Simulation System for Manned DeepSubmarine Vehicle. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1271–1272. [Google Scholar]
  14. Wang, J. Design and Research of Robot Simulation based on Virtual Reality. In Proceedings of the 2022 IEEE 4th International Conference on Power, Intelligent Computing and Systems (ICPICS), Shenyang, China, 29–31 July 2022; pp. 516–520. [Google Scholar]
  15. Yang, M.; Liu, J. Research on Six-Degree-of-Freedom Refueling Robotic Arm Positioning and Docking Based on RGB-D Visual Guidance. Appl. Sci. 2024, 14, 4904. [Google Scholar] [CrossRef]
  16. Lettera, G.; Natale, C. An Integrated Architecture for Robotic Assembly and Inspection of a Composite Fuselage Panel with an Industry 5.0 Perspective. Machines 2024, 12, 103. [Google Scholar] [CrossRef]
  17. Lan, G.; Lai, Q.; Bai, B.; Zhao, Z.; Hao, Q. A Virtual Reality Training System for Automotive Engines Assembly and Disassembly. IEEE Trans. Learn. Technol. 2024, 17, 754–764. [Google Scholar] [CrossRef]
  18. Dalmasso, V.; Moretti, M.; De’sperati, C. Quasi-3D: Reducing convergence effort improves visual comfort of head-mounted stereoscopic displays. Virtual Real. 2024, 28, 49. [Google Scholar] [CrossRef]
  19. Zhang, Z.; Ji, Y.; Tang, D.; Chen, J.; Liu, C. Enabling collaborative assembly between humans and robots using a digital twin system. Robot. Comput. Manuf. 2024, 86, 102691. [Google Scholar] [CrossRef]
  20. Wang, C.Y.; Bochkovskiy, A.; Liao, H.Y.M. YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 18–22 June 2023; pp. 7464–7475. [Google Scholar]
  21. Wang, H.; Wang, F.; Gong, X.; Zhu, D.; Wang, R.; Wang, P. Detection of Small Targets in Photovoltaic Cell Defect Polarization Imaging Based on Improved YOLOv7. Appl. Sci. 2024, 14, 3899. [Google Scholar] [CrossRef]
  22. Lv, X.; Duan, F.; Jiang, J.-J.; Fu, X.; Gan, L. Deep Metallic Surface Defect Detection: The New Benchmark and Detection Network. Sensors 2020, 20, 1562. [Google Scholar] [CrossRef] [PubMed]
  23. Lei, L.; Sun, S.; Zhang, Y.; Liu, H.; Xie, H. Segmented Embedded Rapid Defect Detection Method for Bearing Surface Defects. Machines 2021, 9, 40. [Google Scholar] [CrossRef]
Figure 1. The system architecture of the virtual simulation platform.
Figure 1. The system architecture of the virtual simulation platform.
Applsci 14 05778 g001
Figure 2. Digital model construction.
Figure 2. Digital model construction.
Applsci 14 05778 g002
Figure 3. Robot D–H coordinate system and original state diagram.
Figure 3. Robot D–H coordinate system and original state diagram.
Applsci 14 05778 g003
Figure 4. Quality inspection robot workstation workflow simulation.
Figure 4. Quality inspection robot workstation workflow simulation.
Applsci 14 05778 g004
Figure 5. CBAM model structure.
Figure 5. CBAM model structure.
Applsci 14 05778 g005
Figure 6. Improved YOLOv7 algorithm structure.
Figure 6. Improved YOLOv7 algorithm structure.
Applsci 14 05778 g006
Figure 7. GC10-DET dataset.
Figure 7. GC10-DET dataset.
Applsci 14 05778 g007
Figure 8. Quality inspection robot workstation detection result.
Figure 8. Quality inspection robot workstation detection result.
Applsci 14 05778 g008
Table 1. Comparison of detection performance with different models.
Table 1. Comparison of detection performance with different models.
MethodP (%)R (%)mAP (%)
YOLOv5s51.267.665.6
YOLOv5m55.869.667.9
YOLOv5l61.559.260.8
YOLOv5x60.663.468.0
YOLOv759.870.971.9
YOLOv7-X73.150.165.7
Ours70.672.773.3
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Liu, Z.; Wang, D.; Li, Y.; Zhu, W.; Ni, H.; Qi, J. Virtual Simulation and Experiment of Quality Inspection Robot Workstation. Appl. Sci. 2024, 14, 5778. https://doi.org/10.3390/app14135778

AMA Style

Liu Z, Wang D, Li Y, Zhu W, Ni H, Qi J. Virtual Simulation and Experiment of Quality Inspection Robot Workstation. Applied Sciences. 2024; 14(13):5778. https://doi.org/10.3390/app14135778

Chicago/Turabian Style

Liu, Zhenlei, Dan Wang, Yueyue Li, Wanan Zhu, Haotian Ni, and Ji Qi. 2024. "Virtual Simulation and Experiment of Quality Inspection Robot Workstation" Applied Sciences 14, no. 13: 5778. https://doi.org/10.3390/app14135778

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop