MomentumNet-CD: Real-Time Collision Detection for Industrial Robots Based on Momentum Observer with Optimized BP Neural Network
Abstract
:1. Introduction
1.1. Related Work
1.2. Motivation
1.3. Main Contributions
- A complete data acquisition system is designed, which integrates both hardware and software. In terms of the hardware architecture, the system consists of five key components: the MR-J2S-70A servo driver (Mitsubishi Electric, Tokyo, Japan), the Q8-USB data acquisition card (Quanser, Markham, ON, Canada), the ATI six-dimensional F/T sensor(ATI Industrial Automation, Apex, NC, USA), the ISO-U2-P1-F8 isolation transmitter(Shenzhen Sunyuan Technology Co., Ltd., Shenzhen, China), and a metal-oxide semiconductor (MOS) tube-triggered switch module (Elecrow, Shenzhen, China). The software part is constructed based on MATLAB/Simulink R2022b and the QUARC 2.7 real-time control development environment, which realizes high-frequency real-time acquisition and provides a reliable hardware foundation for the subsequent neural network training and collision detection experiments.
- A novel collision detection method is proposed, which firstly obtains the collision state features through a momentum observer, followed by the construction of a robust observation model using Mahalanobis distance. Subsequently, the extracted features are fed into a three-layer BP neural network optimized by the Levenberg–Marquardt (LM) algorithm to achieve high-precision collision identification.
1.4. Organization
2. CollisionSense DAQ System Design
2.1. Hardware Integration Architecture
2.2. QUARC-Based Control System
2.3. Sampling Frequency Determination
3. Modeling and Characterization of Collision State Systems
3.1. Dynamics Modeling
3.2. Collision Feature Extraction
3.3. State of the Art Assessment of Mahalanobis Distance
4. Neural Network Detection Optimization and Implementation
4.1. Network Architecture Design
4.2. LM Algorithm Optimization
5. Experimental Research
5.1. Data Acquisition
Algorithm 1 CollisionSense DAQ System |
Require: Q8-USB card, QUARC environment, (sampling frequency), Ntotal (total samples), ratio (collision sample ratio) |
Ensure: Complete dataset containing collision and non-collision states |
1: function Data_Acquisition() 2: Initialize QUARC environment () 3: Configure Q8-USB card channels: 4: analog_inputs ← [0:7] 5: encoder ← 0 6: analog_output ← 0 7: digital_output ← 0 8: Set up hardware components: 9: servo_driver ← Setup_MR_J2S_70A(position_control, 131,072 kpps) 10: ft_sensor ← Setup_ATI_force_sensor() 11: iso_transmitter ← Setup_isolation_transmitter (0–10 V, 0–2.5 kHz) |
12: Create_Simulink_model() 13: non_collision_count ← 0 14: collision_count ← 0 15: non_collision_target ← Ntotal × (1 − ratio) 16: collision_target ← Ntotal × ratio 17: Set_collision_mode(false) 18: Start_data_acquisition() 19: while non_collision_count < non_collision_target do 20: position ← Read_encoder_position() 21: velocity ← Read_encoder_velocity() 22: current ← Read_analog_input([0,1]) 23: hysteresis ← Read_hysteresis_pulse() 24: Store_non_collision_data(position, velocity, current, hysteresis) 25: non_collision_count ← non_collision_count + 1 26: end while 27: Set_collision_mode(true) 28: while collision_count < collision_target do 29: position ← Read_encoder_position() 30: velocity ← Read_encoder_velocity() 31: current ← Read_analog_input([0,1]) 32: hysteresis ← Read_hysteresis_pulse() 33: ft_data ← Read_analog_input([2–7]) 34: Store_collision_data(position, velocity, current, hysteresis, ft_data) 35: collision_count ← collision_count + 1 36: end while 37: Stop_data_acquisition() 38: Save_dataset(‘data.mat’) 39: return dataset 40: end function |
5.2. Model Training
5.3. Experimental Validation
5.4. Comparison with Existing Method
6. Conclusions and Future Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Zhang, T.; Ge, P.; Zou, Y.; He, Y. Robot collision detection without external sensors based on time-series analysis. J. Dyn. Syst. Meas. Control. 2021, 143, 041005. [Google Scholar] [CrossRef]
- Bonci, A.; Cheng, P.D.C.; Indri, M.; Nabissi, G.; Sibona, F. Human-robot perception in industrial environments: A survey. Sensors 2021, 21, 1571. [Google Scholar] [CrossRef] [PubMed]
- Park, K.M.; Kim, J.; Park, J.; Park, F.C. Learning-based real-time detection of robot collisions without joint torque sensors. IEEE Robot. Autom. Lett. 2020, 6, 103–110. [Google Scholar] [CrossRef]
- Huang, S.; Gao, M.; Liu, L.; Chen, J.; Zhang, J. Collision detection for cobots: A back-input compensation approach. IEEE/ASME Trans. Mechatron. 2022, 27, 4951–4962. [Google Scholar] [CrossRef]
- Park, J.; Kim, T.; Gu, C.; Kang, Y.; Cheong, J. Dynamic collision estimator for collaborative robots: A dynamic Bayesian network with Markov model for highly reliable collision detection. Robot. Comput.-Integr. Manuf. 2024, 86, 102692. [Google Scholar] [CrossRef]
- Lv, H.; Liu, L.; Gao, Y.; Zhao, S.; Yang, P.; Mu, Z. A compound planning algorithm considering both collision detection and obstacle avoidance for intelligent demolition robots. Robot. Auton. Syst. 2024, 181, 104781. [Google Scholar] [CrossRef]
- Chang, Z.; Chen, H.; Hua, M.; Fu, Q.; Peng, J. A bio-inspired visual collision detection network integrated with dynamic temporal variance feedback regulated by scalable functional countering jitter streaming. Neural Netw. 2025, 182, 106882. [Google Scholar] [CrossRef]
- Montaut, L.; Le Lidec, Q.; Petrik, V.; Sivic, J.; Carpentier, J. GJK++: Leveraging Acceleration Methods for Faster Collision Detection. IEEE Trans. Robot. 2024, 40, 2564–2581. [Google Scholar] [CrossRef]
- Shen, T.; Liu, X.; Dong, Y.; Yang, L.; Yuan, Y. Switched Momentum Dynamics Identification for Robot Collision Detection. IEEE Trans. Ind. Inform. 2024, 20, 11252–11261. [Google Scholar] [CrossRef]
- Xu, T.; Tuo, H.; Fang, Q.; Shan, D.; Jin, H.; Fan, J.; Zhu, Y.; Zhao, J. A novel collision detection method based on current residuals for robots without joint torque sensors: A case study on UR10 robot. Robot. Comput.-Integr. Manuf. 2024, 89, 102777. [Google Scholar] [CrossRef]
- Yun, A.; Lee, W.; Kim, S.; Kim, J.-H.; Yoon, H. Development of a robot arm link system embedded with a three-axis sensor with a simple structure capable of excellent external collision detection. Sensors 2022, 22, 1222. [Google Scholar] [CrossRef] [PubMed]
- Wu, H.; Chen, J.; Su, Y.; Li, Z.; Ye, J. New tactile sensor for position detection based on distributed planar electric field. Sens. Actuators A Phys. 2016, 242, 146–161. [Google Scholar] [CrossRef]
- Min, F.; Wang, G.; Liu, N. Collision Detection and Identification on Robot Manipulators Based on Vibration Analysis. Sensors 2019, 19, 1080. [Google Scholar] [CrossRef]
- Ma, J.; Zhuang, X.; Zi, P.; Zhang, T.; Zhang, W.; Xu, K.; Ding, X. Efficient Collision Detection Algorithm for Space Reconfigurable Integrated Leg-Arm Robot. In Proceedings of the 2024 IEEE 19th Conference on Industrial Electronics and Applications (ICIEA), Kristiansand, Norway, 5–8 August 2024; pp. 1–6. [Google Scholar]
- Zhao, B.; Wu, C.; Chang, L.; Jiang, Y.; Sun, R. Research on Zero-Force control and collision detection of deep learning methods in collaborative robots. Displays 2025, 87, 102969. [Google Scholar] [CrossRef]
- Liu, B.; Fu, Z.; Hua, Z.; Zhang, J. Collision detection and fault diagnosis with DMAO-GRU for flow regulating valve. Meas. Sci. Technol. 2024, 36, 016238. [Google Scholar] [CrossRef]
- Li, W.; Han, Y.; Wu, J.; Xiong, Z. Collision detection of robots based on a force/torque sensor at the bedplate. IEEE/ASME Trans. Mechatron. 2020, 25, 2565–2573. [Google Scholar] [CrossRef]
- Zhang, X.; Zhong, Z.; Guan, W.; Pan, M.; Liang, K. Collision-risk assessment model for teleoperation robots considering acceleration. IEEE Access 2024, 12, 101756–101766. [Google Scholar] [CrossRef]
- Zhang, T.; Chen, Y.; Zou, Y. Robot collision detection based on external moment observer. J. South China Univ. Technol. (Nat. Sci. Ed.) 2024, 52. [Google Scholar] [CrossRef]
- Chen, S.; Xiao, H.; Qiu, L.; Bi, Q.; Chen, X. Robotic Flexible Collision Detection Based on Second-Order Sliding-Mode Momentum Observer. In Proceedings of the 2024 10th International Conference on Electrical Engineering, Control and Robotics (EECR), Guangzhou, China, 29–31 March 2024; pp. 1–7. [Google Scholar]
- Zhao, P.; Gao, Z.; Liu, X.; Zeng, Y.; Zhou, Y. Collision observation of collaborative robots based on generalized momentum deviation. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2024, 239, 09544062241299672. [Google Scholar] [CrossRef]
- Lu, S.; Xu, Z.; Wang, B. Human-robot collision detection based on the improved camshift algorithm and bounding box. Int. J. Control. Autom. Syst. 2022, 20, 3347–3360. [Google Scholar] [CrossRef]
- Kim, D.; Lim, D.; Park, J. Transferable collision detection learning for collaborative manipulator using versatile modularized neural network. IEEE Trans. Robot. 2021, 38, 2426–2445. [Google Scholar] [CrossRef]
- Fan, T.; Long, P.; Liu, W.; Pan, J. Distributed multi-robot collision avoidance via deep reinforcement learning for navigation in complex scenarios. Int. J. Robot. Res. 2020, 39, 856–892. [Google Scholar] [CrossRef]
- Sharkawy, A.N.; Koustoumpardis, P.N.; Aspragathos, N. Human-robot collisions detection for safe human-robot interaction using one multi-input-output neural network. Soft Comput. 2020, 24, 6687–6719. [Google Scholar] [CrossRef]
- Heo, Y.J.; Kim, D.; Lee, W.; Kim, H.; Park, J.; Chung, W.K. Collision detection for industrial collaborative robots: A deep learning approach. IEEE Robot. Autom. Lett. 2019, 4, 740–746. [Google Scholar] [CrossRef]
- Niu, Z.; Hassan, T.; Boushaki, M.N.; Werghi, N.; Hussain, I. Continuous Wavelet Network for Efficient and Transferable Collision Detection in Collaborative Robots. IEEE Trans. Syst. Man Cybern. Syst. 2024, 55, 2046–2061. [Google Scholar] [CrossRef]
- Zeng, Z.; Liu, J.; Yuan, Y. A generalized Nyquist-Shannon sampling theorem using the Koopman operator. IEEE Trans. Signal Process. 2024, 72, 3595–3610. [Google Scholar] [CrossRef]
- Farrow, C.L.; Shaw, M.; Kim, H.; Juhás, P.; Billinge, S.J. Nyquist-Shannon sampling theorem applied to refinements of the atomic pair distribution function. Phys. Rev. B—Condens. Matter Mater. Phys. 2011, 84, 134105. [Google Scholar] [CrossRef]
- Song, Z.; Liu, B.; Pang, Y.; Hou, C.; Li, X. An improved Nyquist–Shannon irregular sampling theorem from local averages. IEEE Trans. Inf. Theory 2012, 58, 6093–6100. [Google Scholar] [CrossRef]
- Liu, S.; Wu, C.; Liang, L.; Zhao, B.; Sun, R. Research on Vibration Suppression Methods for Industrial Robot Time-Lag Filtering. Machines 2024, 12, 250. [Google Scholar] [CrossRef]
- De Maesschalck, R.; Jouan-Rimbaud, D.; Massart, D.L. The mahalanobis distance. Chemom. Intell. Lab. Syst. 2000, 50, 1–18. [Google Scholar] [CrossRef]
- Moré, J.J. The Levenberg-Marquardt algorithm: Implementation and theory. In Numerical Analysis, Proceedings of the Biennial Conference Held at Dundee, Dundee, UK, 28 June–1 July 1977; Springer: Berlin/Heidelberg, 2006; pp. 105–116. [Google Scholar]
- Xing, X.; Burdet, E.; Si, W.; Yang, C.; Li, Y. Impedance learning for human-guided robots in contact with unknown environments. IEEE Trans. Robot. 2023, 39, 3705–3721. [Google Scholar] [CrossRef]
Pinout | Location and Pin Number | Function | Output Range |
---|---|---|---|
MO1 | CN3 4 | Voltage output parameter No. 17 between MO1 and LG | 0–8 V |
MO2 | CN3 14 | Voltage output parameter No. 17 between MO2 and LG | 0–8 V |
LA/LAR | CN1A6/16 | Encoder A-phase pulses (differential drive) | - |
LB/LBR | CN1A7/17 | Encoder B-phase pulse (differential drive) | - |
LZ/LZR | CN1A5/15 | Encoder Z-phase pulses (differential drive) | - |
LG | CN1A 1 | Common ground for output interface | - |
Function | Goal | Channel Selection | Setting Range |
---|---|---|---|
Configurable range 16-bit analog input (ADC) | Collect the joint motor current information output by the AC servo driver and the joint motor detent pulse signal. | [0,1] | 0–10 V |
Collect the voltage of the six channels of the ATI six-dimensional F/T sensor. | [2–7] | 0–10 V | |
Configurable range 16-bit digital-to-analog converter (DAC) | The input to the analog-to-pulse frequency conversion module. | [0] | 0–10 V |
Digital output interface | Control the on/off switching of the MOS field-effect transistor (FET) trigger switch module. | [0] | - |
Single-ended encoder input | Collect the position and speed of the joint motor. | [0] | - |
Component | Function | Use in the System |
---|---|---|
MR-J2S-70A servo driver | Control of robot joints, processing of different signals through three connectors (CN1A/B, CN3, CN2). | Output motor current, hysteresis pulse information, and differential pulse signals to provide joint position and velocity data. |
Q8-USB data acquisition card | Provides multiple inputs and outputs, including 8 channels of 16-bit ADC/DAC and encoder inputs. | Acquisition of joint information, sensor data, and control of joint motors via analog outputs. |
ATI six-dimensional F/T sensor | Measurement of external forces using silicon strain technology with a safety factor of up to 4080% and a transmission rate of 28.5 KHz. | Connection to Q8-USB card via analog output to provide collision force reference data. |
ISO-U2-P1-F8 isolation transmitter | Converts analog DC voltage signals into digital pulse frequency signals for triple isolation. | Receives Q8-USB card output, converts and controls robot position. |
MOS tube-triggered switch module | Provides high current output (10 A, up to 15 A with heat sink) and handles PWM signals up to 2.5 kHz. | Controlled by the digital output of the Q8-USB card as a switching interface between system components. |
Module | Name | Function |
---|---|---|
HIL Initialize | Initializes acquisition card parameters to associate the system with a specific HIL card (Q8-USB card). | |
HIL Read Analog | When this block is executed, the analog signal of the specified channel is read. | |
HIL Read other | When this block is executed, the encoder pulse signal of the specified channel is read. | |
HIL Write Analog | When this block is executed, the corresponding analog signal is written to the specified channel. | |
To Workspace | Writes input signal data to workspace. |
Joint Velocity | NC | CC | FN | FP | Accuracy/% |
---|---|---|---|---|---|
10°/s | 900 | 848 | 52 | 0 | 94.26 |
12°/s | 900 | 847 | 53 | 0 | 94.21 |
15°/s | 900 | 844 | 56 | 0 | 93.85 |
20°/s | 900 | 840 | 60 | 1 | 93.34 |
25°/s | 900 | 833 | 67 | 1 | 92.57 |
Total | 4500 | 4212 | 288 | 2 | 93.65 |
Method | DD | FP |
---|---|---|
MomentumNet-CD | 12.16 ms | 0 |
CollisionNet | 20.75 ms | 2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ye, J.; Fan, Y.; Kang, Q.; Liu, X.; Wu, H.; Zheng, G. MomentumNet-CD: Real-Time Collision Detection for Industrial Robots Based on Momentum Observer with Optimized BP Neural Network. Machines 2025, 13, 334. https://doi.org/10.3390/machines13040334
Ye J, Fan Y, Kang Q, Liu X, Wu H, Zheng G. MomentumNet-CD: Real-Time Collision Detection for Industrial Robots Based on Momentum Observer with Optimized BP Neural Network. Machines. 2025; 13(4):334. https://doi.org/10.3390/machines13040334
Chicago/Turabian StyleYe, Jinhua, Yechen Fan, Quanjie Kang, Xiaohan Liu, Haibin Wu, and Gengfeng Zheng. 2025. "MomentumNet-CD: Real-Time Collision Detection for Industrial Robots Based on Momentum Observer with Optimized BP Neural Network" Machines 13, no. 4: 334. https://doi.org/10.3390/machines13040334
APA StyleYe, J., Fan, Y., Kang, Q., Liu, X., Wu, H., & Zheng, G. (2025). MomentumNet-CD: Real-Time Collision Detection for Industrial Robots Based on Momentum Observer with Optimized BP Neural Network. Machines, 13(4), 334. https://doi.org/10.3390/machines13040334