Next Article in Journal
Real-Time Correction and Stabilization of Laser Diode Wavelength in Miniature Homodyne Interferometer for Long-Stroke Micro/Nano Positioning Stage Metrology
Previous Article in Journal
Integrated Robotic and Network Simulation Method
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion

1
ASTUTE 2020 (Advanced Sustainable Manufacturing Technologies), Swansea University, Swansea SA1 8EN, UK
2
School of Automation and Electrical Engineering, Qingdao University of Science and Technology, Qingdao 266042, China
3
Department of Electrical Power and Machines, Helwan University, Helwan 11795, Egypt
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(20), 4586; https://doi.org/10.3390/s19204586
Submission received: 20 August 2019 / Revised: 30 September 2019 / Accepted: 17 October 2019 / Published: 22 October 2019
(This article belongs to the Section Intelligent Sensors)

Abstract

In this paper, the application of Augmented Reality (AR) for the control and adjustment of robots has been developed, with the aim of making interaction and adjustment of robots easier and more accurate from a remote location. A LeapMotion sensor based controller has been investigated to track the movement of the operator hands. The data from the controller allows gestures and the position of the hand palm’s central point to be detected and tracked. A Kinect V2 camera is able to measure the corresponding motion velocities in x, y, z directions after our investigated post-processing algorithm is fulfilled. Unreal Engine 4 is used to create an AR environment for the user to monitor the control process immersively. Kalman filtering (KF) algorithm is employed to fuse the position signals from the LeapMotion sensor with the velocity signals from the Kinect camera sensor, respectively. The fused/optimal data are sent to teleoperate a Baxter robot in real-time by User Datagram Protocol (UDP). Several experiments have been conducted to test the validation of the proposed method.
Keywords: augmented reality; human-robot interaction; LeapMotion sensor; Kinect sensor; Kalman filter sensor fusion augmented reality; human-robot interaction; LeapMotion sensor; Kinect sensor; Kalman filter sensor fusion

Share and Cite

MDPI and ACS Style

Li, C.; Fahmy, A.; Sienz, J. An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion. Sensors 2019, 19, 4586. https://doi.org/10.3390/s19204586

AMA Style

Li C, Fahmy A, Sienz J. An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion. Sensors. 2019; 19(20):4586. https://doi.org/10.3390/s19204586

Chicago/Turabian Style

Li, Chunxu, Ashraf Fahmy, and Johann Sienz. 2019. "An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion" Sensors 19, no. 20: 4586. https://doi.org/10.3390/s19204586

APA Style

Li, C., Fahmy, A., & Sienz, J. (2019). An Augmented Reality Based Human-Robot Interaction Interface Using Kalman Filter Sensor Fusion. Sensors, 19(20), 4586. https://doi.org/10.3390/s19204586

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop