Next Article in Journal
Enhancing Energy Saving in Smart Farming through Aggregation and Partition Aware IoT Routing Protocol
Next Article in Special Issue
Variability of Coordination in Typically Developing Children Versus Children with Autism Spectrum Disorder with and without Rhythmic Signal
Previous Article in Journal
Recognition of Rare Low-Moral Actions Using Depth Data
Previous Article in Special Issue
DataSpoon: Validation of an Instrumented Spoon for Assessment of Self-Feeding
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data

Group of Sensors and Actuators, Department of Electrical Engineering and Applied Sciences, Westphalian University of Applied Sciences, 45877 Gelsenkirchen, Germany
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(10), 2759; https://doi.org/10.3390/s20102759
Submission received: 14 April 2020 / Revised: 9 May 2020 / Accepted: 9 May 2020 / Published: 12 May 2020
(This article belongs to the Special Issue Low-Cost Sensors and Biological Signals)

Abstract

This paper presents the use of eye tracking data in Magnetic AngularRate Gravity (MARG)-sensor based head orientation estimation. The approach presented here can be deployed in any motion measurement that includes MARG and eye tracking sensors (e.g., rehabilitation robotics or medical diagnostics). The challenge in these mostly indoor applications is the presence of magnetic field disturbances at the location of the MARG-sensor. In this work, eye tracking data (visual fixations) are used to enable zero orientation change updates in the MARG-sensor data fusion chain. The approach is based on a MARG-sensor data fusion filter, an online visual fixation detection algorithm as well as a dynamic angular rate threshold estimation for low latency and adaptive head motion noise parameterization. In this work we use an adaptation of Madgwicks gradient descent filter for MARG-sensor data fusion, but the approach could be used with any other data fusion process. The presented approach does not rely on additional stationary or local environmental references and is therefore self-contained. The proposed system is benchmarked against a Qualisys motion capture system, a gold standard in human motion analysis, showing improved heading accuracy for the MARG-sensor data fusion up to a factor of 0.5 while magnetic disturbance is present.
Keywords: data fusion; MARG; IMU; eye tracker; self-contained; head motion measurement data fusion; MARG; IMU; eye tracker; self-contained; head motion measurement

Share and Cite

MDPI and ACS Style

Wöhle, L.; Gebhard, M. SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data. Sensors 2020, 20, 2759. https://doi.org/10.3390/s20102759

AMA Style

Wöhle L, Gebhard M. SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data. Sensors. 2020; 20(10):2759. https://doi.org/10.3390/s20102759

Chicago/Turabian Style

Wöhle, Lukas, and Marion Gebhard. 2020. "SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data" Sensors 20, no. 10: 2759. https://doi.org/10.3390/s20102759

APA Style

Wöhle, L., & Gebhard, M. (2020). SteadEye-Head—Improving MARG-Sensor Based Head Orientation Measurements Through Eye Tracking Data. Sensors, 20(10), 2759. https://doi.org/10.3390/s20102759

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop