*Article* **Intensity/Inertial Integration-Aided Feature Tracking on Event Cameras**

**Zeyu Li 1, Yong Liu 2,\*, Feng Zhou <sup>1</sup> and Xiaowan Li <sup>3</sup>**


**Abstract:** Achieving efficient and accurate feature tracking on event cameras is a fundamental step for practical high-level applications, such as simultaneous localization and mapping (SLAM) and structure from motion (SfM) and visual odometry (VO) in GNSS (Global Navigation Satellite System) denied environments. Although many asynchronous tracking methods purely using event flow have been proposed, they suffer from high computation demand and drift problems. In this paper, event information is still processed in the form of synthetic event frames to better adapt to the practical demands. Weighted fusion of multiple hypothesis testing with batch processing (WF-MHT-BP) is proposed based on loose integration of event, intensity, and inertial information. More specifically, with inertial information acting as priors, multiple hypothesis testing with batch processing (MHT-BP) produces coarse feature-tracking solutions on event frames in a batch processing way. With a time-related stochastic model, a weighted fusion mechanism fuses feature-tracking solutions from event and intensity frames compared with other state-of-the-art feature-tracking methods on event cameras. Evaluation on public datasets shows significant improvements on accuracy and efficiency and comparable performances in terms of feature-tracking length.

**Keywords:** event camera; feature tracking; intensity/inertial integration
