Next Article in Journal
Online Scene Semantic Understanding Based on Sparsely Correlated Network for AR
Previous Article in Journal
Robot-Assisted Augmented Reality (AR)-Guided Surgical Navigation for Periacetabular Osteotomy
Previous Article in Special Issue
Hierarchical Multi-Objective Optimization for Dedicated Bus Punctuality and Supply–Demand Balance Control
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Sensor-Fused Nighttime System for Enhanced Pedestrian Detection in ADAS and Autonomous Vehicles

College of Engineering, Kettering University, Flint, MI 48504, USA
*
Author to whom correspondence should be addressed.
Sensors 2024, 24(14), 4755; https://doi.org/10.3390/s24144755
Submission received: 17 June 2024 / Revised: 5 July 2024 / Accepted: 19 July 2024 / Published: 22 July 2024
(This article belongs to the Special Issue Sensors and Sensor Fusion in Autonomous Vehicles)

Abstract

Ensuring a safe nighttime environmental perception system relies on the early detection of vulnerable road users with minimal delay and high precision. This paper presents a sensor-fused nighttime environmental perception system by integrating data from thermal and RGB cameras. A new alignment algorithm is proposed to fuse the data from the two camera sensors. The proposed alignment procedure is crucial for effective sensor fusion. To develop a robust Deep Neural Network (DNN) system, nighttime thermal and RGB images were collected under various scenarios, creating a labeled dataset of 32,000 image pairs. Three fusion techniques were explored using transfer learning, alongside two single-sensor models using only RGB or thermal data. Five DNN models were developed and evaluated, with experimental results showing superior performance of fused models over non-fusion counterparts. The late-fusion system was selected for its optimal balance of accuracy and response time. For real-time inferencing, the best model was further optimized, achieving 33 fps on the embedded edge computing device, an 83.33% improvement in inference speed over the system without optimization. These findings are valuable for advancing Advanced Driver Assistance Systems (ADASs) and autonomous vehicle technologies, enhancing pedestrian detection during nighttime to improve road safety and reduce accidents.
Keywords: ADAS; nighttime object detection; sensor-fusion; image alignment; Deep Neural Network; transfer learning; embedded devices ADAS; nighttime object detection; sensor-fusion; image alignment; Deep Neural Network; transfer learning; embedded devices

Share and Cite

MDPI and ACS Style

Park, J.; Thota, B.K.; Somashekar, K. Sensor-Fused Nighttime System for Enhanced Pedestrian Detection in ADAS and Autonomous Vehicles. Sensors 2024, 24, 4755. https://doi.org/10.3390/s24144755

AMA Style

Park J, Thota BK, Somashekar K. Sensor-Fused Nighttime System for Enhanced Pedestrian Detection in ADAS and Autonomous Vehicles. Sensors. 2024; 24(14):4755. https://doi.org/10.3390/s24144755

Chicago/Turabian Style

Park, Jungme, Bharath Kumar Thota, and Karthik Somashekar. 2024. "Sensor-Fused Nighttime System for Enhanced Pedestrian Detection in ADAS and Autonomous Vehicles" Sensors 24, no. 14: 4755. https://doi.org/10.3390/s24144755

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop