A Comprehensive Review of Autonomous Driving Algorithms: Tackling Adverse Weather Conditions, Unpredictable Traffic Violations, Blind Spot Monitoring, and Emergency Maneuvers
Abstract
:1. Introduction
2. Autonomous Driving in Adverse Weather Conditions
2.1. Impact of Adverse Weather on Sensor Performance
2.1.1. Optical Sensors (Cameras, LiDAR) in Rain, Snow, and Fog
2.1.2. Radar Performance in Extreme Weather
2.1.3. Other Sensor Types and Their Weather-Related Limitations
2.2. Algorithmic Challenges in Adverse Weather
2.2.1. Object Detection and Classification Issues
2.2.2. Lane Detection and Road Boundary Identification
2.2.3. Path Planning and Decision-Making Complexities
2.3. Current Solutions and Advancements
2.3.1. Multi-Sensor Data Fusion Techniques
2.3.2. Deep Learning Models for Adverse Weather
2.3.3. Vehicle-to-Infrastructure Cooperative Perception Systems
2.4. Future Research Directions
2.4.1. Optimizing Multi-Sensor Fusion Algorithms
2.4.2. Developing Robust Deep Learning Models for Extreme Weather
2.4.3. Creating Specialized Datasets for Adverse Weather Conditions
2.5. Autonomous Driving’s Dataset Analysis for Adverse Weather Challenges
2.5.1. Datasets for Adverse Weather Conditions
2.5.2. Analysis of Algorithm Performance
2.5.3. Insights from the Comparative Analysis
2.5.4. Implications for Model Selection
3. Algorithms for Managing Complex Traffic Scenarios and Violations
3.1. Types of Complex Traffic Scenarios and Violations
3.1.1. Unpredictable Pedestrian Behavior
3.1.2. Aggressive Driving and Sudden Lane Changes
3.1.3. Traffic Signal and Sign Violations
3.1.4. Blind Spot Challenges and Multiple Moving Objects
3.2. Detection and Prediction Methods
3.2.1. Computer Vision-Based Approaches
3.2.2. Behavioral Prediction Models
3.2.3. Sensor Fusion for Improved Detection
3.2.4. Machine Learning for Object Classification and Movement Prediction
3.3. Response Algorithms and Decision-Making Processes
3.3.1. Emergency Braking Systems and Evasive Maneuver Planning
3.3.2. Risk Assessment and Mitigation Strategies
3.3.3. Ethical Considerations in Decision-Making
3.4. Challenges and Future Work
4. Emergency Maneuver Strategies and Blind 60Spot Management
4.1. Types of Emergency Scenarios and Blind Spot Challenges
4.2. Current Technologies and Algorithms for Emergency Response and Blind Spot Management
4.2.1. Emergency Response Technologies
4.2.2. Blind Spot Detection and Monitoring
4.3. Advanced Algorithmic Approaches
4.4. Testing, Validation, and Future Directions
5. Comparative Analysis and Future Outlook
5.1. ADAS vs. Full Self-Driving (FSD) Systems
5.2. Algorithmic Differences and Development Trajectories
5.3. Regulatory and Ethical Considerations
5.4. Future Research Directions and Societal Impact
6. Conclusions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Sezgin, F.; Vriesman, D.; Steinhauser, D.; Lugner, R.; Brandmeier, T. Safe Autonomous Driving in Adverse Weather: Sensor Evaluation and Performance Monitoring. In Proceedings of the IEEE Intelligent Vehicles Symposium, Anchorage, AK, USA, 4–7 June 2023. [Google Scholar]
- Dey, K.C.; Mishra, A.; Chowdhury, M. Potential of intelligent transportation systems in mitigating adverse weather impacts on road mobility: A review. IEEE Trans. Intell. Transp. Syst. 2015, 16, 1107–1119. [Google Scholar] [CrossRef]
- Yang, G.; Song, X.; Huang, C.; Deng, Z.; Shi, J.; Zhou, B. Drivingstereo: A large-scale dataset for stereo matching in autonomous driving scenarios. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Wang, W.; You, X.; Chen, L.; Tian, J.; Tang, F.; Zhang, L. A Scalable and Accurate De-Snowing Algorithm for LiDAR Point Clouds in Winter. Remote Sens. 2022, 14, 1468. [Google Scholar] [CrossRef]
- Musat, V.; Fursa, I.; Newman, P.; Cuzzolin, F.; Bradley, A. Multi-weather city: Adverse weather stacking for autonomous driving. In Proceedings of the IEEE International Conference on Computer Vision, Virtual, 11–17 October 2021. [Google Scholar]
- Ha, M.H.; Kim, C.H.; Park, T.H. Object Recognition for Autonomous Driving in Adverse Weather Condition Using Polarized Camera. In Proceedings of the 2022 10th International Conference on Control, Mechatronics and Automation, ICCMA 2022, Luxembourg, 9–12 November 2022. [Google Scholar]
- Kim, Y.; Shin, J. Efficient and Robust Object Detection Against Multi-Type Corruption Using Complete Edge Based on Lightweight Parameter Isolation. IEEE Trans. Intell. Veh. 2024, 9, 3181–3194. [Google Scholar] [CrossRef]
- Bijelic, M.; Gruber, T.; Ritter, W. Benchmarking Image Sensors under Adverse Weather Conditions for Autonomous Driving. In Proceedings of the IEEE Intelligent Vehicles Symposium, Changshu, China, 26–30 June 2018. [Google Scholar]
- Wu, C.H.; Tai, T.C.; Lai, C.F. Semantic Image Segmentation in Similar Fusion Background for Self-driving Vehicles. Sens. Mater. 2022, 34, 467–491. [Google Scholar] [CrossRef]
- Du, Y.; Yang, T.; Chang, Q.; Zhong, W.; Wang, W. Enhancing Lidar and Radar Fusion for Vehicle Detection in Adverse Weather via Cross-Modality Semantic Consistency. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2024. [Google Scholar]
- Wang, J.; Wu, Z.; Liang, Y.; Tang, J.; Chen, H. Perception Methods for Adverse Weather Based on Vehicle Infrastructure Cooperation System: A Review. Sensors 2024, 24, 374. [Google Scholar] [CrossRef]
- Mu, M.; Wang, C.; Liu, X.; Bi, H.; Diao, H. AI monitoring and warning system for low visibility of freeways using variable weight combination model. Adv. Control Appl. Eng. Ind. Syst. 2024, e195. [Google Scholar] [CrossRef]
- Almalioglu, Y.; Turan, M.; Trigoni, N.; Markham, A. Deep learning-based robust positioning for all-weather autonomous driving. Nat. Mach. Intell. 2022, 4, 749–760. [Google Scholar] [CrossRef]
- Sun, P.P.; Zhao, X.M.; Jiang Y de Wen, S.Z.; Min, H.G. Experimental Study of Influence of Rain on Performance of Automotive LiDAR. Zhongguo Gonglu Xuebao/China J. Highw. Transp. 2022, 35, 318–328. [Google Scholar]
- Nahata, D.; Othman, K. Exploring the challenges and opportunities of image processing and sensor fusion in autonomous vehicles: A comprehensive review. AIMS Electron. Electr. Eng. 2023, 7, 271–321. [Google Scholar] [CrossRef]
- Al-Haija, Q.A.; Gharaibeh, M.; Odeh, A. Detection in Adverse Weather Conditions for Autonomous Vehicles via Deep Learning. AI 2022, 3, 303–317. [Google Scholar] [CrossRef]
- Sheeny, M.; de Pellegrin, E.; Mukherjee, S.; Ahrabian, A.; Wang, S.; Wallace, A. Radiate: A Radar Dataset for Automotive Perception in Bad Weather. In Proceedings of the IEEE International Conference on Robotics and Automation, Xi’an, China, 30 May–5 June 2021. [Google Scholar]
- Gupta, H.; Kotlyar, O.; Andreasson, H.; Lilienthal, A.J. Robust Object Detection in Challenging Weather Conditions. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 3–8 January 2024. [Google Scholar]
- Yoneda, K.; Suganuma, N.; Yanase, R.; Aldibaja, M. Automated driving recognition technologies for adverse weather conditions. IATSS Res. 2019, 43, 253–262. [Google Scholar] [CrossRef]
- Hannan Khan, A.; Tahseen, S.; Rizvi, R.; Dengel, A. Real-Time Traffic Object Detection for Autonomous Driving. 31 January 2024. Available online: https://arxiv.org/abs/2402.00128v2 (accessed on 24 October 2024).
- Hassaballah, M.; Kenk, M.A.; Muhammad, K.; Minaee, S. Vehicle Detection and Tracking in Adverse Weather Using a Deep Learning Framework. IEEE Trans. Intell. Transp. Syst. 2021, 22, 4230–4242. [Google Scholar] [CrossRef]
- Chougule, A.; Chamola, V.; Sam, A.; Yu, F.R.; Sikdar, B. A Comprehensive Review on Limitations of Autonomous Driving and Its Impact on Accidents and Collisions. IEEE Open J. Veh. Technol. 2024, 5, 142–161. [Google Scholar] [CrossRef]
- Scheiner, N.; Kraus, F.; Appenrodt, N.; Dickmann, J.; Sick, B. Object detection for automotive radar point clouds—A comparison. AI Perspect. 2021, 3, 6. [Google Scholar] [CrossRef]
- Aloufi, N.; Alnori, A.; Basuhail, A. Enhancing Autonomous Vehicle Perception in Adverse Weather: A Multi Objectives Model for Integrated Weather Classification and Object Detection. Electronics 2024, 13, 3063. [Google Scholar] [CrossRef]
- Gehrig, S.; Schneider, N.; Stalder, R.; Franke, U. Stereo vision during adverse weather—Using priors to increase robustness in real-time stereo vision. Image Vis. Comput. 2017, 68, 28–39. [Google Scholar] [CrossRef]
- Peng, L.; Wang, H.; Li, J. Uncertainty Evaluation of Object Detection Algorithms for Autonomous Vehicles. Automot. Innov. 2021, 4, 241–252. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Doric, I.; Lauerer, C.; Brandmeier, T. Modeling and simulation of rain for the test of automotive sensor systems. In Proceedings of the IEEE Intelligent Vehicles Symposium, Gotenburg, Sweden, 19–22 June 2016. [Google Scholar]
- Bernardin, F.; Bremond, R.; Ledoux, V.; Pinto, M.; Lemonnier, S.; Cavallo, V.; Colomb, M. Measuring the effect of the rainfall on the windshield in terms of visual performance. Accid. Anal. Prev. 2014, 63, 83–88. [Google Scholar] [CrossRef]
- Heinzler, R.; Schindler, P.; Seekircher, J.; Ritter, W.; Stork, W. Weather influence and classification with automotive lidar sensors. In Proceedings of the IEEE Intelligent Vehicles Symposium, Paris, France, 9–12 June 2019. [Google Scholar]
- Bijelic, M.; Gruber, T.; Ritter, W. A Benchmark for Lidar Sensors in Fog: Is Detection Breaking Down? In Proceedings of the IEEE Intelligent Vehicles Symposium, Changshu, Suzhou, China 26–30 June 2018.
- Rasshofer, R.H.; Spies, M.; Spies, H. Influences of weather phenomena on automotive laser radar systems. Adv. Radio Sci. 2011, 9, 49–60. [Google Scholar] [CrossRef]
- Yang, H.; Ding, M.; Carballo, A.; Zhang, Y.; Ohtani, K.; Niu, Y.; Ge, M.; Feng, Y.; Takeda, K. Synthesizing Realistic Snow Effects in Driving Images Using GANs and Real Data with Semantic Guidance. In Proceedings of the 2023 IEEE Intelligent Vehicles Symposium (IV), Anchorage, AK, USA, 4–7 June 2023. [Google Scholar]
- Han, C.; Huo, J.; Gao, Q.; Su, G.; Wang, H. Rainfall Monitoring Based on Next-Generation Millimeter-Wave Backhaul Technologies in a Dense Urban Environment. Remote Sens. 2020, 12, 1045. [Google Scholar] [CrossRef]
- Lee, S.; Lee, D.; Choi, P.; Park, D. Accuracy–power controllable lidar sensor system with 3d object recognition for autonomous vehicle. Sensors 2020, 20, 5706. [Google Scholar] [CrossRef] [PubMed]
- Ehrnsperger, M.G.; Siart, U.; Moosbühler, M.; Daporta, E.; Eibert, T.F. Signal degradation through sediments on safety-critical radar sensors. Adv. Radio Sci. 2019, 17, 91–100. [Google Scholar] [CrossRef]
- Zhang, Y.; Carballo, A.; Yang, H.; Takeda, K. Perception and sensing for autonomous vehicles under adverse weather conditions: A survey. ISPRS J. Photogramm. Remote Sens. 2023, 196, 146–177. [Google Scholar] [CrossRef]
- Chowdhuri, S.; Pankaj, T.; Zipser, K. MultiNet: Multi-Modal Multi-Task Learning for Autonomous Driving. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision, WACV, Waikoloa Village, HI, USA, 7–11 January 2019; pp. 1496–1504. Available online: https://arxiv.org/abs/1709.05581v4 (accessed on 24 October 2024).
- Huang, Z.; Lv, C.; Xing, Y.; Wu, J. Multi-Modal Sensor Fusion-Based Deep Neural Network for End-to-End Autonomous Driving with Scene Understanding. IEEE Sens. J. 2021, 21, 11781–11790. [Google Scholar] [CrossRef]
- Efrat, N.; Bluvstein, M.; Oron, S.; Levi, D.; Garnett, N.; Shlomo, B.E. 3D-LaneNet+: Anchor Free Lane Detection using a Semi-Local Representation. arXiv 2020, arXiv:2011.01535. [Google Scholar]
- Wang, Z.; Wu, Y.; Niu, Q. Multi-Sensor Fusion in Automated Driving: A Survey. IEEE Access. 2020, 8, 2847–2868. [Google Scholar] [CrossRef]
- Gamba, M.T.; Marucco, G.; Pini, M.; Ugazio, S.; Falletti, E.; lo Presti, L. Prototyping a GNSS-Based Passive Radar for UAVs: An Instrument to Classify the Water Content Feature of Lands. Sensors 2015, 15, 28287–28313. [Google Scholar] [CrossRef]
- Grigorescu, S.; Trasnea, B.; Cocias, T.; Macesanu, G. A survey of deep learning techniques for autonomous driving AI for self-driving vehicles, artificial intelligence, autonomous driving, deep learning for autonomous driving. J. Field Robot. 2020, 37, 362–386. [Google Scholar] [CrossRef]
- Rasouli, A.; Tsotsos, J.K. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE Trans. Intell. Transp. Syst. 2020, 21, 900–918. [Google Scholar] [CrossRef]
- Cui, G.; Zhang, W.; Xiao, Y.; Yao, L.; Fang, Z. Cooperative Perception Technology of Autonomous Driving in the Internet of Vehicles Environment: A Review. Sensors 2022, 22, 5535. [Google Scholar] [CrossRef]
- Xu, M.; Niyato, D.; Chen, J.; Zhang, H.; Kang, J.; Xiong, Z.; Mao, S.; Han, Z. Generative AI-Empowered Simulation for Autonomous Driving in Vehicular Mixed Reality Metaverses. IEEE J. Sel. Top. Signal Process. 2023, 17, 1064–1079. [Google Scholar] [CrossRef]
- Hasirlioglu, S.; Kamann, A.; Doric, I.; Brandmeier, T. Test methodology for rain influence on automotive surround sensors. In Proceedings of the IEEE Conference on Intelligent Transportation Systems, Rio de Janeiro, Brazil, 1–4 November 2016. [Google Scholar]
- Zhu, M.; Wang, X.; Wang, Y. Human-like autonomous car-following model with deep reinforcement learning. Transp. Res. Part C Emerg. Technol. 2018, 97, 348–368. [Google Scholar] [CrossRef]
- Ferranti, L.; Brito, B.; Pool, E.; Zheng, Y.; Ensing, R.M.; Happee, R.; Shyrokau, B.; Kooij, J.F.P.; Alonso-Mora, J.; Gavrila, D.M. SafeVRU: A research platform for the interaction of self-driving vehicles with vulnerable road users. In Proceedings of the IEEE Intelligent Vehicles Symposium, Paris, France, 9–12 June 2019. [Google Scholar]
- Mozaffari, S.; Al-Jarrah, O.Y.; Dianati, M.; Jennings, P.; Mouzakitis, A. Deep Learning-Based Vehicle Behavior Prediction for Autonomous Driving Applications: A Review. IEEE Trans. Intell. Transp. Syst. 2022, 23, 33–47. [Google Scholar] [CrossRef]
- Wang, C.; Sun, Q.; Li, Z.; Zhang, H. Human-like lane change decision model for autonomous vehicles that considers the risk perception of drivers in mixed traffic. Sensors 2020, 20, 2259. [Google Scholar] [CrossRef]
- Shaik, F.A.; Malreddy, A.; Billa, N.R.; Chaudhary, K.; Manchanda, S.; Varma, G. IDD-AW: A Benchmark for Safe and Robust Segmentation of Drive Scenes in Unstructured Traffic and Adverse Weather. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Waikoloa, HI, USA, 3–8 January 2024; pp. 4602–4611. [Google Scholar]
- El-Shair, Z.A.; Abu-Raddaha, A.; Cofield, A.; Alawneh, H.; Aladem, M.; Hamzeh, Y.; Rawashdeh, S.A. SID: Stereo Image Dataset for Autonomous Driving in Adverse Conditions. In Proceedings of the NAECON 2024-IEEE National Aerospace and Electronics Conference, Dayton, OH, USA, 15–18 July 2024; pp. 403–408. [Google Scholar]
- Kenk, M.A.; Hassaballah, M. DAWN: Vehicle Detection in Adverse Weather Nature. arXiv 2020, arXiv:2008.05402. [Google Scholar]
- Marathe, A.; Ramanan, D.; Walambe, R.; Kotecha, K. WEDGE: A Multi-Weather Autonomous Driving Dataset Built from Generative Vision-Language Models. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, Vancouver, BC, Canada, 17–24 June 2023. [Google Scholar]
- Caesar, H.; Bankiti, V.; Lang, A.H.; Vora, S.; Liong, V.E.; Xu, Q.; Krishnan, A.; Pan, Y.; Baldan, G.; Beijbom, O. NuScenes: A Multimodal Dataset for Autonomous Driving. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020. [Google Scholar]
- Sun, P.; Kretzschmar, H.; Dotiwalla, X.; Chouard, A.; Patnaik, V.; Tsui, P.; Guo, J.; Zhou, Y.; Chai, Y.; Caine, B.; et al. Scalability in Perception for Autonomous Driving: Waymo Open Dataset. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020. [Google Scholar]
- Wang, Y.; Zheng, K.; Tian, D.; Duan, X.; Zhou, J. Pre-Training with Asynchronous Supervised Learning for Reinforcement Learning Based Autonomous Driving. Front. Inf. Technol. Electron. Eng. 2021, 22, 673–686. [Google Scholar] [CrossRef]
- Tahir, N.U.A.; Zhang, Z.; Asim, M.; Chen, J.; ELAffendi, M. Object Detection in Autonomous Vehicles Under Adverse Weather: A Review of Traditional and Deep Learning Approaches. Algorithms 2024, 17, 103. [Google Scholar] [CrossRef]
- Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative Adversarial Networks. Commun. ACM 2020, 63, 139–144. [Google Scholar] [CrossRef]
- Bavirisetti, D.P.; Martinsen, H.R.; Kiss, G.H.; Lindseth, F. A Multi-Task Vision Transformer for Segmentation and Monocular Depth Estimation for Autonomous Vehicles. IEEE Open J. Intell. Transp. Syst. 2023, 4, 909–928. [Google Scholar] [CrossRef]
- Alahi, A.; Goel, K.; Ramanathan, V.; Robicquet, A.; Li, F.F.; Savarese, S. Social LSTM: Human Trajectory Prediction in Crowded Spaces. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
- Yurtsever, E.; Lambert, J.; Carballo, A.; Takeda, K. A Survey of Autonomous Driving: Common Practices and Emerging Technologies. IEEE Access 2020, 8, 58443–58469. [Google Scholar] [CrossRef]
- Aoude, G.S.; Desaraju, V.R.; Stephens, L.H.; How, J.P. Driver Behavior Classification at Intersections and Validation on Large Naturalistic Data Set. IEEE Trans. Intell. Transp. Syst. 2012, 13, 724–736. [Google Scholar] [CrossRef]
- Wang, Z.; Zheng, L.; Liu, Y.; Li, Y.; Wang, S. Towards Real-Time Multi-Object Tracking. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2020. [Google Scholar]
- Xu, H.; Gao, Y.; Yu, F.; Darrell, T. End-to-End Learning of Driving Models from Large-Scale Video Datasets. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Jocher, G. ultralytics/yolov5: v7.0-YOLOv5 SOTA Real-Time Instance Segmentation (v7.0). Available online: https://github.com/ultralytics/yolov5/tree/v7.0 (accessed on 24 October 2024).
- Han, H.; Xie, T. Lane Change Trajectory Prediction of Vehicles in Highway Interweaving Area Using Seq2Seq-Attention Network. Zhongguo Gonglu Xuebao/China J. Highw. Transp. 2020, 33, 106–118. [Google Scholar]
- Shalev-Shwartz, S.; Shammah, S.; Shashua, A. Safe, Multi-Agent, Reinforcement Learning for Autonomous Driving. arXiv 2016, arXiv:1610.03295v1. [Google Scholar]
- Hang, P.; Lv, C.; Huang, C.; Cai, J.; Hu, Z.; Xing, Y. An Integrated Framework of Decision Making and Motion Planning for Autonomous Vehicles Considering Social Behaviors. IEEE Trans. Veh. Technol. 2020, 69, 14458–14469. [Google Scholar] [CrossRef]
- Xie, S.; Girshick, R.; Dollár, P.; Tu, Z.; He, K. Aggregated Residual Transformations for Deep Neural Networks. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2017), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Xu, X.; Wang, X.; Wu, X.; Hassanin, O.; Chai, C. Calibration and Evaluation of the Responsibility-Sensitive Safety Model of Autonomous Car-Following Maneuvers Using Naturalistic Driving Study Data. Transp. Res. Part C Emerg. Technol. 2021, 123, 102988. [Google Scholar] [CrossRef]
- Rjoub, G.; Wahab, O.A.; Bentahar, J.; Bataineh, A.S. Improving Autonomous Vehicles Safety in Snow Weather Using Federated YOLO CNN Learning. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
- Lefèvre, S.; Vasquez, D.; Laugier, C. A Survey on Motion Prediction and Risk Assessment for Intelligent Vehicles. ROBOMECH J. 2014, 1, 1. [Google Scholar] [CrossRef]
- Bolano, A. “Moral Machine Experiment”: Large-Scale Study Reveals Regional Differences in Ethical Preferences for Self-Driving Cars. Sci. Trends 2018. [Google Scholar] [CrossRef]
- Awad, E.; Dsouza, S.; Kim, R.; Schulz, J.; Henrich, J.; Shariff, A.; Bonnefon, J.-F.; Rahwan, I. The Moral Machine Experiment. Nature 2018, 563, 59–64. [Google Scholar] [CrossRef]
- Djuric, N.; Radosavljevic, V.; Cui, H.; Nguyen, T.; Chou, F.C.; Lin, T.H.; Schneider, J. Motion Prediction of Traffic Actors for Autonomous Driving Using Deep Convolutional Networks. arXiv 2018, arXiv:1808.05819. [Google Scholar]
- Chandra, R.; Bhattacharya, U.; Bera, A.; Manocha, D. Traphic: Trajectory Prediction in Dense and Heterogeneous Traffic Using Weighted Interactions. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 15–20 June 2019. [Google Scholar]
- Li, S.; Shu, K.; Chen, C.; Cao, D. Planning and Decision-Making for Connected Autonomous Vehicles at Road Intersections: A Review. Chin. J. Mech. Eng. 2021, 34, 1–18. [Google Scholar] [CrossRef]
- Schwarting, W.; Alonso-Mora, J.; Rus, D. Planning and Decision-Making for Autonomous Vehicles. Annu. Rev. Control Robot. Auton. Syst. 2018, 1, 187–210. [Google Scholar] [CrossRef]
- Wang, J.; Wu, J.; Li, Y. The Driving Safety Field Based on Driver-Vehicle-Road Interactions. IEEE Trans. Intell. Transp. Syst. 2015, 16, 2203–2214. [Google Scholar] [CrossRef]
- Jahromi, B.S.; Tulabandhula, T.; Cetin, S. Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles. Sensors 2019, 19, 4357. [Google Scholar] [CrossRef] [PubMed]
- Choi, D.; An, T.H.; Ahn, K.; Choi, J. Driving Experience Transfer Method for End-to-End Control of Self-Driving Cars. arXiv 2018, arXiv:1809.01822. [Google Scholar]
- Li, G.; Yang, Y.; Zhang, T.; Qu, X.; Cao, D.; Cheng, B.; Li, K. Risk Assessment Based Collision Avoidance Decision-Making for Autonomous Vehicles in Multi-Scenarios. Transp. Res. Part C Emerg. Technol. 2021, 122, 102820. [Google Scholar] [CrossRef]
- Ye, B.-L.; Wu, W.; Gao, H.; Lu, Y.; Cao, Q.; Zhu, L. Stochastic Model Predictive Control for Urban Traffic Networks. Appl. Sci. 2017, 7, 588. [Google Scholar] [CrossRef]
- Galceran, E.; Cunningham, A.G.; Eustice, R.M.; Olson, E. Multipolicy Decision-Making for Autonomous Driving via Change-Point-Based Behavior Prediction: Theory and Experiment. Auton. Robot. 2017, 41, 1367–1382. [Google Scholar] [CrossRef]
- Benderius, O.; Berger, C.; Malmsten Lundgren, V. The Best Rated Human-Machine Interface Design for Autonomous Vehicles in the 2016 Grand Cooperative Driving Challenge. IEEE Trans. Intell. Transp. Syst. 2018, 19, 1302–1307. [Google Scholar] [CrossRef]
- Patole, S.M.; Torlak, M.; Wang, D.; Ali, M. Automotive Radars: A Review of Signal Processing Techniques. IEEE Signal Process. Mag. 2017, 34, 22–35. [Google Scholar] [CrossRef]
- Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.Y.; Berg, A.C. SSD: Single Shot Multibox Detector. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2016. [Google Scholar]
- Hu, X.; Chen, L.; Tang, B.; Cao, D.; He, H. Dynamic Path Planning for Autonomous Driving on Various Roads with Avoidance of Static and Moving Obstacles. Mech. Syst. Signal Process. 2018, 100, 482–500. [Google Scholar] [CrossRef]
- Feng, D.; Rosenbaum, L.; Timm, F.; Dietmayer, K. Leveraging Heteroscedastic Aleatoric Uncertainties for Robust Real-Time LiDAR 3D Object Detection. In Proceedings of the 2019 IEEE Intelligent Vehicles Symposium (IV), Paris, France, 9–12 June 2019. [Google Scholar]
- Dixit, V.V.; Chand, S.; Nair, D.J. Autonomous Vehicles: Disengagements, Accidents and Reaction Times. PLoS ONE 2016, 11, e0168054. [Google Scholar] [CrossRef] [PubMed]
- Kiran, B.R.; Sobh, I.; Talpaert, V.; Mannion, P.; Sallab, A.A.A.; Yogamani, S.; Perez, P. Deep Reinforcement Learning for Autonomous Driving: A Survey. IEEE Trans. Intell. Transp. Syst. 2022, 23, 4909–4926. [Google Scholar] [CrossRef]
- Malik, S.; Khan, M.A.; Aadam; El-Sayed, H.; Iqbal, F.; Khan, J.; Ullah, O. CARLA+: An Evolution of the CARLA Simulator for Complex Environment Using a Probabilistic Graphical Model. Drones 2023, 7, 111. [Google Scholar] [CrossRef]
- Zhao, D.; Lam, H.; Peng, H.; Bao, S.; LeBlanc, D.J.; Nobukawa, K.; Pan, C.S. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques. IEEE Trans. Intell. Transp. Syst. 2017, 18, 595–607. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, L.; Zhang, D.; Li, K. An Adaptive Longitudinal Driving Assistance System Based on Driver Characteristics. IEEE Trans. Intell. Transp. Syst. 2013, 14, 1–12. [Google Scholar] [CrossRef]
- Bijelic, M.; Gruber, T.; Mannan, F.; Kraus, F.; Ritter, W.; Dietmayer, K.; Heide, F. Seeing through Fog without Seeing Fog: Deep Multimodal Sensor Fusion in Unseen Adverse Weather. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020. [Google Scholar]
- Petrovskaya, A.; Thrun, S. Model-Based Vehicle Detection and Tracking for Autonomous Urban Driving. Auton. Robot. 2009, 26, 123–139. [Google Scholar] [CrossRef]
- Boban, M.; Kousaridas, A.; Manolakis, K.; Eichinger, J.; Xu, W. Connected Roads of the Future: Use Cases, Requirements, and Design Considerations for Vehicle-To-Everything Communications. IEEE Veh. Technol. Mag. 2018, 13, 110–123. [Google Scholar] [CrossRef]
- Bengler, K.; Dietmayer, K.; Farber, B.; Maurer, M.; Stiller, C.; Winner, H. Three Decades of Driver Assistance Systems: Review and Future Perspectives. IEEE Intell. Transp. Syst. Mag. 2014, 6, 6–22. [Google Scholar] [CrossRef]
- Kuutti, S.; Bowden, R.; Jin, Y.; Barber, P.; Fallah, S. A Survey of Deep Learning Applications to Autonomous Vehicle Control. IEEE Trans. Intell. Transp. Syst. 2021, 22, 712–733. [Google Scholar] [CrossRef]
- Mueller, A.S.; Cicchino, J.B.; Zuby, D.S. What Humanlike Errors Do Autonomous Vehicles Need to Avoid to Maximize Safety? J. Saf. Res. 2020, 75, 310–318. [Google Scholar] [CrossRef]
- Hou, Y.; Edara, P.; Sun, C. Modeling Mandatory Lane Changing Using Bayes Classifier and Decision Trees. IEEE Trans. Intell. Transp. Syst. 2013, 15, 647–655. [Google Scholar] [CrossRef]
- Chen, X.; Ma, H.; Wan, J.; Li, B.; Xia, T. Multi-View 3D Object Detection Network for Autonomous Driving. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Zhao, J.; Liang, B.; Chen, Q. The Key Technology Toward the Self-Driving Car. Int. J. Intell. Unmanned Syst. 2018, 6, 2–20. [Google Scholar] [CrossRef]
- Fagnant, D.J.; Kockelman, K. Preparing a Nation for Autonomous Vehicles: Opportunities, Barriers and Policy Recommendations. Transp. Res. Part A Policy Pract. 2015, 77, 167–181. [Google Scholar] [CrossRef]
- Taeihagh, A.; Lim, H.S.M. Governing Autonomous Vehicles: Emerging Responses for Safety, Liability, Privacy, Cybersecurity, and Industry Risks. Transp. Rev. 2019, 39, 103–128. [Google Scholar] [CrossRef]
- Mendiboure, L.; Benzagouta, M.L.; Gruyer, D.; Sylla, T.; Adedjouma, M.; Hedhli, A. Operational Design Domain for Automated Driving Systems: Taxonomy Definition and Application. In Proceedings of the 2023 IEEE Intelligent Vehicles Symposium (IV), Anchorage, AK, USA, 4–7 June 2023; pp. 1–6. [Google Scholar]
- de Bruyne, J.; Werbrouck, J. Merging Self-Driving Cars with the Law. Comput. Law Secur. Rev. 2018, 34, 1150–1153. [Google Scholar] [CrossRef]
- Badue, C.; Guidolini, R.; Carneiro, R.V.; Azevedo, P.; Cardoso, V.B.; Forechi, A.; Jesus, L.; Berriel, R.; Paixão, T.M.; Mutz, F.; et al. Self-Driving Cars: A Survey. Expert Syst. Appl. 2021, 165, 113816. [Google Scholar] [CrossRef]
- Liu, S.; Yu, B.; Tang, J.; Zhu, Q. Invited: Towards Fully Intelligent Transportation Through Infrastructure-Vehicle Cooperative Autonomous Driving: Challenges and Opportunities. In Proceedings of the 2021 58th ACM/IEEE Design Automation Conference (DAC), San Francisco, CA, USA, 5–9 December 2021; pp. 1323–1326. [Google Scholar]
- Fadadu, S.; Pandey, S.; Hegde, D.; Shi, Y.; Chou, F.-C.; Djuric, N.; Vallespi-Gonzalez, C. Multi-View Fusion of Sensor Data for Improved Perception and Prediction in Autonomous Driving. In Proceedings of the 2022 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Waikoloa, HI, USA, 3–8 January 2022; pp. 3292–3300. [Google Scholar]
- Li, Y.; Liu, W.; Liu, Q.; Zheng, X.; Sun, K.; Huang, C. Complying with ISO 26262 and ISO/SAE 21434: A Safety and Security Co-Analysis Method for Intelligent Connected Vehicle. Sensors 2024, 24, 1848. [Google Scholar] [CrossRef]
- Ye, L.; Yamamoto, T. Evaluating the Impact of Connected and Autonomous Vehicles on Traffic Safety. Phys. A Stat. Mech. Its Appl. 2019, 526, 121009. [Google Scholar] [CrossRef]
- Muhammad, K.; Ullah, A.; Lloret, J.; del Ser, J.; de Albuquerque, V.H.C. Deep Learning for Safe Autonomous Driving: Current Challenges and Future Directions. IEEE Trans. Intell. Transp. Syst. 2021, 22, 4316–4336. [Google Scholar] [CrossRef]
- Hou, G. Evaluating Efficiency and Safety of Mixed Traffic with Connected and Autonomous Vehicles in Adverse Weather. Sustainability 2023, 15, 3138. [Google Scholar] [CrossRef]
- Wang, M.; Su, T.; Chen, S.; Yang, W.; Liu, J.; Wang, Z. Automatic Model-Based Dataset Generation for High-Level Vision Tasks of Autonomous Driving in Haze Weather. IEEE Trans. Ind. Inform. 2023, 19, 9071–9081. [Google Scholar] [CrossRef]
- Litman, T. Autonomous Vehicle Implementation Predictions: Implications for Transport Planning. In Proceedings of the Transportation Research Board Annual Meeting 2014, Washington, DC, USA, 12–16 January 2014; p. 42. Available online: https://www.vtpi.org/avip.pdf (accessed on 24 October 2024).
- Patel, A.R.; Roscia, M.; Vucinic, D. Legal Implications for Autonomous Vehicles Mobility in Future Smart Cities. In Proceedings of the 2023 IEEE International Smart Cities Conference (ISC2), Bucharest, Romania, 24–27 September 2023; pp. 1–5. [Google Scholar]
Level | Name | Algorithmic Focus | Key Algorithms | Capabilities | Challenges |
---|---|---|---|---|---|
Level 0 | Fully Manual Driving Vehicle | No automation; basic driver assistance | None | Manual control by the driver | No algorithmic support; |
safety relies entirely on human input | |||||
Level 1 | Partial Driver Assistance Vehicle | Single task automation, | PID controllers, simple rule-based algorithms | Basic ADAS features, lane-keeping, adaptive cruise control | Limited automation, requires constant human supervision |
cruise control; | |||||
braking assistance | |||||
Level 2 | Combined Driver Assistance Vehicle | Automation of multiple tasks, driver still engaged | Sensor fusion, basic computer vision, simple machine learning | Partial automation, control steering, speed control | Contextual understanding is weak; handover between machine and driver is critical |
Level 3 | Conditional Automation Vehicle | Conditional automation, vehicle can handle dynamic tasks | Advanced sensor fusion, decision trees, reinforcement learning | Can handle driving tasks autonomously under certain conditions | Requires quick human intervention in complex or unforeseen scenarios |
Level 4 | High Automation Vehicle | Full automation in specific conditions | Deep Learning, reinforcement learning, model predictive control | Can perform all driving tasks autonomously in predefined scenarios | Limited by operational design domain (ODD), challenges in managing unexpected situations |
Level 5 | Full Automation Vehicle | Full automation in all conditions | End-to-end deep learning, AI-based prediction, SLAM | Completely driverless in all conditions and environments | High computational demands, ethical decision-making, managing all edge cases |
Dataset | Number of Images | Weather Conditions | Annotations | Focus |
---|---|---|---|---|
IDD-AW | 5000 pairs | Rain, fog, snow, low light | Pixel-level segmentation | Safety in unstructured traffic environments, providing a variety of road scenes in adverse conditions [51]. |
SID | 178,000 pairs | Snow, rain, nighttime | Depth estimation | Designed for stereo vision tasks, emphasizing depth perception in low-visibility scenarios [52]. |
DAWN | 1000 images | Fog, rain, snow | Bounding boxes | Specialized in vehicle detection accuracy in low-visibility conditions, allowing models to improve object tracking and localization [53]. |
WEDGE | 3360 synthetic images | Simulated extreme conditions | Bounding boxes, SSIM metrics | Focuses on training models in rare and extreme weather scenarios like hurricanes, which are difficult to capture in real-world datasets [54]. |
nuScenes | 1.4 million images | Urban, night, rain | 3D bounding boxes, tracking | Provides multi-sensor data (camera, LiDAR, radar) for complex urban driving scenarios, making it suitable for sensor fusion research [55]. |
Waymo Open Dataset | 10 million images | Diverse geographical conditions | 3D object detection, trajectory | Offers high-resolution data with extensive labeling for objects, enabling long-range perception analysis [56]. |
Aspect | Adverse Weather Conditions | Complex Traffic Scenarios | Emergency Maneuvers | Blind Spot Management |
---|---|---|---|---|
Environmental Challenges | ||||
Key Scenarios | Heavy rain, snow, fog, glare | Pedestrian jaywalking, aggressive driving, traffic violations | Sudden obstacles, system failures | Hidden vehicles, pedestrians in blind spots |
Visibility Reduction | Up to 95% in heavy fog | 10–30% in urban environments | Varies widely | 100% in blind spots |
Impact on Sensor Performance | LiDAR: −50% range in heavy rain | Camera: −20% accuracy in crowded scenes | Minimal impact | Radar: −10% accuracy for moving objects |
Perception Technologies | ||||
Primary Sensor | Fusion of LiDAR and radar systems | High-resolution cameras | LiDAR (range: 200 m) | Short-range radar (30 m) |
Secondary Sensor | Infrared cameras | LiDAR for 3D mapping | Stereo cameras (150 m) | Wide-angle cameras (50 m) |
Tertiary Sensor | Ultrasonic for close-range | GPS/IMU for localization | Long-range radar (160 m) | Ultrasonic sensors (5 m) |
Sensor Fusion Technique | Adaptive multi-sensor fusion | Spatio-temporal fusion | Low-latency sensor fusion | Cross-modal fusion |
Algorithmic Approaches | ||||
Main Algorithm | DeepWet-Net for rain removal | Social-LSTM for trajectory prediction | Model predictive control | YOLOv5 for object detection |
Auxiliary Algorithms | Fog density estimation | Intention-aware motion planning | Reinforcement learning for making decision-making | Graph Neural Networks for spatial reasoning |
Accuracy (Standard/Adverse) | 92%/78% | 95%/85% | 94%/85% | 97%/90% |
Computational Complexity | O(n^2) for image dehazing [20] | O(nlogn) for multi-object tracking | O(n^2) for MPC | O(n) for single-stage detection |
Testing and Validation | ||||
Simulation Environments | CARLA with weather modules | SUMO for urban traffic | PreScan for ADAS testing | SynCity for diverse scenarios |
Real-World Testing | Dedicated bad weather tracks | Urban and highway environments | Closed courses with obstacles | Specialized blind spot test tracks |
Key Performance Metrics | Weather condition classification accuracy | Prediction accuracy of road user behavior | Collision avoidance success rate | False positive/negative rates |
Benchmark Datasets | BDD100K Weather | Waymo Open Dataset | EuroNCAP AEB scenarios | scenes |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, C.; Sankar, R. A Comprehensive Review of Autonomous Driving Algorithms: Tackling Adverse Weather Conditions, Unpredictable Traffic Violations, Blind Spot Monitoring, and Emergency Maneuvers. Algorithms 2024, 17, 526. https://doi.org/10.3390/a17110526
Xu C, Sankar R. A Comprehensive Review of Autonomous Driving Algorithms: Tackling Adverse Weather Conditions, Unpredictable Traffic Violations, Blind Spot Monitoring, and Emergency Maneuvers. Algorithms. 2024; 17(11):526. https://doi.org/10.3390/a17110526
Chicago/Turabian StyleXu, Cong, and Ravi Sankar. 2024. "A Comprehensive Review of Autonomous Driving Algorithms: Tackling Adverse Weather Conditions, Unpredictable Traffic Violations, Blind Spot Monitoring, and Emergency Maneuvers" Algorithms 17, no. 11: 526. https://doi.org/10.3390/a17110526
APA StyleXu, C., & Sankar, R. (2024). A Comprehensive Review of Autonomous Driving Algorithms: Tackling Adverse Weather Conditions, Unpredictable Traffic Violations, Blind Spot Monitoring, and Emergency Maneuvers. Algorithms, 17(11), 526. https://doi.org/10.3390/a17110526