Ego-Vehicle Speed Correction for Automotive Radar Systems Using Convolutional Neural Networks
Abstract
:1. Introduction
- The proposed model is independent of the radar mounting angle on the vehicle.
- The model is independent of the angular performance of the automotive radar system.
- The loss in the maximum detection distance is reduced because the model is not operated with a downward or wide beam for the elevation angle.
- The complexity is reduced by simplifying the model and reducing the parameters, enabling the application of the model in embedded systems.
2. Related Work
- Orthogonal distance regression (ODR): In a previous study [18], ODR algorithms were used to analyze the correlation between the speed of an ego vehicle and the relative speed/angle of stationary objects to predict the ego-vehicle speed. When there was significant noise in the measured values or when stationary objects were not detected, the ego-vehicle speed was predicted using the Kalman filter.
- Hough transform: Lim and Lee [19] aimed to accurately estimate the speed of the ego vehicle by examining the distribution characteristics of stationary objects within an angle-velocity domain and by identifying the intersection point of the Hough transform.
- Elevated objects: Kingery and Song [20] proposed a method to compensate for the elevation angle and estimate the ego-vehicle speed by analyzing the radial velocity difference in elevated objects, such as buildings and tall trees.
- Ground backscattering: A unique technique has been proposed [21] for estimating the ego-vehicle speed by analyzing backscattering signals reflected from the ground.
- Random sample consensus (RANSAC): The speed of the ego-vehicle was matched by distinguishing stationary objects using the RANSAC algorithm and interpreting the radial velocity errors associated with these stationary objects [22].
3. Materials and Methods
3.1. Measurement Dataset
3.1.1. Configuration of the Vehicle and Radar System
3.1.2. Waveform Specifications of the Automotive Radar System
3.1.3. Measurement Data Results
3.2. Simulation Dataset
3.2.1. Simulation Methods
3.2.2. Simulation Results
3.3. Implementation of CNN
3.3.1. Preprocessing
3.3.2. Network Architecture Considerations
3.3.3. Network Architecture
3.3.4. Postprocessing
3.3.5. Comparison of AVSD Net Models
4. Results and Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Bilik, I.; Longman, O.; Villeval, S.; Tabrikian, J. The Rise of Radar for Autonomous Vehicles: Signal Processing Solutions and Future Research Directions. IEEE Signal Process. Mag. 2019, 36, 20–31. [Google Scholar] [CrossRef]
- Meinel, H.H. Evolving Automotive Radar—From the Very Beginnings into the Future. In Proceedings of the 8th European Conference on Antennas and Propagation (EuCAP 2014), Hague, The Netherlands, 6–11 April 2014; pp. 3107–3114. [Google Scholar] [CrossRef]
- Parliament Approves EU Rules Requiring Life-saving Technologies in Vehicles. Available online: https://www.europarl.europa.eu/news/en/press-room/20190410IPR37528/parliament-approves-eu-rules-requiring-life-saving-technologies-in-vehicles (accessed on 8 July 2024).
- NHTSA Finalizes Key Safety Rule to Reduce Crashes and Save Lives. Available online: https://www.nhtsa.gov/press-releases/nhtsa-fmvss-127-automatic-emergency-braking-reduce-crashes (accessed on 8 July 2024).
- Mohammed, A.S.; Amamou, A.; Ayevide, F.K.; Kelouwani, S.; Agbossou, K.; Zioui, N. The perception system of intelligent ground vehicles in all weather conditions: A systematic literature review. Sensors 2020, 20, 6532. [Google Scholar] [CrossRef]
- Klotz, M.; Rohling, H. 24 GHz Radar Sensors for Automotive Applications. In Proceedings of the 13th International Conference on Microwaves, Radar and Wireless Communications. MIKON-2000. Conference Proceedings (IEEE Cat. No. 00EX428), Wroclaw, Poland, 22–24 May 2000; pp. 359–362. [Google Scholar] [CrossRef]
- Song, H.; Shin, H.C. Classification and spectral mapping of stationary and moving objects in road environments using FMCW radar. IEEE Access 2020, 8, 22955–22963. [Google Scholar] [CrossRef]
- Yang, W.; Zhang, X.; Lei, Q.; Cheng, X. Research on longitudinal active collision avoidance of autonomous emergency braking pedestrian system (AEB-P). Sensors 2019, 19, 4671. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Lin, Y.; Qin, Y.; Dong, M.; Gao, L.; Hashemi, E. A new adaptive cruise control considering crash avoidance for intelligent vehicle. IEEE Trans. Ind. Electron. 2024, 71, 688–696. [Google Scholar] [CrossRef]
- Grimm, C.; Breddermann, T.; Farhoud, R.; Fei, T.; Warsitz, E.; Haeb-Umbach, R. Discrimination of Stationary from Moving Targets with Recurrent Neural Networks in Automotive Radar. In Proceedings of the 2018 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Munich, Germany, 15–17 April 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Raj, S.; Ghosh, D. Optimized DBSCAN with Improved Static Clutter Removal for High Resolution Automotive Radars. In Proceedings of the 2022 19th European Radar Conference (EuRAD), Milan, Italy, 28–30 September 2022; pp. 1–4. [Google Scholar] [CrossRef]
- Kopp, J.; Kellner, D.; Piroli, A.; Dietmayer, K. Fast Rule-Based Clutter Detection in Automotive Radar Data. In Proceedings of the 2021 IEEE International Intelligent Transportation Systems Conference (ITSC), Indianapolis, IN, USA, 19–22 September 2021; pp. 3010–3017. [Google Scholar] [CrossRef]
- Hernandez, W. Improving the response of a wheel speed sensor by using frequency-domain adaptive filtering. IEEE Sens. J. 2003, 3, 404–413. [Google Scholar] [CrossRef]
- Hernandez, W. Improving the response of a wheel speed sensor by using a RLS lattice algorithm. Sensors 2006, 6, 64–79. [Google Scholar] [CrossRef]
- Magnusson, M.; Trobro, C. Improving Wheel Speed Sensing and Estimation. Master’s Thesis, Department of Automatic Control, Lund Institute of Technology, Lund, Sweden, December 2003. [Google Scholar]
- Isermann, R.; Wesemeier, D. Indirect vehicle tire pressure monitoring with wheel and suspension sensors. IFAC Proc. Vol. 2009, 42, 917–922. [Google Scholar] [CrossRef]
- Spangenberg, M.; Calmettes, V.; Tourneref, J.Y. Fusion of GPS, INS and Odometric Data for Automotive Navigation. In Proceedings of the 2007 15th European Signal Processing Conference, Poznan, Poland, 3–7 September 2007; pp. 886–890. [Google Scholar]
- Grimm, C.; Farhoud, R.; Fei, T.; Warsitz, E.; Haeb-Umbach, R. Detection of Moving Targets in Automotive Radar with Distorted Ego-Velocity Information. In Proceedings of the 2017 IEEE Microwaves, Radar and Remote Sensing Symposium (MRRS), Kiev, Ukraine, 29–31 August 2017; pp. 111–116. [Google Scholar] [CrossRef]
- Lim, S.; Lee, S. Hough transform based ego-velocity estimation in automotive radar system. Electron. Lett. 2021, 57, 80–82. [Google Scholar] [CrossRef]
- Kingery, A.; Song, D. Improving ego-velocity estimation of low-cost Doppler radars for vehicles. IEEE Robot. Autom. Lett. 2022, 7, 9445–9452. [Google Scholar] [CrossRef]
- Klinefelter, E.; Nanzer, J.A. Automotive velocity sensing using millimeter-wave interferometric radar. IEEE Trans. Microw. Theory Techn. 2021, 69, 1096–1104. [Google Scholar] [CrossRef]
- Kellner, D.; Barjenbruch, M.; Klappstein, J.; Dickmann, J.; Dietmayer, K. Instantaneous Ego-Motion Estimation Using Doppler Radar. In Proceedings of the 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), The Hague, Netherlands, 6–9 October 2013; pp. 869–874. [Google Scholar] [CrossRef]
- Cho, H.W.; Choi, S.; Cho, Y.R.; Kim, J. Complex-valued channel attention and application in ego-velocity estimation with automotive radar. IEEE Access 2021, 9, 17717–17727. [Google Scholar] [CrossRef]
- Cho, H.W.; Choi, S.; Cho, Y.R.; Kim, J. Deep Complex-Valued Network for Ego-Velocity Estimation with Millimeter-Wave Radar. In Proceedings of the 2020 IEEE SENSORS, Rotterdam, Netherlands, 25–28 October 2020; pp. 1–4. [Google Scholar] [CrossRef]
- Automotive, Second-Generation 76-GHz to 81-GHz High-Performance SoC for Corner and Long-Range Radar. Available online: https://www.ti.com/product/AWR2944 (accessed on 10 July 2024).
- DCA1000 Evaluation Module for Real-Time Data Capture and Streaming. Available online: https://www.ti.com/tool/DCA1000EVM (accessed on 10 July 2024).
- Iovescu, C.; Rao, S. The fundamentals of millimeter wave sensors. Tex. Instrum. 2017, 1–8. Available online: https://www.ti.com/lit/wp/spyy005a/spyy005a.pdf?ts=1727787945304&ref_url=https%253A%252F%252Fwww.google.com%252F (accessed on 10 July 2024).
- Rao, S. Introduction to mmWave sensing: FMCW radars. Tex. Instrum. (TI) Mmwave Train. Ser. 2017, 1–11. Available online: https://www.ti.com/content/dam/videos/external-videos/zh-tw/2/3816841626001/5415203482001.mp4/subassets/mmwaveSensing-FMCW-offlineviewing_0.pdf (accessed on 10 July 2024).
- Lee, Y.; Park, B. Nonlinear regression-based GNSS multipath modelling in deep urban area. Mathematics 2022, 10, 412. [Google Scholar] [CrossRef]
- Liu, F.; Han, H.; Cheng, X.; Li, B. Performance of tightly coupled integration of GPS/BDS/MEMS-INS/Odometer for real-time high-precision vehicle positioning in urban degraded and denied environment. J. Sens. 2020, 2020, 8670262. [Google Scholar] [CrossRef]
- Wang, G.; Li, S. A Novel Longitudinal Speed Estimator for Fully Automation Ground Vehicle on Cornering Maneuver. In Proceedings of the 2009 International Conference on Artificial Intelligence and Computational Intelligence, Shanghai, China, 7–8 November 2009; pp. 433–437. [Google Scholar] [CrossRef]
- User’s Guide AWR2944 Evaluation Module. Available online: https://www.ti.com/lit/ug/spruj22b/spruj22b.pdf (accessed on 14 July 2024).
- Damnjanović, V.; Petrović, M.; Milovanović, V. On Hardware Implementations of Two-Dimensional Fast Fourier Transform for Radar Signal Processing. In Proceedings of the IEEE EUROCON 2023—20th International Conference on Smart Technologies, Torino, Italy, 6 July 2023; pp. 400–405. [Google Scholar] [CrossRef]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning (ICML), Lille, France, 6–11 July 2015. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 30 June 2016; pp. 770–778. [Google Scholar]
- Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, Hawaii, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Gao, G.; Zhong, Y.; Gao, S.; Gao, B. Double-Channel Sequential Probability Ratio Test for Failure Detection in Multisensor Integrated Systems. IEEE Trans. Instrum. Meas. 2021, 70, 3514814. [Google Scholar] [CrossRef]
- Gao, G.; Gao, B.; Gao, S.; Hu, G.; Zhong, Y. A Hypothesis Test-Constrained Robust Kalman Filter for INS/GNSS Integration With Abnormal Measurement. IEEE Trans. Vehi. Technol. 2023, 72, 1662–1673. [Google Scholar] [CrossRef]
- Zhao, Y.; Zhang, J.; Hu, G.; Zhong, Y. Set-Membership Based Hybrid Kalman Filter for Nonlinear State Estimation under Systematic Uncertainty. Sensors 2020, 20, 627. [Google Scholar] [CrossRef]
- Gao, B.; Hu, G.; Zhu, X.; Zhong, Y. A Robust Cubature Kalman Filter with Abnormal Observations Identification Using the Mahalanobis Distance Criterion for Vehicular INS/GNSS Integration. Sensors 2019, 19, 5149. [Google Scholar] [CrossRef] [PubMed]
- Gao, B.; Hu, G.; Zhong, Y.; Zhu, X. Cubature Kalman Filter With Both Adaptability and Robustness for Tightly-Coupled GNSS/INS Integration. IEEE Sens. J. 2021, 21, 14997–15011. [Google Scholar] [CrossRef]
- Gao, B.; Gao, S.; Zhong, Y.; Hu, G.; Gu, C. Interacting multiple model estimation-based adaptive robust unscented Kalman filter. Int. J. Control Autom. Syst. 2017, 15, 2013–2025. [Google Scholar] [CrossRef]
Study | Mounting Angle | Angular Performance | Downward or Wide Beam | Complexity |
---|---|---|---|---|
ODR | Dependent | Dependent | Unnecessary | Simple |
Hough transform | Dependent | Dependent | Unnecessary | Simple |
Elevated objects | Dependent | Dependent | Necessary | Simple |
Ground backscattering | Dependent | Dependent | Necessary | Simple |
RANSAC | Independent | Dependent | Unnecessary | Simple |
CVNN | Independent | Independent | Unnecessary | Complex |
Parameter | Value | Parameter | Value |
---|---|---|---|
) | 77.0991 GHz | ) | 256 |
) | 77.3104 GHz | ) | 128 |
Bandwidth (BW) | 211.3536 MHz | ) | 50 ms |
Sample rate (SR) | 20 MHz | ) | 0.7092 m |
) | 12.8 µs | ) | 90.7802 m |
) | 16.512 MHz/µs | ) | 0.3792 m/s |
Pulse repetition interval (PRI) | 40 µs | ) | 24.2693 m/s |
Range | X (m) | Y Start (m) | Y End (m) | Y Direction Spacing (m) | Number of Stationary Object Sets |
---|---|---|---|---|---|
a | −15 | 1 | 70 | 1 | 1 |
b | +15 | 3 | 90 | - | 2 |
Range | X (m) | Y (m) | Radial Velocity (m/s) | Number of Extraneous Objects |
---|---|---|---|---|
a | −10 | 1 | −23 | 20 |
b | +10 | 90 | +23 | 25 |
Datasets | Measurement Data | Simulation Data |
---|---|---|
Dataset 1 | 34,538 | 30,000 |
Dataset 2 | 38,909 | 30,000 |
Total | 73,447 | 60,000 |
Layer Name | Uniform Stride (2 × 2) | Non-Uniform Stride (2 × 1) | ||||||
---|---|---|---|---|---|---|---|---|
Input Size | Output Size | Channels (Input) | Channels (Output) | Input Size | Output Size | Channels (Input) | Channels (Output) | |
Input | - | 128 × 65 | - | 1 | - | 128 × 65 | - | 1 |
Conv 1 | 128 × 65 | 64 × 33 | 1 | 8 | 128 × 65 | 64 × 65 | 1 | 8 |
Conv 2 | 64 × 33 | 32 × 17 | 8 | 16 | 64 × 65 | 32 × 65 | 8 | 8 |
Conv 3 | 32 × 17 | 17 × 9 | 16 | 16 | 32 × 65 | 16 × 65 | 8 | 8 |
Conv 4 | 16 × 9 | 8 × 5 | 16 | 16 | 16 × 65 | 8 × 65 | 8 | 8 |
Conv 5 | 8 × 5 | 4 × 3 | 16 | 16 | 8 × 65 | 4 × 65 | 8 | 8 |
Conv 6 | 4 × 3 | 2 × 2 | 16 | 16 | 4 × 65 | 2 × 65 | 8 | 8 |
Conv 7 | - | - | - | - | 2 × 65 | 1 × 65 | 8 | 1 |
FC | 64 | 7 | - | - | 65 | 7 | - | - |
Params | 11,306 | 3633 | ||||||
Error | 7.09% | 4.26% |
Layer Name | Input Size | Output Size | Kernel Size (5 × 5) | Kernel Size (3 × 3) | ||
---|---|---|---|---|---|---|
Channels (Input) | Channels (Output) | Channels (Input) | Channels (Output) | |||
Input | - | 128 × 65 | - | 1 | - | 1 |
Conv 1 | 128 × 65 | 64 × 65 | 1 | 16 | 1 | 16 |
Conv 2 | 64 × 65 | 32 × 65 | 16 | 32 | 16 | 32 |
Conv 3 | 32 × 65 | 16 × 65 | 32 | 32 | 32 | 32 |
Conv 4 | 16 × 65 | 8 × 65 | 32 | 32 | 32 | 32 |
Conv 5 | 8 × 65 | 4 × 65 | 32 | 32 | 32 | 32 |
Conv 6 | 4 × 65 | 2 × 65 | 32 | 16 | 32 | 16 |
Conv 7 | 2 × 65 | 1 × 65 | 16 | 1 | 16 | 1 |
FC | 65 | 7 | - | - | - | - |
Params | - | 104,145 | 38,097 | |||
Error | - | 4.22% | 3.99% |
Layer Name | Input Size | Output Size | AVSD NET-84k | AVSD NET-38k | AVSD NET-12k | AVSD NET-3k | AVSD NET-1k | AVSD NET-717 | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Channels | Channels | Channels | Channels | Channels | Channels | |||||||||
In | Out | In | Out | In | Out | In | Out | In | Out | In | Out | |||
Input | - | 128 × 65 | - | 1 | - | 1 | - | 1 | - | 1 | - | 1 | - | 1 |
Conv 1 | 128 × 65 | 64 × 65 | 1 | 16 | 1 | 16 | 1 | 16 | 1 | 8 | 1 | 4 | 1 | 2 |
Conv 2 | 64 × 65 | 32 × 65 | 16 | 32 | 16 | 32 | 16 | 16 | 8 | 8 | 4 | 4 | 2 | 2 |
Conv 3 | 32 × 65 | 16 × 65 | 32 | 64 | 32 | 32 | 16 | 16 | 8 | 8 | 4 | 4 | 2 | 2 |
Conv 4 | 16 × 65 | 8 × 65 | 64 | 64 | 32 | 32 | 16 | 16 | 8 | 8 | 4 | 4 | 2 | 2 |
Conv 5 | 8 × 65 | 4 × 65 | 64 | 32 | 32 | 32 | 16 | 16 | 8 | 8 | 4 | 4 | 2 | 2 |
Conv 6 | 4 × 65 | 2 × 65 | 32 | 16 | 32 | 16 | 16 | 16 | 8 | 8 | 4 | 4 | 2 | 2 |
Conv 7 | 2 × 65 | 1 × 65 | 16 | 1 | 16 | 1 | 16 | 1 | 8 | 1 | 4 | 1 | 2 | 1 |
FC | 65 | 7 | - | - | - | - | - | - | - | - | - | - | - | - |
Softmax | 7 | 7 | - | - | - | - | - | - | - | - | - | - | - | - |
Parameters | - | 84,369 | 38,097 | 12,561 | 3633 | 1329 | 717 | |||||||
FLOPs | - | 55,141 k | 28,533 k | 10,550 k | 2954 k | 897 k | 304 k |
Model | Processing Time in CPU | Model Size |
---|---|---|
AVSD Net-84k | 2.50 ms | 355 kilobytes |
AVSD Net-38k | 2.34 ms | 174 kilobytes |
AVSD Net-12k | 1.99 ms | 74 kilobytes |
AVSD Net-3k | 1.71 ms | 37 kilobytes |
AVSD Net-1k | 1.44 ms | 29 kilobytes |
AVSD Net-717 | 1.33 ms | 26 kilobytes |
Model | Number of Parameters | Training Error Rate | Testing Error Rate |
---|---|---|---|
AVSD Net-84k | 84,369 | 0.02% | 5.29% |
AVSD Net-38k | 38,097 | 0.10% | 5.01% |
AVSD Net-12k | 12,561 | 0.17% | 6.09% |
AVSD Net-3k | 3633 | 0.32% | 5.89% |
AVSD Net-1k | 1329 | 0.64% | 7.31% |
AVSD Net-717 | 717 | 1.45% | 7.39% |
Model | Training Error Rate | Mean of Speed Ratio | Standard Deviation of Speed Ratio |
---|---|---|---|
AVSD Net-84k | 7.00% | 1.01813 | |
AVSD Net-38k | 9.47% | 1.01816 | |
AVSD Net-12k | 9.29% | 1.01813 | |
AVSD Net-3k | 7.71% | 1.01817 | |
AVSD Net-1k | 8.82% | 1.01826 | |
AVSD Net-717 | 14.94% | 1.01823 |
Window Size | Mean of Speed Ratio | Standard Deviation of Speed Ratio | Rate of Abnormal Value |
---|---|---|---|
150 | 1.01813 | 0.00% | |
100 | 1.01822 | 0.00% | |
50 | 1.01869 | 2.65% | |
25 | 1.01994 | 8.55% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Moon, S.; Kim, D.; Kim, Y. Ego-Vehicle Speed Correction for Automotive Radar Systems Using Convolutional Neural Networks. Sensors 2024, 24, 6409. https://doi.org/10.3390/s24196409
Moon S, Kim D, Kim Y. Ego-Vehicle Speed Correction for Automotive Radar Systems Using Convolutional Neural Networks. Sensors. 2024; 24(19):6409. https://doi.org/10.3390/s24196409
Chicago/Turabian StyleMoon, Sunghoon, Daehyun Kim, and Younglok Kim. 2024. "Ego-Vehicle Speed Correction for Automotive Radar Systems Using Convolutional Neural Networks" Sensors 24, no. 19: 6409. https://doi.org/10.3390/s24196409
APA StyleMoon, S., Kim, D., & Kim, Y. (2024). Ego-Vehicle Speed Correction for Automotive Radar Systems Using Convolutional Neural Networks. Sensors, 24(19), 6409. https://doi.org/10.3390/s24196409