Integrated Modeling and Target Classification Based on mmWave SAR and CNN Approach
Abstract
:1. Introduction
- Reflectivity Mapping and Standardization of Online Data Set: This study involves collecting 1000 online images of weapons from various resources and generating 3605 samples, with reflectivity mapping and standardization applied during preprocessing phase. These reflectivity-added images are used for further image reconstruction using SAR imaging techniques.
- Utilization of mm-Wave FMCW Radar with 2D SAR Imaging: Preprocessed images are reconstructed using mm-Wave FMCW radar combined with 2D SAR imaging techniques.
- ML Classification Approach: The reconstructed images are classified using CNNs, enhancing classification accuracy.
- Complete Analytical Approach: The approach employs SAR imaging for high-resolution reconstruction of preprocessed weapon images and CNNs for accurate classification. It achieves over 98% accuracy, offering a non-invasive, high-resolution alternative to traditional security screening methods with enhanced capabilities to detect weapons through non-metallic surfaces.
2. Framework Overview
2.1. Phase 1: Image Preprocessing
2.2. Phase 2: mm-Wave FMCW Image Reconstruction Using SAR
2.3. Phase 3: Classification Using Machine Learning
3. Data Collection and Preprocessing to Generate Reflectivity-Added Image
4. mm-Wave FMCW Radar Modeling
4.1. SAR Scanning Area
4.2. Numerical Modeling Approach
4.3. Tx and Rx Signal Modeling
5. Machine Learning Technique to Classify Reconstructed Images
Convolutional Neural Network Architecture
- is the output at in the pooled feature map.
- is the input value at position in the input feature map (before pooling).
- denotes that the coordinates are within the specified pooling region, typically a small square region, such as 2 × 2 or 3 × 3.
- y is the output vector after applying the activation function.
- W is the weight matrix, containing the learned weights during training.
- x is the input vector, typically a flattened feature vector from the preceding layer.
- b is the bias vector, which is added to the weighted sum to enhance model fitting by shifting the activation function.
- Activation is the function applied to each element of the weighted sum plus bias, introducing non-linearity into the model (common functions include ReLU, sigmoid, and softmax).
- denotes the softmax output for the i-th class, representing the probability that the input belongs to class i.
- is the exponential of the i-th element of the input vector z.
- is the normalization factor that makes sure all of the output probabilities add up to one. It is the sum of exponentials for each element in the input vector z.
- The output of the convolution operation is represented by at position in the output feature map.
- The input image or feature map value at position is indicated by .
- The convolution kernel (filter) value at position is represented by .
- The summation across all positions of the convolution kernel is represented by .
- The cross-entropy loss between the true labels y and the predicted probabilities is represented by .
- is the true label for the k-th class, typically a binary value (0 or 1) indicating the correct class.
- is the anticipated likelihood of the k-th class, as output by the model.
- denotes the summation over all classes k.
- is the natural logarithm of the predicted probability for the k-th class.
6. Discussion and Results
6.1. Reconstruction of Reflectivity-Added Images
6.2. CNN Performance Results for SAR-Reconstructed Images
Training and Evaluation
7. Comparison with Previous Works
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Alvarez, L.; Schell, S.; Schell, A. Millimeter-wave radar for automotive applications. IEEE Microw. Mag. 2019, 20, 72–80. [Google Scholar]
- Smith, A.; Lee, B.; Kim, J. Radar Imaging Techniques for Autonomous Vehicles. IEEE Trans. Veh. Technol. 2022, 71, 2345–2356. [Google Scholar]
- Li, Z.; Zhang, R.; Liang, C.; Zhang, X.; Zhang, S. Wireless Wearable Sensors in Smart Healthcare Monitoring: Current Trends and Future Challenges. Sensors 2020, 20, 6474. [Google Scholar]
- Zhang, W.; Liu, Z.; Feng, Y. Neural Network Approaches to SAR Image Reconstruction. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 2118–2130. [Google Scholar]
- Guo, L.; Li, T.; Zhou, X. Deep Learning for Synthetic Aperture Radar Imaging. IEEE Access 2021, 9, 43425–43437. [Google Scholar]
- Liu, Y.; Zhang, Z.; Wang, Y. Deep Learning-Based Target Classification Using mmWave FMCW Radar. IEEE Trans. Geosci. Remote Sens. 2021, 59, 3891–3904. [Google Scholar]
- Chen, X.; Zhao, D.; Huang, S. Image Reconstruction Using SAR with Deep Learning. IEEE Trans. Image Process. 2021, 30, 2938–2949. [Google Scholar]
- Park, S.; Lee, J.; Choi, H. High-Resolution Radar Imaging with Deep Learning. Remote Sens. 2020, 12, 1584. [Google Scholar]
- Cianca, E.; Dio, C.D.; Ruggieri, M. Radar for Detection and Imaging of Non-Metallic Weapons: A Review. IEEE Trans. Aerosp. Electron. Syst. 2013, 49, 1993–2011. [Google Scholar]
- Lee, H.; Kim, S.; Park, Y. Radar-Based Imaging Techniques for Security Applications. IEEE Access 2020, 8, 123456–123468. [Google Scholar]
- Sun, Y.; Liu, Y. Machine Learning for Radar Image Processing: Advances and Challenges. IEEE Access 2019, 7, 155432–155444. [Google Scholar]
- Gao, F.; Xu, H.; Yang, J. Synthetic Aperture Radar Imaging Techniques for High-Resolution Target Detection. IEEE Trans. Aerosp. Electron. Syst. 2020, 56, 1043–1056. [Google Scholar]
- Zheng, L.; Wang, L.; Li, Y. Robust Radar Target Recognition Based on Deep Learning Techniques. IEEE Trans. Signal Process. 2019, 67, 2743–2755. [Google Scholar]
- Huang, J.; Wang, L.; Zhang, D. Millimeter-Wave Radar for High-Resolution Object Classification. IEEE Sens. J. 2022, 22, 3561–3572. [Google Scholar]
- Liu, R.; Chen, F.; Zhou, Y. Dynamic Object Detection with mmWave Radar and Deep Learning. IEEE Access 2023, 11, 1203–1214. [Google Scholar]
- Zhang, H.; Xu, J.; Li, T. Radar Signal Processing with Deep Learning for Target Classification. IEEE Trans. Signal Process. 2023, 72, 1123–1135. [Google Scholar]
- Yu, S.; Fan, W.; Luo, Q. Machine Learning in Radar Applications: A Survey of Recent Advances. IEEE Commun. Surv. Tutor. 2023, 25, 15–37. [Google Scholar]
- Liu, C.; Ma, Y.; Gao, J. Fusion of Radar and Optical Data for Improved Target Detection. IEEE Trans. Geosci. Remote Sens. 2022, 60, 3145–3157. [Google Scholar]
- Li, G.; Sun, Y.; Yu, K. Deep Learning Techniques for SAR Image Analysis. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 4001–4012. [Google Scholar]
- Chen, L.; Zhang, Z.; Dong, Y. Real-Time Radar Imaging with Deep Learning. IEEE Trans. Geosci. Remote Sens. 2023, 61, 453–465. [Google Scholar]
- Gao, G. Synthetic Aperture Radar Imaging with Deep Learning Techniques. IEEE Trans. Image Process. 2020, 29, 5627–5638. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep Residual Learning for Image Recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 770–778. [Google Scholar]
- Wadde, C.; Karvande, R.S.; Kumar, R. Modeling of mmWave FMCW Radar System for 2D SAR Imaging. In Proceedings of the 2024 IEEE Space, Aerospace and Defence Conference (SPACE), Bangalore, India, 22–23 July 2024. [Google Scholar]
- Wadde, C.; Routhu, G.; Karvande, R.S.; Kumar, R. Preliminary Analysis of mmWave SAR Model and Machine Learning Approach. In Proceedings of the 2024 IEEE Space, Aerospace and Defence Conference (SPACE), Bangalore, India, 22–23 July 2024. [Google Scholar]
- Smith, J.; Johnson, A. Synthetic Aperture Radar Imaging: Principles and Applications; Springer: Berlin/Heidelberg, Germany, 2022. [Google Scholar]
- Brocker, D.M.; Hohn, C.H.; Polcawich, A.C. High-resolution imaging with millimeter-wave radar through various clothing materials. In Proceedings of the IEEE Radar Conference, Pasadena, CA, USA, 4–8 May 2009; pp. 1–6. [Google Scholar]
- Guan, J.; Madani, S.; Jog, S.; Gupta, S.; Hassanieh, H. Through fog high-resolution imaging using millimeter wave radar. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 14–19 June 2020. [Google Scholar]
- Isom, B.; Palmer, R.; Kelley, R.; Meier, J.; Bodine, D.; Yeary, M.; Cheong, B.L.; Zhang, Y.; Yu, T. Real-time high-resolution 3D radar imaging using a MEMS-based programmable FMCW array radar. In Proceedings of the 2012 IEEE Radar Conference, Atlanta, GA, USA, 7–11 May 2012. [Google Scholar] [CrossRef]
- Niu, Y.; Li, Y.; Jin, D.; Su, L.; Vasilakos, A.V. A Survey of Millimeter Wave (mmWave) Communications for 5G: Opportunities and Challenges. IEEE J. Sel. Areas Commun. 2017, 35, 1909–1935. [Google Scholar] [CrossRef]
- Park, J.; Kim, S. Synthetic Aperture Radar Signal Processing with MATLAB Algorithms; Wiley: Hoboken, NJ, USA, 2015. [Google Scholar]
- Hastie, T.; Tibshirani, R.; Friedman, J. The Elements of Statistical Learning: Data Mining, Inference, and Prediction; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
- Murphy, K.P. Machine Learning: A Probabilistic Perspective; MIT Press: Cambridge, MA, USA, 2012. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Simonyan, K.; Zisserman, A. Very Deep Convolutional Networks for Large-Scale Image Recognition. arXiv 2014, arXiv:1409.1556. [Google Scholar]
- Zeiler, M.D.; Fergus, R. Visualizing and understanding convolutional networks. In European Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2014; pp. 818–833. [Google Scholar]
- Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference on Machine Learning (ICML-10), Haifa, Israel, 21–24 June 2010; pp. 807–814. [Google Scholar]
- Szegedy, C.; Liu, W.; Jia, Y.; Sermanet, P.; Reed, S.; Anguelov, D.; Erhan, D.; Vanhoucke, V.; Rabinovich, A. Going deeper with convolutions. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 1–9. [Google Scholar]
- Scherer, D.; Müller, A.; Behnke, S. Evaluation of pooling operations in convolutional architectures for object recognition. In International Conference on Artificial Neural Networks; Springer: Berlin/Heidelberg, Germany, 2010; pp. 92–101. [Google Scholar]
- LeCun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
- Bishop, C.M. Pattern Recognition and Machine Learning; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Convolutional neural networks at constrained time cost. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 5353–5360. [Google Scholar]
- Zeiler, M.D.; Fergus, R. Stochastic pooling for regularization of deep convolutional neural networks. arXiv 2013, arXiv:1301.3557. [Google Scholar]
- Huang, G.; Liu, Z.; Maaten, L.V.D.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 4700–4708. [Google Scholar]
- Glorot, X.; Bordes, A.; Bengio, Y. Deep sparse rectifier neural networks. In Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, Fort Lauderdale, FL, USA, 11–13 April 2011; pp. 315–323. [Google Scholar]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Ioffe, S.; Szegedy, C. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the International Conference on Machine Learning, Lille, France, 6 July–11 July 2015; pp. 448–456. [Google Scholar]
- Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2016, arXiv:1609.04747. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Shorten, C.; Khoshgoftaar, T.M. A survey on image data augmentation for deep learning. J. Big Data 2019, 6, 60. [Google Scholar] [CrossRef]
- Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
- Wu, Y.; Zhang, L.; Li, H. FMCW radar signal processing for enhanced image resolution. IEEE Trans. Signal Process. 2023, 71, 923–933. [Google Scholar]
- Kang, D.; Oh, S.; Yoon, J. Real-time SAR imaging with deep learning methods. IEEE Trans. Geosci. Remote Sens. 2021, 60, 210423–210434. [Google Scholar]
- Wang, T.; Liu, B.; Shi, J. SAR image target classification using convolutional neural networks. IEEE Geosci. Remote Sens. Lett. 2018, 15, 154–158. [Google Scholar] [CrossRef]
- Li, M.; Wu, J.; Du, Q. Target recognition in SAR images based on deep learning techniques. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3684–3695. [Google Scholar]
Step-by-Step Overview Workflow |
---|
1. Online Images Collection and Preprocessing |
2. mm-Wave FMCW radar modeling and Image Reconstruction |
3. Generate Reflectivity added Images |
4. Reconstruct Images using SAR imaging Technique |
5. Save the reconstructed images in folder |
6. Import reconstructed images for classification. |
7. Normalize, reshape, and design CNN Architecture |
8. Performance Evaluation and Analysis |
S.No | Parameter | Value | Property |
---|---|---|---|
1 | Object Reflectivity | 0.8 to 1 | Perfect material |
2 | Background Reflectivity | 0 to 0.3 | Perfect absorber |
3 | Image Size | m × n | Online resource image |
4 | grid_size | 100 × 100 | Reflectivity-added image |
S.No | Parameter | Description | Value |
---|---|---|---|
1 | c | Speed of light | m/s |
2 | max_range | Maximum range for the radar system | 100 mm |
3 | object_range | Range of the object | 50 mm |
4 | range_resolution | Range resolution of the radar system | 3.8 mm |
5 | fc | Carrier frequency | 77 GHz |
6 | B | Chirp bandwidth | |
7 | Tchirp | Chirp time | |
8 | slope | Chirp slope | |
9 | Nd | Number of scanning points in x-axis | 100 |
10 | Nr | Number of scanning points in y-axis | 100 |
11 | grid_size | Total grid size | |
12 | step_sizex | Step size along X-axis | 1.9 |
13 | step_sizey | Step size along Y-axis | 1.9 |
14 | num_freq | Number of frequencies to sum over | 10 |
Parameter | Previous Works 1 [2,3,4,8,22] | Previous Works 2 [7,10,12,13,14,19] | Previous Works 3 [11,20,26,53,54,55,56] | This Research Work |
---|---|---|---|---|
Type of Radar | mmWave FMCW | mmWave FMCW | SAR Radar | mmWave FMCW |
SAR Technique | Range-Doppler SAR | Spotlight SAR | Synthetic SAR | Range-Doppler SAR |
Distance | 20–30 m | 10–50 m | 5–20 m | 10–50 m |
Bandwidth | 76–81 GHz | 77–81 GHz | 75–80 GHz | 77–78 GHz |
Data Collection | Extensive real-world data | Limited real-world data | Simulated data | Simulated + Online data |
Reflectivity | Complex multi-material | Single material | Single material | Background: 0–0.3, Object: 0.8–1 |
ML Technique | ResNet, VGG-19 | VGG-16, ResNet | SVM, CNN | CNN |
System Complexity | High | Moderate | Low | Moderate |
Type of Radar | Monostatic | Bistatic | Monostatic | Monostatic |
CNN Network Size | 50+ layers | 16–50 layers | N/A | 10 layers |
Signal Processing | Advanced DSP | Basic DSP | Basic DSP | Advanced DSP |
Post-Processing Requirements | Extensive | Moderate | Minimal | Minimal |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wadde, C.; Routhu, G.; Clemente-Arenas, M.; Gummadi, S.P.; Kumar, R. Integrated Modeling and Target Classification Based on mmWave SAR and CNN Approach. Sensors 2024, 24, 7934. https://doi.org/10.3390/s24247934
Wadde C, Routhu G, Clemente-Arenas M, Gummadi SP, Kumar R. Integrated Modeling and Target Classification Based on mmWave SAR and CNN Approach. Sensors. 2024; 24(24):7934. https://doi.org/10.3390/s24247934
Chicago/Turabian StyleWadde, Chandra, Gayatri Routhu, Mark Clemente-Arenas, Surya Prakash Gummadi, and Rupesh Kumar. 2024. "Integrated Modeling and Target Classification Based on mmWave SAR and CNN Approach" Sensors 24, no. 24: 7934. https://doi.org/10.3390/s24247934
APA StyleWadde, C., Routhu, G., Clemente-Arenas, M., Gummadi, S. P., & Kumar, R. (2024). Integrated Modeling and Target Classification Based on mmWave SAR and CNN Approach. Sensors, 24(24), 7934. https://doi.org/10.3390/s24247934