Breast Cancer Diagnosis Using YOLO-Based Multiscale Parallel CNN and Flattened Threshold Swish
Abstract
:1. Introduction
1.1. Current State-of-the-Art in BCD
1.2. Research Gap Analysis
1.3. Contributions
1.4. Organization
2. Material and Methods
2.1. Datasets for Performance Evaluation
2.2. Image Preprocessing
2.3. Breast Cancer Diagnosis Framework
2.4. Proposed Backbone CNN Architecture
2.5. Activation Function
2.6. Loss Function
2.7. Performance Evaluation Matrices
2.8. Experimental Setup and Implementation
3. Results and Discussion
3.1. Breast Cancer Classification Comparative Analysis
3.2. Ablation Experiment Breast Cancer Detection
3.3. Breast Cancer Detection Comparative Analysis
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Jeon, W.; Ko, G.; Lee, J.; Lee, H.; Ha, D.; Ro, W.W. Deep learning with GPUs. Adv. Comput. 2021, 122, 167–215. [Google Scholar] [CrossRef]
- Anand, V.; Gupta, S.; Koundal, D.; Singh, K. Fusion of U-Net and CNN model for segmentation and classification of skin lesion from dermoscopy images. Expert Syst. Appl. 2023, 213, 119230. [Google Scholar] [CrossRef]
- Hasan, A.M.; Al-Waely, N.K.N.; Aljobouri, H.K.; Jalab, H.A.; Ibrahim, R.W.; Meziane, F. Molecular subtypes classification of breast cancer in DCE-MRI using deep features. Expert Syst. Appl. 2024, 236, 121371. [Google Scholar] [CrossRef]
- Van Do, Q.; Hoang, H.T.; Van Vu, N.; De Jesus, D.A.; Brea, L.S.; Nguyen, H.X.; Nguyen, A.T.L.; Le, T.N.; Dinh, D.T.M.; Nguyen, M.T.B.; et al. Segmentation of hard exudate lesions in color fundus image using two-stage CNN-based methods. Expert Syst. Appl. 2024, 241, 122742. [Google Scholar] [CrossRef]
- Baccouche, A.; Garcia-Zapirain, B.; Olea, C.C.; Elmaghraby, A.S. Breast lesions detection and classification via YOLO-based fusion models. Comput. Mater. Contin. 2021, 69, 1407–1425. [Google Scholar] [CrossRef]
- Sechopoulos, I.; Teuwen, J.; Mann, R. Artificial intelligence for breast cancer detection in mammography and digital breast tomosynthesis: State of the art. Semin. Cancer Biol. 2021, 72, 214–225. [Google Scholar] [CrossRef]
- Jessica, T. Contrast-Enhanced Spectral Mammography (Cesm) versus Breast Magnetic Resonance Imaging (Mri) in Breast Cancer Detection among Patients with Newly Diagnosed Breast Cancer: A Systematic Review. J. Med. Imaging Radiat. Sci. 2023, 54, S35–S36. [Google Scholar] [CrossRef]
- Koh, J.; Yoon, Y.; Kim, S.; Han, K.; Kim, E.K. Deep Learning for the Detection of Breast Cancers on Chest Computed Tomography. Clin. Breast Cancer 2022, 22, 26–31. [Google Scholar] [CrossRef]
- Kratkiewicz, K.; Pattyn, A.; Alijabbari, N.; Mehrmohammadi, M. Ultrasound and Photoacoustic Imaging of Breast Cancer: Clinical Systems, Challenges, and Future Outlook. J. Clin. Med. 2022, 11, 1165. [Google Scholar] [CrossRef]
- Moloney, B.M.; McAnena, P.F.; Elwahab, S.M.A.; Fasoula, A.; Duchesne, L.; Gil Cano, J.D.; Glynn, C.; O’Connell, A.; Ennis, R.; Lowery, A.J.; et al. Microwave Imaging in Breast Cancer—Results from the First-In-Human Clinical Investigation of the Wavelia System. Acad. Radiol. 2022, 29, S211–S222. [Google Scholar] [CrossRef]
- Zhao, J.; Chen, T.; Cai, B. A computer-aided diagnostic system for mammograms based on YOLOv3. Multimed. Tools Appl. 2022, 81, 19257–19281. [Google Scholar] [CrossRef]
- Belhaj Soulami, K.; Kaabouch, N.; Nabil Saidi, M. Breast cancer: Classification of suspicious regions in digital mammograms based on capsule network. Biomed. Signal Process Control 2022, 76, 103696. [Google Scholar] [CrossRef]
- Elkorany, A.S.; Elsharkawy, Z.F. Efficient breast cancer mammograms diagnosis using three deep neural networks and term variance. Sci. Rep. 2023, 13, 2663. [Google Scholar] [CrossRef] [PubMed]
- Kang, D.; Gweon, H.M.; Eun, N.L.; Youk, J.H.; Kim, J.A.; Son, E.J. A convolutional deep learning model for improving mammographic breast-microcalcification diagnosis. Sci. Rep. 2021, 11, 23925. [Google Scholar] [CrossRef]
- Haq, I.U.; Ali, H.; Wang, H.Y.; Lei, C.; Ali, H. Feature fusion and Ensemble learning-based CNN model for mammographic image classification. J. King Saud. Univ.—Comput. Inf. Sci. 2022, 34, 3310–3318. [Google Scholar] [CrossRef]
- Escorcia-Gutierrez, J.; Mansour, R.F.; Beleño, K.; Jiménez-Cabas, J.; Pérez, M.; Madera, N.; Velasquez, K. Automated Deep Learning Empowered Breast Cancer Diagnosis Using Biomedical Mammogram Images. Comput. Mater. Contin. 2022, 71, 4221–4235. [Google Scholar] [CrossRef]
- Sannasi Chakravarthy, S.R.; Rajaguru, H. Automatic Detection and Classification of Mammograms Using Improved Extreme Learning Machine with Deep Learning. IRBM 2022, 43, 49–61. [Google Scholar] [CrossRef]
- Salama, W.M.; Aly, M.H. Deep learning in mammography images segmentation and classification: Automated CNN approach. Alex. Eng. J. 2021, 60, 4701–4709. [Google Scholar] [CrossRef]
- Xie, L.; Zhang, L.; Hu, T.; Huang, H.; Yi, Z. Neural networks model based on an automated multi-scale method for mammogram classification. Knowl. Based Syst. 2020, 208, 106465. [Google Scholar] [CrossRef]
- Zhang, L.; Li, Y.; Chen, H.; Wu, W.; Chen, K.; Wang, S. Anchor-free YOLOv3 for mass detection in mammogram. Expert Syst. Appl. 2022, 191, 116273. [Google Scholar] [CrossRef]
- Meng, M.; Zhang, M.; Shen, D.; He, G.; Guo, Y. Detection and classification of breast lesions with You Only Look Once version 5. Future Oncol. 2022, 18, 4361–4370. [Google Scholar] [CrossRef] [PubMed]
- Su, Y.; Liu, Q.; Xie, W.; Hu, P. YOLO-LOGO: A transformer-based YOLO segmentation model for breast mass detection and segmentation in digital mammograms. Comput. Methods Programs Biomed. 2022, 221, 106903. [Google Scholar] [CrossRef] [PubMed]
- Ashraf, M.H.; Jabeen, F.; Alghamdi, H.; Zia, M.S.; Almutairi, M.S. HVD-Net: A Hybrid Vehicle Detection Network for Vision-Based Vehicle Tracking and Speed Estimation. J. King Saud. Univ.—Comput. Inf. Sci. 2023, 35, 101657. [Google Scholar] [CrossRef]
- Loizidou, K.; Elia, R.; Pitris, C. Computer-aided breast cancer detection and classification in mammography: A comprehensive review. Comput. Biol. Med. 2023, 153, 106554. [Google Scholar] [CrossRef]
- Vijayan, D.; Lavanya, R. Integration of Global and Local Descriptors for Mass Characterization in Mammograms. Procedia Comput. Sci. 2023, 218, 393–405. [Google Scholar] [CrossRef]
- Irshad Khan, A.; Abushark, Y.B.; Alsolami, F.; Almalawi, A.; Alam, M.; Kshirsagar, P.; Khan, R.A. Prediction of breast cancer based on computer vision and artificial intelligence techniques. Measurement 2023, 218, 113230. [Google Scholar] [CrossRef]
- Selvi, A.; Thilagamani, S.; Author, C. Scale Invariant Feature Transform with Crow Optimization for Breast Cancer Detection. Intell. Autom. Soft Comput. 2023, 36. [Google Scholar] [CrossRef]
- Lemons, K. A Comparison Between Naïve Bayes and Random Forest to Predict Breast Cancer. Int. J. Undergrad. Res. Creat. Act. 2023, 12, 10. [Google Scholar] [CrossRef]
- Malebary, S.J.; Hashmi, A. Automated Breast Mass Classification System Using Deep Learning and Ensemble Learning in Digital Mammogram. IEEE Access 2021, 9, 55312–55328. [Google Scholar] [CrossRef]
- Wen, X.; Guo, X.; Wang, S.; Lu, Z.; Zhang, Y. Breast cancer diagnosis A systematic review. Biocybern. Biomed. Eng. 2024, 44, 119–148. [Google Scholar] [CrossRef]
- Abduljabbar, M.; Al Bayati, Z.; Çakmak, M. Real-Time Vehicle Detection for Surveillance of River Dredging Areas Using Convolutional Neural Networks. Int. J. Image Graph. Signal Process. 2023, 5, 17–28. [Google Scholar] [CrossRef]
- Saber, A.; Sakr, M.; Abo-Seida, O.M.; Keshk, A.; Chen, H. A Novel Deep-Learning Model for Automatic Detection and Classification of Breast Cancer Using the Transfer-Learning Technique. IEEE Access 2021, 9, 71194–71209. [Google Scholar] [CrossRef]
- Ibrokhimov, B.; Kang, J.Y. Two-Stage Deep Learning Method for Breast Cancer Detection Using High-Resolution Mammogram Images. Appl. Sci. 2022, 12, 4616. [Google Scholar] [CrossRef]
- Alghamdi, H.; Turki, T. PDD-Net: Plant Disease Diagnoses Using Multilevel and Multiscale Convolutional Neural Network Features. Agriculture 2023, 13, 1072. [Google Scholar] [CrossRef]
- Moreira, I.C.; Amaral, I.; Domingues, I.; Cardoso, A.; Cardoso, M.J.; Cardoso, J.S. INbreast: Toward a Full-field Digital Mammographic Database. Acad. Radiol. 2012, 19, 236–248. [Google Scholar] [CrossRef] [PubMed]
- Lee, R.S.; Gimenez, F.; Hoogi, A.; Miyake, K.K.; Gorovoy, M.; Rubin, D.L. A curated mammography data set for use in computer-aided detection and diagnosis research. Sci. Data 2017, 4, 170177. [Google Scholar] [CrossRef] [PubMed]
- Szegedy, C.; Vanhoucke, V.; Ioffe, S.; Shlens, J.; Wojna, Z. Rethinking the Inception Architecture for Computer Vision. In Proceedings of the IEEE Conference On Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 2818–2826. [Google Scholar]
- Tharwat, A. Classification assessment methods. Appl. Comput. Inform. 2018, 17, 168–192. [Google Scholar] [CrossRef]
- Padilla, R.; Netto, S.L.; Da Silva, E.A.B. A Survey on Performance Metrics for Object-Detection Algorithms. In Proceedings of the 2020 International Conference on Systems, Signals and Image Processing (IWSSIP), Niteroi, Brazil, 1–3 July 2020; pp. 237–242. [Google Scholar] [CrossRef]
Author | Methods | Dataset | Limitations |
---|---|---|---|
Vijayan et al. [25] | They proposed a feature fusion strategy utilizing global (e.g., morphological features and GLCM) and local (e.g., SIFT and LBP) features for BCD. | CBIS-DDSM | The reliance on predefined feature sets limits the adaptability and generalization of models across diverse mammogram datasets, and the complex shapes and sizes of lesions present significant challenges for models trained on specific features. |
Irshad et al. [26] | Utilized the HOG and GLCM handcrafted features for BCD. | BreakHis | HOG features are sensitive to geometric and photometric transformations, while GLCM-derived texture features are particularly vulnerable to noise in mammogram images. |
Selvi et al. [27] | Utilized the Kalman filter for noise removal and SIFT features in their BCD framework. | CBIS-DDSM | The SIFT mechanism’s reliance on gradient orientation and magnitude can lead to limited effectiveness in capturing the complex textural details and slight intensity variations often present in mammographic images of breast cancer tissues. |
Baccouche et al. [5] | Proposed a YOLO V2-based framework that utilized Darknet19 CNN architecture for feature extraction and BCD. | CBIS-DDSM and INbreast. | DarkNet-19 is a feed-forward CNN that overlooks the importance of multi-level and multi-scale features. This oversight can lead to challenges such as gradient vanishing or exploding, limiting its effectiveness in managing class variation crucial for accurate BCD [23]. |
Zhao et al. [11] | Proposed a YOLO v3-based framework that utilized Darknet53 CNN architecture for feature extraction and BCD. | CBIS-DDSM | Although DarkNet-53 utilizes multilevel features to mitigate the gradient vanishing problem, it still ignores multi-scale features, which are vital for a scale-invariant BCD framework. |
Xie et al. [19] | Utilized DenseNet and MobileNet CNN architectures for BCD. | INbreast | DenseNet’s computational complexity and extensive use of dense connection blocks can lead to model overfitting; on the other hand, MobileNet, while more lightweight, tends to overlook multi-level features. |
Meng et al. [21] | Proposed a YOLO v5-based framework that utilized Cross Stage Partial-DarkNet53 CNN architecture. | DCE-MRI | The CSP-DarkNet53 architecture is computationally expensive, which hinders deployment in resource-constrained settings. Its feature integration does not optimally capture multi-scale details, which are critical for detecting subtle breast cancer signs. |
Su et al. [22] | Proposed the YOLO-LOGO framework based on YOLOV5 CNN architecture. | CBIS-DDSM and INbreast | The YOLOv5 CNN architecture is computationally expensive and ignores multiscale features. |
Saber et al. [32] | Proposed a transfer learning-based BCD framework that utilized VGG-16 CNN architecture. | MIAS | VGG-16 is a simple CNN architecture that ignores multilevel and multiscale features. |
Ibrokhimov et al. [33] | Proposed a Faster-RCNN-based framework for BCD. | INBreast | The Faster-RCNN architecture, as a two-stage detector, first extracts RoIs and then performs classification, which slows down the detection process and increases the overall system complexity. |
Framework | Activation Function | Precision (%) | Recall (%) | F1-Score (%) | Accuracy (%) |
---|---|---|---|---|---|
Faster RCNN Ibrokhimov et al. [33] | ReLU | 84.36 | 85.23 | 84.70 | 87.80 |
YOLO V3 Zhang et al. [20] | Leaky ReLU | 90.94 | 89.83 | 90.36 | 92.16 |
YOLO V5 Meng et al. [21] | Leaky ReLU | 92.08 | 91.31 | 91.67 | 93.72 |
YOLO-LOGO Su et al. [22] | Leaky ReLU | 95.53 | 95.47 | 95.50 | 96.51 |
YOLO-BCD (Proposed) | Leaky ReLU | 96.39 | 97.03 | 96.70 | 97.56 |
YOLO-BCD (Proposed) | FTS | 97.13 | 97.93 | 97.51 | 98.08 |
Multi-Scale Features | Multi-Level Features | CLAHE | Activation Function | Loss Function | Mass AP% | Cal AP% | mAP% |
---|---|---|---|---|---|---|---|
✘ | ✔ | ✔ | FTS | Scale-invariant | 87.17 | 86.98 | 87.08 |
✔ | ✘ | ✔ | FTS | Scale-invariant | 88.44 | 87.59 | 88.02 |
✔ | ✔ | ✘ | FTS | Scale-invariant | 89.14 | 88.17 | 88.66 |
✔ | ✔ | ✔ | FTS | YOLO Loss | 89.77 | 90.11 | 89.94 |
✔ | ✔ | ✔ | Leaky ReLU | Scale-invariant | 90.55 | 90.64 | 90.60 |
✔ | ✔ | ✔ | FTS | Scale-invariant | 90.95 | 91.35 | 91.15 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mohammed, A.D.; Ekmekci, D. Breast Cancer Diagnosis Using YOLO-Based Multiscale Parallel CNN and Flattened Threshold Swish. Appl. Sci. 2024, 14, 2680. https://doi.org/10.3390/app14072680
Mohammed AD, Ekmekci D. Breast Cancer Diagnosis Using YOLO-Based Multiscale Parallel CNN and Flattened Threshold Swish. Applied Sciences. 2024; 14(7):2680. https://doi.org/10.3390/app14072680
Chicago/Turabian StyleMohammed, Ahmed Dhahi, and Dursun Ekmekci. 2024. "Breast Cancer Diagnosis Using YOLO-Based Multiscale Parallel CNN and Flattened Threshold Swish" Applied Sciences 14, no. 7: 2680. https://doi.org/10.3390/app14072680
APA StyleMohammed, A. D., & Ekmekci, D. (2024). Breast Cancer Diagnosis Using YOLO-Based Multiscale Parallel CNN and Flattened Threshold Swish. Applied Sciences, 14(7), 2680. https://doi.org/10.3390/app14072680