An Integrated Multimodal-Based CAD System for Breast Cancer Diagnosis
Simple Summary
Abstract
1. Introduction
2. Literature Review
Ref. | Year | Image Processing | Classifier | Classification | Dataset | Dataset-Type | Dataset Size | Views | ACC | Limitations |
---|---|---|---|---|---|---|---|---|---|---|
[20] | 2020 | - Wiener filter - K-means clustering | CNN | Binary (benign, malignant) | MIAS | Mammographic images | 322 images | - | 97.14% | Less accessibility of huge data |
[25] | 2020 | - EM segmentation - WBCT features extraction - SVM–RFE with CBR for features reduction | Q-classifier | Binary (benign, malignant) | - DDSM - MIAS | Mammographic images | 1000 images | - | 98.16% | - |
[26] | 2021 | - ABC - WOA - RP, LM, and GD - back- propagation | ANN | Binary (benign, malignant) | WBCD, WDBC, WPBC, DDSM, MIAS, INBREAS | Mammographic images | 1750 images | - | 99.2% | complex and requires more computational time |
[23] | 2021 | - Zernike moments for shape feature extraction - Monogenic-Local Binary Pattern for texture feature extraction Optimized Fusion | CNN SVM ANN KNN | Binary (benign, malignant) | DDSM | Mammographic images | 520 images | - | 99.5% | Low size dataset |
[32] | 2021 | - convolution filters - Data augmentation | SVM VGG | Binary (benign, malignant) | MIAS | Mammographic images | 322 images | - | 98.67% | Low size dataset |
[24] | 2021 | - OKMT-SGO for segmentation - CapsNet for feature extraction | BPNN | Binary (benign, malignant) | MIAS, DDSm | Mammographic images | 322, 13128 | - | 98.16% | - |
[22] | 2021 | CNN | Binary (benign, malignant) | - Mammograms (MIAS, DDSM, INbreast) - Ultrasound (BUS-1, BUS-2) | Mammograms ultrasound | MIAS: 50, DDSM: 450, INbreast: 35 BUS-1: 100, BUS-2: 210 | - | 96.55%, | - | |
[33] | 2022 | - cross-entropy for feature extraction - SCL for reduction - Data augmentation | CNN | Binary (benign, malignant) | DDSM | Mammographic images | 1371 images | MLO CC | 73.55% | lack of publicly available breast mammography databases the insufficient feature extraction ability from breast mammography |
[31] | 2022 | - CLAHE energy layer for texture feature extraction - ECfA for feature selection - CSID fusion algorithm | TTCNN | Binary (benign, malignant) | DDSM, INbreast, MIAS | Mammographic images | DDSM: 981 images INbreast: 269 images MIAS: 119 images | - | 99.08% | - |
[21] | 2022 | - Gaussian filter - Tsallis entropy for segmentation - ResNet 34 for feature extraction - COA for Parampara tuning | WNN | multi (normal-benign- malignant) | MIAS | Mammographic images | 322 images | - | 96.07% | Low size dataset |
[29] | 2021 | - PCA - feature fusion | SVM | Binary (malignant, benign) | CBIS-DDSM, MIAS | Mammograms images | 891, 322 | - | 97.8% | - |
[30] | 2019 | CLAHE, SRG, statistical feature extraction | K-NN, DT, RF, ensemble | Binary classification (normal, abnormal) | MIAS, Digital Mammography Dream Challenge | Mammograms images | 322, (34, 466) | - | 99.5% | - |
[28] | 2024 | noise reduction, image normalization, and contrast enhancement | CNN, Inception, and EfficientNet. | multi-class (being benign, malignant, and normal) | MaMaTT2 | mammogram images. | 408 | - | 98.46% | data accessibility, and the challenge of data imbalance. |
[27] | 2024 | YOLO for Segmentation | BreastNet-SVM | multi-class (being benign, malignant, and normal) | CBIS-DDSM | mammogram images | 6165 | - | 99.16% | the high number of trainable parameters and the need for further validation with larger datasets |
3. Materials and Methods
3.1. Dataset and Preprocessing
3.1.1. Statistic Information Dataset
3.1.2. Images Dataset
- Checking that each record includes MLO and CC views for both breasts.
- Image cleaning: patient information and radiologist comments on mammograms were deleted manually using paint tools and Photoshop to preserve image quality.
- Image resizing: The original mammograms’ dimensions vary due to sources and devices. Therefore, Photoshop was used to resize all mammograms into 1800 × 1800.
- Data augmentation: The dataset initially consisted of 100 records from BI-RADS-1, 100 from BI-RADS-2, and only 56 from BI-RADS-5, creating an imbalance among these classes. To address this, random rotation and flipping were applied to BI-RADS-5 mammograms for data augmentation, increasing its records to 100. For the statistical data, slight random alterations were introduced to preserve feature integrity while balancing the classes. Figure 3 provides an overview of the preprocessing steps for the images.
3.2. Proposed Method
3.2.1. Experiment Setup
Statistical Information-Based Model
Mammograms-Based Model
- Uncleaned images: the model processes images containing text, with four mammograms for each patient, covering two views per breast side.
- Cleaned images: the same model is then applied to images devoid of text, also featuring four mammograms per patient and two views per breast side.
- The outperforming scenario between the uncleaned and cleaned images is rerun using only two images from the two views of the affected breast side.
Combining Approaches
- Soft voting approach: The idea of the model proposed by B. Kurian, which ensembles classifiers by soft voting and hard voting, is applied to the utilized classifiers in this study [38]. Soft voting is applied to combine the predictions of the CNN and the ML models by taking the weighted average of their predicted probabilities. In soft voting, the models are trained and then used to make predictions for the test data. The probabilities of each predicted class are averaged across the models to produce the final tumor class prediction. This can improve the accuracy and robustness of the ensembled models, especially when individual models perform well on different data subsets.
- Hard voting approach: Hard voting is a simple ensemble technique where multiple models are trained separately on the same data and their individual predictions are combined by majority voting. The final prediction for the tumor class is determined by the most common prediction among the individual models. This approach simplifies the integration process while leveraging the strengths of each model.
- Concatenating approach: The concatenation between CNN and MLP refers to the combination of the two neural network architectures. This can be achieved by taking the output of the CNN and feeding it into a fully connected layer (from the MLP). Then, the MLP can learn from features extracted by the CNN in conjunction with statistical data features, potentially improving the overall performance of the model. This approach is commonly used in deep learning applications, such as image recognition. For example, D. Kwon successfully combined a CNN model using panoramic views with an MLP model utilizing patient clinical data to predict the timing for extracting a mandibular third molar tooth, achieving impressive accuracy in clinical practice [39].
- Processor: Intel(R) Core (TM) i7- 10510U CPU 1.80 GHz
- RAM: 16.00 GB
- OS Edition: Windows 11 Home
4. Results and Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Tang, J.; Member, S.; Rangayyan, R.M.; Xu, J.; El Naqa, I. Computer-Aided Detection and Diagnosis of Breast Cancer with Mammography: Recent Advances. IEEE Trans. Inf. Technol. Biomed. 2009, 13, 236–251. [Google Scholar] [CrossRef] [PubMed]
- WHO. Cancer. [Online]. 2022. Available online: https://www.who.int/news-room/fact-sheets/detail/cancer (accessed on 24 February 2023).
- Cherath, L.; Sullivan, M. Cancer. In The GALE ENCYCLOPEDIA of Science, 5th ed.; Gale Group: Farmington Hills, MI, USA, 2014. [Google Scholar]
- Odle, D.D.T.G.; Davidson, A.M.T. Cancer. In The Gale Encyclopedia of Alternative Medicine, 3rd ed.; Gale Group: Farmington Hills, MI, USA, 2008. [Google Scholar]
- Karaman, S.; Detmar, M. Mechanisms of lymphatic metastasis. J. Clin. Investig. 2014, 124, 922–928. [Google Scholar] [CrossRef] [PubMed]
- Kaplan, W. Cancer and Cancer Therapeutics. In Priority Medicines for Europe and the World; WHO: Geneva, Switzerland, 2013; pp. 1–5. [Google Scholar]
- Chaurasia, V.; Pal, S. A Novel Approach for Breast Cancer Detection using Data Mining Techniques. Int. J. Innov. Res. Comput. Commun. Eng. 2017, 2, 2456–2465. [Google Scholar]
- Han, S.; Kang, H.-K.; Jeong, J.-Y.; Park, M.-H.; Kim, W.; Bang, W.-C.; Seong, Y.-K. A Deep Learning Framework for Supporting the Classification of Breast Lesions in Ultrasound Images. Phys. Med. Biol. 2017, 62, 7714–7728. [Google Scholar] [CrossRef] [PubMed]
- Kele, A.; Kele, A.; Yavuz, U. Expert system based on neuro-fuzzy rules for diagnosis breast cancer. Expert Syst. Appl. 2011, 38, 5719–5726. [Google Scholar] [CrossRef]
- Ly, D.; Forman, D.; Ferlay, J.; Brinton, L.A.; Cook, M.B. An international comparison of male and female breast cancer incidence rates. Int. J. Cancer 2013, 132, 1918–1926. [Google Scholar] [CrossRef]
- Alotaibi, R.M.; Rezk, H.R.; Juliana, C.I.; Guure, C. Breast cancer mortality in Saudi Arabia: Modelling observed and unobserved factors. PLoS ONE 2018, 13, e0206148. [Google Scholar] [CrossRef] [PubMed]
- Shah, R. Pathogenesis, prevention, diagnosis and treatment of breast cancer. World J. Clin. Oncol. 2014, 5, 283. [Google Scholar] [CrossRef] [PubMed]
- Society, A.C. About Breast Cancer; American Cancer Society, Inc.: Atlanta, GA, USA, 2018; p. 17. [Google Scholar]
- Nees, A.V. Digital Mammography. Are There Advantages in Screening for Breast Cancer? Acad. Radiol. 2008, 15, 401–407. [Google Scholar] [CrossRef]
- Feig, S.A.; Yaffe, M.J. Digital mammography, computer-aided diagnosis, and telemammography. Radiol. Clin. North Am. 1995, 33, 1205–1230. [Google Scholar] [CrossRef]
- Guo, Z.; Xie, J.; Wan, Y.; Zhang, M.; Qiao, L.; Yu, J.; Chen, S.; Li, B.; Yao, Y. A Review of the Current State of the Computer-Aided Diagnosis (CAD) Systems for Breast Cancer Diagnosis. Open Life Sci. 2022, 17, 1600–1611. [Google Scholar] [CrossRef]
- Jalalian, A.; Mashohor, S.; Mahmud, R.; Karasfi, B.; Iqbal, M.; Saripan, B.; Rahman, A.; Ramli, B. Foundation and Methodologies in Computer-Aided Diagnosis Systems for Breast Cancer Detection. Excli. J. 2017, 16, 113–137. [Google Scholar] [CrossRef] [PubMed]
- Alshammari, M.M.; Almuhanna, A.; Alhiyafi, J. Mammography Image-Based Diagnosis of Breast Cancer Using Machine Learning: A Pilot Study. Sensors 2022, 22, 203. [Google Scholar] [CrossRef]
- Arika, R.N.; Mindila, A.; Cheruiyo, W. Machine Learning Algorithms for Breast Cancer Diagnosis: Challenges, Prospects and Future Research Directions. J. Oncol. Res. 2022, 5, 1–13. [Google Scholar] [CrossRef]
- Albalawi, U.; Manimurugan, S.; Varatharajan, R. Classification of breast cancer mammogram images using convolution neural network. Concurr. Comput. Pract. Exp. 2020, 34, e5803. [Google Scholar] [CrossRef]
- Escorcia-Gutierrez, J.; Mansour, R.F.; Beleño, K.; Jiménez-Cabas, J.; Pérez, M.; Madera, N.; Velasquez, K. Automated Deep Learning Empowered Breast Cancer Diagnosis Using Biomedical Mammogram Images. Comput. Mater. Contin. 2022, 71, 4221–4235. [Google Scholar] [CrossRef]
- Muduli, D.; Dash, R.; Majhi, B. Automated diagnosis of breast cancer using multi-modal datasets: A deep convolution neural network based approach. Biomed. Signal Process. Control 2021, 71, 1746–8094. [Google Scholar] [CrossRef]
- Gargouri, N.; Mokni, R.; Damak, A.; Sellami, D.; Abid, R. Combination of Texture and Shape Features Using Machine and Deep Learning Algorithms for Breast Cancer Diagnosis. Res. Sq. 2021. Preprint. [Google Scholar]
- Kavitha, T.; Mathai, P.P.; Karthikeyan, C.; Ashok, M.; Kohar, R.; Avanija, J.; Neelakandan, S. Deep Learning Based Capsule Neural Network Model for Breast Cancer Diagnosis Using Mammogram Images. Interdiscip. Sci.—Comput. Life Sci. 2021, 14, 113–129. [Google Scholar] [CrossRef]
- Eltrass, A.S.; Salama, M.S. Fully automated scheme for computer-aided detection and breast cancer diagnosis using digitised mammograms. IET Image Process. 2020, 14, 495–505. [Google Scholar] [CrossRef]
- Stephan, P.; Stephan, T.; Kannan, R.; Abraham, A. A hybrid artificial bee colony with whale optimization algorithm for improved breast cancer diagnosis. Neural Comput. Appl. 2021, 33, 13667–13691. [Google Scholar] [CrossRef]
- Ahmad, J.; Akram, S.; Jaffar, A.; Ali, Z.; Bhatti, S.M.; Ahmad, A.; Rehman, S.U. Deep Learning Empowered Breast Cancer Diagnosis: Advancements in Detection and Classification. PLoS ONE 2024, 19, e0304757. [Google Scholar] [CrossRef] [PubMed]
- Dada, E.G.; Oyewola, D.O.; Misra, S. Computer-aided diagnosis of breast cancer from mammogram images using deep learning algorithms. J. Electr. Syst. Inf. Technol. 2024, 11, 38. [Google Scholar] [CrossRef]
- Ragab, D.A.; Attallah, O.; Sharkas, M.; Ren, J.; Marshall, S. A framework for breast cancer classification using Multi-DCNNs. Comput. Biol. Med. 2021, 131, 104245. [Google Scholar] [CrossRef] [PubMed]
- Ragab, D.A.; Sharkas, M.; Attallah, O. Breast cancer diagnosis using an efficient CAD system based on multiple classifiers. Diagnostics 2019, 9, 165. [Google Scholar] [CrossRef]
- Maqsood, S.; Damaševičius, R.; Maskeliūnas, R. TTCNN: A Breast Cancer Detection and Classification towards Computer-Aided Diagnosis Using Digital Mammography in Early Stages. Appl. Sci. 2022, 12, 3273. [Google Scholar] [CrossRef]
- Jayandhi, G.; Jasmine, J.S.L.; Joans, S.M. Mammogram Learning System for Breast Cancer Diagnosis Using Deep Learning SVM. Comput. Syst. Sci. Eng. 2021, 40, 491–503. [Google Scholar] [CrossRef]
- Sun, L.; Wen, J.; Wang, J.; Zhang, Z.; Zhao, Y.; Zhang, G.; Xu, Y. Breast Mass Classification Based on Supervised Contrastive Learning and Multi-View Consistency Penalty on Mammography. IET Biom. 2022, 11, 588–600. [Google Scholar] [CrossRef]
- Wang, L. Mammography with deep learning for breast cancer detection. Front. Oncol. 2024, 14, 1281922. [Google Scholar] [CrossRef]
- Łukasiewicz, S.; Czeczelewski, M.; Forma, A.; Baj, J.; Sitarz, R.; Stanisławek, A. Breast Cancer—Epidemiology, Risk Factors, Classification, Prognostic Markers, and Current Treatment Strategies—An Updated Review. Cancers 2021, 13, 4287. [Google Scholar] [CrossRef]
- Hashim, M.S.; Yassin, A.A. Breast Cancer Prediction Using Soft Voting Classifier Based on Machine Learning Models. Int. J. Comput. Sci. 2023, 50, 705–714. [Google Scholar]
- Puttagunta, M.; Ravi, S. Medical image analysis based on deep learning approach. Multimed. Tools Appl. 2023, 80, 24365–24398. [Google Scholar] [CrossRef] [PubMed]
- Kurian, B.; Jyothi, V.L. Breast cancer prediction using ensemble voting classifiers in next-generation sequences. Soft Comput. 2023, 1–7. [Google Scholar] [CrossRef]
- Kwon, D.; Ahn, J.; Kim, C.S.; Kang, D.O.; Paeng, J.Y. A deep learning model based on concatenation approach to predict the time to extract a mandibular third molar tooth. BMC Oral Health 2022, 22, 1–8. [Google Scholar] [CrossRef]
- Ghafari, S.; Tarnik, M.G.; Yazdi, H.S. Robustness of convolutional neural network models in hyperspectral noisy datasets with loss functions. Comput. Electr. Eng. 2021, 90, 107009. [Google Scholar] [CrossRef]
- Gerami, R.; Joni, S.S.; Akhondi, N.; Etemadi, A.; Fosouli, M.; Eghbal, A.F.; Souri, Z. A Literature Review on the Imaging Methods for Breast Cancer. Int. J. Physiol. Pathophysiol. Pharmacol. 2022, 14, 171–176. [Google Scholar]
Feature | Type | Description | Feature Selection | Modifications | |
---|---|---|---|---|---|
1 | Ser | Numeric | Serial number | Yes | - |
2 | MRN | Numeric | Patient’s record ID | No | - |
3 | Age | Numeric | Patient’s age | Yes | |
4 | Menopause | Numeric | Menopause status. 0 = pre, 1 = post | Yes | - |
5 | Gender | Numeric | Male or female | Yes | - |
6 | Year | Numeric | 0 = 2014, 1 = 2015, 2 = 2016, 3 = 2017, 4 = 2018 | Yes | - |
7 | CC | Numeric | Patient’s complaint (0 = screening, 1 = pain, 2 = lump, 3 = scar, 4 = moles, 5 = skin retraction, 6 = tissue thickening, 7 = nipple discharge, 8 = nipple inversion or retraction, 9 = other | Yes | - |
8 | Age_menarche | Numeric | Age of menarche | Yes | - |
9 | Age_1st_birth | Numeric | Age of first birth | Yes | - |
10 | HRT | Numeric | taking HRT. 0 = no, 1 = yes | Yes | - |
11 | Fhx | Numeric | Family history. 0 = no, 1 = yes | Yes | - |
12 | Nationality | String | Nationality. s = Saudi | No | - |
13 | Density | String | Breast density. A = A, B = B, C = C, D = D | Yes | Changed to numeric values. A = 1, B = 2, C = 3, D = 4 |
14 | Bi_RADS | String | Bi-RADS. 1 = 1, 2 = 2, 5 = 5 | Yes | - |
15 | Biopsy | String | Biopsy. 0 = normal, 1 = malignant | Yes | - |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Sunba, A.; AlShammari, M.; Almuhanna, A.; Alkhnbashi, O.S. An Integrated Multimodal-Based CAD System for Breast Cancer Diagnosis. Cancers 2024, 16, 3740. https://doi.org/10.3390/cancers16223740
Sunba A, AlShammari M, Almuhanna A, Alkhnbashi OS. An Integrated Multimodal-Based CAD System for Breast Cancer Diagnosis. Cancers. 2024; 16(22):3740. https://doi.org/10.3390/cancers16223740
Chicago/Turabian StyleSunba, Amal, Maha AlShammari, Afnan Almuhanna, and Omer S. Alkhnbashi. 2024. "An Integrated Multimodal-Based CAD System for Breast Cancer Diagnosis" Cancers 16, no. 22: 3740. https://doi.org/10.3390/cancers16223740
APA StyleSunba, A., AlShammari, M., Almuhanna, A., & Alkhnbashi, O. S. (2024). An Integrated Multimodal-Based CAD System for Breast Cancer Diagnosis. Cancers, 16(22), 3740. https://doi.org/10.3390/cancers16223740