The Role and Impact of Deep Learning Methods in Computer-Aided Diagnosis Using Gastrointestinal Endoscopy
Abstract
:1. Introduction
1.1. Deep Learning
1.2. Diagnosis of Gastrointestinal Endoscopy Based on Deep Learning
2. Developments of Deep Learning Methods in CAD for Gastrointestinal Endoscopy
2.1. Diagnosis of Gastric Cancer and HP Infection
2.1.1. Diagnosis of Gastric Cancer
2.1.2. Diagnosis of HP Infection
2.2. Classification and Detection of Colon Polyps
2.3. Diagnosis of ESCC and EAC in Esophagus
2.3.1. Diagnosis of ESCC
2.3.2. Diagnosis of EAC
3. Discussion
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Global Burden of Disease Cancer Collaboration; Fitzmaurice, C.; Akinyemiju, T.F.; Al Lami, F.H.; Alam, T.; Alizadeh-Navaei, R.; Allen, C.; Alsharif, U.; Alvis-Guzman, N.; Amini, E.; et al. Global, regional, and national cancer incidence, mortality, years of life lost, years lived with disability, and disability-adjusted life-years for 29 cancer groups, 1990 to 2016: A systematic analysis for the global burden of disease study. JAMA Oncol. 2018, 4, 1553–1568. [Google Scholar]
- Ikenoyama, Y.; Hirasawa, T.; Ishioka, M.; Namikawa, K.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Yoshio, T.; Tsuchida, T.; Takeuchi, Y.; et al. Detecting early gastric cancer: Comparison between the diagnostic ability of convolutional neural networks and endoscopists. Dig. Endosc. 2021, 33, 141–150. [Google Scholar] [CrossRef] [PubMed]
- Hirasawa, T.; Aoyama, K.; Tanimoto, T.; Ishihara, S.; Shichijo, S.; Ozawa, T.; Ohnishi, T.; Fujishiro, M.; Matsuo, K.; Fujisaki, J.; et al. Application of artificial intelligence using a convolutional neural network for detecting gastric cancer in endoscopic images. Gastric Cancer 2018, 21, 653–660. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Sakai, Y.; Takemoto, S.; Hori, K.; Nishimura, M.; Ikematsu, H.; Yano, T.; Yokota, H. Automatic detection of early gastric cancer in endoscopic images using a transferring convolutional neural network. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; Volume 2018, pp. 4138–4141. [Google Scholar]
- Cao, G.; Song, W.; Zhao, Z. Gastric cancer diagnosis with mask R-CNN. In Proceedings of the 2019 11th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC), Hangzhou, China, 24–25 August 2019; Volume 1, pp. 60–63. [Google Scholar]
- Li, L.; Chen, Y.; Shen, Z.; Zhang, X.; Sang, J.; Ding, Y.; Yang, X.; Li, J.; Chen, M.; Jin, C.; et al. Convolutional neural network for the diagnosis of early gastric cancer based on magnifying narrow band imaging. Gastric Cancer 2020, 23, 126–132. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shibata, T.; Teramoto, A.; Yamada, H.; Ohmiya, N.; Saito, K.; Fujita, H. Automated detection and segmentation of early gastric cancer from endoscopic images using mask R-CNN. Appl. Sci. 2020, 10, 3842. [Google Scholar] [CrossRef]
- Zhang, X.; Hu, W.; Chen, F.; Liu, J.; Yang, Y.; Wang, L.; Duan, H.; Si, J. Gastric precancerous diseases classification using CNN with a concise model. PLoS ONE 2017, 12, e0185508. [Google Scholar] [CrossRef]
- Tahara, T.; Arisawa, T.; Shibata, T.; Wang, F.Y.; Nakamura, M.; Sakata, M.; Nagasaka, M.; Takagi, T.; Kamiya, Y.; Fujita, H.; et al. Risk prediction of gastric cancer by analysis of aberrant DNA methylation in non-neoplastic gastric epithelium. Digestion 2007, 75, 54–61. [Google Scholar] [CrossRef]
- Uemura, N.; Okamoto, S.; Yamamoto, S.; Matsumura, N.; Yamaguchi, S.; Yamakido, M.; Taniyama, K.; Sasaki, N.; Schlemper, R.J. Helicobacter pyloriInfection and the Development of Gastric Cancer. N. Engl. J. Med. 2001, 345, 784–789. [Google Scholar] [CrossRef]
- Goodwin, C.S. Helicobacter pylori gastritis, peptic ulcer, and gastric cancer: Clinical and molecular aspects. Clin. Infect. Dis. 1997, 25, 1017–1019. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Nomura, S.; Terao, S.; Adachi, K.; Kato, T.; Ida, K.; Watanabe, H.; Shimbo, T.; The Research Group for Establishment of Endoscopic Diagnosis of Chronic Gastritis. Endoscopic diagnosis of gastric mucosal activity and inflammation: Endoscopic features of chronic gastritis. Dig. Endosc. 2013, 25, 136–146. [Google Scholar] [CrossRef]
- Tahara, T.; Shibata, T.; Horiguchi, N.; Kawamura, T.; Okubo, M.; Ishizuka, T.; Nagasaka, M.; Nakagawa, Y.; Ohmiya, N. A possible link between gastric mucosal atrophy and gastric cancer after helicobacter pylori eradication. PLoS ONE 2016, 11, e0163700. [Google Scholar] [CrossRef]
- Shichijo, S.; Nomura, S.; Aoyama, K.; Nishikawa, Y.; Miura, M.; Shinagawa, T.; Takiyama, H.; Tanimoto, T.; Ishihara, S.; Matsuo, K.; et al. Application of convolutional neural networks in the diagnosis of helicobacter pylori infection based on endoscopic images. EBioMedicine 2017, 25, 106–111. [Google Scholar] [CrossRef] [Green Version]
- Itoh, T.; Kawahira, H.; Nakashima, H.; Yata, N. Deep learning analyzes Helicobacter pylori infection by upper gastrointesti-nal endoscopy images. Endosc. Int. Open 2018, 6, 139–144. [Google Scholar]
- Nakashima, H.; Kawahira, H.; Kawachi, H.; Sakaki, N. Artificial intelligence diagnosis of helicobacter pylori infection using blue laser imaging-bright and linked color imaging: A single-center prospective study. Ann. Gastroenterol. 2018, 31, 462–468. [Google Scholar] [CrossRef]
- Bray, F.; Ferlay, J.; Soerjomataram, I.; Siegel, R.L.; Torre, L.A.; Jemal, A. Global cancer statistics 2018: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. CA Cancer J. Clin. 2018, 68, 394–424. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Morson, B. The polyp-cancer sequence in the large bowel. Proc. R. Soc. Med. 1974, 67, 451–457. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Butterly, L.F.; Chase, M.P.; Pohl, H.; Fiarman, G.S. Prevalence of clinically important histology in small adenomas. Clin. Gastroenterol. Hepatol. 2006, 4, 343–348. [Google Scholar] [CrossRef] [PubMed]
- Pohl, J.; Nguyen-Tat, M.; Pech, O.; May, A.; Rabenstein, T.; Ell, C. Computed virtual chromoendoscopy for classification of small colorectal lesions: A prospective comparative study. Am. J. Gastroenterol. 2008, 103, 562–569. [Google Scholar] [CrossRef] [PubMed]
- Tajbakhsh, N.; Gurudu, S.R.; Liang, J. Automatic polyp detection in colonoscopy videos using an ensemble of convolutional neural networks. In Proceedings of the 2015 IEEE 12th International Symposium on Biomedical Imaging (ISBI), Brooklyn, NY, USA, 16–19 April 2015; pp. 79–83. [Google Scholar]
- Yu, L.; Chen, H.; Dou, Q.; Qin, J.; Ann Heng, P. Integrating online and offline three-dimensional deep learning for automated polyp detection in colonoscopy videos. IEEE J. Biomed. Health Inform. 2017, 21, 65–75. [Google Scholar] [CrossRef] [PubMed]
- Mohammed, A.; Yildirim, S.; Farup, I.; Pedersen, M.; Hovde, Ø. Y-Net: A deep convolutional neural network for polyp detection. arXiv 2018, arXiv:1806.01907. [Google Scholar]
- Haj-Manouchehri, A.; Mohammadi, H.M. Polyp detection using CNNs in colonoscopy video. IET Comput. Vis. 2020, 14, 241–247. [Google Scholar] [CrossRef]
- Pan, S.J.; Yang, Q. A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 2009, 22, 1345–1359. [Google Scholar] [CrossRef]
- Shin, H.-C.; Roth, H.R.; Gao, M.; Lu, L.; Xu, Z.; Nogues, I.; Yao, J.; Mollura, D.; Summers, R.M. Deep convolutional neural networks for computer-aided detection: CNN architectures, dataset characteristics and transfer learning. IEEE Trans. Med. Imaging 2016, 35, 1285–1298. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shie, C.-K.; Chuang, C.-H.; Chou, C.-N.; Wu, M.-H.; Chang, E.Y. Transfer representation learning for medical image analysis. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25–29 August 2015; Volume 2015, pp. 711–714. [Google Scholar]
- Zhang, R.; Zheng, Y.; Mak, T.W.C.; Yu, R.; Wong, S.H.; Lau, J.Y.W.; Poon, C.C.Y. Automatic detection and classification of colorectal polyps by transferring low-level CNN features from nonmedical domain. IEEE J. Biomed. Health Inform. 2017, 21, 41–47. [Google Scholar] [CrossRef] [PubMed]
- Horie, Y.; Yoshio, T.; Aoyama, K.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Hirasawa, T.; Tsuchida, T.; Ozawa, T.; Ishihara, S.; et al. Diagnostic outcomes of esophageal cancer by artificial intelligence using convolution-al neural networks. Gastrointest. Endosc. 2019, 89, 25–32. [Google Scholar] [CrossRef]
- Cai, S.-L.; Li, B.; Tan, W.-M.; Niu, X.-J.; Yu, H.-H.; Yao, L.-Q.; Zhou, P.-H.; Yan, B.; Zhong, Y.-S. Using a deep learning system in endoscopy for screening of early esophageal squamous cell carcinoma (with video). Gastrointest. Endosc. 2019, 90, 745–753.e2. [Google Scholar] [CrossRef]
- Guo, L.; Xiao, X.; Wu, C.; Zeng, X.; Zhang, Y.; Du, J.; Bai, S.; Xie, J.; Zhang, Z.; Li, Y.; et al. Real-time automated diagnosis of precancerous lesions and early esophageal squamous cell carcinoma using a deep learning model (with videos). Gastrointest. Endosc. 2020, 91, 41–51. [Google Scholar] [CrossRef]
- Ohmori, M.; Ishihara, R.; Aoyama, K.; Nakagawa, K.; Iwagami, H.; Matsuura, N.; Shichijo, S.; Yamamoto, K.; Nagaike, K.; Nakahara, M.; et al. Endoscopic detection and differenti-ation of esophageal lesions using a deep neural network. Gastrointest. Endosc. 2020, 91, 301–309. [Google Scholar] [CrossRef] [PubMed]
- Tokai, Y.; Yoshio, T.; Aoyama, K.; Horie, Y.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Tsuchida, T.; Sakakibara, Y.; Yamada, T.; et al. Application of artificial intelligence using con-volutional neural networks in determining the invasion depth of esophageal squamous cell carcinoma. Esophagus 2020, 17, 250–256. [Google Scholar] [CrossRef] [PubMed]
- Arnold, M.; Laversanne, M.; Brown, L.M.; Devesa, S.S.; Bray, F. Predicting the future burden of esophageal cancer by his-tological subtype: International trends in incidence up to 2030. Am. J. Gastroenterol. 2017, 112, 1247–1255. [Google Scholar] [CrossRef]
- Zhang, Y. Epidemiology of esophageal cancer. World J. Gastroenterol. 2013, 19, 5598–5606. [Google Scholar] [CrossRef] [PubMed]
- Mendel, R.; Ebigbo, A.; Probst, A.; Messmann, H.; Palm, C. Barrett’s Esophagus Analysis Using Convolutional Neural Networks; Springer: Heidelberg/Berlin, Germany, 2017; pp. 80–85. [Google Scholar]
- Hashimoto, R.; Requa, J.; Dao, T.; Ninh, A.; Tran, E.; Mai, D.; Lugo, M.; Chehade, N.E.-H.; Chang, K.J.; Karnes, W.E.; et al. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett’s esophagus (with video). Gastrointest. Endosc. 2020, 91, 1264–1271.e1. [Google Scholar] [CrossRef] [PubMed]
- Fonollà, R.; Scheeve, T.; Struyvenberg, M.R.; Curvers, W.L.; de Groof, A.J.; van der Sommen, F.; Schoonet, E.J.; Bergman, J.J.G.H.M.; de With, P.H.N. Ensemble of deep con-volutional neural networks for classification of Early Barrett’s Neoplasia Using Volumetric laser endomicroscopy. Appl. Sci. 2019, 9, 2183. [Google Scholar] [CrossRef] [Green Version]
- Ghatwary, N.; Zolgharni, M.; Ye, X. Early esophageal adenocarcinoma detection using deep learning methods. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 611–621. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zeiler, M.D.; Fergus, R. Visualizing and understanding convolutional networks. In Proceedings of the European Conference on Computer Vision (ECCV), Zurich, Switzerland, 6–12 September 2014; Volume 8689, pp. 818–833. [Google Scholar]
- Zhou, B.; Khosla, A.; Lapedriza, A.; Oliva, A.; Torralba, A. Learning deep features for discriminative localization. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016. [Google Scholar]
Study | Aim | Method | Performance | Train Dataset | Test Dataset |
---|---|---|---|---|---|
Ikenoyama et al. (2021) [2] | Comparison between CNN and endoscopists | CNN based on SSD | CNN/Endoscopist: Sensitivity: 58.4%/31.9% Specificity: 87.3%/97.2% PPV: 26.0%/46.2% | 10,474 early-stage gastric cancer images and 3110 advanced-stage gastric cancer images | 209 gastric cancer images and 2731 normal images |
Hirasawa et al. (2018) [3] | Detection | CNN based on SSD | Sensitivity: 92.2% Accuracy: 98.6% PPV: 30.6% | 13,584 gastric cancer images | 2296 gastric cancer images |
Sakai et al. (2018) [4] | Detection | CNN based on GoogLeNet | Accuracy: 87.6% Sensitivity: 80.0% Specificity: 94.8% | 9587 gastric cancer images and 9800 normal images | 4653 gastric cancer images and 4997 normal images |
Cao et al. (2019) [5] | Detection + segmentation | Mask R-CNN | AP: 61.2% | 1000 positive samples and 250 negative samples | 120 positive samples and 29 negative samples |
Li et al. (2020) [6] | Classification | CNN + M-NBI | Accuracy: 90.91% Sensitivity: 91.18% Specificity: 90.64% | 1702 gastric cancer images and 386 normal images | 170 gastric cancer images and 171 normal images |
Shibata et al. (2020) [7] | Detection | Mask R-CNN | Average Dice: 71.0% Sensitivity: 96.0% | 533 gastric cancer images and 1208 normal images | Five-fold cross-validation |
Zhang et al. (2017) [8] | Classification | GPD Net | Accuracy: 88.9% | 921 images of erosion, 918 images of polyps, and 944 images of ulcer | 300 images of erosion, 300 images of polyps, and 300 images of ulcer |
Shichijo et al. (2017) [14] | Classification | First CNN based on GoogLeNet Secondary CNN based on GoogLeNet | First/second AUC: 83.1%/87.7% Sensitivity: 81.9%/88.9% Specificity: 83.4%/87.4% | 32,208 images either positive or negative for HP | 11,481 images |
Itoh et al. (2018) [15] | Detection | CNN based on GoogLeNet | AUC: 95.6% Sensitivity: 86.7% Specificity: 86.7% | 596 images | 30 images |
Nakashima et al. (2018) [16] | Classification | CNN based on GoogLeNet | AUCs: 66.0% (WLI), 96.0% (BLI-bright), 95.0% (LCI) | 648 images for each WLI, BLI-bright, and LCI | 60 separate images for WLI, BLI-bright, and LCI |
Study | Aim | Method | Performance | Train Dataset | Test Dataset |
---|---|---|---|---|---|
Tajbakhsh et al. (2015) [21] | Detection | Three-way image presentation + CNN | Sensitivity: about 75.0% | 20 collected short colonoscopy videos (10 positive, 10 negative) | 20 collected colonoscopy videos (10 positive and 10 negative) |
Yu et al. (2017) [22] | Detection | Offline and Online 3D FCN | F1-score: 78.6% F2-score: 73.9% Precision: 88.1% Recall rate: 71.0% | ASU-Mayo clinic database (20 colonoscopy videos) | ASU-Mayo clinic database (18 short colonoscopy videos) |
Mohammed et al. (2018) [23] | Detection | Y-Net | F1-score: 85.9% F2-score: 85.0% Precision: 87.4% Recall rate: 84.4% | ASU-Mayo clinic database (20 colonoscopy videos) | ASU-Mayo clinic database (18 short colonoscopy videos) |
Haj-Manouchehri et al. (2020) [24] | Detection + segmentation | CNN based on VGG; FCN + post-processing | Detection: (accuracy) 86.0% Segmentation: (F2-score) 82.0% | Two collected colonoscopy videos for detection; CVC-CLINIC and ETIS-LARIB dataset for segmentation | 1 collected colonoscopy video for detection; CVC-CLINIC and ETIS-LARIB datasets for segmentation |
Zhang et al. (2017) [28] | Detection + classification | CNN based on CaffNet | Precision: 87.3% Recall rate: 87.6% Accuracy: 85.9% | Source: ImageNet database and Places205; Target: PWH database and new polyp database | 50 images for each class (nonpolyp, hyperplasia, and adenoma polyps) + 10 images for each class (hyperplasia, serrated adenoma, and adenoma) for 5 trials |
Study | Aim | Method | Performance | Train Dataset | Test Dataset |
---|---|---|---|---|---|
Horie et al. (2019) [29] | Detection of ESCC + EAC | CNN based on SSD | Sensitivity: 97.0% (ESCC) 100.0% (EAC) for each case | 8428 EC images | 1118 images (EC + normal) |
Cai et al. (2019) [30] | Detection of ESCC | DNN-CAD | Accuracy: 91.4% Sensitivity: 97.8% Specificity: 85.4% | 2428 (1332 abnormal and 1096 normal) esophagoscopic images | 187 images |
Guo et al. (2020) [31] | Detection of ESCC | CNN based on SegNet | AUC: 98.9% Sensitivity: 98.04% Specificity: 95.03% | 2770 images (precancerous and early-stage ESCC); 3703 images (noncancerous) | Dataset A: 1480 images (precancerous + ESCC); B: 5191 images (noncancerous); C: 27 videos (precancerous + ESCC); D: 33 videos (noncancerous) |
Ohmori et al. (2020) [32] | Detection of ESCC | CNN based on SSD | Performance is good; 100.0% sensitivity by non-ME and 98.0% by ME | 9591 non-ME + 7844 ME images from superficial ESCCs; 564 non-ME + 2744 ME images of noncancerous lesions; 1128 non-ME + 691 ME images of normal esophagus | 255 non-ME WLI images; 268 non-ME NBI/BLI images; 204 ME NBI/BLI images |
Tokai et al. (2020) [33] | Invasion depth of ESCC | CNN based on SSD (detection), GoogLeNet (estimation) | Detection: 95.5% Estimation of depth: 84.1% (sensitivity) 80.9% (accuracy) | 1751 images of ESCC | 291 test images |
Mendel et al. (2017) [36] | Diagnosis of EAC | CNN based on ResNet | Sensitivity: 94.0% Specificity: 88.0% | 4157 noncancerous region patches and 3666 cancerous region patches | Leave-one-patient-out cross-validation |
Hashimoto et al. (2020) [37] | Detection of EAC | Model based on Xception and YOLO v2 | Accuracy: 95.4% Sensitivity: 96.4% Specificity: 94.2% | 916 images of BE (high-grade dysplasia/T1 cancer) and 919 images of BE (nonhigh-grade dysplasia) | 458 test images (225 dysplasia and 233 non-dysplasia) |
Fonollà et al. (2019) [38] | Diagnosis of EAC | Assemble of 3 DCNN based on VGG16 | AUC: 96.0% Sensitivity: 95.0% Specificity: 85.0% | 134 NDBE, 38 HGD/EAC regions; total 8772 images | 99 NDBE and 42 HGD/EAC; total of 7191 images |
Ghatwary et al. (2019) [39] | Comparison | CNN based on R-CNN/Fast R-CNN/Faster R-CNN/SSD | SSD is the best: Sensitivity: 96.0% Specificity: 92.0% F-measure: 94.0% | 50 images (EAC) and 50 images (noncancerous) before data augmentation | Leave-one-patient-out cross-validation |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pang, X.; Zhao, Z.; Weng, Y. The Role and Impact of Deep Learning Methods in Computer-Aided Diagnosis Using Gastrointestinal Endoscopy. Diagnostics 2021, 11, 694. https://doi.org/10.3390/diagnostics11040694
Pang X, Zhao Z, Weng Y. The Role and Impact of Deep Learning Methods in Computer-Aided Diagnosis Using Gastrointestinal Endoscopy. Diagnostics. 2021; 11(4):694. https://doi.org/10.3390/diagnostics11040694
Chicago/Turabian StylePang, Xuejiao, Zijian Zhao, and Ying Weng. 2021. "The Role and Impact of Deep Learning Methods in Computer-Aided Diagnosis Using Gastrointestinal Endoscopy" Diagnostics 11, no. 4: 694. https://doi.org/10.3390/diagnostics11040694