Machine Learning in Dentistry: A Scoping Review
Abstract
:1. Introduction
2. Materials and Methods
2.1. Search Strategy and Selection Criteria
- (1)
- Studies which had a dental/oral focus, including technical papers.
- (2)
- Studies employing ML, for example, SVM, RF, Artificial Neural Network, CNN.
- (3)
- Studies published between 1 January 2015 and 31 May 2021, as we aimed to gather recent studies and specifically include deep learning as the most rapidly evolving ML field at present.
- Population: All types of data with a dental or oral component.
- Intervention/Comparison: ML techniques applied with a dental or oral focus for the diagnosis, management, prognosis of dental conditions or improving data quality. Patient-level, tooth-level, surface-level, or pixel-level.
- Outcome: Performance evaluation of the ML models in terms of metrics, for example, accuracy, IoU, sensitivity, precision, area under the receiver operating characteristic, F indices, specificity, negative predictive value, rank-N recognition rate, error estimates, correlation coefficients, etc.
- Study design type: For this review, we considered all kinds of studies except reviews, editorials, and technical standards, with no language restrictions.
2.2. Data Collection, Items, and Pre-Processing
2.3. Quality Assessment
2.4. Data Synthesis
- In ML, classification refers to a predictive modeling problem where a class label is predicted for a given example of input data. An example is to classify a given handwritten character as one of the known characters. Algorithms popularly used for classification in the included studies were logistic regression, k-Nearest Neighbors, Decision Trees, Naïve Bayes, RF, Gradient Boosting, etc.
- In object detection tasks, one attempts to identify and locate objects within an image or video. Specifically, object detection draws bounding boxes around the detected objects, which allow to locate the said objects. Given the complexity of handling image data, deep learning based on CNNs, such as Region-based CNN, Fast Region-based CNN, You Only Look Once, Single Shot multiBox Detection, are popularly used for this task.
- In image segmentation tasks, one aims to identify the exact outline of a detected object in an image. There are two types of segmentation tasks: semantic segmentation and instance segmentation. Semantic segmentation classifies each pixel in the image into a particular class. It does not differentiate between different instances of the same object. For example, if there are two cats in an image, semantic segmentation gives the same label, for instance, ‘cat’, to all the pixels of both cats. Instance segmentation differs from this in the sense that it gives a unique label to every instance of a particular object in the image. Thus, in the example of an image containing two cats, each cat would receive a distinct label, for instance, ‘cat1’ and ‘cat2’. Currently, the most popular models for image segmentation are Fully CNNs and their variants like UNet, DeepLab, PointNet, etc.
- A fifth type of a ML task is a generation task, which is not predictive in nature. Such tasks involve the generation of new images from the input images, for example, generation of artifact-free CT images from those containing metal artifacts.
3. Results
3.1. Study Selection and Characteristics
3.2. Risk of Bias and Applicability Concerns
3.3. Adherence to Reporting Standards
3.4. Tasks, Metrics, and Findings of the Studies
4. Discussion
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sun, M.-L.; Liu, Y.; Liu, G.-M.; Cui, D.; Heidari, A.A.; Jia, W.-Y.; Ji, X.; Chen, H.-L.; Luo, Y.-G. Application of Machine Learning to Stomatology: A Comprehensive Review. IEEE Access 2020, 8, 184360–184374. [Google Scholar] [CrossRef]
- Schwendicke, F.; Chaurasia, A.; Arsiwala, L.; Lee, J.-H.; Elhennawy, K.; Jost-Brinkmann, P.-G.; Demarco, F.; Krois, J. Deep learning for cephalometric landmark detection: Systematic review and meta-analysis. Clin. Oral Investig. 2021, 25, 4299–4309. [Google Scholar] [CrossRef] [PubMed]
- Farhadian, M.; Shokouhi, P.; Torkzaban, P. A decision support system based on support vector machine for diagnosis of periodontal disease. BMC Res. Notes 2020, 13, 337. [Google Scholar] [CrossRef]
- Abdalla-Aslan, R.; Yeshua, T.; Kabla, D.; Leichter, I.; Nadler, C. An artificial intelligence system using machine-learning for automatic detection and classification of dental restorations in panoramic radiography. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2020, 130, 593–602. [Google Scholar] [CrossRef]
- Ben Li, B.; Feridooni, T.; Cuen-Ojeda, C.; Kishibe, T.; de Mestral, C.; Mamdani, M.; Al-Omran, M. Machine learning in vascular surgery: A systematic review and critical appraisal. NPJ Digit. Med. 2022, 5, 7. [Google Scholar] [CrossRef]
- Schwendicke, F.; Tzschoppe, M.; Paris, S. Radiographic caries detection: A systematic review and meta-analysis. J. Dent. 2015, 43, 924–933. [Google Scholar] [CrossRef]
- Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G.; PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 Statement: An Updated Guideline for Reporting Systematic Reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Whiting, P.F.; Rutjes, A.W.S.; Westwood, M.E.; Mallett, S.; Deeks, J.J.; Reitsma, J.B.; Leeflang, M.M.G.; Sterne, J.A.C.; Bossuyt, P.M.M.; QUADAS-2 Group. QUADAS-2: A Revised Tool for the Quality Assessment of Diagnostic Accuracy Studies. Ann. Intern. Med. 2011, 155, 529–536. [Google Scholar] [CrossRef]
- Wolff, R.F.; Moons, K.G.; Riley, R.; Whiting, P.F.; Westwood, M.; Collins, G.S.; Reitsma, J.B.; Kleijnen, J.; Mallett, S.; for the PROBAST Group. PROBAST: A Tool to Assess the Risk of Bias and Applicability of Prediction Model Studies. Ann. Intern. Med. 2019, 170, 51–58. [Google Scholar] [CrossRef]
- Collins, G.S.; Reitsma, J.B.; Altman, D.G.; Moons, K.G.M. Transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD): The TRIPOD Statement. BMC Med. 2015, 13, 214. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Aliaga, I.J.; Vera, V.; De Paz, J.F.; García, A.E.; Mohamad, M.S. Modelling the Longevity of Dental Restorations by means of a CBR System. BioMed Res. Int. 2015, 2015, 540306. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gupta, A.; Kharbanda, O.P.; Sardana, V.; Balachandran, R.; Sardana, H.K. A knowledge-based algorithm for automatic detection of cephalometric landmarks on CBCT images. Int. J. Comput. Assist. Radiol. Surg. 2015, 10, 1737–1752. [Google Scholar] [CrossRef] [PubMed]
- Gupta, A.; Kharbanda, O.P.; Sardana, V.; Balachandran, R.; Sardana, H.K. Accuracy of 3D cephalometric measurements based on an automatic knowledge-based landmark detection algorithm. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 1297–1309. [Google Scholar] [CrossRef]
- Hadley, A.J.; Krival, K.R.; Ridgel, A.L.; Hahn, E.C.; Tyler, D.J. Neural Network Pattern Recognition of Lingual–Palatal Pressure for Automated Detection of Swallow. Dysphagia 2015, 30, 176–187. [Google Scholar] [CrossRef]
- Kavitha, M.S.; An, S.-Y.; An, C.-H.; Huh, K.-H.; Yi, W.-J.; Heo, M.-S.; Lee, S.-S.; Choi, S.-C. Texture analysis of mandibular cortical bone on digital dental panoramic radiographs for the diagnosis of osteoporosis in Korean women. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2015, 119, 346–356. [Google Scholar] [CrossRef]
- Mansoor, A.; Patsekin, V.; Scherl, D.; Robinson, J.P.; Rajwa, B. A Statistical Modeling Approach to Computer-Aided Quantification of Dental Biofilm. IEEE J. Biomed. Health Inform. 2015, 19, 358–366. [Google Scholar] [CrossRef] [Green Version]
- Jung, S.-K.; Kim, T.-W. New approach for the diagnosis of extractions with neural network machine learning. Am. J. Orthod. Dentofac. Orthop. 2016, 149, 127–133. [Google Scholar] [CrossRef] [Green Version]
- Kavitha, M.S.; Kumar, P.G.; Park, S.-Y.; Huh, K.-H.; Heo, M.-S.; Kurita, T.; Asano, A.; An, S.-Y.; Chien, S.-I. Automatic detection of osteoporosis based on hybrid genetic swarm fuzzy classifier approaches. Dentomaxillofac. Radiol. 2016, 45, 20160076. [Google Scholar] [CrossRef] [Green Version]
- Mahmoud, Y.E.; Labib, S.S.; Mokhtar, H.M.O. Teeth periapical lesion prediction using machine learning techniques. In Proceedings of the 2016 SAI Computing Conference (SAI), London, UK, 13–15 July 2016; pp. 129–134. [Google Scholar] [CrossRef]
- Wang, L.; Li, S.; Chen, R.; Liu, S.-Y.; Chen, J.-C. An Automatic Segmentation and Classification Framework Based on PCNN Model for Single Tooth in MicroCT Images. PLoS ONE 2016, 11, e0157694. [Google Scholar] [CrossRef]
- De Tobel, J.; Radesh, P.; Vandermeulen, D.; Thevissen, P.W. An automated technique to stage lower third molar development on panoramic radiographs for age estimation: A pilot study. J. Forensic Odonto-Stomatol. 2017, 35, 42–54. [Google Scholar]
- Hwang, J.J.; Lee, J.-H.; Han, S.-S.; Kim, Y.H.; Jeong, H.-G.; Choi, Y.J.; Park, W. Strut analysis for osteoporosis detection model using dental panoramic radiography. Dentomaxillofac. Radiol. 2017, 46, 20170006. [Google Scholar] [CrossRef] [Green Version]
- Imangaliyev, S.; van der Veen, M.H.; Volgenant, C.; Loos, B.G.; Keijser, B.J.; Crielaard, W.; Levin, E. Classification of quantitative light-induced fluorescence images using convolutional neural network. arXiv 2017, arXiv:1705.09193. [Google Scholar]
- Johari, M.; Esmaeili, F.; Andalib, A.; Garjani, S.; Saberkari, H. Detection of vertical root fractures in intact and endodontically treated premolar teeth by designing a probabilistic neural network: An ex vivo study. Dentomaxillofac. Radiol. 2017, 46, 20160107. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Liu, Y.; Li, Y.; Fu, Y.; Liu, T.; Liu, X.; Zhang, X.; Fu, J.; Guan, X.; Chen, T.; Chen, X.; et al. Quantitative prediction of oral cancer risk in patients with oral leukoplakia. Oncotarget 2017, 8, 46057–46064. [Google Scholar] [CrossRef] [Green Version]
- Miki, Y.; Muramatsu, C.; Hayashi, T.; Zhou, X.; Hara, T.; Katsumata, A.; Fujita, H. Classification of teeth in cone-beam CT using deep convolutional neural network. Comput. Biol. Med. 2017, 80, 24–29. [Google Scholar] [CrossRef]
- Oktay, A.B. Tooth detection with Convolutional Neural Networks. In Proceedings of the 2017 Medical Technologies National Congress (TIPTEKNO), Trabzon, Turkey, 12–14 October 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Prajapati, A.S.; Nagaraj, R.; Mitra, S. Classification of dental diseases using CNN and transfer learning. In Proceedings of the 2017 5th International Symposium on Computational and Business Intelligence (ISCBI), Dubai, United Arab Emirates, 11–14 August 2017; pp. 70–74. [Google Scholar]
- Raith, S.; Vogel, E.P.; Anees, N.; Keul, C.; Güth, J.-F.; Edelhoff, D.; Fischer, H. Artificial Neural Networks as a powerful numerical tool to classify specific features of a tooth based on 3D scan data. Comput. Biol. Med. 2017, 80, 65–76. [Google Scholar] [CrossRef]
- Rana, A.; Yauney, G.; Wong, L.C.; Gupta, O.; Muftu, A.; Shah, P. Automated segmentation of gingival diseases from oral images. In Proceedings of the 2017 IEEE Healthcare Innovations and Point of Care Technologies (HI-POCT), Bethesda, MD, USA, 6–8 November 2017; pp. 144–147. [Google Scholar] [CrossRef]
- Srivastava, M.M.; Kumar, P.; Pradhan, L.; Varadarajan, S. Detection of tooth caries in bitewing radiographs using deep learning. arXiv 2017, arXiv:1711.07312. [Google Scholar]
- Štepanovský, M.; Ibrová, A.; Buk, Z.; Velemínská, J. Novel age estimation model based on development of permanent teeth compared with classical approach and other modern data mining methods. Forensic Sci. Int. 2017, 279, 72–82. [Google Scholar] [CrossRef]
- Yilmaz, E.; Kayikcioglu, T.; Kayipmaz, S. Computer-aided diagnosis of periapical cyst and keratocystic odontogenic tumor on cone beam computed tomography. Comput. Methods Programs Biomed. 2017, 146, 91–100. [Google Scholar] [CrossRef]
- Du, X.; Chen, Y.; Zhao, J.; Xi, Y. A Convolutional Neural Network Based Auto-Positioning Method for Dental Arch In Rotational Panoramic Radiography. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 2615–2618. [Google Scholar] [CrossRef]
- Egger, J.; Pfarrkirchner, B.; Gsaxner, C.; Lindner, L.; Schmalstieg, D.; Wallner, J. Fully Convolutional Mandible Segmentation on a valid Ground- Truth Dataset. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 656–660. [Google Scholar] [CrossRef]
- Fakhriy, N.A.A.; Ardiyanto, I.; Nugroho, H.A.; Pratama, G.N.P. Machine Learning Algorithms for Classifying Abscessed and Impacted Tooth: Comparison Study. In Proceedings of the 2018 2nd International Conference on Biomedical Engineering (IBIOMED), Bali, Indonesia, 24–26 July 2018; pp. 88–93. [Google Scholar] [CrossRef]
- Fariza, A.; Arifin, A.Z.; Astuti, E.R. Interactive Segmentation of Conditional Spatial FCM with Gaussian Kernel-Based for Panoramic Radiography. In Proceedings of the 2018 International Symposium on Advanced Intelligent Informatics (SAIN), Yogyakarta, Indonesia, 29–30 August 2018; pp. 157–161. [Google Scholar] [CrossRef]
- Gavinho, L.G.; Araujo, S.A.; Bussadori, S.K.; Silva, J.V.P.; Deana, A.M. Detection of white spot lesions by segmenting laser speckle images using computer vision methods. Lasers Med. Sci. 2018, 33, 1565–1571. [Google Scholar] [CrossRef]
- Ha, S.-R.; Park, H.S.; Kim, E.-H.; Kim, H.-K.; Yang, J.-Y.; Heo, J.; Yeo, I.-S.L. A pilot study using machine learning methods about factors influencing prognosis of dental implants. J. Adv. Prosthodont. 2018, 10, 395–400. [Google Scholar] [CrossRef] [Green Version]
- Heinrich, A.; Güttler, F.; Wendt, S.; Schenkl, S.; Hubig, M.; Wagner, R.; Mall, G.; Teichgräber, U. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision. Rofo 2018, 190, 1152–1158. [Google Scholar] [CrossRef] [Green Version]
- Jader, G.; Fontineli, J.; Ruiz, M.; Abdalla, K.; Pithon, M.; Oliveira, L. Deep Instance Segmentation of Teeth in Panoramic X-ray Images. In Proceedings of the 2018 31st SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI), Parana, Brazil, 29 October–1 November 2018; pp. 400–407. [Google Scholar] [CrossRef]
- Jiang, M.-X.; Chen, Y.-M.; Huang, W.-H.; Huang, P.-H.; Tsai, Y.-H.; Huang, Y.-H.; Chiang, C.-K. Teeth-Brushing Recognition Based on Deep Learning. In Proceedings of the 2018 IEEE International Conference on Consumer Electronics-Taiwan (ICCE-TW), Taichung, Taiwan, 19–21 May 2018; pp. 1–2. [Google Scholar] [CrossRef]
- Kim, D.W.; Kim, H.; Nam, W.; Kim, H.J.; Cha, I.-H. Machine learning to predict the occurrence of bisphosphonate-related osteonecrosis of the jaw associated with dental extraction: A preliminary report. Bone 2018, 116, 207–214. [Google Scholar] [CrossRef]
- Lee, J.-H.; Kim, D.-H.; Jeong, S.-N.; Choi, S.-H. Diagnosis and prediction of periodontally compromised teeth using a deep learning-based convolutional neural network algorithm. J. Periodontal Implant. Sci. 2018, 48, 114–123. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.-H.; Kim, D.-H.; Jeong, S.-N.; Choi, S.-H. Detection and diagnosis of dental caries using a deep learning-based convolutional neural network algorithm. J. Dent. 2018, 77, 106–111. [Google Scholar] [CrossRef]
- Lu, S.; Yang, J.; Wang, W.; Li, Z.; Lu, Z. Teeth Classification Based on Extreme Learning Machine. In Proceedings of the 2018 Second World Conference on Smart Trends in Systems, Security and Sustainability (WorldS4), London, UK, 30–31 October 2018; pp. 198–202. [Google Scholar] [CrossRef]
- Shah, H.; Hernandez, P.; Budin, F.; Chittajallu, D.; Vimort, J.B.; Walters, R.; Mol, A.; Khan, A.; Paniagua, B. Automatic quantification framework to detect cracks in teeth. Proc. SPIE Int. Soc. Opt. Eng. 2018, 10578, 105781K. [Google Scholar] [CrossRef] [Green Version]
- Song, B.; Sunny, S.; Uthoff, R.; Patrick, S.; Suresh, A.; Kolur, T.; Keerthi, G.; Anbarani, A.; Wilder-Smith, P.; Kuriakose, M.A.; et al. Automatic classification of dual-modalilty, smartphone-based oral dysplasia and malignancy images using deep learning. Biomed. Opt. Express 2018, 9, 5318–5329. [Google Scholar] [CrossRef]
- Thanathornwong, B. Bayesian-Based Decision Support System for Assessing the Needs for Orthodontic Treatment. Health Inform. Res. 2018, 24, 22–28. [Google Scholar] [CrossRef]
- Yang, J.; Xie, Y.; Liu, L.; Xia, B.; Cao, Z.; Guo, C. Automated Dental Image Analysis by Deep Learning on Small Dataset. In Proceedings of the 2018 IEEE 42nd Annual Computer Software and Applications Conference (COMPSAC), Tokyo, Japan, 23–27 July 2018; pp. 492–497. [Google Scholar] [CrossRef]
- Yoon, S.; Odlum, M.; Lee, Y.; Choi, T.; Kronish, I.M.; Davidson, K.W.; Finkelstein, J. Applying Deep Learning to Understand Predictors of Tooth Mobility Among Urban Latinos. Stud. Health Technol. Inform. 2018, 251, 241–244. [Google Scholar]
- Zakirov, A.; Ezhov, M.; Gusarev, M.; Alexandrovsky, V.; Shumilov, E. Dental pathology detection in 3D cone-beam CT. arXiv 2018, arXiv:1810.10309. [Google Scholar]
- Zanella-Calzada, L.A.; Galván-Tejada, C.E.; Chávez-Lamas, N.M.; Rivas-Gutierrez, J.; Magallanes-Quintanar, R.; Celaya-Padilla, J.M.; Galván-Tejada, J.I.; Gamboa-Rosales, H. Deep Artificial Neural Networks for the Diagnostic of Caries Using Socioeconomic and Nutritional Features as Determinants: Data from NHANES 2013–2014. Bioengineering 2018, 5, 47. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhang, K.; Wu, J.; Chen, H.; Lyu, P. An effective teeth recognition method using label tree with cascade network structure. Comput. Med. Imaging Graph. 2018, 68, 61–70. [Google Scholar] [CrossRef] [PubMed]
- Ali, H.; Khursheed, M.; Fatima, S.K.; Shuja, S.M.; Noor, S. Object Recognition for Dental Instruments Using SSD-MobileNet. In Proceedings of the 2019 International Conference on Information Science and Communication Technology (ICISCT), Karachi, Pakistan, 9–10 March 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Alkaabi, S.; Yussof, S.; Al-Mulla, S. Evaluation of Convolutional Neural Network based on Dental Images for Age Estimation. In Proceedings of the 2019 International Conference on Electrical and Computing Technologies and Applications (ICECTA), Ras Al Khaimah, United Arab Emirates, 19–21 November 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Askarian, B.; Tabei, F.; Tipton, G.A.; Chong, J.W. Smartphone-Based Method for Detecting Periodontal Disease. In Proceedings of the 2019 IEEE Healthcare Innovations and Point of Care Technologies, (HI-POCT), Bethesda, MD, USA, 20–22 November 2019; pp. 53–55. [Google Scholar] [CrossRef]
- Bouchahma, M.; Ben Hammouda, S.; Kouki, S.; Alshemaili, M.; Samara, K. An Automatic Dental Decay Treatment Prediction using a Deep Convolutional Neural Network on X-ray Images. In Proceedings of the 2019 IEEE/ACS 16th International Conference on Computer Systems and Applications (AICCSA), Abu Dhabi, United Arab Emirates, 3–7 November 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Casalegno, F.; Newton, T.; Daher, R.; Abdelaziz, M.; Lodi-Rizzini, A.; Schürmann, F.; Krejci, I.; Markram, H. Caries Detection with Near-Infrared Transillumination Using Deep Learning. J. Dent. Res. 2019, 98, 1227–1233. [Google Scholar] [CrossRef] [Green Version]
- Chen, H.; Zhang, K.; Lyu, P.; Li, H.; Zhang, L.; Wu, J.; Lee, C.-H. A deep learning approach to automatic teeth detection and numbering based on object detection in dental periapical films. Sci. Rep. 2019, 9, 3840. [Google Scholar] [CrossRef] [Green Version]
- Cheng, B.; Wang, W. Dental hard tissue morphological segmentation with sparse representation-based classifier. Med. Biol. Eng. Comput. 2019, 57, 1629–1643. [Google Scholar] [CrossRef]
- Chin, C.-L.; Lin, J.-W.; Wei, C.-S.; Hsu, M.-C. Dentition Labeling and Root Canal Recognition Using Ganand Rule-Based System. In Proceedings of the 2019 International Conference on Technologies and Applications of Artificial Intelligence (TAAI), Kaohsiung, Taiwan, 21–23 November 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Choi, H.-I.; Jung, S.-K.; Baek, S.-H.; Lim, W.H.; Ahn, S.-J.; Yang, I.-H.; Kim, T.-W. Artificial Intelligent Model with Neural Network Machine Learning for the Diagnosis of Orthognathic Surgery. J. Craniofac. Surg. 2019, 30, 1986–1989. [Google Scholar] [CrossRef]
- Cui, Z.; Li, C.; Wang, W. ToothNet: Automatic Tooth Instance Segmentation and Identification from Cone Beam CT Images. In Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Long Beach, CA, USA, 15–20 June 2019; pp. 6361–6370. [Google Scholar] [CrossRef]
- Dasanayaka, C.; Dharmasena, B.; Bandara, W.R.; Dissanayake, M.B.; Jayasinghe, R. Segmentation of Mental Foramen in Dental Panoramic Tomography using Deep Learning. In Proceedings of the 2019 14th Conference on Industrial and Information Systems (ICIIS), Kandy, Sri Lanka, 18–20 December 2019; pp. 81–84. [Google Scholar] [CrossRef]
- Cruz, J.C.D.; Garcia, R.G.; Cueto, J.C.C.V.; Pante, S.C.; Toral, C.G.V. Automated Human Identification through Dental Image Enhancement and Analysis. In Proceedings of the 2019 IEEE 11th International Conference on Humanoid, Nanotechnology, Information Technology, Communication and Control, Environment, and Management (HNICEM), Laoag, Philippines, 29 November–1 December 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Duong, D.Q.; Nguyen, K.-C.T.; Kaipatur, N.R.; Lou, E.H.M.; Noga, M.; Major, P.W.; Punithakumar, K.; Le, L.H. Fully Automated Segmentation of Alveolar Bone Using Deep Convolutional Neural Networks from Intraoral Ultrasound Images. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6632–6635. [Google Scholar] [CrossRef]
- Ekert, T.; Krois, J.; Meinhold, L.; Elhennawy, K.; Emara, R.; Golla, T.; Schwendicke, F. Deep Learning for the Radiographic Detection of Apical Lesions. J. Endod. 2019, 45, 917–922.e5. [Google Scholar] [CrossRef]
- Hatvani, J.; Basarab, A.; Tourneret, J.-Y.; Gyongy, M.; Kouame, D. A Tensor Factorization Method for 3-D Super Resolution with Application to Dental CT. IEEE Trans. Med. Imaging 2019, 38, 1524–1531. [Google Scholar] [CrossRef] [Green Version]
- Hatvani, J.; Horvath, A.; Michetti, J.; Basarab, A.; Kouame, D.; Gyongy, M. Deep Learning-Based Super-Resolution Applied to Dental Computed Tomography. IEEE Trans. Radiat. Plasma Med. Sci. 2019, 3, 120–128. [Google Scholar] [CrossRef] [Green Version]
- Hegazy, M.A.A.; Cho, M.H.; Cho, M.H.; Lee, S.Y. U-net based metal segmentation on projection domain for metal artifact reduction in dental CT. Biomed. Eng. Lett. 2019, 9, 375–385. [Google Scholar] [CrossRef] [PubMed]
- Hiraiwa, T.; Ariji, Y.; Fukuda, M.; Kise, Y.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. A deep-learning artificial intelligence system for assessment of root morphology of the mandibular first molar on panoramic radiography. Dentomaxillofac. Radiol. 2019, 48, 20180218. [Google Scholar] [CrossRef] [PubMed]
- Hu, Z.; Jiang, C.; Sun, F.; Zhang, Q.; Ge, Y.; Yang, Y.; Liu, X.; Zheng, H.; Liang, D. Artifact correction in low-dose dental CT imaging using Wasserstein generative adversarial networks. Med. Phys. 2019, 46, 1686–1696. [Google Scholar] [CrossRef] [PubMed]
- Hung, M.; Voss, M.W.; Rosales, M.N.; Li, W.; Su, W.; Xu, J.; Bounsanga, J.; Ruiz-Negrón, B.; Lauren, E.; Licari, F.W. Application of machine learning for diagnostic prediction of root caries. Gerodontology 2019, 36, 395–404. [Google Scholar] [CrossRef] [PubMed]
- Ilic, I.; Vodanovic, M.; Subasic, M. Gender Estimation from Panoramic Dental X-ray Images using Deep Convolutional Networks. In Proceedings of the IEEE EUROCON 2019 -18th International Conference on Smart Technologies, Novi Sad, Serbia, 1–4 July 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Kats, L.; Vered, M.; Zlotogorski-Hurvitz, A.; Harpaz, I. Atherosclerotic carotid plaque on panoramic radiographs: Neural network detection. Int. J. Comput. Dent. 2019, 22, 163–169. [Google Scholar]
- Kats, L.; Vered, M.; Zlotogorski-Hurvitz, A.; Harpaz, I. Atherosclerotic carotid plaques on panoramic imaging: An automatic detection using deep learning with small dataset. arXiv 2018, arXiv:1808.08093. [Google Scholar]
- Kim, D.W.; Lee, S.; Kwon, S.; Nam, W.; Cha, I.-H.; Kim, H.J. Deep learning-based survival prediction of oral cancer patients. Sci. Rep. 2019, 9, 6994. [Google Scholar] [CrossRef] [Green Version]
- Kim, J.; Lee, H.-S.; Song, I.-S.; Jung, K.-H. DeNTNet: Deep Neural Transfer Network for the detection of periodontal bone loss using panoramic dental radiographs. Sci. Rep. 2019, 9, 17615. [Google Scholar] [CrossRef] [Green Version]
- Kise, Y.; Shimizu, M.; Ikeda, H.; Fujii, T.; Kuwada, C.; Nishiyama, M.; Funakoshi, T.; Ariji, Y.; Fujita, H.; Katsumata, A.; et al. Usefulness of a deep learning system for diagnosing Sjögren’s syndrome using ultrasonography images. Dentomaxillofac. Radiol. 2020, 49, 20190348. [Google Scholar] [CrossRef]
- Koch, T.L.; Perslev, M.; Igel, C.; Brandt, S.S. Accurate segmentation of dental panoramic radiographs with U-NETS. In Proceedings of the 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019), Venice, Italy, 8–11 April 2019; pp. 15–19. [Google Scholar]
- Krois, J.; Ekert, T.; Meinhold, L.; Golla, T.; Kharbot, B.; Wittemeier, A.; Dörfer, C.; Schwendicke, F. Deep Learning for the Radiographic Detection of Periodontal Bone Loss. Sci. Rep. 2019, 9, 8495. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.-S.; Adhikari, S.; Liu, L.; Jeong, H.-G.; Kim, H.; Yoon, S.-J. Osteoporosis detection in panoramic radiographs using a deep convolutional neural network-based computer-assisted diagnosis system: A preliminary study. Dentomaxillofac. Radiol. 2019, 48, 20170344. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Zhang, Y.; Cui, Q.; Yi, X.; Zhang, Y. Tooth-Marked Tongue Recognition Using Multiple Instance Learning and CNN Features. IEEE Trans. Cybern. 2019, 49, 380–387. [Google Scholar] [CrossRef] [PubMed]
- Liu, L.; Xu, J.; Huan, Y.; Zou, Z.; Yeh, S.-C.; Zheng, L.-R. A Smart Dental Health-IoT Platform Based on Intelligent Hardware, Deep Learning, and Mobile Terminal. IEEE J. Biomed. Health Inform. 2020, 24, 898–906. [Google Scholar] [CrossRef] [PubMed]
- Liu, Y.; Shang, X.; Shen, Z.; Hu, B.; Wang, Z.; Xiong, G. 3D Deep Learning for 3D Printing of Tooth Model. In Proceedings of the 2019 IEEE International Conference on Service Operations and Logistics, and Informatics (SOLI), Zhengzhou, China, 6–8 November 2019; pp. 274–279. [Google Scholar] [CrossRef]
- Milosevic, D.; Vodanovic, M.; Galic, I.; Subasic, M. Estimating Biological Gender from Panoramic Dental X-ray Images. In Proceedings of the 2019 11th International Symposium on Image and Signal Processing and Analysis (ISPA), Dubrovnik, Croatia, 23–25 September 2019; pp. 105–110. [Google Scholar] [CrossRef]
- Minnema, J.; van Eijnatten, M.; Hendriksen, A.A.; Liberton, N.; Pelt, D.M.; Batenburg, K.J.; Forouzanfar, T.; Wolff, J. Segmentation of dental cone-beam CT scans affected by metal artifacts using a mixed-scale dense convolutional neural network. Med. Phys. 2019, 46, 5027–5035. [Google Scholar] [CrossRef] [PubMed]
- Moriyama, Y.; Lee, C.; Date, S.; Kashiwagi, Y.; Narukawa, Y.; Nozaki, K.; Murakami, S. Evaluation of Dental Image Augmentation for the Severity Assessment of Periodontal Disease. In Proceedings of the 2019 International Conference on Computational Science and Computational Intelligence (CSCI), Las Vegas, NV, USA, 5–7 December 2019; pp. 924–929. [Google Scholar] [CrossRef]
- Moutselos, K.; Berdouses, E.; Oulis, C.; Maglogiannis, I. Recognizing Occlusal Caries in Dental Intraoral Images Using Deep Learning. In Proceedings of the 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 1617–1620. [Google Scholar] [CrossRef]
- Murata, M.; Ariji, Y.; Ohashi, Y.; Kawai, T.; Fukuda, M.; Funakoshi, T.; Kise, Y.; Nozawa, M.; Katsumata, A.; Fujita, H.; et al. Deep-learning classification using convolutional neural network for evaluation of maxillary sinusitis on panoramic radiography. Oral Radiol. 2019, 35, 301–307. [Google Scholar] [CrossRef]
- Patcas, R.; Bernini, D.; Volokitin, A.; Agustsson, E.; Rothe, R.; Timofte, R. Applying artificial intelligence to assess the impact of orthognathic treatment on facial attractiveness and estimated age. Int. J. Oral Maxillofac. Surg. 2019, 48, 77–83. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Patcas, R.; Timofte, R.; Volokitin, A.; Agustsson, E.; Eliades, T.; Eichenberger, M.; Bornstein, M.M. Facial attractiveness of cleft patients: A direct comparison between artificial-intelligence-based scoring and conventional rater groups. Eur. J. Orthod. 2019, 41, 428–433. [Google Scholar] [CrossRef]
- Sajad, M.; Shafi, I.; Ahmad, J. Automatic Lesion Detection in Periapical X-rays. In Proceedings of the 2019 International Conference on Electrical, Communication, and Computer Engineering (ICECCE), Swat, Pakistan, 24–25 July 2019; pp. 1–6. [Google Scholar] [CrossRef]
- Senirkentli, G.B.; Sen, S.; Farsak, O.; Bostanci, E. A Neural Expert System Based Dental Trauma Diagnosis Application. In Proceedings of the 2019 Medical Technologies Congress (TIPTEKNO), Izmir, Turkey, 3–5 October 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Stark, B.; Samarah, M. Ensemble and Deep Learning for Real-time Sensors Evaluation of algorithms for real-time sensors with application for detecting brushing location. In Proceedings of the 2019 IEEE 5th International Conference on Computer and Communications (ICCC), Chengdu, China, 5–9 December 2019; pp. 555–559. [Google Scholar] [CrossRef]
- Tian, S.; Dai, N.; Zhang, B.; Yuan, F.; Yu, Q.; Cheng, X. Automatic Classification and Segmentation of Teeth on 3D Dental Model Using Hierarchical Deep Learning Networks. IEEE Access 2019, 7, 84817–84828. [Google Scholar] [CrossRef]
- Tuzoff, D.V.; Tuzova, L.N.; Bornstein, M.M.; Krasnov, A.S.; Kharchenko, M.A.; Nikolenko, S.I.; Sveshnikov, M.M.; Bednenko, G.B. Tooth detection and numbering in panoramic radiographs using convolutional neural networks. Dentomaxillofac. Radiol. 2019, 48, 20180051. [Google Scholar] [CrossRef]
- Vinayahalingam, S.; Xi, T.; Bergé, S.; Maal, T.; de Jong, G. Automated detection of third molars and mandibular nerve by deep learning. Sci. Rep. 2019, 9, 9007. [Google Scholar] [CrossRef] [Green Version]
- Woo, J.; Xing, F.; Prince, J.L.; Stone, M.; Green, J.R.; Goldsmith, T.; Reese, T.G.; Wedeen, V.J.; El Fakhri, G. Differentiating post-cancer from healthy tongue muscle coordination patterns during speech using deep learning. J. Acoust. Soc. Am. 2019, 145, EL423–EL429. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xu, X.; Liu, C.; Zheng, Y. 3D Tooth Segmentation and Labeling Using Deep Convolutional Neural Networks. IEEE Trans. Vis. Comput. Graph. 2019, 25, 2336–2348. [Google Scholar] [CrossRef]
- Yamaguchi, S.; Lee, C.; Karaer, O.; Ban, S.; Mine, A.; Imazato, S. Predicting the Debonding of CAD/CAM Composite Resin Crowns with AI. J. Dent. Res. 2019, 98, 1234–1238. [Google Scholar] [CrossRef] [PubMed]
- Yauney, G.; Rana, A.; Wong, L.C.; Javia, P.; Muftu, A.; Shah, P. Automated Process Incorporating Machine Learning Segmentation and Correlation of Oral Diseases with Systemic Health. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 3387–3393. [Google Scholar] [CrossRef] [Green Version]
- Alalharith, D.M.; Alharthi, H.M.; Alghamdi, W.M.; Alsenbel, Y.M.; Aslam, N.; Khan, I.U.; Shahin, S.Y.; Dianišková, S.; Alhareky, M.S.; Barouch, K.K. A Deep Learning-Based Approach for the Detection of Early Signs of Gingivitis in Orthodontic Patients Using Faster Region-Based Convolutional Neural Networks. Int. J. Environ. Res. Public Health 2020, 17, 8447. [Google Scholar] [CrossRef] [PubMed]
- Aliaga, I.; Vera, V.; Vera, M.; García, E.; Pedrera, M.; Pajares, G. Automatic computation of mandibular indices in dental panoramic radiographs for early osteoporosis detection. Artif. Intell. Med. 2020, 103, 101816. [Google Scholar] [CrossRef]
- Banar, N.; Bertels, J.; Laurent, F.; Boedi, R.M.; De Tobel, J.; Thevissen, P.; Vandermeulen, D. Towards fully automated third molar development staging in panoramic radiographs. Int. J. Leg. Med. 2020, 134, 1831–1841. [Google Scholar] [CrossRef]
- Cantu, A.G.; Gehrung, S.; Krois, J.; Chaurasia, A.; Rossi, J.G.; Gaudin, R.; Elhennawy, K.; Schwendicke, F. Detecting caries lesions of different radiographic extension on bitewings using deep learning. J. Dent. 2020, 100, 103425. [Google Scholar] [CrossRef]
- Chang, H.-J.; Lee, S.-J.; Yong, T.-H.; Shin, N.-Y.; Jang, B.-G.; Kim, J.-E.; Huh, K.-H.; Lee, S.-S.; Heo, M.-S.; Choi, S.-C.; et al. Deep Learning Hybrid Method to Automatically Diagnose Periodontal Bone Loss and Stage Periodontitis. Sci. Rep. 2020, 10, 7531. [Google Scholar] [CrossRef]
- Chen, S.; Wang, L.; Li, G.; Wu, T.-H.; Diachina, S.; Tejera, B.; Kwon, J.J.; Lin, F.-C.; Lee, Y.-T.; Xu, T.; et al. Machine Learning in Orthodontics: Introducing a 3D Auto-segmentation and Auto-landmark Finder of Cbct Images To Assess Maxillary Constriction in Unilateral Impacted Canine patients. Angle Orthod. 2020, 90, 77–84. [Google Scholar] [CrossRef] [Green Version]
- Chen, Y.; Du, H.; Yun, Z.; Yang, S.; Dai, Z.; Zhong, L.; Feng, Q.; Yang, W. Automatic Segmentation of Individual Tooth in Dental CBCT Images from Tooth Surface Map by a Multi-Task FCN. IEEE Access 2020, 8, 97296–97309. [Google Scholar] [CrossRef]
- Chung, M.; Lee, M.; Hong, J.; Park, S.; Lee, J.; Lee, J.; Yang, I.-H.; Lee, J.; Shin, Y.-G. Pose-aware instance segmentation framework from cone beam CT images for tooth segmentation. Comput. Biol. Med. 2020, 120, 103720. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Endres, M.; Hillen, F.; Salloumis, M.; Sedaghat, A.; Niehues, S.; Quatela, O.; Hanken, H.; Smeets, R.; Beck-Broichsitter, B.; Rendenbach, C.; et al. Development of a Deep Learning Algorithm for Periapical Disease Detection in Dental Radiographs. Diagnostics 2020, 10, 430. [Google Scholar] [CrossRef] [PubMed]
- Fan, F.; Ke, W.; Wu, W.; Tian, X.; Lyu, T.; Liu, Y.; Liao, P.; Dai, X.; Chen, H.; Deng, Z. Automatic human identification from panoramic dental radiographs using the convolutional neural network. Forensic Sci. Int. 2020, 314, 110416. [Google Scholar] [CrossRef]
- Fujima, N.; Andreu-Arasa, V.C.; Meibom, S.K.; Mercier, G.A.; Salama, A.R.; Truong, M.T.; Sakai, O. Deep learning analysis using FDG-PET to predict treatment outcome in patients with oral cavity squamous cell carcinoma. Eur. Radiol. 2020, 30, 6322–6330. [Google Scholar] [CrossRef]
- Fukuda, M.; Ariji, Y.; Kise, Y.; Nozawa, M.; Kuwada, C.; Funakoshi, T.; Muramatsu, C.; Fujita, H.; Katsumata, A.; Ariji, E. Comparison of 3 deep learning neural networks for classifying the relationship between the mandibular third molar and the mandibular canal on panoramic radiographs. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2020, 130, 336–343. [Google Scholar] [CrossRef] [PubMed]
- Fukuda, M.; Inamoto, K.; Shibata, N.; Ariji, Y.; Yanashita, Y.; Kutsuna, S.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. Evaluation of an artificial intelligence system for detecting vertical root fracture on panoramic radiography. Oral Radiol. 2020, 36, 337–343. [Google Scholar] [CrossRef]
- Geetha, V.; Aprameya, K.S.; Hinduja, D.M. Dental caries diagnosis in digital radiographs using back-propagation neural network. Health Inf. Sci. Syst. 2020, 8, 8. [Google Scholar] [CrossRef] [PubMed]
- Hung, M.; Hon, E.S.; Ruiz-Negron, B.; Lauren, E.; Moffat, R.; Su, W.; Xu, J.; Park, J.; Prince, D.; Cheever, J.; et al. Exploring the Intersection between Social Determinants of Health and Unmet Dental Care Needs Using Deep Learning. Int. J. Environ. Res. Public Health 2020, 17, 7286. [Google Scholar] [CrossRef]
- Hung, M.; Li, W.; Hon, E.S.; Su, S.; Su, W.; He, Y.; Sheng, X.; Holubkov, R.; Lipsky, M.S. Prediction of 30-Day Hospital Readmissions for All-Cause Dental Conditions using Machine Learning. Risk Manag. Health Policy 2020, 13, 2047–2056. [Google Scholar] [CrossRef]
- Jaskari, J.; Sahlsten, J.; Järnstedt, J.; Mehtonen, H.; Karhu, K.; Sundqvist, O.; Hietanen, A.; Varjonen, V.; Mattila, V.; Kaski, K. Deep Learning Method for Mandibular Canal Segmentation in Dental Cone Beam Computed Tomography Volumes. Sci. Rep. 2020, 10, 5842. [Google Scholar] [CrossRef] [Green Version]
- Jeong, S.H.; Yun, J.P.; Yeom, H.-G.; Lim, H.J.; Lee, J.; Kim, B.C. Deep learning based discrimination of soft tissue profiles requiring orthognathic surgery by facial photographs. Sci. Rep. 2020, 10, 16235. [Google Scholar] [CrossRef] [PubMed]
- Joshi, S.V.; Kanphade, R.D. Deep Learning Based Person Authentication Using Hand Radiographs: A Forensic Approach. IEEE Access 2020, 8, 95424–95434. [Google Scholar] [CrossRef]
- Kats, L.; Vered, M.; Blumer, S.; Kats, E. Neural Network Detection and Segmentation of Mental Foramen in Panoramic Imaging. J. Clin. Pediatr. Dent. 2020, 44, 168–173. [Google Scholar] [CrossRef] [PubMed]
- Khan, H.A.; Haider, M.A.; Ansari, H.A.; Ishaq, H.; Kiyani, A.; Sohail, K.; Muhammad, M.; Khurram, S.A. Automated feature detection in dental periapical radiographs by using deep learning. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2021, 131, 711–720. [Google Scholar] [CrossRef]
- Kim, H.; Shim, E.; Park, J.; Kim, Y.-J.; Lee, U.; Kim, Y. Web-based fully automated cephalometric analysis by deep learning. Comput. Methods Programs Biomed. 2020, 194, 105513. [Google Scholar] [CrossRef] [PubMed]
- Kim, I.; Misra, D.; Rodriguez, L.; Gill, M.; Liberton, D.K.; Almpani, K.; Lee, J.S.; Antani, S. Malocclusion Classification on 3D Cone-Beam CT Craniofacial Images Using Multi-Channel Deep Learning Models. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 1294–1298. [Google Scholar] [CrossRef]
- Kim, J.-E.; Nam, N.-E.; Shim, J.-S.; Jung, Y.-H.; Cho, B.-H.; Hwang, J.J. Transfer Learning via Deep Neural Networks for Implant Fixture System Classification Using Periapical Radiographs. J. Clin. Med. 2020, 9, 1117. [Google Scholar] [CrossRef] [Green Version]
- Kunz, F.; Stellzig-Eisenhauer, A.; Zeman, F.; Boldt, J. Artificial intelligence in orthodontics: Evaluation of a fully automated cephalometric analysis using a customized convolutional neural network. J. Orofac. Orthop. 2020, 81, 52–68. [Google Scholar] [CrossRef]
- Kuramoto, N.; Ichimura, K.; Jayatilake, D.; Shimokakimoto, T.; Hidaka, K.; Suzuki, K. Deep Learning-Based Swallowing Monitor for Realtime Detection of Swallow Duration. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 4365–4368. [Google Scholar] [CrossRef]
- Kuwada, C.; Ariji, Y.; Fukuda, M.; Kise, Y.; Fujita, H.; Katsumata, A.; Ariji, E. Deep learning systems for detecting and classifying the presence of impacted supernumerary teeth in the maxillary incisor region on panoramic radiographs. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2020, 130, 464–469. [Google Scholar] [CrossRef]
- Kwak, G.H.; Kwak, E.-J.; Song, J.M.; Park, H.R.; Jung, Y.-H.; Cho, B.-H.; Hui, P.; Hwang, J.J. Automatic mandibular canal detection using a deep convolutional neural network. Sci. Rep. 2020, 10, 5711. [Google Scholar] [CrossRef] [Green Version]
- Lee, D.; Park, C.; Lim, Y.; Cho, H. A Metal Artifact Reduction Method Using a Fully Convolutional Network in the Sinogram and Image Domains for Dental Computed Tomography. J. Digit. Imaging 2020, 33, 538–546. [Google Scholar] [CrossRef]
- Lee, J.-H.; Han, S.-S.; Kim, Y.H.; Lee, C.; Kim, I. Application of a fully deep convolutional neural network to the automation of tooth segmentation on panoramic radiographs. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2020, 129, 635–642. [Google Scholar] [CrossRef] [PubMed]
- Lee, J.-H.D.; Jeong, S.-N. Efficacy of deep convolutional neural network algorithm for the identification and classification of dental implant systems, using panoramic and periapical radiographs: A pilot study. Medicine 2020, 99, e20787. [Google Scholar] [CrossRef] [PubMed]
- Lee, J.-H.; Kim, D.-H.; Jeong, S.-N. Diagnosis of cystic lesions using panoramic and cone beam computed tomographic images based on deep learning neural network. Oral Dis. 2020, 26, 152–158. [Google Scholar] [CrossRef]
- Lee, J.-H.; Kim, Y.-T.; Lee, J.-B.; Jeong, S.-N. A Performance Comparison between Automated Deep Learning and Dental Professionals in Classification of Dental Implant Systems from Dental Imaging: A Multi-Center Study. Diagnostics 2020, 10, 910. [Google Scholar] [CrossRef]
- Lee, J.-H.; Yu, H.-J.; Kim, M.-J.; Kim, J.-W.; Choi, J. Automated cephalometric landmark detection with confidence regions using Bayesian convolutional neural networks. BMC Oral Health 2020, 20, 270. [Google Scholar] [CrossRef] [PubMed]
- Lee, K.-S.; Jung, S.-K.; Ryu, J.-J.; Shin, S.-W.; Choi, J. Evaluation of Transfer Learning with Deep Convolutional Neural Networks for Screening Osteoporosis in Dental Panoramic Radiographs. J. Clin. Med. 2020, 9, 392. [Google Scholar] [CrossRef]
- Lee, S.; Woo, S.; Yu, J.; Seo, J.; Lee, J.; Lee, C. Automated CNN-Based Tooth Segmentation in Cone-Beam CT for Dental Implant Planning. IEEE Access 2020, 8, 50507–50518. [Google Scholar] [CrossRef]
- Li, C.; Zhang, D.; Chen, S. Research about Tongue Image of Traditional Chinese Medicine(TCM) Based on Artificial Intelligence Technology. In Proceedings of the 2020 IEEE 5th Information Technology and Mechatronics Engineering Conference (ITOEC), Chongqing, China, 12–14 June 2020; pp. 633–636. [Google Scholar] [CrossRef]
- Li, Q.; Chen, K.; Han, L.; Zhuang, Y.; Li, J.; Lin, J. Automatic tooth roots segmentation of cone beam computed tomography image sequences using U-net and RNN. J. X-ray Sci. Technol. 2020, 28, 905–922. [Google Scholar] [CrossRef]
- Li, S.; Pang, Z.; Song, W.; Guo, Y.; You, W.; Hao, A.; Qin, H. Low-Shot Learning of Automatic Dental Plaque Segmentation Based on Local-to-Global Feature Fusion. In Proceedings of the 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), Iowa City, IA, USA, 3–7 April 2020; pp. 664–668. [Google Scholar] [CrossRef]
- Lian, C.; Wang, L.; Wu, T.-H.; Wang, F.; Yap, P.-T.; Ko, C.-C.; Shen, D. Deep Multi-Scale Mesh Feature Learning for Automated Labeling of Raw Dental Surfaces From 3D Intraoral Scanners. IEEE Trans. Med. Imaging 2020, 39, 2440–2450. [Google Scholar] [CrossRef]
- Mahdi, F.P.; Motoki, K.; Kobashi, S. Optimization technique combined with deep learning method for teeth recognition in dental panoramic radiographs. Sci. Rep. 2020, 10, 19261. [Google Scholar] [CrossRef]
- Mallishery, S.; Chhatpar, P.; Banga, K.S.; Shah, T.; Gupta, P. The precision of case difficulty and referral decisions: An innovative automated approach. Clin. Oral Investig. 2020, 24, 1909–1915. [Google Scholar] [CrossRef] [PubMed]
- Matsuda, S.; Miyamoto, T.; Yoshimura, H.; Hasegawa, T. Personal identification with orthopantomography using simple convolutional neural networks: A preliminary study. Sci. Rep. 2020, 10, 13559. [Google Scholar] [CrossRef] [PubMed]
- Boedi, R.M.; Banar, N.; De Tobel, J.; Bertels, J.; Vandermeulen, D.; Thevissen, P.W. Effect of Lower Third Molar Segmentations on Automated Tooth Development Staging using a Convolutional Neural Network. J. Forensic Sci. 2020, 65, 481–486. [Google Scholar] [CrossRef] [PubMed]
- Ngoc, V.T.N.; Agwu, A.C.; Son, L.H.; Tuan, T.M.; Giap, C.N.; Thanh, M.T.G.; Duy, H.B.; Ngan, T.T. The Combination of Adaptive Convolutional Neural Network and Bag of Visual Words in Automatic Diagnosis of Third Molar Complications on Dental X-ray Images. Diagnostics 2020, 10, 209. [Google Scholar] [CrossRef] [PubMed]
- Oh, J.H.; Pouryahya, M.; Iyer, A.; Apte, A.P.; Deasy, J.O.; Tannenbaum, A. A novel kernel Wasserstein distance on Gaussian measures: An application of identifying dental artifacts in head and neck computed tomography. Comput. Biol. Med. 2020, 120, 103731. [Google Scholar] [CrossRef]
- Orhan, K.; Bayrakdar, I.S.; Ezhov, M.; Kravtsov, A.; Özyürek, T. Evaluation of artificial intelligence for detecting periapical pathosis on cone-beam computed tomography scans. Int. Endod. J. 2020, 53, 680–689. [Google Scholar] [CrossRef] [PubMed]
- Ren, J.; Fan, H.; Yang, J.; Ling, H. Detection of Trabecular Landmarks for Osteoporosis Prescreening in Dental Panoramic Radiographs. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; pp. 2194–2197. [Google Scholar] [CrossRef]
- Schwendicke, F.; Elhennawy, K.; Paris, S.; Friebertshäuser, P.; Krois, J. Deep learning for caries lesion detection in near-infrared light transillumination images: A pilot study. J. Dent. 2020, 92, 103260. [Google Scholar] [CrossRef]
- Setzer, F.C.; Shi, K.J.; Zhang, Z.; Yan, H.; Yoon, H.; Mupparapu, M.; Li, J. Artificial Intelligence for the Computer-aided Detection of Periapical Lesions in Cone-beam Computed Tomographic Images. J. Endod. 2020, 46, 987–993. [Google Scholar] [CrossRef]
- Sukegawa, S.; Yoshii, K.; Hara, T.; Yamashita, K.; Nakano, K.; Yamamoto, N.; Nagatsuka, H.; Furuki, Y. Deep Neural Networks for Dental Implant System Classification. Biomolecules 2020, 10, 984. [Google Scholar] [CrossRef]
- Sun, D.; Pei, Y.; Song, G.; Guo, Y.; Ma, G.; Xu, T.; Zha, H. Tooth Segmentation and Labeling from Digital Dental Casts. In Proceedings of the 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), Iowa City, IA, USA, 3–7 April 2020; pp. 669–673. [Google Scholar] [CrossRef]
- Takahashi, T.; Nozaki, K.; Gonda, T.; Ikebe, K. A system for designing removable partial dentures using artificial intelligence. Part 1. Classification of partially edentulous arches using a convolutional neural network. J. Prosthodont. Res. 2021, 65, 115–118. [Google Scholar] [CrossRef]
- Tang, W.; Gao, Y.; Liu, L.; Xia, T.; He, L.; Zhang, S.; Guo, J.; Li, W.; Xu, Q. An Automatic Recognition of Tooth- Marked Tongue Based on Tongue Region Detection and Tongue Landmark Detection via Deep Learning. IEEE Access 2020, 8, 153470–153478. [Google Scholar] [CrossRef]
- Thanathornwong, B.; Suebnukarn, S. Automatic detection of periodontal compromised teeth in digital panoramic radiographs using faster regional convolutional neural networks. Imaging Sci. Dent. 2020, 50, 169–174. [Google Scholar] [CrossRef] [PubMed]
- Vila-Blanco, N.; Carreira, M.J.; Varas-Quintana, P.; Balsa-Castro, C.; Tomas, I. Deep Neural Networks for Chronological Age Estimation From OPG Images. IEEE Trans. Med. Imaging 2020, 39, 2374–2384. [Google Scholar] [CrossRef] [PubMed]
- Vranckx, M.; Van Gerven, A.; Willems, H.; Vandemeulebroucke, A.; Leite, A.F.; Politis, C.; Jacobs, R. Artificial Intelligence (AI)-Driven Molar Angulation Measurements to Predict Third Molar Eruption on Panoramic Radiographs. Int. J. Environ. Res. Public Health 2020, 17, 3716. [Google Scholar] [CrossRef]
- Wang, X.; Liu, J.; Wu, C.; Liu, J.; Li, Q.; Chen, Y.; Wang, X.; Chen, X.; Pang, X.; Chang, B.; et al. Artificial intelligence in tongue diagnosis: Using deep convolutional neural network for recognizing unhealthy tongue with tooth-mark. Comput. Struct. Biotechnol. J. 2020, 18, 973–980. [Google Scholar] [CrossRef] [PubMed]
- Wang, Y.; Hays, R.D.; Marcus, M.; Maida, C.A.; Shen, J.; Xiong, D.; Coulter, I.D.; Lee, S.Y.; Spolsky, V.W.; Crall, J.J.; et al. Developing Children’s Oral Health Assessment Toolkits Using Machine Learning Algorithm. JDR Clin. Transl. Res. 2020, 5, 233–243. [Google Scholar] [CrossRef] [PubMed]
- Welch, M.L.; McIntosh, C.; Purdie, T.G.; Wee, L.; Traverso, A.; Dekker, A.; Haibe-Kains, B.; Jaffray, D.A. Automatic classification of dental artifact status for efficient image veracity checks: Effects of image resolution and convolutional neural network depth. Phys. Med. Biol. 2020, 65, 015005. [Google Scholar] [CrossRef]
- Welch, M.L.; McIntosh, C.; Traverso, A.; Wee, L.; Purdie, T.G.; Dekker, A.; Haibe-Kains, B.; Jaffray, D.A. External validation and transfer learning of convolutional neural networks for computed tomography dental artifact classification. Phys. Med. Biol. 2020, 65, 035017. [Google Scholar] [CrossRef]
- Welikala, R.A.; Remagnino, P.; Lim, J.H.; Chan, C.S.; Rajendran, S.; Kallarakkal, T.G.; Zain, R.B.; Jayasinghe, R.D.; Rimal, J.; Kerr, A.R.; et al. Automated Detection and Classification of Oral Lesions Using Deep Learning for Early Detection of Oral Cancer. IEEE Access 2020, 8, 132677–132693. [Google Scholar] [CrossRef]
- Yang, H.; Jo, E.; Kim, H.J.; Cha, I.-H.; Jung, Y.-S.; Nam, W.; Kim, J.-Y.; Kim, J.-K.; Kim, Y.H.; Oh, T.G.; et al. Deep Learning for Automated Detection of Cyst and Tumors of the Jaw in Panoramic Radiographs. J. Clin. Med. 2020, 9, 1839. [Google Scholar] [CrossRef]
- You, W.; Hao, A.; Li, S.; Wang, Y.; Xia, B. Deep learning-based dental plaque detection on primary teeth: A comparison with clinical assessments. BMC Oral Health 2020, 20, 141. [Google Scholar] [CrossRef] [PubMed]
- Zhang, X.; Liang, Y.; Li, W.; Liu, C.; Gu, D.; Sun, W.; Miao, L. Development and evaluation of deep learning for screening dental caries from oral photographs. Oral Dis. 2022, 28, 173–181. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Z.; Yan, H.; Setzer, F.C.; Shi, K.J.; Mupparapu, M.; Li, J. Anatomically Constrained Deep Learning for Automating Dental CBCT Segmentation and Lesion Detection. IEEE Trans. Autom. Sci. Eng. 2021, 18, 603–614. [Google Scholar] [CrossRef]
- Zhu, G.; Piao, Z.; Kim, S.C. Tooth Detection and Segmentation with Mask R-CNN. In Proceedings of the 2020 International Conference on Artificial Intelligence in Information and Communication (ICAIIC), Fukuoka, Japan, 19–21 February 2020; pp. 070–072. [Google Scholar] [CrossRef]
- Alkhadar, H.; Macluskey, M.; White, S.; Ellis, I.; Gardner, A. Comparison of machine learning algorithms for the prediction of five-year survival in oral squamous cell carcinoma. J. Oral Pathol. Med. 2021, 50, 378–384. [Google Scholar] [CrossRef] [PubMed]
- Leite, A.F.; Van Gerven, A.; Willems, H.; Beznik, T.; Lahoud, P.; Gaêta-Araujo, H.; Vranckx, M.; Jacobs, R. Artificial intelligence-driven novel tool for tooth detection and segmentation on panoramic radiographs. Clin. Oral Investig. 2021, 25, 2257–2267. [Google Scholar] [CrossRef]
- Machado, R.A.; De Oliveira Silva, C.; Martelli-Junior, H.; das Neves, L.T.; Coletta, R.D. Machine learning in prediction of genetic risk of nonsyndromic oral clefts in the Brazilian population. Clin. Oral Investig. 2021, 25, 1273–1280. [Google Scholar] [CrossRef]
- Muramatsu, C.; Morishita, T.; Takahashi, R.; Hayashi, T.; Nishiyama, W.; Ariji, Y.; Zhou, X.; Hara, T.; Katsumata, A.; Ariji, E.; et al. Tooth detection and classification on panoramic radiographs for automatic dental chart filing: Improved classification by multi-sized input data. Oral Radiol. 2021, 37, 13–19. [Google Scholar] [CrossRef]
- Schwendicke, F.; Rossi, J.; Göstemeyer, G.; Elhennawy, K.; Cantu, A.; Gaudin, R.; Chaurasia, A.; Gehrung, S.; Krois, J. Cost-effectiveness of Artificial Intelligence for Proximal Caries Detection. J. Dent. Res. 2021, 100, 369–376. [Google Scholar] [CrossRef]
- Yasa, Y.; Çelik, Ö.; Bayrakdar, I.S.; Pekince, A.; Orhan, K.; Akarsu, S.; Atasoy, S.; Bilgir, E.; Odabaş, A.; Aslan, A.F. An artificial intelligence proposal to automatic teeth detection and numbering in dental bite-wing radiographs. Acta Odontol. Scand. 2021, 79, 275–281. [Google Scholar] [CrossRef]
- Rischke, R.; Schneider, L.; Müller, K.; Samek, W.; Schwendicke, F.; Krois, J. Federated Learning in Dentistry: Chances and Challenges. J. Dent. Res. 2022, 101, 1269–1273. [Google Scholar] [CrossRef]
- ITU/WHO. Focus Group on “Artificial Intelligence for Health”. 2018. Available online: https://www.itu.int/en/ITU-T/focusgroups/ai4h/Pages/default.aspx (accessed on 31 October 2022).
- Schwendicke, F.; Singh, T.; Lee, J.-H.; Gaudin, R.; Chaurasia, A.; Wiegand, T.; Uribe, S.; Krois, J.; on behalf of the IADR e-Oral Health Network; the ITU WHO Focus Group AI for Health. Artificial intelligence in dental research: Checklist for authors, reviewers, readers. J. Dent. 2021, 107, 103610. [Google Scholar] [CrossRef] [PubMed]
- Lones, M. How to avoid machine learning pitfalls: A guide for academic researchers. arXiv 2021, arXiv:2108.02497. [Google Scholar]
- Stevens, L.M.; Mortazavi, B.J.; Deo, R.C.; Curtis, L.; Kao, D.P. Recommendations for Reporting Machine Learning Analyses in Clinical Research. Circ. Cardiovasc. Qual. Outcomes 2020, 13, e006556. [Google Scholar] [CrossRef] [PubMed]
- Vermeire, T.; Brughmans, D.; Goethals, S.; de Oliveira, R.M.B.; Martens, D. Explainable image classification with evidence counterfactual. Pattern Anal. Appl. 2022, 25, 315–335. [Google Scholar] [CrossRef]
- Nguyen, K.; Duong, D.; Almeida, F.; Major, P.; Kaipatur, N.; Pham, T.; Lou, E.; Noga, M.; Punithakumar, K.; Le, L. Alveolar Bone Segmentation in Intraoral Ultrasonographs with Machine Learning. J. Dent. Res. 2020, 99, 1054–1061. [Google Scholar] [CrossRef]
- Min, X.; Haijin, C. Research on Rapid Detection of Tooth Profile Parameters of the Clothing Wires Based on Image Processing. In Proceedings of the 2020 IEEE International Conference on Artificial Intelligence and Computer Applications (ICAICA), Dalian, China, 27–29 June 2020; pp. 586–590. [Google Scholar] [CrossRef]
- Rasteau, S.; Sigaux, N.; Louvrier, A.; Bouletreau, P. Three-dimensional acquisition technologies for facial soft tissues—Applications and prospects in orthognathic surgery. J. Stomatol. Oral Maxillofac. Surg. 2020, 121, 721–728. [Google Scholar] [CrossRef]
- Dot, G.; Rafflenbeul, F.; Arbotto, M.; Gajny, L.; Rouch, P.; Schouman, T. Accuracy and reliability of automatic three-dimensional cephalometric landmarking. Int. J. Oral Maxillofac. Surg. 2020, 49, 1367–1378. [Google Scholar] [CrossRef]
- Kapralos, V.; Koutroulis, A.; Irinakis, E.; Kouros, P.; Lyroudia, K.; Pitas, I.; Mikrogeorgis, G. Digital subtraction radiography in detection of vertical root fractures: Accuracy evaluation for root canal filling, fracture orientation and width variables. An ex-vivo study. Clin. Oral Investig. 2020, 24, 3671–3681. [Google Scholar] [CrossRef]
- Tanaka, R.; Tanaka, T.; Yeung, A.W.K.; Taguchi, A.; Katsumata, A.; Bornstein, M.M. Mandibular Radiomorphometric Indices and Tooth Loss as Predictors for the Risk of Osteoporosis using Panoramic Radiographs. Oral Health Prev. Dent. 2020, 18, 773–782. [Google Scholar] [CrossRef]
- Laishram, A.; Thongam, K. Detection and Classification of Dental Pathologies using Faster-RCNN in Orthopantomogram Radiography Image. In Proceedings of the 7th International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 27–28 February 2020; pp. 423–428. [Google Scholar] [CrossRef]
- Rao, G.K.L.; Mokhtar, N.; Iskandar, Y.H.P.; Srinivasa, A.C. Learning Orthodontic Cephalometry through Augmented Reality: A Conceptual Machine Learning Validation Approach. In Proceedings of the 2018 International Conference on Electrical Engineering and Informatics (ICELTICs), Banda Aceh, Indonesia, 19–20 September 2018; pp. 133–138. [Google Scholar] [CrossRef]
- Damiani, G.; Grossi, E.; Berti, E.; Conic, R.; Radhakrishna, U.; Pacifico, A.; Bragazzi, N.; Piccinno, R.; Linder, D. Artificial neural networks allow response prediction in squamous cell carcinoma of the scalp treated with radiotherapy. J. Eur. Acad. Dermatol. Venereol. 2020, 34, 1369–1373. [Google Scholar] [CrossRef]
- Yoon, S.; Choi, T.; Odlum, M.; Mitchell, D.A.; Kronish, I.M.; Davidson, K.W.; Finkelstein, J. Machine Learning to Identify Behavioral Determinants of Oral Health in Inner City Older Hispanic Adults. Stud. Health Technol. Inform. 2018, 251, 253–256. [Google Scholar] [CrossRef] [PubMed]
- Yatabe, M.; Prieto, J.C.; Styner, M.; Zhu, H.; Ruellas, A.C.; Paniagua, B.; Budin, F.; Benavides, E.; Shoukri, B.; Michoud, L.; et al. 3D superimposition of craniofacial imaging—The utility of multicentre collaborations. Orthod. Craniofacial Res. 2019, 22 (Suppl. S1), 213–220. [Google Scholar] [CrossRef] [PubMed]
- Hung, M.; Lauren, E.; Hon, E.S.; Birmingham, W.C.; Xu, J.; Su, S.; Hon, S.D.; Park, J.; Dang, P.; Lipsky, M.S. Social Network Analysis of COVID-19 Sentiments: Application of Artificial Intelligence. J. Med. Internet Res. 2020, 22, e22590. [Google Scholar] [CrossRef]
- Suhail, Y.; Upadhyay, M.; Chhibber, A. Kshitiz Machine Learning for the Diagnosis of Orthodontic Extractions: A Computational Analysis Using Ensemble Learning. Bioengineering 2020, 7, 55. [Google Scholar] [CrossRef]
- Mehandru, N.; Hicks, W.L.; Singh, A.K.; Markiewicz, M.R. Machine Learning for Identification of Craniomaxillofacial Radiographic Lesions. J. Oral Maxillofac. Surg. 2020, 78, 2106–2107. [Google Scholar] [CrossRef]
Sr. No. [Citation] | Data Selection: Risk of Bias/Applicability Concerns | Index Test: Risk of Bias/Applicability Concerns | Reference Standard: Risk of Bias/Applicability Concerns | Flow and Timing: Risk of Bias |
---|---|---|---|---|
1. [12] | high/high | low/high | high/high | low |
2. [13] | low/low | low/low | low/low | low |
3. [14] | high/low | low/low | low/low | low |
4. [15] | low/low | low/high | high/high | low |
5. [16] | low/low | low/low | low/low | low |
6. [17] | high/high | low/high | high/high | low |
7. [18] | high/high | low/low | high/low | low |
8. [19] | low/low | low/high | low/low | low |
9. [20] | low/low | low/low | low/high | low |
10. [21] | high/high | low/low | high/low | low |
11. [22] | high/high | low/low | high/high | low |
12. [23] | high/low | high/low | high/low | low |
13. [24] | low/high | low/low | high/high | low |
14. [25] | high/high | high/low | low/low | low |
15. [26] | low/low | high/low | low/low | low |
16. [27] | high/low | low/low | high/low | low |
17. [28] | high/high | low/low | high/low | low |
18. [29] | high/low | low/low | high/low | low |
19. [30] | high/high | low/low | high/low | low |
20. [31] | high/high | low/high | high/low | low |
21. [32] | high/high | high/high | high/high | low |
22. [33] | low/low | low/low | low/low | low |
23. [34] | low/high | low/low | low/high | low |
24. [35] | high/high | low/low | low/low | low |
25. [36] | low/low | low/low | low/low | low |
26. [37] | high/high | low/low | high/low | low |
27. [38] | high/high | low/low | high/low | low |
28. [39] | high/high | low/low | high/low | low |
29. [40] | high/high | high/low | high/low | low |
30. [41] | low/low | low/low | low/low | low |
31. [42] | high/low | high/low | low/low | low |
32. [43] | low/high | low/high | low/high | low |
33. [44] | low/low | high/low | high/low | low |
34. [45] | high/high | low/high | low/high | low |
35. [46] | high/low | low/low | low/low | low |
36. [47] | high/high | low/low | low/low | low |
37. [48] | high/high | low/high | low/high | low |
38. [49] | low/low | low/low | high/low | low |
39. [50] | low/high | low/low | high/low | high |
40. [51] | low/high | low/low | low/low | low |
41. [52] | high/low | low/high | high/low | low |
42. [53] | high/high | low/low | low/low | high |
43. [54] | low/low | low/high | low/high | low |
44. [55] | high/high | low/low | high/low | low |
45. [56] | high/high | low/high | high/low | low |
46. [57] | high/high | low/low | high/high | low |
47. [58] | high/high | high/high | high/high | low |
48. [59] | low/high | low/low | high/high | low |
49. [60] | low/high | low/low | high/high | low |
50. [61] | low/low | low/low | high/low | high |
51. [62] | high/high | low/low | high/low | low |
52. [63] | low/high | low/high | high/high | low |
53. [64] | high/high | high/high | high/high | low |
54. [65] | high/high | low/low | high/low | low |
55. [66] | low/high | low/low | high/low | low |
56. [67] | high/high | low/high | low/high | low |
57. [68] | low/high | low/low | low/low | high |
58. [69] | low/low | low/low | low/low | low |
59. [70] | high/high | low/low | low/low | low |
60. [71] | low/low | low/low | low/low | low |
61. [72] | low/high | low/low | high/low | low |
62. [73] | low/low | low/low | high/low | low |
63. [74] | low/low | low/low | low/low | low |
64. [75] | low/low | low/low | low/low | low |
65. [76] | low/low | low/low | low/low | low |
66. [77] | high/high | high/low | high/low | low |
67. [78] | high/low | high/low | high/low | low |
68. [79] | high/low | high/low | high/low | low |
69. [80] | high/low | high/low | low/low | low |
70. [81] | low/low | low/low | low/low | low |
71. [82] | low/low | low/low | high/low | low |
72. [83] | low/low | low/low | low/low | low |
73. [84] | high/low | low/low | high/low | low |
74. [85] | low/low | low/low | low/low | high |
75. [86] | high/high | low/low | low/low | low |
76. [87] | high/high | high/low | low/low | low |
77. [88] | low/low | low/low | low/low | low |
78. [89] | high/high | high/high | high/high | low |
79. [90] | high/high | high/high | high/high | low |
80. [91] | high/high | low/low | high/low | low |
81. [92] | low/low | low/low | high/low | low |
82. [93] | low/high | low/low | high/high | low |
83. [94] | low/low | low/low | low/low | high |
84. [95] | high/high | high/low | high/high | low |
85. [96] | low/high | high/low | high/high | low |
86. [97] | high/high | low/high | low/high | low |
87. [98] | high/high | low/low | low/low | low |
88. [99] | low/high | low/high | high/high | low |
89. [100] | low/high | low/high | high/high | low |
90. [101] | low/high | low/low | low/high | low |
91. [102] | high/high | low/low | high/low | low |
92. [103] | low/low | low/low | low/low | low |
93. [4] | high/low | low/high | high/high | low |
94. [104] | low/low | low/low | high/low | low |
95. [105] | high/high | low/high | high/low | low |
96. [106] | low/high | low/low | low/high | low |
97. [107] | low/low | low/low | high/low | low |
98. [108] | low/low | low/low | low/low | low |
99. [109] | high/high | high/low | high/low | low |
100. [110] | low/low | low/low | high/low | low |
101. [111] | low/low | low/low | high/low | low |
102. [112] | high/low | high/low | high/high | low |
103. [113] | high/high | low/low | low/high | high |
104. [3] | low/high | low/low | low/low | low |
105. [114] | low/low | low/low | low/low | low |
106. [115] | low/low | low/low | low/low | low |
107. [116] | high/high | high/low | high/low | low |
108. [117] | high/low | high/low | low/low | low |
109. [118] | high/high | low/low | high/low | low |
110. [119] | low/low | low/low | low/low | low |
111. [120] | low/low | low/high | high/high | low |
112. [121] | low/low | low/low | high/low | low |
113. [122] | high/high | high/low | low/low | low |
114. [123] | low/low | low/low | low/low | low |
115. [124] | low/high | low/low | high/low | low |
116. [125] | high/high | low/low | low/high | low |
117. [126] | high/low | high/low | high/low | high |
118. [127] | high/high | low/low | high/low | low |
119. [128] | low/low | high/low | low/low | low |
120. [129] | high/low | low/low | low/low | low |
121. [130] | high/high | low/low | high/low | high |
122. [131] | high/low | high/low | high/low | low |
123. [132] | high/high | low/low | high/low | low |
124. [133] | high/high | low/low | high/low | high |
125. [134] | low/high | high/low | high/low | low |
126. [135] | high/low | high/low | low/low | low |
127. [136] | high/low | high/low | high/low | low |
128. [137] | high/low | high/high | low/low | low |
129. [138] | low/high | low/high | high/low | low |
130. [139] | high/low | low/low | low/low | low |
131. [140] | high/low | low/high | high/high | low |
132. [141] | low/low | low/low | high/low | low |
133. [142] | high/high | low/low | high/low | low |
134. [143] | high/high | low/low | low/low | low |
135. [144] | high/high | low/low | high/low | low |
136. [145] | high/high | high/low | high/low | low |
137. [146] | high/high | low/low | high/low | low |
138. [147] | high/low | high/low | low/low | low |
139. [148] | high/high | low/low | high/low | low |
140. [149] | high/high | low/high | high/high | low |
141. [150] | high/high | low/high | high/high | low |
142. [151] | low/high | low/low | high/high | low |
143. [152] | high/high | low/high | high/high | low |
144. [153] | high/low | low/low | high/low | low |
145. [154] | low/low | low/high | high/high | low |
146. [155] | low/low | high/low | low/low | low |
147. [156] | low/high | low/low | low/low | low |
148. [157] | high/high | high/low | high/low | high |
149. [158] | low/low | low/low | low/low | low |
150. [159] | low/high | low/high | low/high | low |
151. [160] | high/low | low/high | low/low | low |
152. [161] | low/low | high/low | high/low | high |
153. [162] | high/low | low/low | low/high | low |
154. [163] | low/low | low/high | low/high | low |
155. [164] | high/low | low/low | high/low | low |
156. [165] | low/low | low/low | high/low | low |
157. [166] | low/high | low/high | high/high | high |
158. [167] | low/low | low/low | low/low | low |
159. [168] | low/low | low/low | high/low | low |
160. [169] | low/high | high/low | high/high | low |
161. [170] | high/high | low/low | low/low | low |
162. [171] | low/low | low/low | high/low | low |
163. [172] | low/low | low/low | low/low | low |
164. [173] | low/low | low/low | high/low | low |
165. [174] | low/low | low/low | low/low | low |
166. [175] | high/high | high/high | high/high | low |
167. [176] | high/high | high/low | low/low | low |
168. [177] | high/high | low/low | high/low | low |
Classification Task | Object Detection Task | Semantic Segmentation Task | Instance Segmentation Task | Generation Task | |
---|---|---|---|---|---|
n | 85 | 22 | 37 | 19 | 5 |
Field of dentistry, n (%) | |||||
Restorative dentistry and endodontics | 13 (15%) | 1 (4%) | 9 (24%) | 2 (11%) | 0 (0%) |
Oral medicine | 19 (22%) | 5 (23%) | 1 (3%) | 0 (0%) | 0 (0%) |
Oral radiology | 3 (4%) | 0 (0%) | 2 (5%) | 2 (11%) | 4 (80%) |
Orthodontics | 10 (12%) | 3 (14%) | 1 (3%) | 3 (15%) | 1 (20%) |
Oral surgery and implantology | 11 (13%) | 3 (14%) | 3 (8%) | 0 (0%) | 0 (0%) |
Periodontology | 9 (11%) | 2 (9%) | 7 (19%) | 1 (5%) | 0 (0%) |
Prosthodontics | 2 (2%) | 0 (0%) | 0 (0%) | 0 (0%) | 0 (0%) |
Others (non-specific field, general dentistry) | 18 (21%) | 8 (36%) | 14 (38%) | 11 (58%) | 0 (0%) |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Arsiwala-Scheppach, L.T.; Chaurasia, A.; Müller, A.; Krois, J.; Schwendicke, F. Machine Learning in Dentistry: A Scoping Review. J. Clin. Med. 2023, 12, 937. https://doi.org/10.3390/jcm12030937
Arsiwala-Scheppach LT, Chaurasia A, Müller A, Krois J, Schwendicke F. Machine Learning in Dentistry: A Scoping Review. Journal of Clinical Medicine. 2023; 12(3):937. https://doi.org/10.3390/jcm12030937
Chicago/Turabian StyleArsiwala-Scheppach, Lubaina T., Akhilanand Chaurasia, Anne Müller, Joachim Krois, and Falk Schwendicke. 2023. "Machine Learning in Dentistry: A Scoping Review" Journal of Clinical Medicine 12, no. 3: 937. https://doi.org/10.3390/jcm12030937
APA StyleArsiwala-Scheppach, L. T., Chaurasia, A., Müller, A., Krois, J., & Schwendicke, F. (2023). Machine Learning in Dentistry: A Scoping Review. Journal of Clinical Medicine, 12(3), 937. https://doi.org/10.3390/jcm12030937