Next Article in Journal
The Hidden Link between Polycystic Ovary Syndrome and Kidney Stones: Finding from the Tehran Lipid and Glucose Study (TLGS)
Previous Article in Journal
Accuracy of Breast Ultrasonography and Mammography in Comparison with Postoperative Histopathology in Breast Cancer Patients after Neoadjuvant Chemotherapy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Applications and Prospects of Artificial Intelligence-Assisted Endoscopic Ultrasound in Digestive System Diseases

Department of Gastroenterology and Hepatology, Tianjin Medical University General Hospital, No. 154, Anshan Road, Heping District, Tianjin 300052, China
*
Author to whom correspondence should be addressed.
Diagnostics 2023, 13(17), 2815; https://doi.org/10.3390/diagnostics13172815
Submission received: 1 August 2023 / Revised: 22 August 2023 / Accepted: 27 August 2023 / Published: 30 August 2023
(This article belongs to the Section Machine Learning and Artificial Intelligence in Diagnostics)

Abstract

:
Endoscopic ultrasound (EUS) has emerged as a widely utilized tool in the diagnosis of digestive diseases. In recent years, the potential of artificial intelligence (AI) in healthcare has been gradually recognized, and its superiority in the field of EUS is becoming apparent. Machine learning (ML) and deep learning (DL) are the two main AI algorithms. This paper aims to outline the applications and prospects of artificial intelligence-assisted endoscopic ultrasound (EUS-AI) in digestive diseases over the past decade. The results demonstrated that EUS-AI has shown superiority or at least equivalence to traditional methods in the diagnosis, prognosis, and quality control of subepithelial lesions, early esophageal cancer, early gastric cancer, and pancreatic diseases including pancreatic cystic lesions, autoimmune pancreatitis, and pancreatic cancer. The implementation of EUS-AI has opened up new avenues for individualized precision medicine and has introduced novel diagnostic and treatment approaches for digestive diseases.

1. Introduction

Artificial intelligence (AI) is a computer program that has been developed by humans to mimic the abilities of the human brain to think, judge, and react. Since the term “AI algorithm” was first proposed by John McCarthy in 1956, it has undergone a significant transition from artificial narrow intelligence (ANI) to artificial general intelligence (AGI). While ANI refers to AI that is trained to focus on performing specific tasks, AGI is a theoretical form of AI that possesses self-awareness and the ability to solve problems, learn, and plan for the future. Artificial super intelligence (ASI) surpasses human intelligence and capabilities, but it is still in a completely theoretical stage. The realization of ASI is the ultimate purpose of AI research. Currently, machine learning (ML) and deep learning (DL) are the main architectures of AI that are widely used in medical image recognition. ML is a computer program that enables machines to learn without explicit programming. It could quantify the features of medical images based on inherent regular patterns. Various ML technologies, such as decision trees, random forest (RF), logistic regression, and artificial neural networks (ANN), are commonly employed in diagnosing medical images. DL is a new field of ML, which is realized by imitating the mechanisms of the human brain in three phases: data collection and annotation, construction of DL architecture, and training and verification of its capabilities. One of the most notable advantages of DL is its ability to automatically detect and objectively identify features of interest in medical images. Compared with ML, DL is easier to implement and offers higher accuracy. Convolutional neural network (CNN) is commonly used in the field of DL, and it is considered one of the best-performing algorithms for recognizing images at present [1,2,3] (Figure 1). In recent studies, the focus has been on evaluating the diagnostic performance of individual DL models for various diseases. However, each model may have its own unique ability to outperform other models in feature learning. Consequently, some researchers have begun exploring the possibility of enhancing diagnostic accuracy through the integration of multiple models after model fusion. Gunasekaran et al. proposed an ensemble model (GIT-NET) that combines pre-trained ResNet50, DenseNet201, and Inception v3 models. By utilizing these models to extract features from the KVASIR v2 dataset with eight classes of digestive diseases, the authors achieved accuracies of 92.96% and 95.00% through model averaging and weight averaging methods, respectively, surpassing the baseline models [4]. Similarly, Ramamurthy et al. introduced an approach that employed a pre-trained EfficientNet B0 backbone and custom CNN (Effimix) to automatically classify gastrointestinal diseases, employing the HyperKvasir benchmark dataset. By leveraging the combined features of these two networks, the proposed model achieved an accuracy of 97.99%, along with an impressive F1 score, precision, and recall values of 97%, 97%, and 98%, respectively [5]. Zhao et al. developed an improved YOLOX model that incorporated YOLOX, group normalization, and video adjacent-frame association to achieve real-time detection of endometrial polyps. The improved model had sensitivities of 100% and 92.0% for lesions in both the internal and external test sets, surpassing the per-lesion sensitivities of the original YOLOX model, which stood at 95.83% and 77.33%, respectively. These findings highlight the potential of the improved model in reducing the risk of endometrial cancer by effectively identifying and localizing endometrial polyps [6].
In the early stages of the disease, patients often lack obvious clinical signs and symptoms, making it challenging to diagnose. Traditional imaging techniques, such as ultrasonography (US), computed tomography (CT), and magnetic resonance imaging (MRI), may not be able to detect smaller lesions. However, with advancements in gastrointestinal endoscopy technology, the potential advantages that endoscopic ultrasound (EUS) might have over other imaging modalities has been discovered. Yoshida et al. observed that EUS exhibited a median sensitivity of 93–94% in detecting pancreatic lesions, surpassing the sensitivity of magnetic MRI at 67% and CT at 53% (for detecting lesions <30 mm, n = 49) [7]. EUS combines ultrasound technology with endoscopic visualization, allowing for high-resolution and real-time visualization of the digestive tract lumen, obtaining ultrasonic images of the gastrointestinal tract and adjacent organs, and providing insights into the depth of tumor invasion and the presence of enlarged lymph nodes. As a result, EUS has become an important detection tool for various digestive diseases, enhancing the ability to accurately assess the nature and scope of lesions and improving the detection rate of the diseases [8,9].
The diagnostic accuracy of EUS is closely related to the knowledge, experience, and operation level of the endoscopists. It is subjective to a certain extent. Some diseases are challenging to diagnose with EUS alone [10]. AI has the ability to process large amounts of data with high accuracy, and when combined with EUS, it offers objective, simple, and rapid examination. Tonozuka et al. developed a computer-aided diagnosis (CAD) system utilizing EUS images and assessed its efficacy in distinguishing between pancreatic ductal adenocarcinoma (PDAC), chronic pancreatitis (CP), and individuals with normal pancreatic conditions. The findings of this study showcased the exceptional performance of the CAD system for detecting PDAC, with a sensitivity of 92.4% and a specificity of 84.1%, respectively [11]. In a study conducted by Wang et al., the diagnostic value of single endoscopy and artificial intelligence-assisted endoscopic ultrasound (EUS-AI) in early esophageal cancer and precancerous lesions was compared. The researchers also described the diagnostic accuracy of two different models; namely, the CNN model and the Cascade region-convolutional neural network (Cascade RCNN) model. The findings revealed that the Cascade RCNN model, which was based on the soft-NMS intelligent algorithm, exhibited better diagnostic performance compared to the CNN model algorithm. Moreover, the performance of the Cascade RCNN model was similar to the gold standard developed by endoscopists. Additionally, the use of the Cascade RCNN model resulted in a reduction in detection time and a significant improvement in efficiency. The detective rates of the Cascade RCNN model, CNN model, and endoscopic detection alone in early esophageal cancer and precancerous lesions were 88.8% (71/80), 56.3% (45/80), and 44.1% (35/80), respectively [12]. Given the extensive duration required for EUS training and the intricate nature of the techniques involved, there is a growing interest among researchers in the field of EUS-AI. EUS-AI has the potential to aid novice practitioners in their training, significantly reducing the learning curve for them. Additionally, it can serve as a tool for quality control in pancreatic diseases, by providing information about the current location and suggesting the appropriate means of examination for the next step, including assisting with endoscopic ultrasound-guided fine-needle aspiration/biopsy procedures. However, the utilization of EUS in primary hospitals is relatively limited, resulting in a lack of operational experience among endoscopists. Consequently, the incorporation of EUS-AI in primary hospitals could prove highly advantageous in enhancing the overall disease detection rate. This paper aims to provide a comprehensive review of the diagnosis, prognosis, and quality control of EUS-AI in various digestive diseases such as subepithelial lesions (SELs), early esophageal cancer, early gastric cancer (EGC), and pancreatic diseases including pancreatic cystic lesions, autoimmune pancreatitis (AIP), and pancreatic cancer (PC) in the last ten years. The objective is to facilitate the advancement of this examination method in the realm of medical health. Additionally, this paper sheds light on the limitations and future prospects of EUS-AI in the field of digestive diseases (Figure 2).

2. Methods

2.1. Search Strategy

For this review we searched articles in the PubMed Database published within the past ten years (2013–2023) in order to assess the latest advancements in this area. Search terms in the title, abstract, and keywords are as follows: (“artificial intelligence” OR “AI” OR “machine learning” OR “deep learning” OR “convolutional neural network” OR “computer-assisted” OR “computer-aided” OR “neural network” OR “digital image analysis” OR “digital image processing”) AND (“endoscopic ultrasound” OR “endoscopic ultrasonography” OR “endosonography” OR “EUS”). To avoid omissions, the digestive system diseases were not included in the retrieval strategy. The data search was limited to studies written in English.

2.2. Inclusion and Exclusion Criteria

We carefully selected articles that accurately described the applications and prospects of EUS-AI for digestive system diseases. Studies that met all of the following inclusion criteria were selected: (1) evaluated EUS-AI for the diagnosis, prognosis, and quality control of digestive system diseases; (2) the final diagnosis was established by the histopathological diagnosis after surgical or endoscopic resection (ER) or EUS-guided fine-needle aspiration/biopsy; (3) an AI algorithm was applied to the diagnosis of patients with digestive system diseases using EUS; (4) study results demonstrated the diagnostic performance of CAD algorithms, including area under the curve (AUC), sensitivity, specificity, or accuracy, with or without EUS experts as controls.
Studies were excluded based on any of the following criteria: (1) did not evaluate digestive system diseases by EUS-AI; (2) studies were conference proceedings, abstracts, editorials, case reports, or letters; (3) duplicate articles; (4) incomplete data available.

3. Types of Digestive System Diseases Diagnosed and Prognosed by EUS-AI

3.1. Subepithelial Lesions

SELs of the gastrointestinal tract are masses or mass-like structures originating from the non-mucosal layer, including gastrointestinal stromal tumors (GISTs), leiomyoma, schwannoma, neuroendocrine tumor (NET), ectopic pancreas, lipoma, and hemangioma, of which the first two are the most common. GISTs originate from interstitial cells of Cajal or stem cells with a tendency to differentiate interstitial cells of Cajal, and they usually occur in the stomach and small intestine. GISTs have variable malignant potential, accounting for 1–3% of gastrointestinal malignancies. Surgical resection of small lesions without metastasis might be the only possible way to cure GISTs [13,14,15]. Therefore, distinguishing them from benign SELs in early stages is the key to treating GISTs.
CT and MRI are the tests more commonly used to diagnose GISTs [16]. The risk of postoperative recurrence of GISTs is related to the modified US National Institutes of Health classification. Clinically, GISTs are categorized into four grades of very low, low, intermediate, and high risk based on size, site of development, number of microscopic nuclear divisions, and whether or not it is ruptured [17]. In the very low-risk and low-risk group, GISTs are mostly rounded with well-defined borders, and the lesions on CT reveal low density or equal density; the lesions on MRI generally appear to have hypointense or isointense signals on T1-weighted imaging, slightly hyperintense signals on the T2-weighted imaging, and evenly hyperintense signals on fat suppression T2-weighted imaging. Intermediate-risk and high-risk lesions tend to be irregularly shaped with blurred borders. The lesions are shown heterogeneously in both CT and MRI. A consensus report from the German GIST Imaging Working Group suggested that MRI as an option in case of liver-specific problems or contraindications to CT [18]. CT and MRI are usually used to examine larger tumors, and EUS has become the diagnostic procedure of choice for small GISTs [19].
EUS can clearly observe the size and morphology of gastrointestinal tumors, the structure of each layer of the gastrointestinal wall, and the invasion of adjacent organs. It has become the best diagnostic modality for SELs in recent years [20]. The typical EUS imaging features of GISTs are hypoechoic solid masses originating from the fourth layer of ultrasound with well-defined borders while the structure of the fifth layer of ultrasound is clear and intact. However, it is more difficult to distinguish between GISTs and leiomyoma with EUS alone. Several studies have been conducted to input filtered high-quality EUS images as a training set into an AI model, extract meaningful image features for model construction, and then input EUS images from the validation set into the model to verify its ability to diagnose GISTs and non-GISTs. It is concluded that EUS-AI has the potential to be a good choice for diagnosing SELs (Table 1).
In 2020, Minoda et al. studied the diagnostic accuracy of EUS-AI based on gastric SELs for GISTs and non-GISTs, indicating that EUS-AI has an accurate diagnosis for GISTs ≥20 mm [21]. In 2022, Minoda et al. verified that this AI model could also be used to distinguish GISTs and non-GISTs patients from non-gastric SELs. The authors collected EUS images of 52 non-gastric SELs patients (esophagus, n = 15; duodenal, n = 26; colon, n = 11), and they noticed that the diagnostic accuracy of EUS-AI improved as the size of the lesion increased, independent of lesion location [22]. A recent meta-analysis (including 7 studies with 2431 patients) noted that the pooled sensitivity and specificity of EUS-AI by CNN in diagnosing GISTs were 0.92 (95%/CI, 0.89–0.95) and 0.82 (95%/CI, 0.75–0.87), respectively, which was higher than those of endoscopists. Two of the studies assessed the ability to predict the malignant potential of GISTs; the very low-risk and low-risk GISTs were classified as the low-risk group, while the intermedium-risk and high-risk GISTs were classified as high-risk group. The pooled sensitivity and specificity for diagnosing high-risk GISTs were 0.84 (95%/CI, 0.68–0.94) and 0.81 (95%/CI, 0.73–0.86), and the summary diagnostic odds ratio was 28.80 (95%/CI, 3.48–238.31), indicating that the EUS-AI model could accurately predict the malignant potential of GISTs [23].
Studies have tested the accuracy of contrast-enhanced harmonic endoscopic ultrasound (CH-EUS) using AI algorithms to diagnose SELs. Tanaka et al. retrospectively examined 53 patients with GISTs and leiomyomas to evaluate their diagnostic accuracy by using DL involving a residual neural network and leave-one-out cross-validation, combining it with the SiamMask technique to track and trim lesions in CH-EUS videos. The sensitivity, specificity, and accuracy of AI in diagnosing GISTs were 90.5%, 90.9%, and 90.6%, respectively, and those of endoscopists were 90.5%, 81.8%, and 88.7%, respectively (p = 0.683), indicating that the diagnosis of CH-EUS images between AI and endoscopists was comparable [24].
Notably, Hirai et al. conducted a multicenter retrospective study to develop an EUS-AI model for common SELs (GISTs, leiomyoma, schwannoma, NET, ectopic pancreas), which assessed the diagnostic accuracy of the model and endoscopists. This study was the first to combine AI and SELs EUS images for classification and recognition, and it showed that the EUS-AI had a diagnostic accuracy of 86.1% for the five SELs categories, which was significantly higher than that of endoscopists; moreover, this model had high sensitivity and accuracy in distinguishing GISTs from non-GISTs, with 98.8% and 89.3%, respectively, which was noticeably higher than that of endoscopists [25].
Table 1. Studies assessing the sensitivity, specificity, and diagnostic accuracy of AI models for GISTs.
Table 1. Studies assessing the sensitivity, specificity, and diagnostic accuracy of AI models for GISTs.
StudyStudy DesignAI ModelPatient PopulationResearch
Object
Outcomes for the AI Model
Minoda et al. [21]Retrospective (Japan)CNNSELs < 20 mm:
Total Patients = 30
GISTs = 23
Leiomyoma = 5
Schwannoma = 1
Ectopic Pancreas = 1
SELs ≥ 20 mm:
Total Patients = 30
GISTs = 24
Leiomyoma = 4
Schwannoma = 1
Ectopic Pancreas = 1
EUS ImagesRecognition of GISTs in SELs < 20 mm:
Sensitivity = 86.3%
Specificity = 62.5%
Accuracy = 86.3%
AUC = 0.861
Recognition of GISTs in SELs ≥ 20 mm:
Sensitivity = 83.3%
Specificity = 91.7%
Accuracy = 90.0%
AUC = 0.965
Minoda et al. [22]Retrospective (Japan)CNNTotal Patients = 52
GISTs = 36
Leiomyoma = 14
Ectopic Pancreas = 1
Appendiceal Mucocele = 1
EUS ImagesRecognition of GISTs:
Sensitivity = 100%
Specificity = 86.1%
Accuracy = 94.4%
AUC = 0.980
Tanaka et al. [24]Retrospective (Japan)DLTotal Patients = 53
GISTs = 42
Leiomyoma = 11
CH-EUS ImagesRecognition of GISTs:
Sensitivity = 90.5%
Specificity = 90.9%
Accuracy = 90.6%
Hirai et al. [25]Retrospective (Japan)CNN
DCGAN
Semi-supervised Learning
Total Patients = 631
GISTs = 435
non-GISTs = 196 (Leiomyoma = 97, Schwannoma = 33, NET = 47, Ectopic Pancreas = 19)
EUS ImagesRecognition of GISTs:
Sensitivity = 98.8%
Specificity = 67.6%
Accuracy = 89.3%
Abbreviation: AI, artificial intelligence; CNN, convolutional neural network; GISTs, gastrointestinal stromal tumors; SELs, subepithelial lesions; EUS, endoscopic ultrasound; AUC, area under the curve; DL, deep learning; CH-EUS, contrast-enhanced harmonic endoscopic ultrasound; DCGAN, deep convolutional generative adversarial network.

3.2. Early Esophageal Cancer

Early esophageal cancer refers to invasive carcinoma with lesions confined to the mucosal layer, regardless of whether or not regional lymph node metastasis was presented. The symptoms of early esophageal cancer are relatively insidious and usually appear in the middle to late stage of the disease. The prognosis of esophageal cancer is closely related to its staging. The five-year survival rate of patients with advanced esophageal cancer is only about 10%, while early esophageal cancer can reach 90% after surgery [26]. Therefore, the early diagnosis and early treatment of esophageal cancer are of great significance to improve the prognosis for patients. EUS is the most accurate imaging modality for T staging of esophageal cancer [27]. T stage is determined by the number of layers of the primary tumor invading the esophageal wall and the depth of adjacent tissue infiltration. Tis a high-grade dysplasia confined to the epithelium and not penetrating the lamina propria, T1a tumors invade the lamina propria or muscularis mucosae, and T1b tumors invade the submucosa [28]. The stage of the tumor determines how it will be treated, and the American College of Gastroenterology guidelines recommend endoscopic treatment for low-grade intraepithelial neoplasia, Tis, T1a, superficial low-risk T1b, and high-risk T1b esophageal adenocarcinoma with surgery, while advanced tumors required esophagectomy, with perioperative chemotherapy or chemoradiotherapy before surgical treatment [29]. This is based on the relative subjectivity of endoscopists and the variability of operation between them, which might lead to misdiagnosis of the diseases [30]. A study by Wang et al. noted that the sensitivity of EUS for T staging of esophageal cancer ranged between 0% and 70.8%, but the specificity ranged between 71.0% and 100.0%, both of which were dependent on clinical pathological stage. The overall accuracy of EUS T staging was 58.6% [31]. This has led to an exploration of whether EUS-AI could help solve the above diagnostic dilemma.
Knabe et al. developed an EUS-AI system based on a deep convolutional neural network (DCNN), which retrospectively collected 1020 EUS images from 577 patients with esophageal adenocarcinoma for training and internal validation. The results showed that AI was able to identify mucosal carcinoma (T1a) with a sensitivity of 72%, specificity of 64%, and accuracy of 68%, while the sensitivity, specificity, and accuracy of identifying submucosal carcinoma (T1b) were 31%, 78%, and 67%, respectively. The sensitivity, specificity, and accuracy of differentiating T1a and T1b were 66%, 49%, and 55%, respectively. This study has shown that AI is likely to be able to help endoscopists detect and diagnose early esophageal cancer in the future and may also provide guidance on the treatment for esophageal cancer [32].

3.3. Early Gastric Cancer

Gastric cancer ranks fifth in cancer incidence and fourth in mortality [33]. The invasive depth of tumor is a significant factor affecting the staging and survival of gastric cancer [34]. Accurate staging prior to treatment is crucial for the EGC. Meanwhile, the common treatment options include endoscopic submucosal dissection (ESD) or endoscopic mucosal resection (EMR) and surgery. The absolute indication for ER for EGC is a differentiated intramucosal carcinoma ≤2 cm in diameter and without ulceration; regarding the criteria for relative indication, they include (1) diameter >2 cm, intramucosal carcinoma, differentiated, without ulceration; (2) intramucosal carcinoma, differentiated, with ulceration, ≤3 cm in diameter; (3) intramucosal carcinoma, undifferentiated, without ulceration, ≤2 cm in diameter [35]. Surgical treatment is recommended when submucosal infiltration is highly suspected by preoperative evaluation [36]. So far, CT and EUS are the common imaging methods for accurate preoperative staging of gastric cancer. CT manifestations of gastric cancer present as focal or diffuse heterogeneous gastric wall thickening. EUS can visualize the entire gastric wall and has higher accuracy in distinguishing EGC from advanced gastric cancer. EUS has become the preferred tool for local staging of gastric cancer nowadays [37]. Previous studies have reported an accuracy of 67% to 72% in determining the depth of infiltration by EUS for EGC. EUS are more dependent on operators’ experience, while the prevalence of gastric cancer is closely related to the regional distribution of patients, which has led to a worldwide imbalance in the operating experience and technical expertise of endoscopists in diagnosing gastric cancer [38]; thus, EUS-AI in EGC has attracted more and more attention.
The size, ulceration, differentiation, and location of the EGC are major factors that affect the accuracy of the T staging of EUS. Kim et al. used decision tree analysis to explore factors affecting the accuracy of EUS T staging and identify factors leading to overestimation and underestimation of EGC diagnosis. The results showed that after decision tree analysis, the accuracy of EUS T staging of EGC differed greatly under different conditions, fluctuating from 34.0% to 74.6%. For lesions >3 cm, the presence of ulcers was associated with overestimation; for lesions ≤3 cm, the type of differentiation and the location of the tumor had a greater impact on EUS T staging. In well-differentiated EGC, location was the main factor affecting the accuracy of EUS T staging. EGC was easily underestimated when the diameter was less than 3 cm and the lesion was located in the upper and middle part of the stomach. The focus of this study was to predict the accuracy of EUS T staging in patients with EGC, but the accuracy of prediction by influencing factors was not particularly high, and the study did not provide answers about which patients would benefit from EUS. Considering that the AI algorithm only used EUS findings and excluded the results of gastroscopy, the researchers mentioned that it might be possible to incorporate these with the results of gastroscopy in the decision tree, which could confirm the necessity of EUS for patients in some cases, thus improving the prediction accuracy of EUS [39].

3.4. Pancreatic Diseases

At present, the common examinations used to diagnose pancreatic diseases in clinical practice are US, CT, MRI, positron emission tomography-computed tomography (PET/CT), and EUS. The results of them are often very similar, differing only in subtle ways, but the treatment and prognosis for different pancreatic diseases are widely divergent, which requires a high level of expertise on the part of the physician. PET/CT is a kind of test for functional activity of the lesion, which is mainly used to evaluate the efficacy of neoadjuvant chemotherapy and recurrence of tumors after surgical resection. EUS not only provides high-resolution images in real time, but also allows rapid on-site evaluation (ROSE) by endoscopic ultrasound-guided fine-needle aspiration (EUS-FNA) and endoscopic ultrasound-guided fine-needle biopsy (EUS-FNB) to predict the types of diseases and accurately characterize them when the pathology results are returned.

3.4.1. Pancreatic Cystic Lesions

Pancreatic cystic lesions (PCLs) are abnormal inflammatory or proliferative lesions of the pancreas with a prevalence rate up to 42% [40,41]. Most of them are benign, but some subtypes of PCLs are highly likely to develop into malignant tumors, and their pathological features are often characterized by a mucinous phenotype. For example, intraductal papillary mucinous neoplasm (IPMN) and mucinous cystic neoplasms (MCN) and so on [42]. Early identification of PCLs has significant value for the treatment and prognosis of the diseases. However, there are limitations in identifying the types of PCLs by EUS alone, and poor interobserver agreement due to the varying levels of endoscopists for EUS. Therefore, EUS-AI is gradually coming into the limelight and has been studied by many investigators. Nguon et al. developed a CNN model that retrospectively collected EUS images from 59 MCN and 49 serous cystic neoplasms (SCN) patients, and this model allowed for the identification of two different types of isolated pancreatic cystic neoplasms. Their algorithm achieved an overall accuracy of 82.75%, which is comparable to the performance of classification by experienced endoscopists [43]. Vilas-Boas et al. focused more on AI for the classification of groups and used a high-precision algorithm of CNN for the automatic identification of mucinous pancreatic cysts. A total of 5505 EUS images were extracted, among which, 3725 depicted mucous lesions and 1780 showed non-mucous lesions. All images were divided into two data sets, namely the training data set and the validation data set. The validation data set was used to evaluate the diagnostic effectiveness of the CNN algorithm in distinguishing mucinous from non-mucinous lesions. The authors ultimately concluded that the overall accuracy, sensitivity, and specificity of the model were 98.5%, 98.3%, and 98.9%, and the AUC was 1. This study provided a timely estimate of the likelihood of lesion malignancy by distinguishing PCLs as mucinous or non-mucinous lesions, which might make the EUS-AI an important tool for risk stratification of PCLs in clinical practice, facilitating the management of patients and subsequent follow-up [44].
IPMN is the precursor to invasive PC [45]. A retrospective study using the CNN model collected static EUS images of 50 patients with IPMN to identify the diagnostic performance of malignant tumors. It showed that the accuracy of predicting IPMN as malignant tumors by AI was 94%, and its diagnostic accuracy was higher than that of conventional EUS (40–60%) and endoscopists’ diagnoses (56%) [46]. Endoscopic ultrasound-guided needle-based confocal laser endomicroscopy (EUS-nCLE), as an emerging technology, enables the confocal laser probe to enter the cystic cavity through a 19 G puncture needle to form real-time tissue microscopic imaging, which further improves the diagnostic accuracy of pancreatic cystic tumors [47,48]. Several studies have proved the feasibility of the EUS-nCLE model to differentiate PCL types [49,50,51]. Therefore, a single-center prospective study assessed the diagnostic performance of AI combined with EUS-nCLE for advanced IPMN and explored whether it could be used for risk stratification of IPMN. Machicado et al. designed two CAD algorithms based on CNN: one of the CNN-CAD systems in the overall model automatically extracted all features of nCLE and predicted high-grade dysplasia and/or adenocarcinoma (HGD-Ca); the other CNN-CAD system in the segmentation model was trained to identify segments of papillary structures, and measure the thickness and darkness of papillary epithelium to distinguish low- or intermediate-grade dysplasia (LGD) from HGD-Ca. Compared with the Fukuoka and AGA guidelines for risk stratification, this study found that the two nCLE-guided CNN-CAD algorithms had higher sensitivity and accuracy with comparable specificity in diagnosing HGD-Ca. In terms of IPMN risk stratification, the two CNN-CAD models were more accurate compared to the guidelines, although with similar specificity. This study demonstrated that the two CNN-CAD algorithms based on the n-CLE model could diagnose advanced tumors in IPMN more accurately, while being feasible and accurate in terms of risk stratification [52].

3.4.2. Autoimmune Pancreatitis

AIP is an immune-mediated fibroinflammatory subtype of CP [53,54]. The typical AIP takes diffuse pancreatic enlargement as its performance, but the atypical AIP is focused on mass enlargement. Therefore, the distinction between atypical AIP (focal AIP) and pancreatic malignancies (especially PDAC) is extremely critical. Existing guidelines consider MRI and CT as important tests for the diagnosis of AIP, and EUS is mainly used to obtain cytohistological specimens despite providing a wealth of morphological features [55]. EUS-guided tissue acquisition techniques include EUS-FNA and EUS-FNB: EUS-FNA indicates that cells are aspirated from the target tissue using a conventional straight needle, and cytopathologists determine the type of lesion by observing the abnormal cells and their characteristics in the aspirated sample. However, the diagnostic accuracy of this test depends on the availability of ROSE and it is more dependent on the diagnostic experience of the cytopathologists; at the same time, EUS-FNB uses a new generation of coarse needle that allows not only cytological evaluation but also histological examination by preserving tissue structure, which makes an effective diagnosis of AIP possible. Therefore, EUS-FNB is increasingly used as an alternative to EUS-FNA [56]. Thomsen et al. examined the utility of pancreatic EUS-FNB based on a large single-center study of 852 specimens from 723 patients, which found that pancreatic EUS-FNB for AIP had an accuracy of 0.992 (95% CI 0.983–0.997). The sensitivity and specificity of EUS-FNB for AIP were 0.833 (95% CI 0.586–0.964) and 0.995 (95% CI 0.988–0.999). This suggested the promising potential advantages of EUS-FNB in diagnosing AIP [57].
The diagnostic accuracy of AIP is highly correlated with the operation of the endoscopists and the experience of the cytopathologists, while both EUS-FNA and EUS-FNB operations are invasive. Even though the incidence of adverse events is rare, there is still a risk of complications for patients. Therefore, some studies have used EUS images or EUS videos in combination with AI to investigate the diagnostic accuracy of AIP. Guo et al. conducted a retrospective study using multivariate stepwise logistic regression and receiver operating characteristics (ROC) analyses. Ninety patients with focal autoimmune pancreatitis (FAIP) and 196 patients with PC were collected and randomly divided into two groups, the derivation group and the validation group. A predictive model was constructed based on all EUS characteristics from the derivation group and its effectiveness in evaluating the two diseases was verified in the validation group. This study demonstrated that diffuse hypoechogenicity, bile duct wall thickening, and hyperechoic foci/strands were three independent predictors, with an AUC of 0.975 (95%/CI, 0.959–0.990). Considering the subjective nature of distinguishing diffuse or focal hypoechogenicity by endoscopists, the authors excluded these two characteristics and designed another prediction model by multivariate stepwise logistic regression analysis. The results showed that main pancreatic duct dilation, common bile duct dilation, bile duct wall thickening, and hyperechoic foci/strands were independent predictors, with an AUC of 0.951 (95%/CI, 0.929–0.974). According to the optimal cutoff value, the sensitivity and specificity of the model were 83.7–91.8% and 93.3–95.6% [58]. Marya et al. used static EUS images and video databases to develop the CNN model. After training, the model could analyze EUS videos in real-time and accurately distinguish AIP from PDAC and benign pancreatic diseases (CP and normal pancreas). The sensitivity and specificity of differentiating AIP and PDAC were 90% and 93%, respectively. The sensitivity and specificity to distinguish AIP from CP were 94% and 71%, respectively. The specificity between AIP and normal pancreas was 98%. The sensitivity and specificity of AIP and non-AIP were 90% and 85%, respectively [59].

3.4.3. Pancreatic Cancer

PC is a highly lethal malignancy with a global five-year overall survival rate of less than 10%. It is difficult to detect while patients have mild or asymptomatic symptoms [60]. PDAC is the most common type of PC, and the majority of diseases are diagnosed at an advanced stage with poor prognosis. Early detection of small lesions and timely excision could improve the five-year survival rate to 80.4% [61]. Traditional imaging techniques such as CT and MRI may not be able to discover smaller lesions, while EUS is the most sensitive modality for the recognition of small solid pancreatic lesions, especially for lesions smaller than 20 mm [62,63]. Therefore, several studies have evaluated EUS-AI in PC settings (Table 2).
In order to distinguish PC from noncarcinomatous pancreatic lesions, Kuwahara et al. collected 22,000 EUS images of 933 patients to evaluate the diagnostic effectiveness of the AI model developed by DL. The authors found that the AUC, sensitivity, specificity, and accuracy (95%/CI) of the diagnosis of PC were 0.90 (0.84–0.97), 0.94 (0.88–0.98), 0.82 (0.88–0.92), and 0.91 (0.85–0.95), respectively. It was also indicated that the model could be utilized to distinguish PDAC, pancreatic adenosquamous carcinoma, acinar cell carcinoma, metastatic pancreatic tumor, neuroendocrine carcinoma, NET, solid pseudopapillary neoplasm, CP, and AIP [64]. As we all know, further external validation is required in the future. Tonozuka et al. developed a CAD system using EUS images and evaluated the ability of the system to discriminate PDAC from CP and normal pancreatic patients. The results presented excellent results of this model in detecting PDAC, with AUCs of 0.924 and 0.940 in the validation and test setting, respectively [11]. In addition, a systematic review of 11 studies investigating the role of EUS-AI in the diagnosis of PC found overall accuracy, sensitivity, and specificity in the range of 80–97.5%, 83–100%, and 50–99%, respectively [65].
EUS-FNA and EUS-FNB are commonly used techniques for the diagnosis of pancreatic diseases by cytohistopathology. The uneven level of practice and experience among endoscopists leads to differences in the quality of the tissue samples they obtain. To address this challenge, AI has emerged in recent years as a promising tool to improve the accuracy and efficiency of EUS-guided tissue sampling. AI potentially assists EUS-FNA/FNB through real-time feedback obtained by the endoscopists during the procedure, helping to select the appropriate type and size of the puncture needle, guiding the optimal location and depth of the puncture, and providing feedback on the quality of the sample obtained. Thus, AI can reduce the number of punctures required to obtain an adequate sample, improve puncture accuracy, and minimize the risk of complications [66]. In addition, with the innovation of image recognition by AI and the development of cytopathology, some researchers have applied AI to analyze pathological specimens by EUS-FNA/FNB. Zhang et al. employed EUS-FNA to perform biopsies in a PC group and a non-PC (mild atypical lesions, other tumors, or no tumor) group, and applied ROSE on detected specimens after staining. Internal testing and external validation were conducted on pathological stained sections by the deep convolutional neural network (DCNN) system. Both the AUCs of internal testing and external validation were >0.9, which was comparable to the diagnostic ability of cytopathologists. Considering that some hospitals might be short of cytopathologists, the diagnostic accuracy of endoscopists who have received standardized training in pathology was compared with the DCNN model, and the results showed that the sensitivity of the DCNN model was higher than that of endoscopists [67]. Ishikawa et al. established that the DL model’s diagnostic accuracy of unstained biopsy specimens of pancreatic diseases was lower than that of macroscopic on-site evaluation (MOSE). Nevertheless, the diagnostic accuracy of the AI model after specimen staining was comparable to that of MOSE. The sensitivity, specificity, and accuracy of MOSE after staining were 88.97%, 53.5%, and 83.24%, respectively. Comparatively, the sensitivity, specificity, and accuracy of AI-assisted EUS-FNB were 90.34%, 53.5%, and 84.39%, respectively [68].
EUS has numerous modes, including B-mode, CH-EUS, endoscopic ultrasound-elastography (EUS-EG), etc. CH-EUS and EUS-EG could be used as complementary tools to characterize focal pancreatic lesions. Even though CH-EUS could not improve the detection rate of lesions, it could assist in the antidiastole of the diseases [69,70,71,72,73]. CH-EUS, which uses contrast agents combined with tissue harmonic imaging to depict the microvascular system in real time, has shown promise in differentiating benign from malignant pancreatic masses [74]. The steep learning curve of EUS requires the operators to be skilled in human anatomy and manipulation [75], thus creating an urgent need for new techniques that can emerge to objectively identify and classify CH-EUS images to assist in diagnosis. In addition, based on the temporal change in echo enhancement intensity, a time-intensity curve (TIC) can be generated. There is evidence that CH-EUS using TIC analysis is very effective in differentiating various pancreatic lesions [76]. Tang et al. constructed a novel AI-assisted diagnostic system (CH-EUS MASTER) based on DCNN and RF algorithms and applied it to two models by retrospectively collecting images or videos of CH-EUS to achieve: (1) identification and tracking of pancreatic masses dynamically in real time; (2) differentiation between PC and CP by TIC analysis. The results showed that the average overlap rate of model 1 was 0.708 with an accuracy of 87.8% at the image overlap threshold of 0.50, compared to manual annotation by endoscopists. Model 2 identified PC with an accuracy of 88.9%. This system is a promising AI system for diagnosing malignant and benign pancreatic masses [77]. EUS-EG is a diagnostic approach based on tissue stiffness measurement. Săftoiu et al. conducted a prospective, blinded, multicentric study using AI-assisted EUS-EG to discriminate between CP and PC, and the authors observed that the model had a training accuracy of 0.9114 (95% CI 0.8987–0.9242) and a test accuracy of 0.8427 (95% CI 0.8309–0.8544). The sensitivity, specificity, PPV, and NPV of AI-assisted EUS-EG were 0.88, 0.83, 0.96, and 0.57, respectively. This study suggests that ANN can provide a rapid and accurate diagnosis of pancreatic malignancies [78].
Table 2. Studies assessing the sensitivity, specificity, and diagnostic accuracy of AI models for PC.
Table 2. Studies assessing the sensitivity, specificity, and diagnostic accuracy of AI models for PC.
StudyStudy DesignAI ModelPatient PopulationResearch ObjectOutcomes for the AI Model
Kuwahara et al. [64]Retrospective (Japan)DLTotal Patients = 694
PC = 524
Non-Cancer Patients = 170
(PDAC = 518, PASC = 5, ACC = 1, MPT = 8, NEC = 6, NET = 57, SPN = 6, CP = 58, AIP = 35)
EUS ImagesRecognition of PC:
Sensitivity = 94%
Specificity = 82%
Accuracy = 91%
AUC = 0.90
Tonozuka et al. [11]Retrospective (Japan)CNNTotal Patients = 139
PDAC = 76
CP = 34
NP = 29
EUS ImagesRecognition of PC:
Sensitivity = 92.4%
Specificity = 84.1%
AUC = 0.940
Goyal et al. [65]Systematic Review (United States)ANN
CNN
SVM
Total Patients = 2292
PC = 1409
Non-Cancer Patients = 883
EUS Images
EUS Videos
EUS-EG
Recognition of PC:
Sensitivity = 83–100%
Specificity = 50–99%
Accuracy = 80–97.5%
Zhang et al. [67]Retrospective (China)DCNNTotal Patients = 194
PC = 110
Non-Cancer Patients = 84
Staining EUS-FNA SpecimensRecognition of PC:
Sensitivity = 92.8–94.4%
Specificity = 87.5–97.1%
Accuracy = 91.2–95.8%
AUC = 0.948–0.976
Ishikawa et al. [68]Retrospective (Japan)Contrastive Learning (Unsupervised Learning)Total Patients = 97
PDAC = 66
MFP = 13
AIP = 11
Pancreatic Neuroendocrine Tumor = 3
MPT = 3
IPMC = 1
Staining EUS-FNB SpecimensRecognition of Pancreatic Diseases:
Sensitivity = 90.34%
Specificity = 53.5%
Accuracy = 84.39%
Tang et al. [77]Prospective (China)Model 1: DCNN
Model 2: RF Algorithm
Total Patients in Model 1 = 950
PC = 760
Benign Pancreatic Masses = 190
Total Patients in Model 2 = 295
PC = 167
Pancreatitis = 128
Model 1: CH-EUS Images
Model 2: CH-EUS Videos
Recognition of Pancreatic Diseases in Model 1:
the Average Overlap Rate = 0.708;
Accuracy = 87.8%
Recognition of Pancreatic Diseases in Model 2:
Sensitivity = 100%
Specificity = 75%
Accuracy = 88.9%
Săftoiu et al. [78]Prospective (Europe)ANNTotal Patients = 258
PC = 211
CP = 47
Hue Histogram Data Extracted from Dynamic Sequences of EUS-EGRecognition of Pancreatic Diseases:
Sensitivity = 87.59%
Specificity = 82.94%
Accuracy = 84.27%
Abbreviation: AI, artificial intelligence; DL, deep learning; PC, pancreatic cancer; PDAC, pancreatic ductal adenocarcinoma; PASC, pancreatic adeno-squamous carcinoma; ACC, acinar cell carcinoma; MPT, metastatic pancreatic tumors; NEC, neuroendocrine carcinoma; NET, neuroendocrine tumors; SPN, solid pseudo papillary neoplasms; CP, chronic pancreatitis; AIP, autoimmune pancreatitis; EUS, endoscopic ultrasound; AUC, area under the curve; CNN, convolutional neural network; NP, normal pancreas; ANN, artificial neural network; SVM, support vector machine; EUS-EG, endoscopic ultrasound elastography; DCNN, deep convolutional neural network; EUS-FNA, endoscopic ultrasound-guided fine-needle aspiration biopsy; MFP, mass-forming pancreatitis; IPMC, intraductal papillary mucinous carcinoma; EUS-FNB, endoscopic ultrasound-guided fine-needle biopsy; RF, random forest; CH-EUS, contrast-enhanced harmonic endoscopic ultrasound.

4. EUS-AI in Quality Control

In addition to building CAD models from EUS images, EUS videos, and cytohistological smears after training to improve the diagnostic accuracy of benign and malignant diseases of the pancreas, EUS-AI can also carry out quality control of the pancreatic scan process to solve the problem of missed diagnosis of diseases due to the blind area of the field of view. Existing studies have established a pancreatic scanning system with EUS as the standard procedure guided by systematic scanning in separate stations [79,80]. However, the complex anatomical structure of EUS for diseases increases the hardship of the interpretation of images. For this reason, Zhang et al. developed a system called BP MASTER to create a station classification model and a segmentation model, which were then subjected to internal and external verification. The classification model was utilized to determine the current scan site and guide the operation of the next site, while the segmentation model focused on monitoring the pancreas/abdominal aorta/portal confluence in real-time. If the pancreas and important blood vessels continued to disappear, it was recommended to return to the previous station for rescanning. The researchers found that the accuracy of the classification model for site identification during internal and external verification was 94.2% and 82.4%, respectively. The mean F1 index (Dice) for the segmentation model was 0.836 and 0.715. Additionally, the researchers extracted 396 video clips and applied them to the station classification model, achieving a per-frame accuracy of 86.2%. Moreover, the BP MASTER system was shown to reduce the learning curve for inexperienced students using EUS-AI to identify PC. On the basis of a previous study, a prospective study was conducted involving eight students with one year of experience in gastroenterology. The students had not participated in operation training for EUS. The trainees’ recognition accuracy of processed EUS videos significantly increased from 67.2% to 78.4% (95%/CI, 0.058–1.663; p < 0.01) [75].

5. Discussion and Prospects

In summary, the diagnostic accuracy of EUS-AI for digestive system diseases is comparable to or even better than that of endoscopists. EUS, with its high-resolution imaging, can effectively observe the lesion site and provide a diagnosis advantage over other imaging tests such as US, CT, MRI, and PET/CT. Nevertheless, the diagnostic ability of endoscopists is highly correlated with their knowledge reserve, clinical experience, and operation proficiency. With the increase in operation times, endoscopists may miss the diagnosis of diseases due to fatigue or inattention. Manual diagnosis by endoscopists is also subjective. EUS-AI, which combines AI with EUS examinations, enables early diagnosis of disease while accelerating the treatment process and improving patient prognosis. By implementing quality control during the inspection process, EUS-AI could alleviate the overall medical burden on individuals and healthcare systems worldwide. Moreover, the addition of AI can shorten the learning curve for novice doctors, who typically require extensive training by experienced endoscopists to standardize their EUS skills. Consequently, these discoveries may eliminate the “automation bias” held by some individuals towards EUS-AI, representing a significant advancement for CAD in clinical diagnosis.
In recent years, there has been a growing interest among researchers in the use of EUS-FNA/FNB instead of postoperative cytopathology and histopathology. AI-assisted EUS-FNA/FNB has shown promising results, as it allows for the precise localization of lesions and avoids important vessels during puncture. By optimizing the sampling site, angle, and number of times, the prognostic risk for patients can be significantly reduced. Additionally, to address the issue of missed diagnosis resulting from the blind field of view during operations, some studies have utilized EUS-AI to create pancreatic segmentation and classification models. These models provide real-time information about the current location and guide subsequent examination methods, representing a relatively novel and valuable tool. If this technique is applied to examine other digestive diseases, this could be a major step forward in the field of disease screening.
The applications of AI in EUS come with certain limitations. Firstly, most of the current studies are retrospective single-center studies, which means that the data sources in the obtained datasets lack universal representation. As a result, models built on such data might be prone to information bias. Secondly, anonymized or de-identified data often need to be traced back to the patient for diagnosis and treatment. This introduces a potential risk of data breaches and unauthorized access to patient information. Thirdly, the standardization of data collection and data analysis is insufficient. In order to ensure accurate diagnosis and broad applicability, standardized processing of data acquisition and analysis should be established. Fourthly, the advent of EUS-AI, a machine–human collaborative examination, has brought about a significant transformation in the conventional doctor–patient relationship. The responsibility for any misdiagnosis resulting from the use of AI falls not only on the doctors but also on the model developers and software platform vendors involved. Active and flexible laws and regulations are still relatively lacking. Fifthly, the particularly specifical limitation of AI is the “black box problem” [81], that is, only the input layers and output layers are visible, and the operation and recognition in the hidden layers are opaque. This makes it difficult for doctors and model developers to explain the reasons for potential biases, errors, and unintended consequences, posing a great challenge in the context of evidence-based medicine.
In the information age, emerging technologies such as AI are still at an early stage. While there are limitations in the diagnosis of diseases through EUS-AI, researchers can take certain steps to enhance its application in clinical settings. In the future, AI algorithms should be refined to create visualized AI decision-making processes. Additionally, large-sample, multi-center prospective studies should be conducted to cover a range of diseases. The incorporation of high-quality images or videos should be maximized, and integrated models should be flexibly applied. Furthermore, an open, quality-monitored data collection server should be established to enable global sharing, while ensuring data confidentiality to protect patient privacy. Clear accountability policies need to be developed to regulate AI technology effectively, ensuring its reasonable and legal application, and minimizing or avoiding harm caused by AI errors. It is important to note that EUS-AI exhibits great potential in healthcare, but it does not imply that endoscopists will be replaced by AI. Instead, the collaboration between the two can lead to more accurate decision-making in the diagnosis and treatment process, thus improving the efficacy of disease diagnosis and facilitating the further development of individualized precision medicine.

Author Contributions

Conceptualization, J.H., X.F. and W.L.; data curation, J.H. and X.F.; writing—original draft preparation, J.H.; writing—review and editing, X.F. and W.L.; supervision, W.L. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Tianjin Medical University General Hospital Clinical Research Program, grant number 22ZYYLCCG09.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest. The sponsors had no role in the design, execution, interpretation, or writing of the study.

References

  1. Samuel, A.L. Some Studies in Machine Learning Using the Game of Checkers. IBM J. Res. Dev. 1959, 3, 210–229. [Google Scholar]
  2. Kuwahara, T.; Hara, K.; Mizuno, N.; Haba, S.; Okuno, N.; Koda, H.; Miyano, A.; Fumihara, D. Current status of artificial intelligence analysis for endoscopic ultrasonography. Dig. Endosc. 2021, 33, 298–305. [Google Scholar] [CrossRef]
  3. Tonozuka, R.; Mukai, S.; Itoi, T. The Role of Artificial Intelligence in Endoscopic Ultrasound for Pancreatic Disorders. Diagnostics 2020, 11, 18. [Google Scholar] [CrossRef] [PubMed]
  4. Gunasekaran, H.; Ramalakshmi, K.; Swaminathan, D.K.; Mazzara, M. GIT-Net: An Ensemble Deep Learning-Based GI Tract Classification of Endoscopic Images. Bioengineering 2023, 10, 809. [Google Scholar] [CrossRef] [PubMed]
  5. Ramamurthy, K.; George, T.T.; Shah, Y.; Sasidhar, P. A Novel Multi-Feature Fusion Method for Classification of Gastrointestinal Diseases Using Endoscopy Images. Diagnostics 2022, 12, 2316. [Google Scholar] [CrossRef]
  6. Zhao, A.; Du, X.; Yuan, S.; Shen, W.; Zhu, X.; Wang, W. Automated Detection of Endometrial Polyps from Hysteroscopic Videos Using Deep Learning. Diagnostics 2023, 13, 1409. [Google Scholar] [CrossRef] [PubMed]
  7. Yoshida, T.; Yamashita, Y.; Kitano, M. Endoscopic Ultrasound for Early Diagnosis of Pancreatic Cancer. Diagnostics 2019, 9, 81. [Google Scholar] [CrossRef]
  8. Sooklal, S.; Chahal, P. Endoscopic Ultrasound. Surg. Clin. N. Am. 2020, 100, 1133–1150. [Google Scholar] [CrossRef]
  9. Dye, C.E.; Waxman, I. Endoscopic ultrasound. Gastroenterol. Clin. N. Am. 2002, 31, 863–879. [Google Scholar] [CrossRef]
  10. Yin, H.; Yang, X.; Sun, L.; Pan, P.; Peng, L.; Li, K.; Zhang, D.; Cui, F.; Xia, C.; Huang, H.; et al. The value of artificial intelligence techniques in predicting pancreatic ductal adenocarcinoma with EUS images: A meta-analysis and systematic review. Endosc. Ultrasound 2023, 12, 50–58. [Google Scholar] [CrossRef] [PubMed]
  11. Tonozuka, R.; Itoi, T.; Nagata, N.; Kojima, H.; Sofuni, A.; Tsuchiya, T.; Ishii, K.; Tanaka, R.; Nagakawa, Y.; Mukai, S. Deep learning analysis for the detection of pancreatic cancer on endosonographic images: A pilot study. J. Hepatobiliary Pancreat. Sci. 2021, 28, 95–104. [Google Scholar] [CrossRef]
  12. Wang, L.; Song, H.; Wang, M.; Wang, H.; Ge, R.; Shen, Y.; Yu, Y. Utilization of Ultrasonic Image Characteristics Combined with Endoscopic Detection on the Basis of Artificial Intelligence Algorithm in Diagnosis of Early Upper Gastrointestinal Cancer. J. Healthc. Eng. 2021, 2021, 2773022. [Google Scholar] [CrossRef] [PubMed]
  13. Akahoshi, K.; Oya, M.; Koga, T.; Shiratsuchi, Y. Current clinical management of gastrointestinal stromal tumor. World J. Gastroenterol. 2018, 24, 2806–2817. [Google Scholar] [CrossRef]
  14. Jaros, D.; Bozic, B.; Sebesta, C. [Gastrointestinal stromal tumors (GIST)]. Wien Med. Wochenschr. 2023, 173, 201–205. [Google Scholar] [CrossRef]
  15. Joensuu, H.; Hohenberger, P.; Corless, C.L. Gastrointestinal stromal tumour. Lancet 2013, 382, 973–983. [Google Scholar] [CrossRef] [PubMed]
  16. Davila, R.E. A Gastroenterologist’s Approach to the Diagnosis and Management of Gastrointestinal Stromal Tumors. Gastroenterol. Clin. N. Am. 2022, 51, 609–624. [Google Scholar] [CrossRef]
  17. Joensuu, H. Risk stratification of patients diagnosed with gastrointestinal stromal tumor. Hum. Pathol. 2008, 39, 1411–1419. [Google Scholar] [CrossRef]
  18. Kalkmann, J.; Zeile, M.; Antoch, G.; Berger, F.; Diederich, S.; Dinter, D.; Fink, C.; Janka, R.; Stattaus, J. Consensus report on the radiological management of patients with gastrointestinal stromal tumours (GIST): Recommendations of the German GIST Imaging Working Group. Cancer Imaging 2012, 12, 126–135. [Google Scholar] [CrossRef] [PubMed]
  19. Landi, B. Gastrointestinal stromal tumors: Clinical features and diagnosis. Bull. Acad. Natl. Med. 2012, 196, 845–852; discussion 852–853. [Google Scholar]
  20. Panbude, S.N.; Ankathi, S.K.; Ramaswamy, A.T.; Saklani, A.P. Gastrointestinal Stromal Tumor (GIST) from esophagus to anorectum—Diagnosis, response evaluation and surveillance on computed tomography (CT) scan. Indian J. Radiol. Imaging 2019, 29, 133–140. [Google Scholar] [CrossRef]
  21. Minoda, Y.; Ihara, E.; Komori, K.; Ogino, H.; Otsuka, Y.; Chinen, T.; Tsuda, Y.; Ando, K.; Yamamoto, H.; Ogawa, Y. Efficacy of endoscopic ultrasound with artificial intelligence for the diagnosis of gastrointestinal stromal tumors. J. Gastroenterol. 2020, 55, 1119–1126. [Google Scholar] [CrossRef]
  22. Minoda, Y.; Ihara, E.; Fujimori, N.; Nagatomo, S.; Esaki, M.; Hata, Y.; Bai, X.; Tanaka, Y.; Ogino, H.; Chinen, T.; et al. Efficacy of ultrasound endoscopy with artificial intelligence for the differential diagnosis of non-gastric gastrointestinal stromal tumors. Sci. Rep. 2022, 12, 16640. [Google Scholar] [CrossRef]
  23. Zhang, B.; Zhu, F.; Li, P.; Zhu, J. Artificial intelligence-assisted endoscopic ultrasound in the diagnosis of gastrointestinal stromal tumors: A meta-analysis. Surg. Endosc. 2023, 37, 1649–1657. [Google Scholar] [CrossRef] [PubMed]
  24. Tanaka, H.; Kamata, K.; Ishihara, R.; Handa, H.; Otsuka, Y.; Yoshida, A.; Yoshikawa, T.; Ishikawa, R.; Okamoto, A.; Yamazaki, T.; et al. Value of artificial intelligence with novel tumor tracking technology in the diagnosis of gastric submucosal tumors by contrast-enhanced harmonic endoscopic ultrasonography. J. Gastroenterol. Hepatol. 2022, 37, 841–846. [Google Scholar] [CrossRef] [PubMed]
  25. Hirai, K.; Kuwahara, T.; Furukawa, K.; Kakushima, N.; Furune, S.; Yamamoto, H.; Marukawa, T.; Asai, H.; Matsui, K.; Sasaki, Y.; et al. Artificial intelligence-based diagnosis of upper gastrointestinal subepithelial lesions on endoscopic ultrasonography images. Gastric Cancer 2022, 25, 382–391. [Google Scholar] [CrossRef] [PubMed]
  26. Min, Y.W.; Lee, H.; Song, B.G.; Min, B.H.; Kim, H.K.; Choi, Y.S.; Lee, J.H.; Hwang, N.Y.; Carriere, K.C.; Rhee, P.L.; et al. Comparison of endoscopic submucosal dissection and surgery for superficial esophageal squamous cell carcinoma: A propensity score-matched analysis. Gastrointest. Endosc. 2018, 88, 624–633. [Google Scholar] [CrossRef]
  27. Van de Ven, S.E.M.; Spaander, M.C.W.; Pouw, R.E.; Tang, T.J.; Houben, M.; Schoon, E.J.; de Jonge, P.J.F.; Bruno, M.J.; Koch, A.D. Favorable effect of endoscopic reassessment of clinically staged T2 esophageal adenocarcinoma: A multicenter prospective cohort study. Endoscopy 2022, 54, 163–169. [Google Scholar] [CrossRef]
  28. Schmidlin, E.J.; Gill, R.R. New frontiers in esophageal radiology. Ann. Transl. Med. 2021, 9, 904. [Google Scholar] [CrossRef]
  29. Shaheen, N.J.; Falk, G.W.; Iyer, P.G.; Souza, R.F.; Yadlapati, R.H.; Sauer, B.G.; Wani, S. Diagnosis and Management of Barrett’s Esophagus: An Updated ACG Guideline. Am. J. Gastroenterol. 2022, 117, 559–587. [Google Scholar] [CrossRef]
  30. Meining, A.; Rösch, T.; Wolf, A.; Lorenz, R.; Allescher, H.D.; Kauer, W.; Dittler, H.J. High interobserver variability in endosonographic staging of upper gastrointestinal cancers. Z. Gastroenterol. 2003, 41, 391–394. [Google Scholar] [CrossRef]
  31. Wang, M.; Zhu, Y.; Li, Z.; Su, P.; Gao, W.; Huang, C.; Tian, Z. Impact of endoscopic ultrasonography on the accuracy of T staging in esophageal cancer and factors associated with its accuracy: A retrospective study. Medicine 2022, 101, e28603. [Google Scholar] [CrossRef]
  32. Knabe, M.; Welsch, L.; Blasberg, T.; Müller, E.; Heilani, M.; Bergen, C.; Herrmann, E.; May, A. Artificial intelligence-assisted staging in Barrett’s carcinoma. Endoscopy 2022, 54, 1191–1197. [Google Scholar] [CrossRef]
  33. Sung, H.; Ferlay, J.; Siegel, R.L.; Laversanne, M.; Soerjomataram, I.; Jemal, A.; Bray, F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J. Clin. 2021, 71, 209–249. [Google Scholar] [CrossRef] [PubMed]
  34. Tsujii, Y.; Hayashi, Y.; Ishihara, R.; Yamaguchi, S.; Yamamoto, M.; Inoue, T.; Nagai, K.; Ogiyama, H.; Yamada, T.; Nakahara, M.; et al. Diagnostic value of endoscopic ultrasonography for the depth of gastric cancer suspected of submucosal invasion: A multicenter prospective study. Surg. Endosc. 2023, 37, 3018–3028. [Google Scholar] [CrossRef] [PubMed]
  35. Ono, H.; Yao, K.; Fujishiro, M.; Oda, I.; Uedo, N.; Nimura, S.; Yahagi, N.; Iishi, H.; Oka, M.; Ajioka, Y.; et al. Guidelines for endoscopic submucosal dissection and endoscopic mucosal resection for early gastric cancer (second edition). Dig. Endosc. 2021, 33, 4–20. [Google Scholar] [CrossRef]
  36. Min, Y.W.; Lee, J.H. Endoscopic Resection for Early Gastric Cancer beyond Absolute Indication with Emphasis on Controversial Issues. J. Gastric. Cancer. 2014, 14, 7–14. [Google Scholar] [CrossRef]
  37. Giandola, T.; Maino, C.; Marrapodi, G.; Ratti, M.; Ragusi, M.; Bigiogera, V.; Talei Franzesi, C.; Corso, R.; Ippolito, D. Imaging in Gastric Cancer: Current Practice and Future Perspectives. Diagnostics 2023, 13, 1276. [Google Scholar] [CrossRef] [PubMed]
  38. Nam, J.Y.; Chung, H.J.; Choi, K.S.; Lee, H.; Kim, T.J.; Soh, H.; Kang, E.A.; Cho, S.J.; Ye, J.C.; Im, J.P.; et al. Deep learning model for diagnosing gastric mucosal lesions using endoscopic images: Development, validation, and method comparison. Gastrointest. Endosc. 2022, 95, 258–268.e210. [Google Scholar] [CrossRef]
  39. Kim, J.; Chung, H.; Kim, J.L.; Lee, E.; Kim, S.G. Hierarchical Analysis of Factors Associated with T Staging of Gastric Cancer by Endoscopic Ultrasound. Dig. Dis. Sci. 2021, 66, 612–618. [Google Scholar] [CrossRef]
  40. Garcea, G.; Ong, S.L.; Rajesh, A.; Neal, C.P.; Pollard, C.A.; Berry, D.P.; Dennison, A.R. Cystic lesions of the pancreas. A diagnostic and management dilemma. Pancreatology 2008, 8, 236–251. [Google Scholar] [CrossRef]
  41. Moris, M.; Bridges, M.D.; Pooley, R.A.; Raimondo, M.; Woodward, T.A.; Stauffer, J.A.; Asbun, H.J.; Wallace, M.B. Association Between Advances in High-Resolution Cross-Section Imaging Technologies and Increase in Prevalence of Pancreatic Cysts From 2005 to 2014. Clin. Gastroenterol. Hepatol. 2016, 14, 585–593.e583. [Google Scholar] [CrossRef] [PubMed]
  42. Oh, S.; Kim, Y.J.; Park, Y.T.; Kim, K.G. Automatic Pancreatic Cyst Lesion Segmentation on EUS Images Using a Deep-Learning Approach. Sensors 2021, 22, 245. [Google Scholar] [CrossRef]
  43. Nguon, L.S.; Seo, K.; Lim, J.H.; Song, T.J.; Cho, S.H.; Park, J.S.; Park, S. Deep Learning-Based Differentiation between Mucinous Cystic Neoplasm and Serous Cystic Neoplasm in the Pancreas Using Endoscopic Ultrasonography. Diagnostics 2021, 11, 1052. [Google Scholar] [CrossRef]
  44. Vilas-Boas, F.; Ribeiro, T.; Afonso, J.; Cardoso, H.; Lopes, S.; Moutinho-Ribeiro, P.; Ferreira, J.; Mascarenhas-Saraiva, M.; Macedo, G. Deep Learning for Automatic Differentiation of Mucinous versus Non-Mucinous Pancreatic Cystic Lesions: A Pilot Study. Diagnostics 2022, 12, 2041. [Google Scholar] [CrossRef] [PubMed]
  45. Matsumoto, I. Invited Editorial: Comprehensive Analysis of Molecular Biological Characteristics of Pancreatic Ductal Adenocarcinoma Concomitant with Intraductal Papillary Mucinous Neoplasm. Ann. Surg. Oncol. 2022, 29, 4683–4685. [Google Scholar] [CrossRef] [PubMed]
  46. Kuwahara, T.; Hara, K.; Mizuno, N.; Okuno, N.; Matsumoto, S.; Obata, M.; Kurita, Y.; Koda, H.; Toriyama, K.; Onishi, S.; et al. Usefulness of Deep Learning Analysis for the Diagnosis of Malignancy in Intraductal Papillary Mucinous Neoplasms of the Pancreas. Clin. Transl. Gastroenterol. 2019, 10, e00045. [Google Scholar] [CrossRef]
  47. Nakai, Y.; Iwashita, T.; Park, D.H.; Samarasena, J.B.; Lee, J.G.; Chang, K.J. Diagnosis of pancreatic cysts: EUS-guided, through-the-needle confocal laser-induced endomicroscopy and cystoscopy trial: DETECT study. Gastrointest. Endosc. 2015, 81, 1204–1214. [Google Scholar] [CrossRef] [PubMed]
  48. Kamboj, A.K.; Modi, R.M.; Swanson, B.; Conwell, D.L.; Krishna, S.G. A comprehensive examination of the novel techniques used for in vivo and ex vivo confocal laser endomicroscopy of pancreatic cystic lesions. VideoGIE 2016, 1, 6–7. [Google Scholar] [CrossRef]
  49. Chin, Y.K.; Wu, C.C.H.; Tan, D.M.Y. The Role of Needle-Based Confocal Laser Endomicroscopy in the Evaluation of Pancreatic Cystic Lesions: A Systematic Review. Clin. Endosc. 2021, 54, 38–47. [Google Scholar] [CrossRef]
  50. Facciorusso, A.; Buccino, V.R.; Sacco, R. Needle-based confocal laser endomicroscopy in pancreatic cysts: A meta-analysis. Eur. J. Gastroenterol. Hepatol. 2020, 32, 1084–1090. [Google Scholar] [CrossRef]
  51. Napoleon, B.; Krishna, S.G.; Marco, B.; Carr-Locke, D.; Chang, K.J.; Ginès, À.; Gress, F.G.; Larghi, A.; Oppong, K.W.; Palazzo, L.; et al. Confocal endomicroscopy for evaluation of pancreatic cystic lesions: A systematic review and international Delphi consensus report. Endosc. Int. Open 2020, 8, E1566–E1581. [Google Scholar] [CrossRef]
  52. Machicado, J.D.; Chao, W.L.; Carlyn, D.E.; Pan, T.Y.; Poland, S.; Alexander, V.L.; Maloof, T.G.; Dubay, K.; Ueltschi, O.; Middendorf, D.M.; et al. High performance in risk stratification of intraductal papillary mucinous neoplasms by confocal laser endomicroscopy image analysis with convolutional neural networks (with video). Gastrointest. Endosc. 2021, 94, 78–87.e72. [Google Scholar] [CrossRef] [PubMed]
  53. Dahiya, D.S.; Al-Haddad, M.; Chandan, S.; Gangwani, M.K.; Aziz, M.; Mohan, B.P.; Ramai, D.; Canakis, A.; Bapaye, J.; Sharma, N. Artificial Intelligence in Endoscopic Ultrasound for Pancreatic Cancer: Where Are We Now and What Does the Future Entail? J. Clin. Med. 2022, 11, 7476. [Google Scholar] [CrossRef]
  54. Mack, S.; Flattet, Y.; Bichard, P.; Frossard, J.L. Recent advances in the management of autoimmune pancreatitis in the era of artificial intelligence. World J. Gastroenterol. 2022, 28, 6867–6874. [Google Scholar] [CrossRef]
  55. Tacelli, M.; Zaccari, P.; Petrone, M.C.; Della Torre, E.; Lanzillotta, M.; Falconi, M.; Doglioni, C.; Capurso, G.; Arcidiacono, P.G. Differential EUS findings in focal type 1 autoimmune pancreatitis and pancreatic cancer: A proof-of-concept study. Endosc. Ultrasound 2022, 11, 216–222. [Google Scholar] [CrossRef]
  56. Yousaf, M.N.; Chaudhary, F.S.; Ehsan, A.; Suarez, A.L.; Muniraj, T.; Jamidar, P.; Aslanian, H.R.; Farrell, J.J. Endoscopic ultrasound (EUS) and the management of pancreatic cancer. BMJ Open Gastroenterol. 2020, 7, e000408. [Google Scholar] [CrossRef]
  57. Thomsen, M.M.; Larsen, M.H.; Di Caterino, T.; Hedegaard Jensen, G.; Mortensen, M.B.; Detlefsen, S. Accuracy and clinical outcomes of pancreatic EUS-guided fine-needle biopsy in a consecutive series of 852 specimens. Endosc. Ultrasound 2022, 11, 306–318. [Google Scholar] [CrossRef]
  58. Guo, T.; Xu, T.; Zhang, S.; Lai, Y.; Wu, X.; Wu, D.; Feng, Y.; Jiang, Q.; Wang, Q.; Qian, J.; et al. The role of EUS in diagnosing focal autoimmune pancreatitis and differentiating it from pancreatic cancer. Endosc. Ultrasound 2021, 10, 280–287. [Google Scholar] [CrossRef] [PubMed]
  59. Marya, N.B.; Powers, P.D.; Chari, S.T.; Gleeson, F.C.; Leggett, C.L.; Abu Dayyeh, B.K.; Chandrasekhara, V.; Iyer, P.G.; Majumder, S.; Pearson, R.K.; et al. Utilisation of artificial intelligence for the development of an EUS-convolutional neural network model trained to enhance the diagnosis of autoimmune pancreatitis. Gut 2021, 70, 1335–1344. [Google Scholar] [CrossRef] [PubMed]
  60. McGuigan, A.; Kelly, P.; Turkington, R.C.; Jones, C.; Coleman, H.G.; McCain, R.S. Pancreatic cancer: A review of clinical diagnosis, epidemiology, treatment and outcomes. World J. Gastroenterol. 2018, 24, 4846–4861. [Google Scholar] [CrossRef]
  61. Spadaccini, M.; Koleth, G.; Emmanuel, J.; Khalaf, K.; Facciorusso, A.; Grizzi, F.; Hassan, C.; Colombo, M.; Mangiavillano, B.; Fugazza, A.; et al. Enhanced endoscopic ultrasound imaging for pancreatic lesions: The road to artificial intelligence. World J. Gastroenterol. 2022, 28, 3814–3824. [Google Scholar] [CrossRef] [PubMed]
  62. Machicado, J.D.; Obuch, J.C.; Goodman, K.A.; Schefter, T.E.; Frakes, J.; Hoffe, S.; Latifi, K.; Simon, V.C.; Santangelo, T.; Ezekwe, E.; et al. Endoscopic Ultrasound Placement of Preloaded Fiducial Markers Shortens Procedure Time Compared to Back-Loaded Markers. Clin. Gastroenterol. Hepatol. 2019, 17, 2749–2758.e2742. [Google Scholar] [CrossRef] [PubMed]
  63. Goggins, M.; Overbeek, K.A.; Brand, R.; Syngal, S.; Del Chiaro, M.; Bartsch, D.K.; Bassi, C.; Carrato, A.; Farrell, J.; Fishman, E.K.; et al. Management of patients with increased risk for familial pancreatic cancer: Updated recommendations from the International Cancer of the Pancreas Screening (CAPS) Consortium. Gut 2020, 69, 7–17. [Google Scholar] [CrossRef] [PubMed]
  64. Kuwahara, T.; Hara, K.; Mizuno, N.; Haba, S.; Okuno, N.; Kuraishi, Y.; Fumihara, D.; Yanaidani, T.; Ishikawa, S.; Yasuda, T.; et al. Artificial intelligence using deep learning analysis of endoscopic ultrasonography images for the differential diagnosis of pancreatic masses. Endoscopy 2023, 55, 140–149. [Google Scholar] [CrossRef]
  65. Goyal, H.; Sherazi, S.A.A.; Gupta, S.; Perisetti, A.; Achebe, I.; Ali, A.; Tharian, B.; Thosani, N.; Sharma, N.R. Application of artificial intelligence in diagnosis of pancreatic malignancies by endoscopic ultrasound: A systemic review. Therap. Adv. Gastroenterol. 2022, 15, 17562848221093873. [Google Scholar] [CrossRef]
  66. Khalaf, K.; Terrin, M.; Jovani, M.; Rizkala, T.; Spadaccini, M.; Pawlak, K.M.; Colombo, M.; Andreozzi, M.; Fugazza, A.; Facciorusso, A.; et al. A Comprehensive Guide to Artificial Intelligence in Endoscopic Ultrasound. J. Clin. Med. 2023, 12, 3757. [Google Scholar] [CrossRef] [PubMed]
  67. Zhang, S.; Zhou, Y.; Tang, D.; Ni, M.; Zheng, J.; Xu, G.; Peng, C.; Shen, S.; Zhan, Q.; Wang, X.; et al. A deep learning-based segmentation system for rapid onsite cytologic pathology evaluation of pancreatic masses: A retrospective, multicenter, diagnostic study. EBioMedicine 2022, 80, 104022. [Google Scholar] [CrossRef] [PubMed]
  68. Ishikawa, T.; Hayakawa, M.; Suzuki, H.; Ohno, E.; Mizutani, Y.; Iida, T.; Fujishiro, M.; Kawashima, H.; Hotta, K. Development of a Novel Evaluation Method for Endoscopic Ultrasound-Guided Fine-Needle Biopsy in Pancreatic Diseases Using Artificial Intelligence. Diagnostics 2022, 12, 434. [Google Scholar] [CrossRef]
  69. D’Onofrio, M.; Mansueto, G.; Falconi, M.; Procacci, C. Neuroendocrine pancreatic tumor: Value of contrast enhanced ultrasonography. Abdom. Imaging 2004, 29, 246–258. [Google Scholar] [CrossRef]
  70. D’Onofrio, M.; Zamboni, G.; Faccioli, N.; Capelli, P.; Pozzi Mucelli, R. Ultrasonography of the pancreas. 4. Contrast-enhanced imaging. Abdom. Imaging 2007, 32, 171–181. [Google Scholar] [CrossRef]
  71. Dietrich, C.F.; Ignee, A.; Braden, B.; Barreiros, A.P.; Ott, M.; Hocke, M. Improved differentiation of pancreatic tumors using contrast-enhanced endoscopic ultrasound. Clin. Gastroenterol. Hepatol. 2008, 6, 590–597.e591. [Google Scholar] [CrossRef] [PubMed]
  72. Dietrich, C.F.; Braden, B.; Hocke, M.; Ott, M.; Ignee, A. Improved characterisation of solitary solid pancreatic tumours using contrast enhanced transabdominal ultrasound. J. Cancer Res. Clin. Oncol. 2008, 134, 635–643. [Google Scholar] [CrossRef] [PubMed]
  73. Cho, I.R.; Jeong, S.H.; Kang, H.; Kim, E.J.; Kim, Y.S.; Jeon, S.; Cho, J.H. Diagnostic performance of endoscopic ultrasound elastography for differential diagnosis of solid pancreatic lesions: A propensity score-matched analysis. Pancreatology 2023, 23, 105–111. [Google Scholar] [CrossRef] [PubMed]
  74. Li, Y.; Jin, H.; Liao, D.; Qian, B.; Zhang, Y.; Xu, M.; Han, S. Contrast-enhanced harmonic endoscopic ultrasonography for the differential diagnosis of pancreatic masses: A systematic review and meta-analysis. Mol. Clin. Oncol. 2019, 11, 425–433. [Google Scholar] [CrossRef]
  75. Zhang, J.; Zhu, L.; Yao, L.; Ding, X.; Chen, D.; Wu, H.; Lu, Z.; Zhou, W.; Zhang, L.; An, P.; et al. Deep learning-based pancreas segmentation and station recognition system in EUS: Development and validation of a useful training tool (with video). Gastrointest. Endosc. 2020, 92, 874–885.e873. [Google Scholar] [CrossRef]
  76. Takada, S.; Kato, H.; Saragai, Y.; Muro, S.; Uchida, D.; Tomoda, T.; Matsumoto, K.; Horiguchi, S.; Tanaka, N.; Okada, H. Contrast-enhanced harmonic endoscopic ultrasound using time-intensity curve analysis predicts pathological grade of pancreatic neuroendocrine neoplasm. J. Med. Ultrason. 2019, 46, 449–458. [Google Scholar] [CrossRef]
  77. Tang, A.; Tian, L.; Gao, K.; Liu, R.; Hu, S.; Liu, J.; Xu, J.; Fu, T.; Zhang, Z.; Wang, W.; et al. Contrast-enhanced harmonic endoscopic ultrasound (CH-EUS) MASTER: A novel deep learning-based system in pancreatic mass diagnosis. Cancer Med. 2023, 12, 7962–7973. [Google Scholar] [CrossRef]
  78. Săftoiu, A.; Vilmann, P.; Gorunescu, F.; Janssen, J.; Hocke, M.; Larsen, M.; Iglesias-Garcia, J.; Arcidiacono, P.; Will, U.; Giovannini, M.; et al. Efficacy of an artificial neural network-based approach to endoscopic ultrasound elastography in diagnosis of focal pancreatic masses. Clin. Gastroenterol. Hepatol. 2012, 10, 84–90.e81. [Google Scholar] [CrossRef]
  79. Wani, S.; Keswani, R.N.; Petersen, B.; Edmundowicz, S.A.; Walsh, C.M.; Huang, C.; Cohen, J.; Cote, G. Training in EUS and ERCP: Standardizing methods to assess competence. Gastrointest. Endosc. 2018, 87, 1371–1382. [Google Scholar] [CrossRef]
  80. Irisawa, A.; Yamao, K. Curved linear array EUS technique in the pancreas and biliary tree: Focusing on the stations. Gastrointest. Endosc. 2009, 69, S84–S89. [Google Scholar] [CrossRef]
  81. Price, W.N. Big data and black-box medical algorithms. Sci. Transl. Med. 2018, 10, eaao5333. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Machine learning and deep learning have emerged as prominent AI architectures extensively employed in the field of medicine. Figure 1 presents several typical examples of techniques that can be effectively applied to these two architectures.
Figure 1. Machine learning and deep learning have emerged as prominent AI architectures extensively employed in the field of medicine. Figure 1 presents several typical examples of techniques that can be effectively applied to these two architectures.
Diagnostics 13 02815 g001
Figure 2. The applications of EUS-AI in gastrointestinal and pancreatic diseases, including identifying and evaluating lesions, prediction of prognosis, treatment planning, and quality control in the diagnostic process of the diseases. Figure 2 also point out the broad prospects of EUS-AI.
Figure 2. The applications of EUS-AI in gastrointestinal and pancreatic diseases, including identifying and evaluating lesions, prediction of prognosis, treatment planning, and quality control in the diagnostic process of the diseases. Figure 2 also point out the broad prospects of EUS-AI.
Diagnostics 13 02815 g002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Huang, J.; Fan, X.; Liu, W. Applications and Prospects of Artificial Intelligence-Assisted Endoscopic Ultrasound in Digestive System Diseases. Diagnostics 2023, 13, 2815. https://doi.org/10.3390/diagnostics13172815

AMA Style

Huang J, Fan X, Liu W. Applications and Prospects of Artificial Intelligence-Assisted Endoscopic Ultrasound in Digestive System Diseases. Diagnostics. 2023; 13(17):2815. https://doi.org/10.3390/diagnostics13172815

Chicago/Turabian Style

Huang, Jia, Xiaofei Fan, and Wentian Liu. 2023. "Applications and Prospects of Artificial Intelligence-Assisted Endoscopic Ultrasound in Digestive System Diseases" Diagnostics 13, no. 17: 2815. https://doi.org/10.3390/diagnostics13172815

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop