Next Article in Journal
Two Approaches to Triple Antithrombotic Therapy in Patients with Acute Coronary Syndrome and Non-Valvular Atrial Fibrillation Treated with Percutaneous Coronary Intervention: Which Is More Efficient and Safer?
Next Article in Special Issue
Steatosis Quantification on Ultrasound Images by a Deep Learning Algorithm on Patients Undergoing Weight Changes
Previous Article in Journal
Ensemble Federated Learning Approach for Diagnostics of Multi-Order Lung Cancer
Previous Article in Special Issue
Enhancing X-ray-Based Wrist Fracture Diagnosis Using HyperColumn-Convolutional Block Attention Module
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence in Endoscopic Ultrasonography-Guided Fine-Needle Aspiration/Biopsy (EUS-FNA/B) for Solid Pancreatic Lesions: Opportunities and Challenges

Department of Gastroenterology, Ruijin Hospital, School of Medicine, Shanghai Jiao Tong University, Shanghai 200025, China
*
Authors to whom correspondence should be addressed.
Diagnostics 2023, 13(19), 3054; https://doi.org/10.3390/diagnostics13193054
Submission received: 1 August 2023 / Revised: 6 September 2023 / Accepted: 6 September 2023 / Published: 26 September 2023
(This article belongs to the Special Issue 2nd Edition: AI/ML-Based Medical Image Processing and Analysis)

Abstract

:
Solid pancreatic lesions (SPLs) encompass a variety of benign and malignant diseases and accurate diagnosis is crucial for guiding appropriate treatment decisions. Endoscopic ultrasonography-guided fine-needle aspiration/biopsy (EUS-FNA/B) serves as a front-line diagnostic tool for pancreatic mass lesions and is widely used in clinical practice. Artificial intelligence (AI) is a mathematical technique that automates the learning and recognition of data patterns. Its strong self-learning ability and unbiased nature have led to its gradual adoption in the medical field. In this paper, we describe the fundamentals of AI and provide a summary of reports on AI in EUS-FNA/B to help endoscopists understand and realize its potential in improving pathological diagnosis and guiding targeted EUS-FNA/B. However, AI models have limitations and shortages that need to be addressed before clinical use. Furthermore, as most AI studies are retrospective, large-scale prospective clinical trials are necessary to evaluate their clinical usefulness accurately. Although AI in EUS-FNA/B is still in its infancy, the constant input of clinical data and the advancements in computer technology are expected to make computer-aided diagnosis and treatment more feasible.

1. Introduction

Solid pancreatic lesions (SPLs) comprise a variety of both benign and malignant diseases, including pancreatic ductal adenocarcinoma (PDAC), pancreatic neuroendocrine tumors (PNET), focal pancreatitis, pancreatic tuberculosis, and pancreatic metastasis [1]. Accurate diagnosis plays a crucial role in guiding appropriate treatment decisions. The clinical presentation of solid pancreatic lesions is highly variable and primarily depends on the histological pattern, location, and size of the lesion. Typically, a pancreatic head tumor results in reduced exocrine function or biliary duct obstruction, both of which result in jaundice [2]. In contrast, a mass in the pancreas’ body and tail is frequently asymptomatic [3]. PDAC is one of the most frequent types of pancreatic lesions and the fifth leading cause of cancer death worldwide, with an overall 5-year survival rate of less than 5% [4]. Only around 10% of patients qualify for potentially curative surgery at diagnosis, and survival gains have been negligible, with patients commonly suffering from illness recurrence [5]. Suppose the SPL is a PNET, particularly a functional one. In that case, its symptoms are related to its hormone-producing capabilities, such as insulin, gastrin, vasoactive intestinal peptide, glucagon, somatostatin, and serotonin, which make it easier to identify [6]. Therapeutic choices are mostly reliant on the capacity to identify or rule out malignancy because the clinical course and prognosis of pancreatic masses vary. Therefore, establishing an optimal treatment strategy depends heavily on the correct diagnosis. Endoscopic ultrasound (EUS) is a well-established tool for evaluating pancreatic lesions. EUS offers high spatial resolution observation of the pancreas due to its proximity; the sensitivity of pancreatic cancer identification has reached 94% [7,8]. Endoscopists can also perform a fine-needle aspiration/biopsy (FNA/B) of tumors under EUS guidance to obtain cytological and histological diagnoses. Additionally, contrast-enhanced harmonic EUS (CH-EUS) and EUS elastography provide complementary information to conventional EUS in diagnosing pancreatic lesions, leading to more accurate diagnoses [8]. The use of contrast agents in EUS can provide valuable information on the microvasculature and perfusion within organs of interest and hypoenhancing masses have been proven to be indicators of malignant tumors [9]. According to a range of studies previously reported, CH-EUS shows a sensitivity range of 80% to 96% and a specificity range of 64% to 100% for SPLs, particularly in differentiating PDACs from other pancreatic masses [10]. Utilizing CH-EUS as an adjunct to assist EUS-FNA demonstrates higher diagnostic sensitivity in the assessment of pancreatic masses compared to standard EUS-FNA (CH-EUS-FNA, 84.6%; EUS-FNA, 75.3%) [11]. As for EUS elastography, it serves as a valuable complement to tissue sampling, which is used to guide fine-needle punctures and aid in determining further clinical treatment plans [12]. Elastography has been reported to demonstrate an extremely high sensitivity (ranging from 92% to 98%) in the detection of malignant pancreatic tumors [13]. Specifically, it has an exceptionally high negative predictive value for diagnosing PDAC in small pancreatic lesions [14].
EUS-FNA has been considered a safe and accurate procedure for diagnosing pancreatic lesions since the study reported by Vilmann et al. in 1992 [15]. According to the guidelines provided by the European Society of Gastrointestinal Endoscopy (ESGE) and the National Comprehensive Cancer Network (NCCN), SPL patients without a definitive diagnosis need to undergo EUS-FNA for pathological diagnosis [16,17]. Furthermore, EUS-FNA is recommended for pathological diagnosis before neoadjuvant therapy is administered and for patients with locally advanced, unresectable pancreatic cancer or metastatic disease [17]. FNA’s reported sensitivity and specificity for PC were 85–92% and 96–98%, respectively [18,19]. Compared to EUS-FNA, EUS-FNB can collect a larger amount of tissue and preserve the associated architecture of the surrounding tissue, which aids in the definitive diagnosis of suspicious pancreatic lesions. Recent studies have found that EUS-FNB has better diagnostic accuracy than EUS-FNA for suspicious pancreatic lesions and requires fewer needle passes, leading to a shorter time to diagnosis [20,21]. However, these findings do not standardize the used needle sizes and locations of the lesions. After settling on a 22G needle size, the diagnostic accuracy of FNB has not been proven to be significantly better than FNA [22]. While a network meta-analysis suggests that Franseen and fork-tip needles may offer superior performance in the tissue sampling of pancreatic masses, the confidence level in these estimates remains low [23]. Therefore, EUS-FNA/B serves as a first-line minimally invasive diagnostic tool for pancreatic mass lesions and is widely used in clinical practice, especially when the diagnosis or staging of the disease is unclear or when neoadjuvant therapy is planned.
In the past decade, an increasing number of medical centers have been performing EUS-FNA/B to improve diagnostic accuracy for pancreatic masses. EUS-FNA/B is a multistep procedure and its accuracy is influenced by various uncertain factors, including rapid on-site evaluation (ROSE) with cytopathologist involvement [24]. ROSE is a technique where experienced cytopathologists assess the quality of aspirate or biopsy samples on-site through the rapid creation of smears and staining methods. ROSE, in EUS-guided tissue acquisition, is used to assess sample adequacy and the nature of lesions in real-time, which can enhance diagnostic accuracy, reduce needle passes, and decrease the proportion of inadequate samples [25,26]. However, not all EUS centers have cytopathologist staffing available and, even when they are present, the ability, experience, and attentiveness of the cytopathologists are critical to accurate lesion recognition. Thus, there is a pressing need for new technologies to address objective recognition and image processing issues to assist in disease diagnosis.
Artificial intelligence (AI) has recently been slowly adopted in the medical field due to its strong self-learning ability and unbiased nature. Though the use of AI for SPLs is constrained and still developing in comparison to other fields, EUS-FNA/B has shown promising potential [27,28,29]. In this paper, we provide a comprehensive review to elucidate the progress and current prospects of EUS-FNA/B with AI for the diagnosis and differential diagnosis of SPLs.

2. Definitions of Artificial Intelligence, Machine Learning, and Deep Learning

Several publications have discussed AI, machine learning (ML), and deep learning (DL); yet, confusion around the terminology still exists. These terms are highly pertinent and cannot be used interchangeably (Figure 1). As a result, we aim to explain these concepts to the clinical audience in an accessible manner that avoids technical jargon.
AI is the ability of digital computers or computer-controlled robots to interpret information or perform tasks commonly associated with human intelligence [30]. It can be classified into three categories: artificial narrow intelligence (ANI, also known as weak AI), artificial general intelligence (AGI, also known as strong AI), and artificial superintelligence (ASI) [31]. ANI is goal-oriented and utilized to perform particular or limited tasks. Almost all AI systems, including medical AI systems, belong to the ANI category [32]. In contrast, AGI remains a theoretical concept that has not yet been achieved. An AGI system would consist of thousands of ANI systems and possess human-level cognitive function, allowing it to solve problems without human intervention [33]. Furthermore, ASI holds the potential to surpass human civilization and is the ultimate goal of AI creation [34]. However, it is unlikely to become a reality within the foreseeable future.
ML is a branch of AI that involves the use of algorithms to extract features from available data in order to make accurate predictions [35]. The demand for biomedical images and automatic analysis has led to significant advances in ML over the past decade, with a wide range of techniques, such as support vector machines (SVMs), random forests (RFs), decision trees, logistic regression analysis, and neural networks being employed [36]. DL is a neural network architecture that has evolved from ML and is characterized by a large number of interconnected elements that can automatically extract features from data, akin to the functioning of the human brain [36]. Convolutional neural networks (CNNs) are a common DL method primarily used to handle data with a grid-like topology, such as images (2D grid of pixels) or videos (3D grid of pixels). Initially, the CNN model is trained using a large collection of labeled images. Composed of multiple convolutional layers, activation functions, and pooling layers, the CNN automatically extracts features from the data [37]. Once trained, it can quickly and efficiently analyze new input images. CNNs have shown exceptional performance in analyzing and classifying medical images. In specific tasks, such as the detection of skin cancer or the identification of PDAC, CNNs trained on annotated datasets have been shown to exceed the accuracy of human experts [38,39,40,41].

3. Use of Artificial Intelligence in EUS-FNA/B

As computational power continues to increase and clinical demand grows, there have been significant advances in the utilization of AI to interpret complex images, particularly in EUS-FNA/B. In the field of pancreatic EUS-FNA/B, AI is predominantly used to aid in pathological diagnosis and is used less frequently in real-time puncture site guidance. The following sections will delve into the topics mentioned above in greater detail.

3.1. AI and Digital Pathology

Pathological images are a crucial form of biomedical imagery used for clinical pathological diagnoses, offering intuitive and valuable insights. Microscopic examination of these images is considered the gold standard for accurately determining the nature and presence of diseases during the diagnostic process [42]. Historically, the analysis of pancreatic specimens obtained via FNA or FNB has been the domain of professional pathologists. However, achieving precise pathological diagnoses and classifications is time- and labor-intensive, requiring pathologists to identify cellular and tissue characteristics and patterns indicative of pathological changes. Although training and standard guidelines can facilitate the harmonization of analytical processes, the subjectivity of pathological analysis and differences in visual perception, data integration, and judgment among independent observers inherently limit its reliability [43]. As a result, even pathologists with equivalent training may encounter diagnostic inconsistencies and discrepancies in opinion.
Digital pathology (DP) is the process of digitizing pathology information, including its acquisition, management, sharing, and interpretation, in a digital environment [44]. This technology enables the transformation of glass slides into digital ones that can be viewed on a computer monitor, offering two main benefits: improved efficiency and productivity and the integration of computer-aided diagnostic techniques [45]. With DP, team annotation of slides is possible, providing pathologists with greater flexibility in work schedules and remote access to pathology data. This technology also facilitates faster consultation telepathology turnaround times; delivers immediate access to previously archived digital slides; and streamlines data retrieval, matching, and organization [46]. Moreover, digital pathology algorithms enable the automatic quantification and analysis of pathology data, providing greater consistency and diagnostic accuracy than light microscopy and glass slides [47]. Given these advantages, DP is seeing increasing use for diagnostic, educational, and research purposes and is on the verge of becoming a mainstream option for routine diagnostics [48].
The potential for AI development in supporting pathology diagnosis, particularly in image analysis and disease detection, is significant. When applied to DP, AI algorithms can enhance the accuracy and reproducibility of morphological variables that pathologists traditionally assess. These algorithms can mine image features from DP slides, including visible morphology and spatial features, such as nuclear and gland size, shape, and tissue architecture. Furthermore, AI can extract features that pathologists may not recognize, such as intensity, texture, and spectral features [49]. These complex features can then be utilized to train models and perform specific segmentation, diagnostic, or prognostic tasks.
Feature extraction in AI involves two general approaches: supervised learning and unsupervised learning [50]. In supervised learning, features are identified based on the regions of interest (ROIs) annotated by pathologists in the images. These identified ROIs can be linked to specific, measurable attributes in the image and have some degree of explainability. Conversely, unsupervised learning uses algorithms to recognize patterns and similarities in image properties among training exemplars. These patterns may coincide with existing morphological classifiers; but, in some cases, they may be unknown to pathologists. To ensure the reliable diagnosis performance of AI algorithms in the medical field, it is crucial to evaluate their diagnostic reliability rigorously. Interpretability methods, such as Grad-CAM and AGF-Visualization, generate visual explanations for corresponding class labels, increasing the transparency of AI algorithms and enabling human scrutiny to detect undesirable AI behavior (Figure 2) [51,52]. The combination of AI and DP can determine each case’s objective measurement criteria and metrics, renewing pathologists’ interest in AI evaluation. This technology has already been authorized for clinical practice use in some geographical areas [53].

3.2. AI in Assisting with Pathological Diagnosis

Recent domestic and international studies demonstrate that AI’s diagnostic accuracy for DP images is comparable to that of senior pathologists, providing faster, more accurate, efficient, and collaborative pathological diagnoses [39,54,55]. AI can be particularly valuable in supporting clinical pathological diagnoses when pathologists are unavailable. As early as 1998, AI was reported to assist in diagnosing DP slides, marking a starting point in the pursuit of computer-aided early diagnosis of PC [56]. Currently, limited research integrating AI with pancreatic pathological diagnoses focuses on extracting nuclear features related to DNA content and chromatin distribution from ERCP cytological specimens and surgically resected histological specimens [57,58,59]. For instance, Song et al. developed and assessed an SVM model for automatically diagnosing and grading PDAC based on the morphologic features found on histology slides, achieving an accuracy of 94.38% in binary classification between PDAC and normal tissues [60]. This outcome suggests a tremendous potential for this model as a valuable supplement for the morphological evaluation of tumor biological characteristics. However, there remains a significant gap in the integration of AI-assisted diagnosis of DP images through specimens obtained via EUS-guided sampling.
Table 1 summarizes published studies that have used artificial intelligence to analyze DP images of EUS-FNA/B data, particularly those of SPLs. In 2017, Momeni-Boroujeni et al. reported the use of a multilayer perceptron neural network (MNN) in classifying pancreatic specimens obtained using EUS-FNA as benign or malignant, which was the first study available for cytological analysis using FNA/FNB samples [61]. The process involved using a K-means clustering algorithm to segment cell cluster pictures collected from FNA and extracting their morphological features. The MNN was then trained using differences in significant morphological features between malignant and benign images, such as contour, perimeter, and area. The MNN was successfully tested with a 100% accuracy rate in discriminating between benign and malignant pancreatic cytology while 77% accuracy was achieved for the atypical dataset. Additionally, a few original research papers and draft conference abstracts on the pathological classification of solid pancreatic masses were published. These papers used a small sample of cytopathological slides obtained through EUS-FNA, which had a limited diagnostic performance in single-center validation (accuracy range: 80–94%) [54,55,62]. Hyperspectral imaging (HSI) is a new optical diagnostic technology that combines spectroscopy. It measures the interaction between tissues and light through an HSI camera, capturing spectral features that conventional imaging modalities cannot obtain [63]. In this way, HSI can provide more diagnostic information for identification and differentiation. Qin et al. developed a CNN model combined with HSI technology, which used informative spectral features to distinguish benign and malignant pancreatic cytology [64]. By comparing the AI model’s diagnostic performance regarding the HSI images to conventional RGB images, one thing that can be learned is that the spectral information makes the CNN model easier to use to identify PDAC cells in cytological slides (HSI accuracy, 88.05%; RGB accuracy, 82.47%). Finally, the HSI-based model has been proven to have good generalization ability (internal test dataset: accuracy, 92.04%; external test dataset: accuracy, 92.27%). In 2022, Zhang et al. conducted a prospective, retrospective study using a novel deep CNN (DCNN) system to segment stained cell clusters and identify PC in a ROSE during EUS-FNA [39]. This study is the first known and the largest one to establish a deep learning system for identifying PDAC in a ROSE, including 6667 images from 194 cases and achieving an accuracy of 94.4% on the internal testing dataset. Additionally, the DCNN system demonstrated outstanding generalization ability on external testing datasets, with an accuracy of 91.2–95.8%. Moreover, its accuracy was comparable to cytopathologists and exhibited high sensitivity and negative predictive value (NPV). These results suggest that deploying the DCNN system in clinical settings to produce a ROSE may increase the diagnostic yield of EUS-FNA. In the same year, Lin et al. reported on a ROSE-AI model that substitutes manual ROSE during EUS-FNA [65]. It performed well in detecting cancer cells, presenting an 83.4% accuracy rate in the internal validation dataset and a similar result in the external validation dataset (88.7%). The ROSE-AI model’s implementation can speed up slide evaluation and shorten endoscopists’ wait times. Although AI has achieved promising results in ROSE, prospective validation studies are necessary to provide high-level evidence in actual clinical practice. We believe that future AI strategies will alleviate the problem of insufficient pathological resources and aid endoscopists in performing ROSE, thereby improving the accuracy of pancreatic disease diagnoses.
The development of FNB needles has made it possible to collect bigger tissue samples with fewer needle passes. With the introduction of EUS-FNB, several researchers contend that ROSE may no longer be required to minimize the number of needle passes [71,72]. In 2021, Naito et al. developed a CNN model for evaluating PDAC in EUS-FNB whole slide images (WSI), achieving a high ROC-AUC of 0.984 and an accuracy of 94.17% [68]. This model can assist in obtaining accurate histopathological diagnoses while avoiding interference from high blood, inflammatory, and digestive tract cell levels. However, a global survey of ROSE indicates that only 50% of Asian endoscopy centers meet the qualification standards for this technique [73]. In such cases, researchers suggest that a more fair and replicable evaluation method should be developed. Macroscopic on-site evaluation (MOSE) refers to the visual assessment of samples obtained during EUS-FNA/B and can serve as an alternative to ROSE [74]. Ordinarily, tumor tissues are white or flesh-colored while blood clots are red. During the period of MOSE, endoscopists transfer the puncture samples onto a glass slide and make a preliminary separation to observe the length of the white tissue samples, thereby assessing the adequacy of the aspiration or biopsy. This step provides an important basis for ensuring diagnostic accuracy. Nonetheless, evaluating specimen adequacy currently relies on the endoscopist’s subjective judgment, which largely depends on their level of experience. As such, it is crucial to develop more objective and reproducible evaluation methods for MOSE. However, in 2022, Ishikawa et al. reported on a contrastive learning-based CNN model that has achieved a comparable accuracy rate (84.4%) to endoscopists in evaluating the diagnosability of EUS-FNB specimens in MOSE [70]. This suggests that in the future, novel AI-based evaluation methods will replace MOSE, resulting in significant time savings and increased productivity.

3.3. AI in Guiding Targeted EUS-FNA/B

Although AI has made significant strides in pathological diagnosis, few reports have investigated its potential in lesion recognition and localization during EUS-FNA/B. CH-EUS is a cutting-edge technology that uses microbubble contrast agents to visualize microvessels and parenchymal perfusion, resulting in better characterization of pancreatic lesions detected by EUS [75]. Compared to conventional EUS, CH-EUS enhances the observation of pancreatic tumors and assists in identifying various pathological areas within pancreatic lesions. Combining CH-EUS with EUS-FNA notably captures subtle lesions that are not distinguishable from conventional EUS, thereby avoiding the sampling of necrotic areas and reducing the need for additional needle passes [76,77]. In a retrospective study comparing diagnostic accuracy and sampling adequacy between CH-EUS-FNA and conventional EUS-FNA groups, biopsy specimens were more frequently obtained in the CH-EUS-FNA group (96.6%) than in the EUS-FNA group (86.7%), with no significant difference in diagnostic accuracy [78]. Additionally, TIC has been used to achieve objective quantitative analyses of SPLs during CH-EUS, which includes variables such as maximum intensity gain, echo intensity reduction rate, and time to peak, enabling SPL classification based on enhancement patterns [79,80,81]. In 2023, Tang et al. developed an innovative auxiliary diagnosis system (CH-EUS MASTER) that uses AI models to guide targeted EUS-FNA/B procedures, which is the first time this technology has been utilized in this way [82]. By employing DCNN and RF algorithms, CH-EUS MASTER has achieved three crucial functions, including real-time pancreatic mass capture and segmentation under CH-EUS, identification of benign and malignant pancreatic masses according to TIC characteristics, and identifying and providing guidance for the target area of EUS-FNA. Endoscopists can perform further puncture procedures based on the ROIs predicted by CH-EUS MASTER with a remarkable accuracy rate of 93.8%, a sensitivity rate of 90.9%, and a specificity rate of 100%. Significantly, CH-EUS MASTER-guided EUS-FNA can improve the first-pass diagnostic yield (80.0% vs. 33.3%) compared to traditional EUS-FNA. Therefore, AI has the potential to assist in the pathological diagnosis of EUS-FNA/B and play a crucial role in guiding puncture sites, allowing inexperienced endoscopists to shorten their learning cycles. Although AI use in guiding targeted EUS-FNA/B is a relatively new field, future research in this area could produce innovative advancements.

4. The Limitations and Shortages of Artificial Intelligence in EUS-FNA/B

Although AI models are still in their infancy, they have already proven to be quite useful in organizing patient treatment procedures and assisting with medical decision making. However, many challenges remain to be addressed, particularly in terms of achieving an accurate diagnosis of specimens in EUS-FNA/B. Like any diagnostic tool, AI-assisted diagnostic models have their own set of limitations and shortcomings that need to be overcome before they can be considered reliable diagnostic methods for SPLs.
Building confidence in AI-assisted diagnostic models as a valuable tool in modern medicine requires addressing one of the most significant limitations, known as the “opaqueness” of AI, where the reasoning and recognition of the computer are not visible, leading to the “black box problem” [83]. This phenomenon can result in misdiagnosis without a clear understanding of why a particular decision was made, creating a fatal flaw in evidence-driven medicine. One potential solution to this issue is using inherently interpretable models that allow visualization of the regions recognized by AI as being important [84]. Another suggestion is conducting meticulous quality assessments before implementing AI models in the clinic to prevent physicians from relying solely on AI models to evaluate clinical outcomes [85]. This raises intricate regulatory and ethical considerations. AI systems designed for medical applications must be subjected to rigorous scrutiny and long-term validation to secure official certification.
Another major concern is the need for more standardization of the input data used to train AI models [86]. Establishing uniform data collection, processing, storage, reproduction, and analysis protocols is essential to ensure consistency. Without standardized data, the same AI model could produce vastly different outcomes for the same patients, reinforcing bias and leading to poor prognoses, which may reduce the popularity of AI technology. Additionally, an AI model trained in a specific environment may not perform equally well in different environments or on different devices. For instance, a deep learning CNN that is trained to accurately classify pancreatic biopsies stained with hematoxylin and eosin (H&E) may perform poorly or not at all on pancreatic biopsies prepared and stained with Papanicolaou (PAP). Other factors that can affect standardization settings include staining quality interference. Creating pathological smears requires experienced cytotechnologists to ensure that smears are uniform in thickness, well concentrated, and easily distributed, facilitating observation. Moreover, during staining, operators should ensure that the cells are fully coated with dye solutions to avoid blurry staining results. Untimely drying after staining can also lead to uneven staining, thus affecting image quality. Although it will be laborious and expensive to create uniform protocols for input data, they are necessary to improve the generalization of AI models.
Other factors that limit the development of AI in the field of EUS-FNA/B include the inability to fully utilize image information and the higher costs of image annotation. In particular, the algorithm’s operation relies on the graphics processing unit (GPU); but, the current storage capacity of GPUs is limited, making it challenging to fully utilize all the information in whole slide images (WSIs) or other image formats, which can result in the loss of some useful information [87,88]. Additionally, supervised learning is a common approach for most CNNs used in deep learning, which requires pathologists to accurately label ROIs in the images, adding to the cost [89]. Furthermore, AI analysis relies on high-quality training datasets, which require a substantial number of training images and can be time-consuming to prepare [90].
Regarding the application of AI in guiding puncture sites during EUS-FNA/B, several limitations and shortcomings need to be considered beyond the previously discussed data standardization issue. One critical limitation is that AI is challenging for dynamic image recognition [91]. EUS images are susceptible to external elements that can cause image jitter and displacement, such as a patient’s breathing and heartbeat. AI models must perform real-time corrections and registrations of EUS images to compensate for these discrepancies.
Currently, there is a growing desire to use artificial intelligence as an alternative to tissue sampling, thereby eliminating the need for and adverse events associated with the procedure. As such, there is increasing interest in using AI-assisted EUS for the diagnosis of pancreatic lesions, mainly due to its relatively low cost and minimal invasiveness. Although an increasing body of research supports the superiority of AI-assisted EUS in diagnostic accuracy compared to traditional human interpretation, most clinicians remain cautious about its widespread application in clinical practice [86]. However, with ongoing improvements in AI algorithms and the quality of EUS images, AI-assisted EUS models have the potential to replace traditional EUS-FNA/B as the gold standard for diagnosing SPLs.
Despite the progress made by AI in the field of digestive endoscopy, its application in actual clinical practice has been limited by insufficient medical data and the need for high accuracy. Figure 3 provides an overview of the limitations linked to AI in EUS-FNA/B. To accelerate the utilization of AI for clinical diagnosis and treatment, it is essential to conduct prospective and multicenter research studies encompassing a wide range of medical images for AI model processing and analysis [92]. Such an approach would ensure the representativeness of the collected data and enhance the recognition of diagnostic results in the medical community.

5. Conclusions and Prospect

In summary, AI has been widely utilized in the EUS-FNA/B field, significantly advancing automatic pathological image diagnosis. Some AI models demonstrate diagnostic accuracy comparable to experienced pathologists. Further AI-assisted interpretation will mitigate empirical misjudgments among pathologists, thereby improving work efficiency and credibility. AI technology can also substitute for cytopathologists in ROSE and alleviate resource constraints. AI can assess the diagnosable rate of puncture specimens by MOSE, replacing endoscopists and saving operational time. Additionally, AI can identify the puncture site’s nature and guide target puncture areas during EUS-FNA/B to shorten endoscopist growth cycles. Nevertheless, current clinical AI applications in our field and country remain limited. Thus, future studies should focus on multicenter research to analyze and process data, facilitate AI’s clinical promotion, and make clinical diagnosis and treatment more feasible.

Author Contributions

Conceptualization, C.Z. and D.Z.; methodology, C.Z. and T.R.; formal analysis, X.Q. and Y.Z.; data curation, X.Q. and T.R.; writing—original draft preparation, X.Q., T.R. and Y.C.; writing—review and editing, C.Z., Y.Z., D.W. and D.Z.; project administration, D.Z.; funding acquisition, D.W. and D.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Science and Technology Commission of Shanghai Municipality, grant numbers No. 21S31903500 and No. 21Y11908100.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets analyzed during the present research are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Iglesias-Garcia, J.; de la Iglesia-Garcia, D.; Olmos-Martinez, J.M.; Larino-Noia, J.; Dominguez-Munoz, J.E. Differential diagnosis of solid pancreatic masses. Minerva Gastroenterol. Dietol. 2020, 66, 70–81. [Google Scholar] [CrossRef] [PubMed]
  2. Strasberg, S.M.; Gao, F.; Sanford, D.; Linehan, D.C.; Hawkins, W.G.; Fields, R.; Carpenter, D.H.; Brunt, E.M.; Phillips, C. Jaundice: An important, poorly recognized risk factor for diminished survival in patients with adenocarcinoma of the head of the pancreas. HPB 2014, 16, 150–156. [Google Scholar] [CrossRef] [PubMed]
  3. Guarneri, G.; Gasparini, G.; Crippa, S.; Andreasi, V.; Falconi, M. Diagnostic strategy with a solid pancreatic mass. Presse Med. 2019, 48, e125–e145. [Google Scholar] [CrossRef] [PubMed]
  4. McGuigan, A.; Kelly, P.; Turkington, R.C.; Jones, C.; Coleman, H.G.; McCain, R.S. Pancreatic cancer: A review of clinical diagnosis, epidemiology, treatment and outcomes. World J. Gastroenterol. 2018, 24, 4846–4861. [Google Scholar] [CrossRef] [PubMed]
  5. Lucas, A.L.; Kastrinos, F. Screening for Pancreatic Cancer. JAMA 2019, 322, 407–408. [Google Scholar] [CrossRef]
  6. Ma, Z.Y.; Gong, Y.F.; Zhuang, H.K.; Zhou, Z.X.; Huang, S.Z.; Zou, Y.P.; Huang, B.W.; Sun, Z.H.; Zhang, C.Z.; Tang, Y.Q.; et al. Pancreatic neuroendocrine tumors: A review of serum biomarkers, staging, and management. World J. Gastroenterol. 2020, 26, 2305–2322. [Google Scholar] [CrossRef]
  7. Iglesias-Garcia, J.; Larino-Noia, J.; Dominguez-Munoz, J.E. When to puncture, when not to puncture: Pancreatic masses. Endosc. Ultrasound 2014, 3, 91–97. [Google Scholar] [CrossRef]
  8. Kitano, M.; Yoshida, T.; Itonaga, M.; Tamura, T.; Hatamaru, K.; Yamashita, Y. Impact of endoscopic ultrasonography on diagnosis of pancreatic cancer. J. Gastroenterol. 2019, 54, 19–32. [Google Scholar] [CrossRef]
  9. Iglesias-Garcia, J.; Lindkvist, B.; Larino-Noia, J.; Abdulkader-Nallib, I.; Dominguez-Munoz, J.E. Differential diagnosis of solid pancreatic masses: Contrast-enhanced harmonic (CEH-EUS), quantitative-elastography (QE-EUS), or both? United Eur. Gastroenterol. J. 2017, 5, 236–246. [Google Scholar] [CrossRef]
  10. Gong, T.T.; Hu, D.M.; Zhu, Q. Contrast-enhanced EUS for differential diagnosis of pancreatic mass lesions: A meta-analysis. Gastrointest. Endosc. 2012, 76, 301–309. [Google Scholar] [CrossRef]
  11. Facciorusso, A.; Mohan, B.P.; Crino, S.F.; Ofosu, A.; Ramai, D.; Lisotti, A.; Chandan, S.; Fusaroli, P. Contrast-enhanced harmonic endoscopic ultrasound-guided fine-needle aspiration versus standard fine-needle aspiration in pancreatic masses: A meta-analysis. Expert Rev. Gastroenterol. Hepatol. 2021, 15, 821–828. [Google Scholar] [CrossRef] [PubMed]
  12. Iglesias-Garcia, J.; de la Iglesia-Garcia, D.; Larino-Noia, J.; Dominguez-Munoz, J.E. Endoscopic Ultrasound (EUS) Guided Elastography. Diagnostics 2023, 13, 1686. [Google Scholar] [CrossRef] [PubMed]
  13. Dietrich, C.F.; Burmeister, S.; Hollerbach, S.; Arcidiacono, P.G.; Braden, B.; Fusaroli, P.; Hocke, M.; Iglesias-Garcia, J.; Kitano, M.; Larghi, A.; et al. Do we need elastography for EUS? Endosc. Ultrasound 2020, 9, 284–290. [Google Scholar] [CrossRef]
  14. Ignee, A.; Jenssen, C.; Arcidiacono, P.G.; Hocke, M.; Moller, K.; Saftoiu, A.; Will, U.; Fusaroli, P.; Iglesias-Garcia, J.; Ponnudurai, R.; et al. Endoscopic ultrasound elastography of small solid pancreatic lesions: A multicenter study. Endoscopy 2018, 50, 1071–1079. [Google Scholar] [CrossRef]
  15. Vilmann, P.; Jacobsen, G.K.; Henriksen, F.W.; Hancke, S. Endoscopic ultrasonography with guided fine needle aspiration biopsy in pancreatic disease. Gastrointest. Endosc. 1992, 38, 172–173. [Google Scholar] [CrossRef] [PubMed]
  16. Pouw, R.E.; Barret, M.; Biermann, K.; Bisschops, R.; Czako, L.; Gecse, K.B.; de Hertogh, G.; Hucl, T.; Iacucci, M.; Jansen, M.; et al. Endoscopic tissue sampling–Part 1: Upper gastrointestinal and hepatopancreatobiliary tracts. European Society of Gastrointestinal Endoscopy (ESGE) Guideline. Endoscopy 2021, 53, 1174–1188. [Google Scholar] [CrossRef]
  17. Tempero, M.A.; Malafa, M.P.; Al-Hawary, M.; Asbun, H.; Bain, A.; Behrman, S.W.; Benson, A.B., 3rd; Binder, E.; Cardin, D.B.; Cha, C.; et al. Pancreatic Adenocarcinoma, Version 2.2017, NCCN Clinical Practice Guidelines in Oncology. J. Natl. Compr. Canc. Netw. 2017, 15, 1028–1061. [Google Scholar] [CrossRef]
  18. Hewitt, M.J.; McPhail, M.J.; Possamai, L.; Dhar, A.; Vlavianos, P.; Monahan, K.J. EUS-guided FNA for diagnosis of solid pancreatic neoplasms: A meta-analysis. Gastrointest. Endosc. 2012, 75, 319–331. [Google Scholar] [CrossRef]
  19. Chen, G.; Liu, S.; Zhao, Y.; Dai, M.; Zhang, T. Diagnostic accuracy of endoscopic ultrasound-guided fine-needle aspiration for pancreatic cancer: A meta-analysis. Pancreatology 2013, 13, 298–304. [Google Scholar] [CrossRef]
  20. de Moura, D.T.H.; McCarty, T.R.; Jirapinyo, P.; Ribeiro, I.B.; Hathorn, K.E.; Madruga-Neto, A.C.; Lee, L.S.; Thompson, C.C. Evaluation of endoscopic ultrasound fine-needle aspiration versus fine-needle biopsy and impact of rapid on-site evaluation for pancreatic masses. Endosc. Int. Open 2020, 8, E738–E747. [Google Scholar] [CrossRef]
  21. Hassan, G.M.; Laporte, L.; Paquin, S.C.; Menard, C.; Sahai, A.V.; Masse, B.; Trottier, H. Endoscopic Ultrasound Guided Fine Needle Aspiration versus Endoscopic Ultrasound Guided Fine Needle Biopsy for Pancreatic Cancer Diagnosis: A Systematic Review and Meta-Analysis. Diagnostics 2022, 12, 2951. [Google Scholar] [CrossRef]
  22. Facciorusso, A.; Bajwa, H.S.; Menon, K.; Buccino, V.R.; Muscatiello, N. Comparison between 22G aspiration and 22G biopsy needles for EUS-guided sampling of pancreatic lesions: A meta-analysis. Endosc. Ultrasound 2020, 9, 167–174. [Google Scholar] [CrossRef]
  23. Gkolfakis, P.; Crino, S.F.; Tziatzios, G.; Ramai, D.; Papaefthymiou, A.; Papanikolaou, I.S.; Triantafyllou, K.; Arvanitakis, M.; Lisotti, A.; Fusaroli, P.; et al. Comparative diagnostic performance of end-cutting fine-needle biopsy needles for EUS tissue sampling of solid pancreatic masses: A network meta-analysis. Gastrointest. Endosc. 2022, 95, 1067–1077. [Google Scholar] [CrossRef]
  24. Cho, J.H.; Kim, J.; Lee, H.S.; Ryu, S.J.; Jang, S.I.; Kim, E.J.; Kang, H.; Lee, S.S.; Song, T.J.; Bang, S. Factors Influencing the Diagnostic Performance of Repeat Endoscopic Ultrasound-Guided Fine-Needle Aspiration/Biopsy after the First Inconclusive Diagnosis of Pancreatic Solid Lesions. Gut Liver 2023, 17. [Google Scholar] [CrossRef]
  25. Iglesias-Garcia, J.; Dominguez-Munoz, J.E.; Abdulkader, I.; Larino-Noia, J.; Eugenyeva, E.; Lozano-Leon, A.; Forteza-Vila, J. Influence of on-site cytopathology evaluation on the diagnostic accuracy of endoscopic ultrasound-guided fine needle aspiration (EUS-FNA) of solid pancreatic masses. Am. J. Gastroenterol. 2011, 106, 1705–1710. [Google Scholar] [CrossRef] [PubMed]
  26. Yang, F.; Liu, E.; Sun, S. Rapid on-site evaluation (ROSE) with EUS-FNA: The ROSE looks beautiful. Endosc. Ultrasound 2019, 8, 283–287. [Google Scholar]
  27. Spadaccini, M.; Koleth, G.; Emmanuel, J.; Khalaf, K.; Facciorusso, A.; Grizzi, F.; Hassan, C.; Colombo, M.; Mangiavillano, B.; Fugazza, A.; et al. Enhanced endoscopic ultrasound imaging for pancreatic lesions: The road to artificial intelligence. World J. Gastroenterol. 2022, 28, 3814–3824. [Google Scholar] [CrossRef] [PubMed]
  28. Tonozuka, R.; Mukai, S.; Itoi, T. The Role of Artificial Intelligence in Endoscopic Ultrasound for Pancreatic Disorders. Diagnostics 2020, 11, 18. [Google Scholar] [CrossRef] [PubMed]
  29. Wang, L.M.; Ang, T.L. Optimizing endoscopic ultrasound guided fine needle aspiration through artificial intelligence. J. Gastroenterol. Hepatol. 2023, 38, 839–840. [Google Scholar] [CrossRef] [PubMed]
  30. Hamet, P.; Tremblay, J. Artificial intelligence in medicine. Metabolism 2017, 69, S36–S40. [Google Scholar] [CrossRef]
  31. Pohl, J. Artificial Superintelligence: Extinction or Nirvana? In Proceedings of the InterSymp-2015, IIAS, 27th International Conference on Systems Research, Informatics, and Cybernetics, Baden-Baden, Germany, 3 August 2015. [Google Scholar]
  32. Abonamah, A.A.; Tariq, M.U.; Shilbayeh, S. On the Commoditization of Artificial Intelligence. Front. Psychol. 2021, 12, 696346. [Google Scholar] [CrossRef] [PubMed]
  33. Jeste, D.V.; Graham, S.A.; Nguyen, T.T.; Depp, C.A.; Lee, E.E.; Kim, H.C. Beyond artificial intelligence: Exploring artificial wisdom. Int. Psychogeriatr. 2020, 32, 993–1001. [Google Scholar] [CrossRef] [PubMed]
  34. Bostrom, N. Superintelligence; Dunod: Malakoff, France, 2017. [Google Scholar]
  35. Robert, C.J.C. Machine Learning, a Probabilistic Perspective; The MIT Press: Cambridge, MA, USA, 2014; Volume 27, pp. 62–63. [Google Scholar]
  36. Choi, R.Y.; Coyner, A.S.; Kalpathy-Cramer, J.; Chiang, M.F.; Campbell, J.P. Introduction to Machine Learning, Neural Networks, and Deep Learning. Transl. Vis. Sci. Technol. 2020, 9, 14. [Google Scholar]
  37. Yamashita, R.; Nishio, M.; Do, R.K.G.; Togashi, K. Convolutional neural networks: An overview and application in radiology. Insights Imaging 2018, 9, 611–629. [Google Scholar] [CrossRef] [PubMed]
  38. Yin, M.; Liu, L.; Gao, J.; Lin, J.; Qu, S.; Xu, W.; Liu, X.; Xu, C.; Zhu, J. Deep learning for pancreatic diseases based on endoscopic ultrasound: A systematic review. Int. J. Med. Inform. 2023, 174, 105044. [Google Scholar] [CrossRef] [PubMed]
  39. Zhang, S.; Zhou, Y.; Tang, D.; Ni, M.; Zheng, J.; Xu, G.; Peng, C.; Shen, S.; Zhan, Q.; Wang, X.; et al. A deep learning-based segmentation system for rapid onsite cytologic pathology evaluation of pancreatic masses: A retrospective, multicenter, diagnostic study. EBioMedicine 2022, 80, 104022. [Google Scholar] [CrossRef]
  40. Mahmoudi, T.; Kouzahkanan, Z.M.; Radmard, A.R.; Kafieh, R.; Salehnia, A.; Davarpanah, A.H.; Arabalibeik, H.; Ahmadian, A. Segmentation of pancreatic ductal adenocarcinoma (PDAC) and surrounding vessels in CT images using deep convolutional neural networks and texture descriptors. Sci. Rep. 2022, 12, 3092. [Google Scholar] [CrossRef]
  41. Beltrami, E.J.; Brown, A.C.; Salmon, P.J.M.; Leffell, D.J.; Ko, J.M.; Grant-Kels, J.M. Artificial intelligence in the detection of skin cancer. J. Am. Acad. Dermatol. 2022, 87, 1336–1342. [Google Scholar] [CrossRef]
  42. Khened, M.; Kori, A.; Rajkumar, H.; Krishnamurthi, G.; Srinivasan, B. A generalized deep learning framework for whole-slide image segmentation and analysis. Sci. Rep. 2021, 11, 11579. [Google Scholar] [CrossRef]
  43. Bera, K.; Schalper, K.A.; Rimm, D.L.; Velcheti, V.; Madabhushi, A. Artificial intelligence in digital pathology—New tools for diagnosis and precision oncology. Nat. Rev. Clin. Oncol. 2019, 16, 703–715. [Google Scholar] [CrossRef]
  44. Mubarak, M. Move from Traditional Histopathology to Digital and Computational Pathology: Are we Ready? Indian. J. Nephrol. 2022, 32, 414–415. [Google Scholar] [CrossRef] [PubMed]
  45. Nam, S.; Chong, Y.; Jung, C.K.; Kwak, T.Y.; Lee, J.Y.; Park, J.; Rho, M.J.; Go, H. Introduction to digital pathology and computer-aided pathology. J. Pathol. Transl. Med. 2020, 54, 125–134. [Google Scholar] [CrossRef] [PubMed]
  46. Jahn, S.W.; Plass, M.; Moinfar, F. Digital Pathology: Advantages, Limitations and Emerging Perspectives. J. Clin. Med. 2020, 9, 3697. [Google Scholar] [CrossRef] [PubMed]
  47. Van Es, S.L. Digital pathology: Semper ad meliora. Pathology 2019, 51, 1–10. [Google Scholar] [CrossRef]
  48. Pallua, J.D.; Brunner, A.; Zelger, B.; Schirmer, M.; Haybaeck, J. The future of pathology is digital. Pathol. Res. Pract. 2020, 216, 153040. [Google Scholar] [CrossRef]
  49. Forsch, S.; Klauschen, F.; Hufnagl, P.; Roth, W. Artificial Intelligence in Pathology. Dtsch. Arztebl. Int. 2021, 118, 194–204. [Google Scholar] [CrossRef]
  50. Loewenstein, Y.; Raviv, O.; Ahissar, M. Dissecting the Roles of Supervised and Unsupervised Learning in Perceptual Discrimination Judgments. J. Neurosci. 2021, 41, 757–765. [Google Scholar] [CrossRef]
  51. Linardatos, P.; Papastefanopoulos, V.; Kotsiantis, S. Explainable AI: A Review of Machine Learning Interpretability Methods. Entropy 2020, 23, 18. [Google Scholar] [CrossRef]
  52. Gur, S.; Ali, A.; Wolf, L. Visualization of Supervised and Self-Supervised Neural Networks via Attribution Guided Factorization. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtually Held, 2–9 February 2021; AAAI: Washington, DC, USA, 2021. [Google Scholar]
  53. Benjamens, S.; Dhunnoo, P.; Mesko, B. The state of artificial intelligence-based FDA-approved medical devices and algorithms: An online database. NPJ. Digit. Med. 2020, 3, 118. [Google Scholar] [CrossRef]
  54. Hashimoto, Y.J.G. 44 Prospective Comparison Study of EUS-FNA Onsite Cytology Diagnosis by Pathologist versus Our Designed Deep Learning Algorhythm in Suspected Pancreatic Cancer. Gastroenterology 2020, 158, S17. [Google Scholar] [CrossRef]
  55. Patel, J.; Bhakta, D.; Elzamly, S.; Moreno, V.; Joldoshova, A.; Ghosh, A.; Shitawi, M.; Lin, M.; Aakash, N.; Guha, S.J.G.E. ID: 3526830 Artificial Intelligence Based Rapid Onsite Cytopathology Evaluation (Rose-AIDTM) vs. Physician Interpretation of Cytopathology Images of Endoscopic Ultrasound-Guided Fine-Needle Aspiration (EUS-FNA) of Pancreatic Solid Lesions. Intell. Based Med. 2021, 93, AB193–AB194. [Google Scholar] [CrossRef]
  56. Yeaton, P.; Sears, R.J.; Ledent, T.; Salmon, I.; Kiss, R.; Decaestecker, C. Discrimination between chronic pancreatitis and pancreatic adenocarcinoma using artificial intelligence-related algorithms based on image cytometry-generated variables. Cytometry 1998, 32, 309–316. [Google Scholar] [CrossRef]
  57. Biesterfeld, S.; Deacu, L. DNA image cytometry in the differential diagnosis of benign and malignant lesions of the bile duct, the pancreatic duct and the papilla of Vater. Anticancer Res. 2009, 29, 1579–1584. [Google Scholar] [PubMed]
  58. Okon, K.; Tomaszewska, R.; Nowak, K.; Stachura, J. Application of neural networks to the classification of pancreatic intraductal proliferative lesions. Anal. Cell. Pathol. 2001, 23, 129–136. [Google Scholar] [CrossRef]
  59. Bloom, G.; Yang, I.V.; Boulware, D.; Kwong, K.Y.; Coppola, D.; Eschrich, S.; Quackenbush, J.; Yeatman, T.J. Multi-platform, multi-site, microarray-based human tumor classification. Am. J. Pathol. 2004, 164, 9–16. [Google Scholar] [CrossRef] [PubMed]
  60. Song, J.W.; Lee, J.H. New morphological features for grading pancreatic ductal adenocarcinomas. Biomed. Res. Int. 2013, 2013, 175271. [Google Scholar] [CrossRef]
  61. Momeni-Boroujeni, A.; Yousefi, E.; Somma, J. Computer-assisted cytologic diagnosis in pancreatic FNA: An application of neural networks to image analysis. Cancer Cytopathol. 2017, 125, 926–933. [Google Scholar] [CrossRef]
  62. Hashimoto, Y.; Ohno, I.; Imaoka, H.; Takahashi, H.; Ikeda, M.J.G.E. Mo1296 Reliminary Result of Computer Aided Diagnosis (CAD) Performance Using Deep Learning in EUS-FNA Cytology of Pancreatic Cancer. Gastrointest. Endosc. 2018, 87, AB434. [Google Scholar] [CrossRef]
  63. Halicek, M.; Fabelo, H.; Ortega, S.; Callico, G.M.; Fei, B. In-Vivo and Ex-Vivo Tissue Analysis through Hyperspectral Imaging Techniques: Revealing the Invisible Features of Cancer. Cancers 2019, 11, 756. [Google Scholar] [CrossRef]
  64. Qin, X.; Zhang, M.; Zhou, C.; Ran, T.; Pan, Y.; Deng, Y.; Xie, X.; Zhang, Y.; Gong, T.; Zhang, B.; et al. A deep learning model using hyperspectral image for EUS-FNA cytology diagnosis in pancreatic ductal adenocarcinoma. Cancer Med. 2023. [Google Scholar] [CrossRef]
  65. Lin, R.; Sheng, L.P.; Han, C.Q.; Guo, X.W.; Wei, R.G.; Ling, X.; Ding, Z. Application of artificial intelligence to digital-rapid on-site cytopathology evaluation during endoscopic ultrasound-guided fine needle aspiration: A proof-of-concept study. J. Gastroenterol. Hepatol. 2023, 38, 883–887. [Google Scholar] [CrossRef] [PubMed]
  66. Kong, F.; Kong, X.; Zhu, J.; Sun, T.; Du, Y.; Wang, K.; Jin, Z.; Li, Z.; Wang, D. A prospective comparison of conventional cytology and digital image analysis for the identification of pancreatic malignancy in patients undergoing EUS-FNA. Endosc. Ultrasound 2019, 8, 269–276. [Google Scholar]
  67. Thosani, N.; Patel, J.; Moreno, V.; Bhakta, D.; Patil, P.; Guha, S.; Zhang, S. Development and validation of artificial intelligence based rapid onsite cytopathology evaluation (rose-aidtm) for endoscopic ultrasound-guided fine-needle aspiration (eus-fna) of pancreatic solid lesions. Gastroenterology 2021, 160, S-17. [Google Scholar] [CrossRef]
  68. Naito, Y.; Tsuneki, M.; Fukushima, N.; Koga, Y.; Higashi, M.; Notohara, K.; Aishima, S.; Ohike, N.; Tajiri, T.; Yamaguchi, H.; et al. A deep learning model to detect pancreatic ductal adenocarcinoma on endoscopic ultrasound-guided fine-needle biopsy. Sci. Rep. 2021, 11, 8454. [Google Scholar] [CrossRef]
  69. Yamada, R.; Nakane, K.; Kadoya, N.; Matsuda, C.; Imai, H.; Tsuboi, J.; Hamada, Y.; Tanaka, K.; Tawara, I.; Nakagawa, H. Development of “Mathematical Technology for Cytopathology”, an Image Analysis Algorithm for Pancreatic Cancer. Diagnostics 2022, 12, 1149. [Google Scholar] [CrossRef]
  70. Ishikawa, T.; Hayakawa, M.; Suzuki, H.; Ohno, E.; Mizutani, Y.; Iida, T.; Fujishiro, M.; Kawashima, H.; Hotta, K. Development of a Novel Evaluation Method for Endoscopic Ultrasound-Guided Fine-Needle Biopsy in Pancreatic Diseases Using Artificial Intelligence. Diagnostics 2022, 12, 434. [Google Scholar] [CrossRef]
  71. Mohamadnejad, M.; Mullady, D.; Early, D.S.; Collins, B.; Marshall, C.; Sams, S.; Yen, R.; Rizeq, M.; Romanas, M.; Nawaz, S.; et al. Increasing Number of Passes Beyond 4 Does Not Increase Sensitivity of Detection of Pancreatic Malignancy by Endoscopic Ultrasound-Guided Fine-Needle Aspiration. Clin. Gastroenterol. Hepatol. 2017, 15, 1071–1078. [Google Scholar] [CrossRef] [PubMed]
  72. Cheng, B.; Zhang, Y.; Chen, Q.; Sun, B.; Deng, Z.; Shan, H.; Dou, L.; Wang, J.; Li, Y.; Yang, X.; et al. Analysis of Fine-Needle Biopsy vs Fine-Needle Aspiration in Diagnosis of Pancreatic and Abdominal Masses: A Prospective, Multicenter, Randomized Controlled Trial. Clin. Gastroenterol. Hepatol. 2018, 16, 1314–1321. [Google Scholar] [CrossRef] [PubMed]
  73. van Riet, P.A.; Cahen, D.L.; Poley, J.W.; Bruno, M.J. Mapping international practice patterns in EUS-guided tissue sampling: Outcome of a global survey. Endosc. Int. Open 2016, 4, E360–E370. [Google Scholar] [CrossRef]
  74. Iwashita, T.; Yasuda, I.; Mukai, T.; Doi, S.; Nakashima, M.; Uemura, S.; Mabuchi, M.; Shimizu, M.; Hatano, Y.; Hara, A.; et al. Macroscopic on-site quality evaluation of biopsy specimens to improve the diagnostic accuracy during EUS-guided FNA using a 19-gauge needle for solid lesions: A single-center prospective pilot study (MOSE study). Gastrointest. Endosc. 2015, 81, 177–185. [Google Scholar] [CrossRef]
  75. Kitano, M.; Sakamoto, H.; Kudo, M. Contrast-enhanced endoscopic ultrasound. Dig. Endosc. 2014, 26 (Suppl. S1), 79–85. [Google Scholar] [CrossRef] [PubMed]
  76. Otsuka, Y.; Kamata, K.; Kudo, M. Contrast-Enhanced Harmonic Endoscopic Ultrasound-Guided Puncture for the Patients with Pancreatic Masses. Diagnostics 2023, 13, 1039. [Google Scholar] [CrossRef] [PubMed]
  77. Sugimoto, M.; Takagi, T.; Suzuki, R.; Konno, N.; Asama, H.; Watanabe, K.; Nakamura, J.; Kikuchi, H.; Waragai, Y.; Takasumi, M.; et al. Contrast-enhanced harmonic endoscopic ultrasonography in gallbladder cancer and pancreatic cancer. Fukushima J. Med. Sci. 2017, 63, 39–45. [Google Scholar] [CrossRef] [PubMed]
  78. Hou, X.; Jin, Z.; Xu, C.; Zhang, M.; Zhu, J.; Jiang, F.; Li, Z. Contrast-enhanced harmonic endoscopic ultrasound-guided fine-needle aspiration in the diagnosis of solid pancreatic lesions: A retrospective study. PLoS ONE 2015, 10, e0121236. [Google Scholar] [CrossRef]
  79. Kitano, M.; Kamata, K.; Imai, H.; Miyata, T.; Yasukawa, S.; Yanagisawa, A.; Kudo, M. Contrast-enhanced harmonic endoscopic ultrasonography for pancreatobiliary diseases. Dig. Endosc. 2015, 27 (Suppl. S1), 60–67. [Google Scholar] [CrossRef]
  80. Imazu, H.; Kanazawa, K.; Mori, N.; Ikeda, K.; Kakutani, H.; Sumiyama, K.; Hino, S.; Ang, T.L.; Omar, S.; Tajiri, H. Novel quantitative perfusion analysis with contrast-enhanced harmonic EUS for differentiation of autoimmune pancreatitis from pancreatic carcinoma. Scand. J. Gastroenterol. 2012, 47, 853–860. [Google Scholar] [CrossRef]
  81. Saftoiu, A.; Vilmann, P.; Dietrich, C.F.; Iglesias-Garcia, J.; Hocke, M.; Seicean, A.; Ignee, A.; Hassan, H.; Streba, C.T.; Ioncica, A.M.; et al. Quantitative contrast-enhanced harmonic EUS in differential diagnosis of focal pancreatic masses (with videos). Gastrointest. Endosc. 2015, 82, 59–69. [Google Scholar] [CrossRef]
  82. Tang, A.; Tian, L.; Gao, K.; Liu, R.; Hu, S.; Liu, J.; Xu, J.; Fu, T.; Zhang, Z.; Wang, W.; et al. Contrast-enhanced harmonic endoscopic ultrasound (CH-EUS) MASTER: A novel deep learning-based system in pancreatic mass diagnosis. Cancer Med. 2023, 12, 7962–7973. [Google Scholar] [CrossRef]
  83. Wadden, J.J. Defining the undefinable: The black box problem in healthcare artificial intelligence. J. Med. Ethics 2021, 48, 764–768. [Google Scholar] [CrossRef]
  84. van der Velden, B.H.M.; Kuijf, H.J.; Gilhuijs, K.G.A.; Viergever, M.A. Explainable artificial intelligence (XAI) in deep learning-based medical image analysis. Med. Image Anal. 2022, 79, 102470. [Google Scholar] [CrossRef]
  85. Pierce, R.L.; Van Biesen, W.; Van Cauwenberge, D.; Decruyenaere, J.; Sterckx, S. Explainability in medicine in an era of AI-based clinical decision support systems. Front. Genet. 2022, 13, 903600. [Google Scholar] [CrossRef] [PubMed]
  86. Dahiya, D.S.; Al-Haddad, M.; Chandan, S.; Gangwani, M.K.; Aziz, M.; Mohan, B.P.; Ramai, D.; Canakis, A.; Bapaye, J.; Sharma, N. Artificial Intelligence in Endoscopic Ultrasound for Pancreatic Cancer: Where Are We Now and What Does the Future Entail? J. Clin. Med. 2022, 11, 7476. [Google Scholar] [CrossRef] [PubMed]
  87. Smistad, E.; Falch, T.L.; Bozorgi, M.; Elster, A.C.; Lindseth, F. Medical image segmentation on GPUs—A comprehensive review. Med. Image Anal. 2015, 20, 1–18. [Google Scholar] [CrossRef] [PubMed]
  88. Hu, X.; Yang, W.; Wen, H.; Liu, Y.; Peng, Y. A Lightweight 1-D Convolution Augmented Transformer with Metric Learning for Hyperspectral Image Classification. Sensors 2021, 21, 1751. [Google Scholar] [CrossRef]
  89. Qu, L.; Liu, S.; Liu, X.; Wang, M.; Song, Z. Towards label-efficient automatic diagnosis and analysis: A comprehensive survey of advanced deep learning-based weakly-supervised, semi-supervised and self-supervised techniques in histopathological image analysis. Phys. Med. Biol. 2022, 67, 20TR01. [Google Scholar] [CrossRef]
  90. Sarker, I.H. Deep Learning: A Comprehensive Overview on Techniques, Taxonomy, Applications and Research Directions. SN Comput. Sci. 2021, 2, 420. [Google Scholar] [CrossRef]
  91. Lee, I.; Kim, D.; Wee, D.; Lee, S. An Efficient Human Instance-Guided Framework for Video Action Recognition. Sensors 2021, 21, 8309. [Google Scholar] [CrossRef]
  92. Sinz, F.H.; Pitkow, X.; Reimer, J.; Bethge, M.; Tolias, A.S. Engineering a Less Artificial Intelligence. Neuron 2019, 103, 967–979. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Types of artificial intelligence.
Figure 1. Types of artificial intelligence.
Diagnostics 13 03054 g001
Figure 2. Commonly used interpretability methods to visualize pathological images. (A) an original pathological image; (B) a class activation map using the Grad-CAM for the pathological image; (C) the other original pathological image, and (D) a class activation map using AGF-Visualization. The high-intensity area (red color) reflects the area of interest to the AI model.
Figure 2. Commonly used interpretability methods to visualize pathological images. (A) an original pathological image; (B) a class activation map using the Grad-CAM for the pathological image; (C) the other original pathological image, and (D) a class activation map using AGF-Visualization. The high-intensity area (red color) reflects the area of interest to the AI model.
Diagnostics 13 03054 g002
Figure 3. The limitations and shortages of artificial intelligence in EUS-FNA/B.
Figure 3. The limitations and shortages of artificial intelligence in EUS-FNA/B.
Diagnostics 13 03054 g003
Table 1. Application of AI in EUS-FNA/B for the pathological diagnosis of solid pancreatic lesions.
Table 1. Application of AI in EUS-FNA/B for the pathological diagnosis of solid pancreatic lesions.
Year/JournalAuthorRef.PurposeData SourceSample SizeAlgorithmDiagnostic Performance
2017/
Cancer Cytopathology
Momeni-Boroujeni et al.[61]Distinguish benign and malignant pancreatic cytologyEUS-FNA277 images from 75 pancreatic FNA casesMNNFor benign and malignant categories: Accuracy 100%
For atypical cases: Accuracy 77%
2018/
Gastrointestinal Endoscopy
Hashimoto et al.[62]PDAC identificationEUS-FNA450 imagesCNNAccuracy 80%
2019/
Endoscopic Ultrasound
Kong et al.[66]PC detectionEUS-FNA142 casesDIAAccuracy (83%) is comparable to conventional cytology (78%)
2020/
Gastroenterology
Hashimoto et al.[54]Distinguish benign and malignant in ROSEEUS-FNARetrospectively collected: 1440 cytology specimens;
Retrospective validated: 400 cytology specimens
CNNAccuracy (93–94%) is comparable to an onsite pathologist (98–99%)
2021/
Gastroenterology
Thosani et al.[67]Interpretation for adequacy and identification of SPLs in ROSEEUS-FNA400 cases for training and 77 images for validationMLFor onsite adequacy testing: Accuracy 87.25%;
For cytopathological diagnosis: Accuracy 81.8%
2021/
Gastrointestinal Endoscopy
Patel et al.[55]Comparison of AI and subspecialty physicians for identification of SPLsEUS-FNA77 imagesMLAccuracy (87%) is on par or superior compared to most physicians (36–96%)
2021/
Scientific Reports
Naito et al.[68]PDAC detection in WSIsEUS-FNB532 WSIsCNNAccuracy 94.17%, AUC 0.9836
2022/
Diagnostics (Basel)
Yamada et al.[69]Distinguish PDAC and benign pancreatic cytologyEUS-FNA/B246 specimensDLAccuracy 74%
2022/
Diagnostics (Basel)
Ishikawa et al.[70]Evaluation of diagnosable EUS-FNB specimen in MOSEEUS-FNB271 specimens from 159 patientsCNNAccuracy (84.4%) is comparable to endoscopists (82.1–83.2%)
2022/
EBioMedicine
Zhang et al.[39]Identification of PDAC in ROSEEUS-FNA6667 images from 194 casesDCNNAccuracy (94.4%) with AUC 0.958, is comparable to cytopathologists (91.7%)
2022/
Journal of Gastroenterology and Hepatology
Lin et al.[65]Detection of cancer cells with pancreatic or other celiac lesions in ROSEEUS-FNA1160 images from 51 casesCNNFor internal validation dataset: Accuracy 83.4%
For external validation dataset: Accuracy 88.7%
2023/
Cancer Medicine
Qin et al.[64]Distinguish benign and malignant masses via pancreatic cytologyEUS-FNA1913 images from 72 casesCNNFor internal test dataset: Accuracy 92.04%
For external test dataset: Accuracy 92.27%
Abbreviations: PDAC, pancreatic ductal adenocarcinoma; MNN, multilayer perceptron neural network; ML, machine learning; DL, deep learning; WSI, whole slide image; AUC, area under the ROC curves; DCNN, deep convolutional neural network.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Qin, X.; Ran, T.; Chen, Y.; Zhang, Y.; Wang, D.; Zhou, C.; Zou, D. Artificial Intelligence in Endoscopic Ultrasonography-Guided Fine-Needle Aspiration/Biopsy (EUS-FNA/B) for Solid Pancreatic Lesions: Opportunities and Challenges. Diagnostics 2023, 13, 3054. https://doi.org/10.3390/diagnostics13193054

AMA Style

Qin X, Ran T, Chen Y, Zhang Y, Wang D, Zhou C, Zou D. Artificial Intelligence in Endoscopic Ultrasonography-Guided Fine-Needle Aspiration/Biopsy (EUS-FNA/B) for Solid Pancreatic Lesions: Opportunities and Challenges. Diagnostics. 2023; 13(19):3054. https://doi.org/10.3390/diagnostics13193054

Chicago/Turabian Style

Qin, Xianzheng, Taojing Ran, Yifei Chen, Yao Zhang, Dong Wang, Chunhua Zhou, and Duowu Zou. 2023. "Artificial Intelligence in Endoscopic Ultrasonography-Guided Fine-Needle Aspiration/Biopsy (EUS-FNA/B) for Solid Pancreatic Lesions: Opportunities and Challenges" Diagnostics 13, no. 19: 3054. https://doi.org/10.3390/diagnostics13193054

APA Style

Qin, X., Ran, T., Chen, Y., Zhang, Y., Wang, D., Zhou, C., & Zou, D. (2023). Artificial Intelligence in Endoscopic Ultrasonography-Guided Fine-Needle Aspiration/Biopsy (EUS-FNA/B) for Solid Pancreatic Lesions: Opportunities and Challenges. Diagnostics, 13(19), 3054. https://doi.org/10.3390/diagnostics13193054

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop