Next Article in Journal
Transoral Laser Microsurgery versus Robot-Assisted Surgery for Squamous Cell Carcinoma of the Tongue Base (Oncological and Functional Results)—A Retrospective GETTEC Multicenter Study
Next Article in Special Issue
Radiomic Analysis of Quantitative T2 Mapping and Conventional MRI in Predicting Histologic Grade of Bladder Cancer
Previous Article in Journal
Bronchopulmonary Dysplasia: Pathogenesis and Pathophysiology
Previous Article in Special Issue
Pivotal Clinical Study to Evaluate the Efficacy and Safety of Assistive Artificial Intelligence-Based Software for Cervical Cancer Diagnosis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Computer-Aided Detection for Pancreatic Cancer Diagnosis: Radiological Challenges and Future Directions

by
Mark Ramaekers
1,*,†,
Christiaan G. A. Viviers
2,†,
Boris V. Janssen
3,4,
Terese A. E. Hellström
2,
Lotte Ewals
5,
Kasper van der Wulp
5,
Joost Nederend
5,
Igor Jacobs
6,
Jon R. Pluyter
7,
Dimitrios Mavroeidis
8,
Fons van der Sommen
2,
Marc G. Besselink
3,4 and
Misha D. P. Luyer
1 on behalf of the E/MTIC Oncology Collaborative Group
1
Department of Surgery, Catharina Cancer Institute, Catharina Hospital Eindhoven, 5623 EJ Eindhoven, The Netherlands
2
Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
3
Department of Surgery, Amsterdam UMC, University of Amsterdam, 1105 AZ Amsterdam, The Netherlands
4
Cancer Center Amsterdam, 1081 HV Amsterdam, The Netherlands
5
Department of Radiology, Catharina Cancer Institute, Catharina Hospital Eindhoven, 5623 EJ Eindhoven, The Netherlands
6
Department of Hospital Services and Informatics, Philips Research, 5656 AE Eindhoven, The Netherlands
7
Department of Experience Design, Philips Design, 5656 AE Eindhoven, The Netherlands
8
Department of Data Science, Philips Research, 5656 AE Eindhoven, The Netherlands
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
J. Clin. Med. 2023, 12(13), 4209; https://doi.org/10.3390/jcm12134209
Submission received: 21 April 2023 / Revised: 8 June 2023 / Accepted: 19 June 2023 / Published: 22 June 2023

Abstract

:
Radiological imaging plays a crucial role in the detection and treatment of pancreatic ductal adenocarcinoma (PDAC). However, there are several challenges associated with the use of these techniques in daily clinical practice. Determination of the presence or absence of cancer using radiological imaging is difficult and requires specific expertise, especially after neoadjuvant therapy. Early detection and characterization of tumors would potentially increase the number of patients who are eligible for curative treatment. Over the last decades, artificial intelligence (AI)-based computer-aided detection (CAD) has rapidly evolved as a means for improving the radiological detection of cancer and the assessment of the extent of disease. Although the results of AI applications seem promising, widespread adoption in clinical practice has not taken place. This narrative review provides an overview of current radiological CAD systems in pancreatic cancer, highlights challenges that are pertinent to clinical practice, and discusses potential solutions for these challenges.

1. Introduction

Pancreatic ductal adenocarcinoma (PDAC) is one of the leading causes of cancer-related deaths worldwide, with a dismal prognosis and an overall 5-year survival rate of only 9% [1,2]. Although pancreatic cancer treatment has improved over the past years through centralization and optimization of treatment strategies, overall survival has not significantly improved [3,4]. Pancreatic cancer often causes only a few non-specific symptoms before it develops into advanced stages of disease. The most commonly presented symptoms in patients with PDAC are pain, jaundice, steatorrhea, and weight loss [5]. As a result, more than 75% of the patients present with irresectable or metastatic disease [6,7]. Therefore, early detection of pancreatic tumors holds significant promise by enabling potential curative treatment [8]. Subsequently, characterization of pancreatic tumors is important in order to tailor specific treatments, determine surgical resectability, and identify each patient for curative treatment as well as possible.
Radiological imaging modalities such as computed tomography (CT) and magnetic resonance imaging (MRI) are key in providing information on the presence of disease and the relation to vessels surrounding the pancreas, which determines the resectability [9]. Standardized resectability criteria are used to tailor the need for neoadjuvant therapy and to select patients for a minimally invasive approach [10,11,12]. However, determining resectability, especially after neoadjuvant therapy, is extremely difficult and mostly inaccurate at this time [13,14]. Tumor regression after neoadjuvant treatment is rarely visible on CT, and the extent of vascular involvement tends to be overestimated [15].
Artificial intelligence (AI) offers a unique opportunity to improve the early detection and characterization of pancreatic cancer. Over the past decades, deep learning-based algorithms have been developed that can provide pixel-level segmentations of relevant anatomy. Deep learning refers to the use of a neural network with multiple layers suited for recognizing features from input data. However, implementation in healthcare proceeds slowly, and the majority of AI solutions remain within the testing and prototyping phases [16,17].
Here, we provide an overview of current AI applications in the radiological detection of pancreatic cancer, highlight challenges in clinical practice, elaborate on the current state of computer-aided detection, and discuss future perspectives.

2. Rationale for Early Detection of Pancreatic Cancer

2.1. Radiological Detection

Pancreatic cancer often goes undetected until it has progressed into an advanced stage, and therefore only a minority of patients are eligible for surgical resection at the time of diagnosis [18]. The initial diagnosis of pancreatic tumors mostly starts with CT imaging. Multi-phase (arterial and portal-venous) contrast-enhanced multi-detector computed tomography (MDCT) is standard practice in evaluating pancreatic cancer and carries a sensitivity of at least 90% in expert hands [19,20,21]. In general, pancreatic tumors appear hypodense compared to normal pancreatic parenchyma. However, there may be other imaging findings predictive of pancreatic cancer as well. The multi-phase protocol provides good spatial resolution and allows proper visualization of relevant structures and vasculature to assess staging and resectability [18].
In addition to CT imaging, MRI allows for successful tumor detection. Despite the soft-tissue contrast, MRI has been shown to be as sensitive as CT in diagnosing and staging pancreatic cancer, with a sensitivity of 89% [22]. While MRI is not widely used as an initial imaging technique due to cost and availability, there are several specific situations where it is preferred. The soft tissue contrast of MRI may be beneficial in the recognition of small tumors and the characterization of indeterminate liver lesions [23].
Lastly, endoscopic ultrasound (EUS) is often used to discriminate between PDAC and other pancreatic diseases. EUS offers several advantages for pancreatic tumor detection, including its ability to provide detailed imaging of the pancreas and its surrounding structures, as well as the possibility of obtaining tissue samples for histological analysis [24]. EUS has shown high sensitivity in detecting pancreatic cancer, and several studies have reported a sensitivity ranging from 80% to 95% [25]. However, the sensitivity of EUS can be influenced by various factors, including the size and location of the tumor and the experience of the endoscopist performing the procedure.

2.2. Restrictions for Early Detection and Characterization of Pancreatic Cancer

Early detection and characterization of pancreatic tumors represent one of the most promising strategies to improve the prognosis and overall survival of pancreatic cancer patients [26,27]. This can be attributed to several factors. First, patients diagnosed in early disease stages often have smaller tumors (<2.0 cm) with less vascular involvement. Although only a minority of patients (10–15%) are diagnosed in early stages and have stage-1 disease, they present a much higher 3-year survival rate (82%) compared to patients diagnosed in later disease stages where tumors are larger (17%) [28,29]. Second, pancreatic imaging requires specific expertise, and with small and iso-attenuating tumors sometimes being barely visible, the radiologist has to rely on other patterns that might indicate malignant disease. Although CT and MRI generally achieve acceptable sensitivity measures in diagnosing pancreatic cancer, subtle pancreatic changes may be missed on abdominal imaging, especially in asymptomatic patients [30,31]. Radiologists’ sensitivity to detecting small and iso-attenuating PDACs with sizes smaller than 2 cm on CT has been reported to be between 58 and 77% [32]. Lack of expertise may result in delayed recognition and prevent patients from receiving curative treatment. This may even be more pertinent in hospitals without specific pancreatic expertise [33]. Over the years, various studies have reported the presence of visible secondary features prior to actual diagnosis [34,35,36]. Kang et al. demonstrated that secondary signs are present in 88% of the cases. The most common secondary sign was pancreatic duct dilation, and vascular invasion was the most commonly missed [31]. In addition, studies reported that indicative changes of PDAC are visible on imaging 6–18 months prior to actual diagnosis in 50% of patients [35,37]. Third, pancreatic cancer treatment is centralized, which limits expertise to certain hospitals. Previous studies have shown that patients with non-metastasized pancreatic cancer had a greater likelihood of receiving surgical treatment when the diagnosis was established in an expert pancreatic cancer center compared to non-expert hospitals [38]. The centralization of pancreatic surgery may further enhance this discrepancy between expert and non-expert hospitals. Multidisciplinary team meetings may preserve expertise in various treatment techniques; however, patients still need to be identified before expert assessment can be performed.
For the characterization of PDAC, resectability is graded as resectable, borderline resectable, or irresectable based on the contact between the tumor and surrounding vasculature [39]. However, the assessment of resectability based on a CT scan remains challenging with considerable interobserver variability, even among the most experienced clinicians [40,41]. Furthermore, between 4 and 13% of patients with operable pancreatic cancer are found to be unresectable at the time of surgery due to missed metastases or inadequate resectability assessment [42]. Following neoadjuvant treatment, assessment of resectability becomes even more difficult. Tumor regression after neoadjuvant treatment is rarely visible on radiological assessment, and literature shows that the diagnostic performance of radiologists to predict resectability or irresectability on CT is inaccurate [13,15]. As such, clinicians are unable to accurately assess tumor resectability after neoadjuvant treatment. Additionally, the extent of vascular involvement tends to be overestimated, and several studies have indicated considerable variability in resectability assessment between medical experts, which could affect treatment strategies [40,43]. Meanwhile, different strategies of neoadjuvant treatment followed by surgery have been shown to improve overall survival in patients with borderline resectable pancreatic cancer [44,45]. Based on these promising results of neoadjuvant treatment, it is to be expected that neoadjuvant treatment evaluation will become an increasing problem.

3. Computer-Aided Detection as a Potential Solution

3.1. Pancreatic Cancer Detection Using CAD

Artificial intelligence (AI) has emerged as a powerful tool in medical healthcare, and by analyzing large amounts of medical imaging data, AI algorithms can identify subtle changes in the pancreas that may be missed by human observers. A potential tool that may facilitate the improvement of pancreatic tumor detection and response evaluation is computer-aided detection (CAD) based on artificial intelligence techniques. Over the past decades, deep learning has rapidly advanced the development of CAD methods across various domains, and it is expected that it will continue to act as a driver for technological innovation in the foreseeable future [46]. The first steps towards automated pancreatic tumor detection on CT have been conducted. Overall, two main approaches exist for CAD development: 1. deep learning and 2. radiomics.
In deep learning, artificial neural networks are employed as a machine learning technique, and these models are trained to detect and localize pancreatic cancer based on a large number of annotated examples. In radiomics, large amounts of hand-crafted image features are extracted from digital images, also known as radiomics features. These features then serve as a basis for conventional machine learning models to make predictions about the data.
In recent years, several studies have demonstrated promising methods and results using AI to detect pancreatic cancer on CT scans. An overview of the most notable results and architectures is summarized in Table 1. Although several studies reported notable results, it is important to consider that some studies only performed binary classification of CT scans, meaning that they only classified tumor presence or absence. To provide patients with adequate treatment, clinicians often require additional results, such as the location of the tumor. Additionally, AI-based tools can be leveraged to distinguish PDAC from auto-immune pancreatitis. Various studies have demonstrated impressive results in distinguishing auto-immune pancreatitis from PDAC, exploiting both deep learning- and radiomics-based approaches [47,48]. Recently, Rigiroli et al. took a first step in investigating whether tumor-related and perivascular CT radiomic features improve preoperative assessment of arterial involvement in patients with surgically proven PDAC. The model showed a sensitivity and specificity of 0.620 and 0.770, respectively, and a better performance compared to the radiologist’s assessment [49].
Quantitative MRI techniques such as T1 and T2 image mapping allow for accurate tissue characterization and provide early indicators of biological changes [63]. Additionally, it offers a nonionizing radiation alternative. However, the availability of high-quality MRI data is limited, and literature on the detection of PDAC using MRI is scarce. Kaissis et al. provided extensive work by applying machine learning to MR images to preoperatively predict survival and molecular subtypes in patients with PDAC [64,65,66]. Their survival model achieved impressive results with a sensitivity and specificity of 0.870 and 0.800, respectively, and an area under the curve (AUC) of 0.90 for the prediction of above- and below-median overall survival (OS) [64]. Liang et al. specifically aimed to develop a deep learning algorithm allowing automatic segmentation of gross tumor volume and reported performances, similar to expert radiation oncologists [67]. Over the years, only a few other studies reported MRI-based machine learning models and mainly focused on the identification and characterization of pancreatic abnormalities. An overview of the most notable results is summarized in Table 2.
AI-based algorithms can be integrated with EUS to assist in the interpretation and analysis of the acquired images. By training AI models on large datasets of EUS images, these algorithms can learn to recognize patterns and features indicative of pancreatic tumors. In addition, they can help reduce inter-observer variability and improve the diagnostic consistency of EUS, as endoscopist experience plays an important role in the accuracy of this imaging technique. The added value of AI to discriminate between benign and malignant tissue during EUS has been investigated in a considerable number of studies. Several studies have explored the use of AI in differentiating between PDAC and the normal pancreas, and all models achieved an overall accuracy of at least 90% [72,73,74]. Additionally, several studies focused on differentiating between PDAC and chronic pancreatitis using EUS, with all models accurately predicting PDAC in over 80% of the cases [75,76,77].
Lastly, assessment of vascular involvement and determination of treatment response are still growing problems without accurate assessment tools. Different scoring systems have been proposed to predict vascular involvement and, thus, resectability status in pancreatic cancer patients [78,79]. Given the beneficial effect of neoadjuvant treatment on cancer-specific survival, a recent study developed tumor-vessel interface criteria to predict vascular involvement and resectability in borderline pancreatic cancer patients [80]. The diagnostic performance for predicting vascular involvement was evaluated between 2 readers and showed an AUC for agreement of 0.85–0.88 for arterial invasion and 0.87–0.92 for venous invasion. Furthermore, CT texture analysis for predicting resectability after neoadjuvant treatment has been introduced, providing important information regarding tumor characterization by quantifying tissue heterogeneity and texture coarseness [81,82].

3.2. AI and the Use of Biomarkers for Pancreatic Cancer Detection

In addition to imaging, there is ongoing research to identify biomarkers that can aid in the early detection of pancreatic cancer. In contrast to other malignancies, there are few sensitive circulating biomarkers, and currently there is no standardized screening strategy. Several conventional biomarkers have shown promise in the detection of pancreatic cancer. Carbohydrate antigen 19-9 (CA19-9) and carcinoembryonic antigen (CEA) are the most widely used biomarkers for the screening of PDAC [83]. Unfortunately, both biomarkers have been proven to lack sensitivity and specificity for early-stage detection, and they are usually used in monitoring treatment response and detection of recurrent pancreatic cancer [84]. Combined measurements can potentially increase the sensitivity and specificity of biomarkers, as Yang et al. developed a neural network for detecting pancreatic cancer combining multiple tumor markers (CA19-9, CEA, and CA125). In a retrospective analysis of 913 serum specimens, the performance of the neural network was superior to each of the serum tumor markers alone, with an AUC of 0.905 and a diagnostic accuracy of 83.5% [85].
Additionally, other novel biomarkers have been identified that are present in early pancreatic adenocarcinomas, such as microRNAs (miRNAs). MiRNAs are small RNA molecules that regulate gene expression. Specific miRNAs such as miR-21, miR-155, and miR-196a are present in pancreatic fluids, have been identified as dysregulated in pancreatic cancer, and may serve as diagnostic biomarkers [86]. However, they tend to have the same limitations in sensitivity and specificity as conventional biomarkers. An interesting study by Cao et al. used machine learning to identify two diagnostic panels based on miRNA expression in plasma to differentiate between pancreatic cancer and chronic pancreatitis. Using 361 plasma samples from six centers in China, they achieved an accuracy of 83.6% for distinguishing pancreatic cancer from chronic pancreatitis. In addition, they demonstrated that the diagnostic value of both panels was comparable to that of CA19-9 [87].
To date, AI-based algorithms have shown great promise in medical image analysis, including CT scans and MRIs. The integration of multiple biomarkers and AI-based image analysis could enable a more comprehensive evaluation of the disease, facilitating the identification of high-risk individuals and potentially detecting pancreatic cancer at earlier stages when treatment options are more effective. Therefore, the most promising strategy for developing complete screening tests for pancreatic cancer should include a combination of imaging data, biomarkers, and patient data. AI-based algorithms can leverage this combined information to learn patterns and features, potentially leading to improved diagnostic accuracy and comprehensive screening of patients.

3.3. Representative Data, Bias, and Confounders

Earlier forms of AI applications have failed to achieve widespread adoption, mainly due to their limited technical performance of these early applications [88,89]. AI’s unique strength is its ability to learn from complex datasets and recognize subtle patterns; however, it also suffers from a host of imperfections, including inapplicability to different populations, bias, and accidentally fitting confounders [16].
Most AI models are far from achieving reliable generalizability because building a trustworthy model requires a representative and diverse dataset that resembles the expected target population during use. Models are trained using variable methodologies on different populations with their respective characteristics and are therefore not automatically transferable to other hospitals, as illustrated in a recent study to detect abnormal chest radiographs. At a fixed operating point, specificity varied widely from 0.566 to 1.000 across five independent datasets [90]. Independent local datasets that represent a sample of their own population could be used as supplementary local training datasets to allow adaptation of an existing algorithm before formal testing. For less complicated tasks such as medical image classification, this problem may be less crucial and might be overcome using large, heterogeneous, multicenter datasets [91].
Another major concern is that datasets used to train AI models could have unintended biases present in historical data. Biases include those related to missing data, sample size, underestimation, misclassification, and measurement error. There is also a general concern that biases in the data used by machine learning algorithms may contribute to socioeconomic disparities in healthcare [92]. Algorithms introduced both in medical and non-medical fields already showed some problematic recommendations that reflect biases inherent in the data used for training [93]. The prediction of an offender’s risk of recidivism has shown a tendency for racial discrimination, just as predicting the risk of cardiovascular events in non-white populations [94,95]. Similar biases could inadvertently be built into healthcare algorithms. To reduce such risks, the prediction model risk of bias assessment tool (PROBAST) criteria can be used to assess the risk of bias for prediction model studies [96].
Finally, machine learning algorithms may include confounding factors and spurious correlations to achieve the best possible performance [16]. In this way, the algorithm would draw conclusions based on accidental correlations found in the training data, impairing the algorithm’s ability to generalize to new datasets. Remarkable examples of accidental fitting confounders in healthcare are the presence of a ruler or surgical skin markings, which the algorithm correlated with an increased likelihood of a cancerous skin lesion [97,98]. Therefore, ongoing development is required to understand the specific features and possible biases that are being learned by AI models before adopting them in healthcare.

3.4. Acceptance of AI in Daily Workflow

Over the last decades, AI has rapidly become one of the major academic research subjects in medical image interpretation, and the number of AI studies has grown at an exceptional rate. Thousands of AI solutions for radiology have been developed in research labs, and, to date, some of these perform on par or even better than clinicians [99,100,101]. However, a recent study showed that most AI applications remain within the testing and prototyping environment, and surprisingly few applications successfully make their way from the research lab into clinical practice [102]. This may be explained in several ways.
Most of the current evidence is based on retrospective studies with extensive benchmarking of the algorithm’s performance against experts’ performance [103]. A higher level of evidence is needed to understand the true performance of AI systems when encountering real clinical data. Although some studies provide excellent results, purely focusing on the algorithm’s performance can come at the expense of study quality and the risk of bias. Hence, regulatory guidelines have been proposed by various parties to guarantee patient safety [104,105]. Recently, Van de Sande et al. proposed a step-by-step approach to improve the quality, safety, and transparency of AI research and to improve clinicians understanding of AI technology [102].
Another well-recognized cause for this low adoption and actual use is a lack of human-centered AI design. Poor fit into the working lives of healthcare providers and a lack of trust or trustworthiness have been shown to hamper the uptake of AI into daily clinical workflow [16,17]. Physicians often work on tight schedules, and if AI creates additional complexity, physicians will have a hard time embracing new innovations.
In an environment where physicians have to make split-second decisions with potentially far-reaching consequences, trust in new technology is important but may also be fragile. This is especially pertinent as AI poses unique potential trust issues, such as the frequent lack of explainability of its results (the black box) and a tendency to occasionally produce unexpected recommendations. It is essential to advance AI from the lab into a clinical setting while keeping humans (patients and physicians) central to the development and evaluation process. Finally, due to the lack of substantial evidence on the added value of AI applications, the acquisition of funds can be uncertain.

3.5. Discussion for Future Development

AI has grown as a high-potential technology that will shape the future of medical healthcare, and deep learning with convolutional neural networks (CNN), transformers, and diffusion models has rapidly revolutionized the standard for AI applications. However, translation of research techniques into effective clinical deployment presents a new opportunity for clinical, machine learning, and human-computer interaction design research. Considering the current issues, we believe the following aspects will become increasingly important for the further progression of AI in pancreatic cancer detection:
First, training a deep learning algorithm for complicated tasks requires large amounts of high-quality labeled data. However, due to the relatively low prevalence of pancreatic cancer, this is one of the biggest hurdles hampering the training of AI-based algorithms. The first indications of the potential of AI applications are clearly visible in retrospective research on small, homogeneous datasets, where these systems demonstrated impressive results in the detection of pancreatic tumors. However, datasets are the primary drivers of AI-based algorithms, and collaboration and data sharing between expert centers for pancreatic surgery would provide the opportunity to deal with the scarcity of high-quality labeled datasets.
Second, for proper evaluation of the model, the results should be presented transparently, following clear standards. To avoid misconceptions, it should be clear which metrics were used to evaluate the performance of the AI model. Metrics such as accuracy, sensitivity, specificity, and area under the receiver’s operating characteristics should be described [102,106]. Therefore, clear guidelines can be used following transparent reporting of a multivariable prediction model for individual prognosis or diagnosis (TRIPOD) [107,108]. Additionally, a targeted calculation of the sample size of the test set, as proposed by Riley et al., which is needed to accurately determine the quality of early-stage algorithms, could facilitate an unbiased evaluation [109].
Third, for an algorithm to be of added value in clinical practice, aspects such as reliability, uncertainty, and robustness against variation will become increasingly important. This may require novel approaches that require strong collaboration between clinicians and AI. While detection of cancer is the first essential step, follow-up tasks such as tumor resection planning require extremely detailed segmentations of the tumor and surrounding anatomical structures. Current existing approaches do not yet meet this high accuracy requirement. The current research trend is to design better models to adhere to these shortcomings, while it is possible that there will always be edge cases unaccounted for or different opinions on the correct segmentation. A possible solution to bridge the gap between what is technically possible and what is clinically necessary is interactive AI (IAI), where a user can provide AI with additional input. By implementing IAI, the pattern recognition abilities of AI and the domain knowledge of the clinician can be combined, resulting in more accurate and robust results [110]. This could enable clinicians to achieve highly accurate annotations of the structures of interest and could improve the effectiveness and implementation of AI algorithms in clinical practice.
Fourth, multidimensional radiological techniques, such as PET-CT and PET-MRI, may offer significant added value in combination with AI-based approaches. Through the use of positron emission tomography (PET), these techniques provide complementary quantitative information, such as metabolic activity, perfusion, and microenvironment, to traditional CT and MRI imaging [111]. This allows for a more accurate and comprehensive assessment of pancreatic tumors. AI-based algorithms can leverage this combined information to learn patterns and features, potentially leading to improved diagnostic accuracy [112,113]. Additionally, intelligent 4D imaging on ultra-high-sensitivity PET, such as Large Axial Field of View (LAFOV) PET, could enhance the detection of small lesions and improve image quality. 4D imaging refers to the acquisition of imaging data over time, allowing for the assessment of dynamic processes within the body [114]. Potentially, AI algorithms can analyze and interpret the data obtained from LAFOV PET, which could enable a more accurate assessment of tumor behavior or prediction of disease progression. Integration of AI with multidimensional imaging techniques, including 4D imaging, may also aid in the development of personalized treatment strategies, as it can provide valuable insights into tumor biology, heterogeneity, and response to therapy.
Lastly, the development and implementation of AI require an interdisciplinary approach between technological and medical partners. A strong collaboration between medical doctors and engineers is necessary to create a common understanding of the possibilities and limitations of AI. After all, it has already been shown that humans assisted by AI performed better than either alone in a study of diabetic retinopathy screening [115]. Understanding human-algorithm interactions and the importance of human-centered AI will be critical to the future adoption of AI applications. The engine of the AI model, as well as the user interface (UI) through which the physician and AI interact, must be optimized for this team’s performance. Choices in AI development and UI design influence how physicians work with AI to leverage the collective intelligence of the physician and the AI [116]. Additionally, facilitating smooth integration will be crucial for clinical adoption. Medical practitioners are often hesitant to adopt new technology that interrupts their traditional practice patterns [117]. The additional time that a new technology requires to be implemented is a significant barrier for clinicians’ user acceptance since their high workloads prevent them from taking time for an adequate implementation.

4. Conclusions

AI is a valuable approach to overcome the current challenges in the diagnosis and optimal treatment of pancreatic cancer. By facilitating applications for early detection, assessing resectability, and determining treatment response to neoadjuvant therapy, AI can revolutionize pancreatic cancer detection. However, AI in pancreatic cancer is still in its infancy, and more collaborations between technical expertise and clinical practice are needed to design the applications that are going to improve diagnostic accuracy. In order to progress, future research should focus on three key challenges: (1) the use of representative heterogeneous datasets; (2) the transparent presentation of results; and (3) understanding the importance of human-centered AI. Over the past decades, the exponential increase in the amount of visual data and computational power has enabled AI to demonstrate its endless capabilities. Clinical adoption of AI will be catalyzed by high-quality prospective studies and optimal human-AI interaction.

Author Contributions

Conceptualization, M.R., C.G.A.V., B.V.J., M.G.B. and M.D.P.L.; methodology, M.R., C.G.A.V. and B.V.J.; validation, J.N., F.v.d.S., I.J., J.R.P., D.M., M.G.B. and M.D.P.L.; formal analysis, M.R., C.G.A.V. and B.V.J.; investigation, M.R., C.G.A.V., B.V.J., L.E., T.A.E.H. and K.v.d.W.; data curation, M.R., C.G.A.V., B.V.J., L.E. and T.A.E.H.; funding acquisition, I.J.; writing—original draft preparation, M.R., C.G.A.V. and B.V.J.; writing—review and editing, T.A.E.H., L.E., K.v.d.W., J.N., F.v.d.S., I.J., J.R.P., D.M., M.G.B. and M.D.P.L.; supervision, M.G.B. and M.D.P.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by an Eindhoven AI Systems Institute (EAISI) grant (RVO characteristic TKI2112P08).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

We want to thank all other team members from the Eindhoven MedTech Innovation Centre (E/MTIC) Oncology Collaborative group for their input.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Rahib, L.; Smith, B.D.; Aizenberg, R.; Rosenzweig, A.B.; Fleshman, J.M.; Matrisian, L.M. Projecting Cancer Incidence and Deaths to 2030: The Unexpected Burden of Thyroid, Liver, and Pancreas Cancers in the United States. Cancer Res. 2014, 74, 2913–2921. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. American Cancer Society. Facts & Figures 2019; American Cancer Society: Atlanta, GA, USA, 2019; pp. 1–76. [Google Scholar]
  3. Conroy, T.; Hammel, P.; Hebbar, M.; Ben Abdelghani, M.; Wei, A.C.; Raoul, J.-L.; Choné, L.; Francois, E.; Artru, P.; Biagi, J.J.; et al. FOLFIRINOX or Gemcitabine as Adjuvant Therapy for Pancreatic Cancer. N. Engl. J. Med. 2018, 379, 2395–2406. [Google Scholar] [CrossRef]
  4. Latenstein, A.E.J.; van der Geest, L.G.M.; Bonsing, B.A.; Groot Koerkamp, B.; Haj Mohammad, N.; de Hingh, I.H.J.T.; de Meijer, V.E.; Molenaar, I.Q.; van Santvoort, H.C.; van Tienhoven, G.; et al. Nationwide Trends in Incidence, Treatment and Survival of Pancreatic Ductal Adenocarcinoma. Eur. J. Cancer 2020, 125, 83–93. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  5. De La Cruz, M.S.D.; Young, A.P.; Ruffin, M.T. Diagnosis and Management of Pancreatic Cancer. Am. Fam. Physician 2014, 89, 626–632. [Google Scholar] [PubMed]
  6. Van der Geest, L.G.M.; Lemmens, V.E.P.P.; de Hingh, I.H.J.T.; van Laarhoven, C.J.H.M.; Bollen, T.L.; Nio, C.Y.; van Eijck, C.H.J.; Busch, O.R.C.; Besselink, M.G.; Dutch Pancreatic Cancer Group. Nationwide Outcomes in Patients Undergoing Surgical Exploration without Resection for Pancreatic Cancer. Br. J. Surg. 2017, 104, 1568–1577. [Google Scholar] [CrossRef]
  7. Gheorghe, G.; Bungau, S.; Ilie, M.; Behl, T.; Vesa, C.M.; Brisc, C.; Bacalbasa, N.; Turi, V.; Costache, R.S.; Diaconu, C.C. Early Diagnosis of Pancreatic Cancer: The Key for Survival. Diagnostics 2020, 10, 869. [Google Scholar] [CrossRef]
  8. Allan, B.J.; Novak, S.M.; Hogg, M.E.; Zeh, H.J. Robotic Vascular Resections during Whipple Procedure. J. Vis. Surg. 2018, 4, 13. [Google Scholar] [CrossRef] [Green Version]
  9. Zhang, L.; Sanagapalli, S.; Stoita, A. Challenges in Diagnosis of Pancreatic Cancer. World J. Gastroenterol. 2018, 24, 2047–2060. [Google Scholar] [CrossRef]
  10. DPCG. CT Staging for Adenocarcinoma of the Pancreatic Head and Uncinate Process. 2012. Available online: https://dpcg.nl/wp-content/uploads/2020/04/Criteria_resectabiliteit.pdf. (accessed on 12 February 2023).
  11. Tempero, M.A.; Malafa, M.P.; Al-Hawary, M.; Behrman, S.W.; Benson, A.B.; Cardin, D.B.; Chiorean, E.G.; Chung, V.; Czito, B.; Del Chiaro, M.; et al. Pancreatic Adenocarcinoma, Version 2.2021, NCCN Clinical Practice Guidelines in Oncology. J. Natl. Compr. Canc. Netw. 2021, 19, 439–457. [Google Scholar] [CrossRef]
  12. Asbun, H.J.; Moekotte, A.L.; Vissers, F.L.; Kunzler, F.; Cipriani, F.; Alseidi, A.; D’Angelica, M.I.; Balduzzi, A.; Bassi, C.; Björnsson, B.; et al. The Miami International Evidence-Based Guidelines on Minimally Invasive Pancreas Resection. Ann. Surg. 2020, 271, 1–14. [Google Scholar] [CrossRef]
  13. Cassinotto, C.; Mouries, A.; Lafourcade, J.-P.; Terrebonne, E.; Belleannée, G.; Blanc, J.-F.; Lapuyade, B.; Vendrely, V.; Laurent, C.; Chiche, L.; et al. Locally Advanced Pancreatic Adenocarcinoma: Reassessment of Response with CT after Neoadjuvant Chemotherapy and Radiation Therapy. Radiology 2014, 273, 108–116. [Google Scholar] [CrossRef]
  14. White, R.R.; Paulson, E.K.; Freed, K.S.; Keogan, M.T.; Hurwitz, H.I.; Lee, C.; Morse, M.A.; Gottfried, M.R.; Baillie, J.; Branch, M.S.; et al. Staging of Pancreatic Cancer before and after Neoadjuvant Chemoradiation. J. Gastrointest. Surg. 2001, 5, 626–633. [Google Scholar] [CrossRef]
  15. Cassinotto, C.; Sa-Cunha, A.; Trillaud, H. Radiological Evaluation of Response to Neoadjuvant Treatment in Pancreatic Cancer. Diagn. Interv. Imaging 2016, 97, 1225–1232. [Google Scholar] [CrossRef]
  16. Kelly, C.J.; Karthikesalingam, A.; Suleyman, M.; Corrado, G.; King, D. Key Challenges for Delivering Clinical Impact with Artificial Intelligence. BMC Med. 2019, 17, 195. [Google Scholar] [CrossRef] [Green Version]
  17. Strohm, L.; Hehakaya, C.; Ranschaert, E.R.; Boon, W.P.C.; Moors, E.H.M. Implementation of Artificial Intelligence (AI) Applications in Radiology: Hindering and Facilitating Factors. Eur. Radiol. 2020, 30, 5525–5532. [Google Scholar] [CrossRef]
  18. Mizrahi, J.D.; Surana, R.; Valle, J.W.; Shroff, R.T. Pancreatic Cancer. Lancet 2020, 395, 2008–2020. [Google Scholar] [CrossRef]
  19. Miura, F.; Takada, T.; Amano, H.; Yoshida, M.; Furui, S.; Takeshita, K. Diagnosis of Pancreatic Cancer. HPB 2006, 8, 337–342. [Google Scholar] [CrossRef] [Green Version]
  20. Al-Hawary, M.M.; Francis, I.R.; Chari, S.T.; Fishman, E.K.; Hough, D.M.; Lu, D.S.; Macari, M.; Megibow, A.J.; Miller, F.H.; Mortele, K.J.; et al. Pancreatic Ductal Adenocarcinoma Radiology Reporting Template: Consensus Statement of the Society of Abdominal Radiology and the American Pancreatic Association. Gastroenterology 2014, 146, 291–304.e1. [Google Scholar] [CrossRef]
  21. Lee, E.S.; Lee, J.M. Imaging Diagnosis of Pancreatic Cancer: A State-of-the-Art Review. World J. Gastroenterol. 2014, 20, 7864–7877. [Google Scholar] [CrossRef]
  22. Treadwell, J.R.; Zafar, H.M.; Mitchell, M.D.; Tipton, K.; Teitelbaum, U.; Jue, J. Imaging Tests for the Diagnosis and Staging of Pancreatic Adenocarcinoma: A Meta-Analysis. Pancreas 2016, 45, 789–795. [Google Scholar] [CrossRef] [Green Version]
  23. Raman, S.P.; Horton, K.M.; Fishman, E.K. Multimodality Imaging of Pancreatic Cancer-Computed Tomography, Magnetic Resonance Imaging, and Positron Emission Tomography. Cancer J. 2012, 18, 511–522. [Google Scholar] [CrossRef] [PubMed]
  24. Yousaf, M.N.; Chaudhary, F.S.; Ehsan, A.; Suarez, A.L.; Muniraj, T.; Jamidar, P.; Aslanian, H.R.; Farrell, J.J. Endoscopic Ultrasound (EUS) and the Management of Pancreatic Cancer. BMJ Open Gastroenterol. 2020, 7, e000408. [Google Scholar] [CrossRef] [PubMed]
  25. Kitano, M.; Yoshida, T.; Itonaga, M.; Tamura, T.; Hatamaru, K.; Yamashita, Y. Impact of Endoscopic Ultrasonography on Diagnosis of Pancreatic Cancer. J. Gastroenterol. 2019, 54, 19–32. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Agarwal, B.; Correa, A.M.; Ho, L. Survival in Pancreatic Carcinoma Based on Tumor Size. Pancreas 2008, 36, e15–e20. [Google Scholar] [CrossRef] [PubMed]
  27. Siegel, R.L.; Miller, K.D.; Jemal, A. Cancer Statistics, 2019. CA. Cancer J. Clin. 2019, 69, 7–34. [Google Scholar] [CrossRef] [Green Version]
  28. Ardengh, J.C.; de Paulo, G.A.; Ferrari, A.P. Pancreatic Carcinomas Smaller than 3.0 Cm: Endosonography (EUS) in Diagnosis, Staging and Prediction of Resectability. HPB 2003, 5, 226–230. [Google Scholar] [CrossRef] [Green Version]
  29. Yamaguchi, K.; Mizumoto, K.; Noshiro, H.; Sugitani, A.; Shimizu, S.; Chijiiwa, K.; Tanaka, M. Pancreatic Carcinoma: < or = 2 Cm versus > 2 Cm in Size. Int. Surg. 1999, 84, 213–219. [Google Scholar]
  30. Elbanna, K.Y.; Jang, H.-J.; Kim, T.K. Imaging Diagnosis and Staging of Pancreatic Ductal Adenocarcinoma: A Comprehensive Review. Insights Imaging 2020, 11, 58. [Google Scholar] [CrossRef] [Green Version]
  31. Kang, J.D.; Clarke, S.E.; Costa, A.F. Factors Associated with Missed and Misinterpreted Cases of Pancreatic Ductal Adenocarcinoma. Eur. Radiol. 2021, 31, 2422–2432. [Google Scholar] [CrossRef]
  32. Yoon, S.H.; Lee, J.M.; Cho, J.Y.; Lee, K.B.; Kim, J.E.; Moon, S.K.; Kim, S.J.; Baek, J.H.; Kim, S.H.; Kim, S.H.; et al. Small (≤20 Mm) Pancreatic Adenocarcinomas: Analysis of Enhancement Patterns and Secondary Signs with Multiphasic Multidetector CT. Radiology 2011, 259, 442–452. [Google Scholar] [CrossRef] [Green Version]
  33. Wong, J.C.; Raman, S. Surgical Resectability of Pancreatic Adenocarcinoma: CTA. Abdom. Imaging 2010, 35, 471–480. [Google Scholar] [CrossRef] [Green Version]
  34. Gangi, S.; Fletcher, J.G.; Nathan, M.A.; Christensen, J.A.; Harmsen, W.S.; Crownhart, B.S.; Chari, S.T. Time Interval between Abnormalities Seen on CT and the Clinical Diagnosis of Pancreatic Cancer: Retrospective Review of CT Scans Obtained before Diagnosis. AJR. Am. J. Roentgenol. 2004, 182, 897–903. [Google Scholar] [CrossRef]
  35. Jang, K.M.; Kim, S.H.; Kim, Y.K.; Song, K.D.; Lee, S.J.; Choi, D. Missed Pancreatic Ductal Adenocarcinoma: Assessment of Early Imaging Findings on Prediagnostic Magnetic Resonance Imaging. Eur. J. Radiol. 2015, 84, 1473–1479. [Google Scholar] [CrossRef]
  36. Ahn, S.S.; Kim, M.-J.; Choi, J.-Y.; Hong, H.-S.; Chung, Y.E.; Lim, J.S. Indicative Findings of Pancreatic Cancer in Prediagnostic CT. Eur. Radiol. 2009, 19, 2448–2455. [Google Scholar] [CrossRef]
  37. Singh, D.P.; Sheedy, S.; Goenka, A.H.; Wells, M.; Lee, N.J.; Barlow, J.; Sharma, A.; Kandlakunta, H.; Chandra, S.; Garg, S.K.; et al. Computerized Tomography Scan in Pre-Diagnostic Pancreatic Ductal Adenocarcinoma: Stages of Progression and Potential Benefits of Early Intervention: A Retrospective Study. Pancreatology 2020, 20, 1495–1501. [Google Scholar] [CrossRef]
  38. Bakens, M.J.A.M.; van Gestel, Y.R.B.M.; Bongers, M.; Besselink, M.G.H.; Dejong, C.H.C.; Molenaar, I.Q.; Busch, O.R.C.; Lemmens, V.E.P.P.; de Hingh, I.H.J.T.; Dutch Pancreatic Cancer Group. Hospital of Diagnosis and Likelihood of Surgical Treatment for Pancreatic Cancer. Br. J. Surg. 2015, 102, 1670–1675. [Google Scholar] [CrossRef]
  39. Tran Cao, H.S.; Balachandran, A.; Wang, H.; Nogueras-González, G.M.; Bailey, C.E.; Lee, J.E.; Pisters, P.W.T.; Evans, D.B.; Varadhachary, G.; Crane, C.H.; et al. Radiographic Tumor-Vein Interface as a Predictor of Intraoperative, Pathologic, and Oncologic Outcomes in Resectable and Borderline Resectable Pancreatic Cancer. J. Gastrointest. Surg. 2014, 18, 269–278; discussion 278. [Google Scholar] [CrossRef]
  40. Versteijne, E.; Gurney-Champion, O.J.; van der Horst, A.; Lens, E.; Kolff, M.W.; Buijsen, J.; Ebrahimi, G.; Neelis, K.J.; Rasch, C.R.N.; Stoker, J.; et al. Considerable Interobserver Variation in Delineation of Pancreatic Cancer on 3DCT and 4DCT: A Multi-Institutional Study. Radiat. Oncol. 2017, 12, 58. [Google Scholar] [CrossRef]
  41. Joo, I.; Lee, J.M.; Lee, E.S.; Son, J.-Y.; Lee, D.H.; Ahn, S.J.; Chang, W.; Lee, S.M.; Kang, H.-J.; Yang, H.K. Preoperative CT Classification of the Resectability of Pancreatic Cancer: Interobserver Agreement. Radiology 2019, 293, 343–349. [Google Scholar] [CrossRef]
  42. Ausania, F.; Vallance, A.E.; Manas, D.M.; Prentis, J.M.; Snowden, C.P.; White, S.A.; Charnley, R.M.; French, J.J.; Jaques, B.C. Double Bypass for Inoperable Pancreatic Malignancy at Laparotomy: Postoperative Complications and Long-Term Outcome. Ann. R. Coll. Surg. Engl. 2012, 94, 563–568. [Google Scholar] [CrossRef]
  43. Giannone, F.; Capretti, G.; Abu Hilal, M.; Boggi, U.; Campra, D.; Cappelli, C.; Casadei, R.; De Luca, R.; Falconi, M.; Giannotti, G.; et al. Resectability of Pancreatic Cancer Is in the Eye of the Observer. Ann. Surg. Open 2021, 2, e087. [Google Scholar] [CrossRef]
  44. Versteijne, E.; Vogel, J.A.; Besselink, M.G.; Busch, O.R.C.; Wilmink, J.W.; Daams, J.G.; van Eijck, C.H.J.; Groot Koerkamp, B.; Rasch, C.R.N.; van Tienhoven, G.; et al. Meta-Analysis Comparing Upfront Surgery with Neoadjuvant Treatment in Patients with Resectable or Borderline Resectable Pancreatic Cancer. Br. J. Surg. 2018, 105, 946–958. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Janssen, Q.P.; Buettner, S.; Suker, M.; Beumer, B.R.; Addeo, P.; Bachellier, P.; Bahary, N.; Bekaii-Saab, T.; Bali, M.A.; Besselink, M.G.; et al. Neoadjuvant FOLFIRINOX in Patients with Borderline Resectable Pancreatic Cancer: A Systematic Review and Patient-Level Meta-Analysis. J. Natl. Cancer Inst. 2019, 111, 782–794. [Google Scholar] [CrossRef] [PubMed]
  46. Hosny, A.; Parmar, C.; Quackenbush, J.; Schwartz, L.H.; Aerts, H.J.W.L. Artificial Intelligence in Radiology. Nat. Rev. Cancer 2018, 18, 500–510. [Google Scholar] [CrossRef] [PubMed]
  47. Park, S.; Chu, L.C.; Hruban, R.H.; Vogelstein, B.; Kinzler, K.W.; Yuille, A.L.; Fouladi, D.F.; Shayesteh, S.; Ghandili, S.; Wolfgang, C.L.; et al. Differentiating Autoimmune Pancreatitis from Pancreatic Ductal Adenocarcinoma with CT Radiomics Features. Diagn. Interv. Imaging 2020, 101, 555–564. [Google Scholar] [CrossRef]
  48. Ziegelmayer, S.; Kaissis, G.; Harder, F.; Jungmann, F.; Müller, T.; Makowski, M.; Braren, R. Deep Convolutional Neural Network-Assisted Feature Extraction for Diagnostic Discrimination and Feature Visualization in Pancreatic Ductal Adenocarcinoma (PDAC) versus Autoimmune Pancreatitis (AIP). J. Clin. Med. 2020, 9, 4013. [Google Scholar] [CrossRef]
  49. Rigiroli, F.; Hoye, J.; Lerebours, R.; Lafata, K.J.; Li, C.; Meyer, M.; Lyu, P.; Ding, Y.; Schwartz, F.R.; Mettu, N.B.; et al. CT Radiomic Features of Superior Mesenteric Artery Involvement in Pancreatic Ductal Adenocarcinoma: A Pilot Study. Radiology 2021, 301, 610–622. [Google Scholar] [CrossRef]
  50. Chu, L.C.; Park, S.; Kawamoto, S.; Wang, Y.; Zhou, Y.; Shen, W.; Zhu, Z.; Xia, Y.; Xie, L.; Liu, F.; et al. Application of Deep Learning to Pancreatic Cancer Detection: Lessons Learned from Our Initial Experience. J. Am. Coll. Radiol. 2019, 16, 1338–1342. [Google Scholar] [CrossRef]
  51. Liu, S.-L.; Li, S.; Guo, Y.-T.; Zhou, Y.-P.; Zhang, Z.-D.; Li, S.; Lu, Y. Establishment and Application of an Artificial Intelligence Diagnosis System for Pancreatic Cancer with a Faster Region-Based Convolutional Neural Network. Chin. Med. J. 2019, 132, 2795–2803. [Google Scholar] [CrossRef] [Green Version]
  52. Zhu, Z.; Xia, Y.; Xie, L.; Fishman, E.K.; Yuille, A.L. Multi-Scale Coarse-to-Fine Segmentation for Screening Pancreatic Ductal Adenocarcinoma. In Proceedings of the Medical Image Computing and Computer Assisted Intervention–MICCAI 2019: 22nd International Conference, Shenzhen, China, 13–17 October 2019; pp. 3–12. [Google Scholar]
  53. Chu, L.C.; Park, S.; Kawamoto, S.; Fouladi, D.F.; Shayesteh, S.; Zinreich, E.S.; Graves, J.S.; Horton, K.M.; Hruban, R.H.; Yuille, A.L.; et al. Utility of CT Radiomics Features in Differentiation of Pancreatic Ductal Adenocarcinoma from Normal Pancreatic Tissue. AJR. Am. J. Roentgenol. 2019, 213, 349–357. [Google Scholar] [CrossRef]
  54. Liu, K.-L.; Wu, T.; Chen, P.-T.; Tsai, Y.M.; Roth, H.; Wu, M.-S.; Liao, W.-C.; Wang, W. Deep Learning to Distinguish Pancreatic Cancer Tissue from Non-Cancerous Pancreatic Tissue: A Retrospective Study with Cross-Racial External Validation. Lancet. Digit. Health 2020, 2, e303–e313. [Google Scholar] [CrossRef]
  55. Zhang, Z.; Li, S.; Wang, Z.; Lu, Y. A Novel and Efficient Tumor Detection Framework for Pancreatic Cancer via CT Images. In Proceedings of the 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020. [Google Scholar]
  56. Ma, H.; Liu, Z.-X.; Zhang, J.-J.; Wu, F.-T.; Xu, C.-F.; Shen, Z.; Yu, C.-H.; Li, Y.-M. Construction of a Convolutional Neural Network Classifier Developed by Computed Tomography Images for Pancreatic Cancer Diagnosis. World J. Gastroenterol. 2020, 26, 5156–5168. [Google Scholar] [CrossRef]
  57. Si, K.; Xue, Y.; Yu, X.; Zhu, X.; Li, Q.; Gong, W.; Liang, T.; Duan, S. Fully End-to-End Deep-Learning-Based Diagnosis of Pancreatic Tumors. Theranostics 2021, 11, 1982–1990. [Google Scholar] [CrossRef]
  58. Qiu, J.-J.; Yin, J.; Qian, W.; Liu, J.-H.; Huang, Z.-X.; Yu, H.-P.; Ji, L.; Zeng, X.-X. A Novel Multiresolution-Statistical Texture Analysis Architecture: Radiomics-Aided Diagnosis of PDAC Based on Plain CT Images. IEEE Trans. Med. Imaging 2021, 40, 12–25. [Google Scholar] [CrossRef]
  59. Ebrahimian, S.; Singh, R.; Netaji, A.; Madhusudhan, K.S.; Homayounieh, F.; Primak, A.; Lades, F.; Saini, S.; Kalra, M.K.; Sharma, S. Characterization of Benign and Malignant Pancreatic Lesions with DECT Quantitative Metrics and Radiomics. Acad. Radiol. 2022, 29, 705–713. [Google Scholar] [CrossRef]
  60. Viviers, C.G.A.; Ramaekers, M.; de With, P.H.N.; Mavroeidis, D.; Nederend, J.; Luyer, M.; van der Sommen, F. Improved Pancreatic Tumor Detection by Utilizing Clinically-Relevant Secondary Features. In Proceedings of the First International Workshop, CaPTion 2022, Held in Conjunction with MICCAI 2022, Singapore, 22 September 2022. [Google Scholar]
  61. Alves, N.; Schuurmans, M.; Litjens, G.; Bosma, J.S.; Hermans, J.; Huisman, H. Fully Automatic Deep Learning Framework for Pancreatic Ductal Adenocarcinoma Detection on Computed Tomography. Cancers 2022, 14, 376. [Google Scholar] [CrossRef]
  62. Chen, P.-T.; Wu, T.; Wang, P.; Chang, D.; Liu, K.-L.; Wu, M.-S.; Roth, H.R.; Lee, P.-C.; Liao, W.-C.; Wang, W. Pancreatic Cancer Detection on CT Scans with Deep Learning: A Nationwide Population-Based Study. Radiology 2023, 306, 172–182. [Google Scholar] [CrossRef]
  63. Kenner, B.; Chari, S.T.; Kelsen, D.; Klimstra, D.S.; Pandol, S.J.; Rosenthal, M.; Rustgi, A.K.; Taylor, J.A.; Yala, A.; Abul-Husn, N.; et al. Artificial Intelligence and Early Detection of Pancreatic Cancer: 2020 Summative Review. Pancreas 2021, 50, 251–279. [Google Scholar] [CrossRef]
  64. Kaissis, G.; Ziegelmayer, S.; Lohöfer, F.; Algül, H.; Eiber, M.; Weichert, W.; Schmid, R.; Friess, H.; Rummeny, E.; Ankerst, D.; et al. A Machine Learning Model for the Prediction of Survival and Tumor Subtype in Pancreatic Ductal Adenocarcinoma from Preoperative Diffusion-Weighted Imaging. Eur. Radiol. Exp. 2019, 3, 41. [Google Scholar] [CrossRef] [Green Version]
  65. Kaissis, G.; Ziegelmayer, S.; Lohöfer, F.; Steiger, K.; Algül, H.; Muckenhuber, A.; Yen, H.-Y.; Rummeny, E.; Friess, H.; Schmid, R.; et al. A Machine Learning Algorithm Predicts Molecular Subtypes in Pancreatic Ductal Adenocarcinoma with Differential Response to Gemcitabine-Based versus FOLFIRINOX Chemotherapy. PLoS ONE 2019, 14, e0218642. [Google Scholar] [CrossRef] [Green Version]
  66. Kaissis, G.A.; Ziegelmayer, S.; Lohöfer, F.K.; Harder, F.N.; Jungmann, F.; Sasse, D.; Muckenhuber, A.; Yen, H.-Y.; Steiger, K.; Siveke, J.; et al. Image-Based Molecular Phenotyping of Pancreatic Ductal Adenocarcinoma. J. Clin. Med. 2020, 9, 724. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  67. Liang, Y.; Schott, D.; Zhang, Y.; Wang, Z.; Nasief, H.; Paulson, E.; Hall, W.; Knechtges, P.; Erickson, B.; Li, X.A. Auto-Segmentation of Pancreatic Tumor in Multi-Parametric MRI Using Deep Convolutional Neural Networks. Radiother. Oncol. 2020, 145, 193–200. [Google Scholar] [CrossRef] [PubMed]
  68. Gao, X.; Wang, X. Deep Learning for World Health Organization Grades of Pancreatic Neuroendocrine Tumors on Contrast-Enhanced Magnetic Resonance Images: A Preliminary Study. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 1981–1991. [Google Scholar] [CrossRef] [PubMed]
  69. Corral, J.E.; Hussein, S.; Kandel, P.; Bolan, C.W.; Bagci, U.; Wallace, M.B. Deep Learning to Classify Intraductal Papillary Mucinous Neoplasms Using Magnetic Resonance Imaging. Pancreas 2019, 48, 805–810. [Google Scholar] [CrossRef]
  70. Gao, X.; Wang, X. Performance of Deep Learning for Differentiating Pancreatic Diseases on Contrast-Enhanced Magnetic Resonance Imaging: A Preliminary Study. Diagn. Interv. Imaging 2020, 101, 91–100. [Google Scholar] [CrossRef]
  71. Deng, Y.; Ming, B.; Zhou, T.; Wu, J.-L.; Chen, Y.; Liu, P.; Zhang, J.; Zhang, S.-Y.; Chen, T.-W.; Zhang, X.-M. Radiomics Model Based on MR Images to Discriminate Pancreatic Ductal Adenocarcinoma and Mass-Forming Chronic Pancreatitis Lesions. Front. Oncol. 2021, 11, 620981. [Google Scholar] [CrossRef]
  72. Zhang, M.-M.; Yang, H.; Jin, Z.-D.; Yu, J.-G.; Cai, Z.-Y.; Li, Z.-S. Differential Diagnosis of Pancreatic Cancer from Normal Tissue with Digital Imaging Processing and Pattern Recognition Based on a Support Vector Machine of EUS Images. Gastrointest. Endosc. 2010, 72, 978–985. [Google Scholar] [CrossRef]
  73. Das, A.; Nguyen, C.C.; Li, F.; Li, B. Digital Image Analysis of EUS Images Accurately Differentiates Pancreatic Cancer from Chronic Pancreatitis and Normal Tissue. Gastrointest. Endosc. 2008, 67, 861–867. [Google Scholar] [CrossRef]
  74. Ozkan, M.; Cakiroglu, M.; Kocaman, O.; Kurt, M.; Yilmaz, B.; Can, G.; Korkmaz, U.; Dandil, E.; Eksi, Z. Age-Based Computer-Aided Diagnosis Approach for Pancreatic Cancer on Endoscopic Ultrasound Images. Endosc. Ultrasound 2016, 5, 101–107. [Google Scholar] [CrossRef] [Green Version]
  75. Norton, I.D.; Zheng, Y.; Wiersema, M.S.; Greenleaf, J.; Clain, J.E.; Dimagno, E.P. Neural Network Analysis of EUS Images to Differentiate between Pancreatic Malignancy and Pancreatitis. Gastrointest. Endosc. 2001, 54, 625–629. [Google Scholar] [CrossRef]
  76. Zhu, M.; Xu, C.; Yu, J.; Wu, Y.; Li, C.; Zhang, M.; Jin, Z.; Li, Z. Differentiation of Pancreatic Cancer and Chronic Pancreatitis Using Computer-Aided Diagnosis of Endoscopic Ultrasound (EUS) Images: A Diagnostic Test. PLoS ONE 2013, 8, e63820. [Google Scholar] [CrossRef] [Green Version]
  77. Săftoiu, A.; Vilmann, P.; Dietrich, C.F.; Iglesias-Garcia, J.; Hocke, M.; Seicean, A.; Ignee, A.; Hassan, H.; Streba, C.T.; Ioncică, A.M.; et al. Quantitative Contrast-Enhanced Harmonic EUS in Differential Diagnosis of Focal Pancreatic Masses (with Videos). Gastrointest. Endosc. 2015, 82, 59–69. [Google Scholar] [CrossRef]
  78. Marinelli, T.; Filippone, A.; Tavano, F.; Fontana, A.; Pellegrini, F.; Köninger, J.; Richter, G.M.; Bonomo, L.; Büchler, M.W.; di Sebastiano, P.; et al. A Tumour Score with Multidetector Spiral CT for Venous Infiltration in Pancreatic Cancer: Influence on Borderline Resectable. Radiol. Med. 2014, 119, 334–342. [Google Scholar] [CrossRef]
  79. Klauss, M.; Mohr, A.; von Tengg-Kobligk, H.; Friess, H.; Singer, R.; Seidensticker, P.; Kauczor, H.U.; Richter, G.M.; Kauffmann, G.W.; Grenacher, L. A New Invasion Score for Determining the Resectability of Pancreatic Carcinomas with Contrast-Enhanced Multidetector Computed Tomography. Pancreatology 2008, 8, 204–210. [Google Scholar] [CrossRef]
  80. Ahmed, S.A.; Mourad, A.F.; Hassan, R.A.; Ibrahim, M.A.E.; Soliman, A.; Aboeleuon, E.; Elbadee, O.M.A.; Hetta, H.F.; Jabir, M.A. Preoperative CT Staging of Borderline Pancreatic Cancer Patients after Neoadjuvant Treatment: Accuracy in the Prediction of Vascular Invasion and Resectability. Abdom. Radiol. 2021, 46, 280–289. [Google Scholar] [CrossRef]
  81. Kim, B.R.; Kim, J.H.; Ahn, S.J.; Joo, I.; Choi, S.-Y.; Park, S.J.; Han, J.K. CT Prediction of Resectability and Prognosis in Patients with Pancreatic Ductal Adenocarcinoma after Neoadjuvant Treatment Using Image Findings and Texture Analysis. Eur. Radiol. 2019, 29, 362–372. [Google Scholar] [CrossRef]
  82. Yip, C.; Landau, D.; Kozarski, R.; Ganeshan, B.; Thomas, R.; Michaelidou, A.; Goh, V. Primary Esophageal Cancer: Heterogeneity as Potential Prognostic Biomarker in Patients Treated with Definitive Chemotherapy and Radiation Therapy. Radiology 2014, 270, 141–148. [Google Scholar] [CrossRef]
  83. Locker, G.Y.; Hamilton, S.; Harris, J.; Jessup, J.M.; Kemeny, N.; Macdonald, J.S.; Somerfield, M.R.; Hayes, D.F.; Bast, R.C.; ASCO. ASCO 2006 Update of Recommendations for the Use of Tumor Markers in Gastrointestinal Cancer. J. Clin. Oncol. 2006, 24, 5313–5327. [Google Scholar] [CrossRef]
  84. Zhang, Y.; Yang, J.; Li, H.; Wu, Y.; Zhang, H.; Chen, W. Tumor Markers CA19-9, CA242 and CEA in the Diagnosis of Pancreatic Cancer: A Meta-Analysis. Int. J. Clin. Exp. Med. 2015, 8, 11683–11691. [Google Scholar]
  85. Yang, Y.; Chen, H.; Wang, D.; Luo, W.; Zhu, B.; Zhang, Z. Diagnosis of Pancreatic Carcinoma Based on Combined Measurement of Multiple Serum Tumor Markers Using Artificial Neural Network Analysis. Chin. Med. J. 2014, 127, 1891–1896. [Google Scholar]
  86. Schultz, N.A.; Dehlendorff, C.; Jensen, B.V.; Bjerregaard, J.K.; Nielsen, K.R.; Bojesen, S.E.; Calatayud, D.; Nielsen, S.E.; Yilmaz, M.; Holländer, N.H.; et al. MicroRNA Biomarkers in Whole Blood for Detection of Pancreatic Cancer. JAMA 2014, 311, 392–404. [Google Scholar] [CrossRef] [PubMed]
  87. Cao, Z.; Liu, C.; Xu, J.; You, L.; Wang, C.; Lou, W.; Sun, B.; Miao, Y.; Liu, X.; Wang, X.; et al. Plasma MicroRNA Panels to Diagnose Pancreatic Cancer: Results from a Multicenter Study. Oncotarget 2016, 7, 41575–41583. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  88. Van Ginneken, B.; Schaefer-Prokop, C.M.; Prokop, M. Computer-Aided Diagnosis: How to Move from the Laboratory to the Clinic. Radiology 2011, 261, 719–732. [Google Scholar] [CrossRef] [PubMed]
  89. Kohli, A.; Jha, S. Why CAD Failed in Mammography. J. Am. Coll. Radiol. 2018, 15, 535–537. [Google Scholar] [CrossRef]
  90. Hwang, E.J.; Park, S.; Jin, K.-N.; Kim, J.I.; Choi, S.Y.; Lee, J.H.; Goo, J.M.; Aum, J.; Yim, J.-J.; Cohen, J.G.; et al. Development and Validation of a Deep Learning-Based Automated Detection Algorithm for Major Thoracic Diseases on Chest Radiographs. JAMA Netw. Open 2019, 2, e191095. [Google Scholar] [CrossRef] [Green Version]
  91. Chilamkurthy, S.; Ghosh, R.; Tanamala, S.; Biviji, M.; Campeau, N.G.; Venugopal, V.K.; Mahajan, V.; Rao, P.; Warier, P. Deep Learning Algorithms for Detection of Critical Findings in Head CT Scans: A Retrospective Study. Lancet 2018, 392, 2388–2396. [Google Scholar] [CrossRef]
  92. Gianfrancesco, M.A.; Tamang, S.; Yazdany, J.; Schmajuk, G. Potential Biases in Machine Learning Algorithms Using Electronic Health Record Data. JAMA Intern. Med. 2018, 178, 1544–1547. [Google Scholar] [CrossRef]
  93. Char, D.S.; Shah, N.H.; Magnus, D. Implementing Machine Learning in Health Care—Addressing Ethical Challenges. N. Engl. J. Med. 2018, 378, 981–983. [Google Scholar] [CrossRef] [Green Version]
  94. Angwin, J.; Larson, J.; Mattu, S.; Kirchner, L. Machine Bias. Available online: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing (accessed on 8 July 2022).
  95. Gijsberts, C.M.; Groenewegen, K.A.; Hoefer, I.E.; Eijkemans, M.J.C.; Asselbergs, F.W.; Anderson, T.J.; Britton, A.R.; Dekker, J.M.; Engström, G.; Evans, G.W.; et al. Race/Ethnic Differences in the Associations of the Framingham Risk Factors with Carotid IMT and Cardiovascular Events. PLoS ONE 2015, 10, e0132321. [Google Scholar] [CrossRef] [Green Version]
  96. Moons, K.G.M.; Wolff, R.F.; Riley, R.D.; Whiting, P.F.; Westwood, M.; Collins, G.S.; Reitsma, J.B.; Kleijnen, J.; Mallett, S. PROBAST: A Tool to Assess Risk of Bias and Applicability of Prediction Model Studies: Explanation and Elaboration. Ann. Intern. Med. 2019, 170, W1–W33. [Google Scholar] [CrossRef] [Green Version]
  97. Esteva, A.; Kuprel, B.; Novoa, R.A.; Ko, J.; Swetter, S.M.; Blau, H.M.; Thrun, S. Dermatologist-Level Classification of Skin Cancer with Deep Neural Networks. Nature 2017, 542, 115–118. [Google Scholar] [CrossRef]
  98. Winkler, J.K.; Fink, C.; Toberer, F.; Enk, A.; Deinlein, T.; Hofmann-Wellenhof, R.; Thomas, L.; Lallas, A.; Blum, A.; Stolz, W.; et al. Association Between Surgical Skin Markings in Dermoscopic Images and Diagnostic Performance of a Deep Learning Convolutional Neural Network for Melanoma Recognition. JAMA Dermatol. 2019, 155, 1135–1141. [Google Scholar] [CrossRef]
  99. Liu, X.; Faes, L.; Kale, A.U.; Wagner, S.K.; Fu, D.J.; Bruynseels, A.; Mahendiran, T.; Moraes, G.; Shamdas, M.; Kern, C.; et al. A Comparison of Deep Learning Performance against Health-Care Professionals in Detecting Diseases from Medical Imaging: A Systematic Review and Meta-Analysis. Lancet. Digit. Health 2019, 1, e271–e297. [Google Scholar] [CrossRef]
  100. Lee, H.; Tajmir, S.; Lee, J.; Zissen, M.; Yeshiwas, B.A.; Alkasab, T.K.; Choy, G.; Do, S. Fully Automated Deep Learning System for Bone Age Assessment. J. Digit. Imaging 2017, 30, 427–441. [Google Scholar] [CrossRef] [Green Version]
  101. De Groof, A.J.; Struyvenberg, M.R.; van der Putten, J.; van der Sommen, F.; Fockens, K.N.; Curvers, W.L.; Zinger, S.; Pouw, R.E.; Coron, E.; Baldaque-Silva, F.; et al. Deep-Learning System Detects Neoplasia in Patients with Barrett’s Esophagus With Higher Accuracy Than Endoscopists in a Multistep Training and Validation Study With Benchmarking. Gastroenterology 2020, 158, 915–929.e4. [Google Scholar] [CrossRef]
  102. Van de Sande, D.; Van Genderen, M.E.; Smit, J.M.; Huiskens, J.; Visser, J.J.; Veen, R.E.R.; van Unen, E.; Ba, O.H.; Gommers, D.; van Bommel, J. Developing, Implementing and Governing Artificial Intelligence in Medicine: A Step-by-Step Approach to Prevent an Artificial Intelligence Winter. BMJ Health Care Inform. 2022, 29, e100495. [Google Scholar] [CrossRef]
  103. Recht, M.P.; Dewey, M.; Dreyer, K.; Langlotz, C.; Niessen, W.; Prainsack, B.; Smith, J.J. Integrating Artificial Intelligence into the Clinical Practice of Radiology: Challenges and Recommendations. Eur. Radiol. 2020, 30, 3576–3584. [Google Scholar] [CrossRef] [Green Version]
  104. The Lancet Digital Health Walking the Tightrope of Artificial Intelligence Guidelines in Clinical Practice. Lancet. Digit. Health 2019, 1, e100. [CrossRef]
  105. European Commission. Regulation of The European Parliament and of The Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) And Amending Certain Union Legislative Acts; European Commission: Brussels, Belgium, 2021. [Google Scholar]
  106. Bouwmeester, W.; Zuithoff, N.P.A.; Mallett, S.; Geerlings, M.I.; Vergouwe, Y.; Steyerberg, E.W.; Altman, D.G.; Moons, K.G.M. Reporting and Methods in Clinical Prediction Research: A Systematic Review. PLoS Med. 2012, 9, e1001221. [Google Scholar] [CrossRef] [Green Version]
  107. Collins, G.S.; Reitsma, J.B.; Altman, D.G.; Moons, K.G.M. Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis or Diagnosis (TRIPOD): The TRIPOD Statement. Br. J. Surg. 2015, 102, 148–158. [Google Scholar] [CrossRef] [Green Version]
  108. Collins, G.S.; Moons, K.G.M. Reporting of Artificial Intelligence Prediction Models. Lancet 2019, 393, 1577–1579. [Google Scholar] [CrossRef] [PubMed]
  109. Riley, R.D.; Ensor, J.; Snell, K.I.E.; Harrell, F.E.; Martin, G.P.; Reitsma, J.B.; Moons, K.G.M.; Collins, G.; van Smeden, M. Calculating the Sample Size Required for Developing a Clinical Prediction Model. BMJ 2020, 368, m441. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  110. Luo, X.; Wang, G.; Song, T.; Zhang, J.; Aertsen, M.; Deprest, J.; Ourselin, S.; Vercauteren, T.; Zhang, S. MIDeepSeg: Minimally Interactive Segmentation of Unseen Objects from Medical Images Using Deep Learning. Med. Image Anal. 2021, 72, 102102. [Google Scholar] [CrossRef] [PubMed]
  111. Oriuchi, N.; Higuchi, T.; Ishikita, T.; Miyakubo, M.; Hanaoka, H.; Iida, Y.; Endo, K. Present Role and Future Prospects of Positron Emission Tomography in Clinical Oncology. Cancer Sci. 2006, 97, 1291–1297. [Google Scholar] [CrossRef] [PubMed]
  112. Xing, H.; Hao, Z.; Zhu, W.; Sun, D.; Ding, J.; Zhang, H.; Liu, Y.; Huo, L. Preoperative Prediction of Pathological Grade in Pancreatic Ductal Adenocarcinoma Based on 18F-FDG PET/CT Radiomics. EJNMMI Res. 2021, 11, 19. [Google Scholar] [CrossRef]
  113. Yao, Y.; Chen, Y.; Gou, S.; Chen, S.; Zhang, X.; Tong, N. Auto-Segmentation of Pancreatic Tumor in Multi-Modal Image Using Transferred DSMask R-CNN Network. Biomed. Signal Process. Control 2023, 83, 104583. [Google Scholar] [CrossRef]
  114. Dimitrakopoulou-Strauss, A.; Pan, L.; Sachpekidis, C. Long Axial Field of View (LAFOV) PET-CT: Implementation in Static and Dynamic Oncological Studies. Eur. J. Nucl. Med. Mol. Imaging 2023. [Google Scholar] [CrossRef]
  115. Sayres, R.; Taly, A.; Rahimy, E.; Blumer, K.; Coz, D.; Hammel, N.; Krause, J.; Narayanaswamy, A.; Rastegar, Z.; Wu, D.; et al. Using a Deep Learning Algorithm and Integrated Gradients Explanation to Assist Grading for Diabetic Retinopathy. Ophthalmology 2019, 126, 552–564. [Google Scholar] [CrossRef] [Green Version]
  116. Cai, C.J.; Reif, E.; Hegde, N.; Hipp, J.; Kim, B.; Smilkov, D.; Wattenberg, M.; Viegas, F.; Corrado, G.S.; Stumpe, M.C.; et al. Human-Centered Tools for Coping with Imperfect Algorithms during Medical Decision-Making. In Proceedings of the 2019 Chi Conference on Human Factors in Computing Systems, Glasgow, UK, 4–9 May 2019. [Google Scholar]
  117. Yarbrough, A.K.; Smith, T.B. Technology Acceptance among Physicians: A New Take on TAM. Med. Care Res. Rev. 2007, 64, 650–672. [Google Scholar] [CrossRef]
Table 1. Studies investigating artificial intelligence-based pancreatic ductal adenocarcinoma detection using CT.
Table 1. Studies investigating artificial intelligence-based pancreatic ductal adenocarcinoma detection using CT.
Author
(Year)
Aim of ModelType of ModelDataset (n)Sensitivity
(%)
Specificity
(%)
AUCAccuracy
(%)
Chu et al. (2019)
[50]
Detection of PDAC3D U-Net (CNN)156 PDAC
300 Control
94.198.5N.A. N.A.
Liu et al. (2019)
[51]
Detection of pancreatic cancerFaster R-CNN338 Pancreatic cancer patientsN.A.N.A.0.963N.A.
Zhu et al. (2019)
[52]
Detection of PDAC3D U-Net (CNN)136 PDAC
303 Control
94.198.5N.A. 57.3
Chu et al. (2019)
[53]
Detection of PDACRadiomics190 PDAC
190 Control
10098.50.99999.2
Liu et al. (2020)
[54]
Detection of pancreatic cancerVGG-CNN370 Pancreatic cancer
320 Control
97.31000.99798.6
Zhang et al. (2020)
[55]
Detection of pancreatic cancerCustom CNN Feature Pyramid2890 CT images83.791.70.94590.2
Ma et al. (2020)
[56]
Detection of pancreatic cancerEncoder only CNN222 PDAC
190 Control
91.698.30.96595.5
Si et al. (2021)
[57]
Detection of pancreatic cancerResNet and U-Net (CNN)319 patients86.869.50.87287.6
Qiu et al. (2021)
[58]
Diagnosis analysis of PDACRadiomics312 patientsN.A.N.A. 0.88081.2
Ebrahimian et al. (2022)
[59]
Differentiating malignant and benign pancreatic lesionsRadiomics103 patients84.095.00.9900.91
Viviers et al. (2022) [60]Detection of PDAC3D U-Net (CNN)99 PDAC
97 Control
99.099.0N.A.N.A.
Alves et al. (2022)
[61]
Detection of PDAC3D U-Net (CNN)119 PDAC
123 Control
N.A.N.A.0.909N.A.
Chen et al. (2023) [62]Detection of pancreatic cancer3D U-Net (CNN)546 pancreatic cancer patients
733 control
89.792.80.950N.A.
Legend: CT, computed tomography; AUC, area under the curve; CNN, convolutional neural network; VGG, visual geometry group; PDAC, pancreatic ductal adenocarcinoma.
Table 2. Studies investigating artificial intelligence-based pancreatic cancer diagnosis using MRI.
Table 2. Studies investigating artificial intelligence-based pancreatic cancer diagnosis using MRI.
Author
(Year)
Aim of ModelType of ModelDataset (n)Sensitivity
(%)
Specificity
(%)
AUCAccuracy
(%)
Gao et al. (2019)
[68]
Histopathological WHO grade prediction of pNETCNN Encoder96 patients N.A.N.A.0.88581.1
Corral et al. (2019)
[69]
Identify neoplasia in IPMNCNN feature representation + SVM139 cases75.078.00.780N.A.
Kiassis et al. (2019)
[64]
Survival prediction of PDACRandom Forest Machine learning 102 PDAC patients87.080.00.900N.A.
Kiassis et al. (2019)
[65]
Molecular subtype prediction of PDAC Radiomics55 PDAC patients90.092.00.930N.A.
Gao et al. (2020)
[70]
Differentiate pancreatic diseasesInception v4 (CNN)398 patientsN.A.N.A.0.86476.8
Kiassis et al. (2020)
[66]
Molecular subtype prediction of PDAC Random Forest Machine learning 207 PDAC patients 84.092.00.930N.A.
Deng et al. (2021)
[71]
Differentiate PDAC from MFP Radiomics52 PDAC patients
13 MFP patients
N.A. N.A. 0.94579.5
Legend: MRI, magnetic resonance imaging; AUC, area under the curve; WHO, world health organization; CNN, convolutional neural network; PDAC, pancreatic ductal adenocarcinoma; IPMN, intraductal papillary mucinous neoplasm; MFP, mass-forming pancreatitis; pNET, pancreatic neuroendocrine tumor.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ramaekers, M.; Viviers, C.G.A.; Janssen, B.V.; Hellström, T.A.E.; Ewals, L.; van der Wulp, K.; Nederend, J.; Jacobs, I.; Pluyter, J.R.; Mavroeidis, D.; et al. Computer-Aided Detection for Pancreatic Cancer Diagnosis: Radiological Challenges and Future Directions. J. Clin. Med. 2023, 12, 4209. https://doi.org/10.3390/jcm12134209

AMA Style

Ramaekers M, Viviers CGA, Janssen BV, Hellström TAE, Ewals L, van der Wulp K, Nederend J, Jacobs I, Pluyter JR, Mavroeidis D, et al. Computer-Aided Detection for Pancreatic Cancer Diagnosis: Radiological Challenges and Future Directions. Journal of Clinical Medicine. 2023; 12(13):4209. https://doi.org/10.3390/jcm12134209

Chicago/Turabian Style

Ramaekers, Mark, Christiaan G. A. Viviers, Boris V. Janssen, Terese A. E. Hellström, Lotte Ewals, Kasper van der Wulp, Joost Nederend, Igor Jacobs, Jon R. Pluyter, Dimitrios Mavroeidis, and et al. 2023. "Computer-Aided Detection for Pancreatic Cancer Diagnosis: Radiological Challenges and Future Directions" Journal of Clinical Medicine 12, no. 13: 4209. https://doi.org/10.3390/jcm12134209

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop