Diagnostic Performance of an Artificial Intelligence Software for the Evaluation of Bone X-Ray Examinations Referred from the Emergency Department
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Type and Subjects
2.2. Acquisition Protocol
2.3. Reading and Data Collection Protocol
2.4. Study Variables
- -
- Fracture.
- -
- Dislocation.
- -
- Joint effusion (only available for elbow radiographs).
- -
- Sequelae of a fracture; understanding this variable as a chronic fracture or dislocation.
- -
- Arthropathy, which includes osteoarthritis and erosive arthritis; however, in our sample, we did not have cases of erosive arthritis, so in our study, arthropathy refers exclusively to osteoarthritis.
- -
- Focal lesion.
- -
- Anatomical variant.
- -
- Other findings: includes other findings not covered in previous categories.
3. Results
3.1. Demographic Characteristics
- Small joints which included hand, feet, fingers, calcaneus, wrist, and ankle. This was the most frequently found group, accounting for up to 51.4% of the cases.
- Large joints which included shoulder, elbow, pelvis or hip, and knee. This was the second most frequent group, representing 43.3% of the cases.
- Flat or long bones which included clavicle, humerus, forearm, femur, and tibia. This was the least frequent group (5.3%).
3.2. Prevalence
3.3. Analysis of the Resident’s and AI’s Performance Validity Compared to the Gold Standard
3.3.1. Acute Fracture
3.3.2. Acute Joint Dislocation
3.3.3. Elbow Joint Effusion
3.3.4. Degree of Agreement Between the Resident and AI
3.4. Analysis of the “Other Findings” That Milvue Has Not Been Trained to Detect
4. Discussion
- Ankle and foot: On six occasions, Milvue marked the fracture variable as doubtful in cases with a bipartite medial sesamoid (two patients), an accessory sesamoid at the base of the 5th metatarsal, synphalangism, os peroneum, and os naviculare. (Figure 8).
- Hand: Milvue marked the fracture variable as doubtful in the case of multiple accessory ossicles.
- Wrist: On four occasions, Milvue marked the fracture variable as doubtful in cases of os paranaviculare, os trapezium secundarium, os ulnar styloid, and os paratrapezium. However, Milvue did not detect fractures in three cases of os ulnar styloid, two cases of accessory ulnar styloid, nor in cases of os hypolunatum and os epilunatum.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Duron, L.; Ducarouge, A.; Gillibert, A.; Lainé, J.; Allouche, C.; Cherel, N.; Zhang, Z.; Nitche, N.; Lacave, E.; Pourchot, A.; et al. Assessment of an AI Aid in Detection of Adult Appendicular Skeletal Fractures by Emergency Physicians and Radiologists: A Multicenter Cross-sectional Diagnostic Study. Radiology 2021, 300, 120–129. [Google Scholar] [CrossRef] [PubMed]
- Fernholm, R.; Pukk Härenstam, K.; Wachtler, C.; Nilsson, G.H.; Holzmann, M.J.; Carlsson, A.C. Diagnostic errors reported in primary healthcare and emergency departments: A retrospective and descriptive cohort study of 4830 reported cases of preventable harm in Sweden. Eur. J. Gen. Pract. 2019, 25, 128–135. [Google Scholar] [CrossRef]
- Guly, H.R. Diagnostic errors in an accident and emergency department. Emerg. Med. J. 2001, 18, 263–269. [Google Scholar] [CrossRef]
- Kuo, R.Y.L.; Harrison, C.; Curran, T.-A.; Jones, B.; Freethy, A.; Cussons, D.; Stewart, M.; Collins, G.S.; Furniss, D. Artificial Intelligence in Fracture Detection: A Systematic Review and Meta-Analysis. Radiology 2022, 304, 50–62. [Google Scholar] [CrossRef]
- Gore, J.C. Artificial intelligence in medical imaging. Magn. Reson. Imaging 2020, 68, A1–A4. [Google Scholar] [CrossRef]
- Van Leeuwen, K.G.; Schalekamp, S.; Rutten, M.J.C.M.; van Ginneken, B.; de Rooij, M. Artificial intelligence in radiology: 100 commercially available products and their scientific evidence. Eur. Radiol. 2021, 31, 3797–3804. [Google Scholar] [CrossRef]
- West, E.; Mutasa, S.; Zhu, Z.; Ha, R. Global trend in artificial intelligence-based publications in radiology from 2000 to 2018. Am. J. Roentgenol. 2019, 213, 1204–1206. [Google Scholar] [CrossRef]
- López Alcolea, J.; Fernández Alfonso, A.; Cano Alonso, R.; Álvarez Vázquez, A.; Díaz Moreno, A.; García Castellanos, D.; Greciano, L.S.; Hayoun, C.; Rodríguez, M.R.; Vázquez, C.A.; et al. Diagnostic Performance of Artificial Intelligence in Chest Radiographs Referred from the Emergency Department. Diagnostics 2024, 14, 2592. [Google Scholar] [CrossRef] [PubMed]
- Cellina, M.; Cè, M.; Irmici, G.; Ascenti, V.; Caloro, E.; Bianchi, L.; Pellegrino, G.; D’amico, N.; Papa, S.; Carrafiello, G. Artificial Intelligence in Emergency Radiology: Where Are We Going? Diagnostics 2022, 12, 3223. [Google Scholar] [CrossRef]
- Katzman, B.D.; van der Pol, C.B.; Soyer, P.; Patlas, M.N. Artificial intelligence in emergency radiology: A review of applications and possibilities. Diagn. Interv. Imaging 2023, 104, 6–10. [Google Scholar] [CrossRef]
- Debs, P.; Fayad, L.M. The promise and limitations of artificial intelligence in musculoskeletal imaging. Front. Radiol. 2023, 3, 1242902. [Google Scholar] [CrossRef]
- Hayashi, D.; Kompel, A.J.; Ventre, J.; Ducarouge, A.; Nguyen, T.; Regnard, N.-E.; Guermazi, A. Automated detection of acute appendicular skeletal fractures in pediatric patients using deep learning. Skelet. Radiol. 2022, 51, 2129–2139. [Google Scholar] [CrossRef] [PubMed]
- Zech, J.R.; Santomartino, S.M.; Yi, P.H. Artificial Intelligence (AI) for Fracture Diagnosis: An Overview of Current Products and Considerations for Clinical Adoption, From the AJR Special Series on AI Applications. Am. J. Roentgenol. 2022, 219, 869–879. [Google Scholar] [CrossRef]
- Regnard, N.-E.; Lanseur, B.; Ventre, J.; Ducarouge, A.; Clovis, L.; Lassalle, L.; Lacave, E.; Grandjean, A.; Lambert, A.; Dallaudière, B.; et al. Assessment of performances of a deep learning algorithm for the detection of limbs and pelvic fractures, dislocations, focal bone lesions, and elbow effusions on trauma X-rays. Eur. J. Radiol. 2022, 154, 110447. [Google Scholar] [CrossRef] [PubMed]
- Wong, C.R.; Zhu, A.; Baltzer, H.L. The Accuracy of Artificial Intelligence Models in Hand/Wrist Fracture and Dislocation Diagnosis. JBJS Rev. 2024, 12, e24.00106. [Google Scholar] [CrossRef]
- Zhao, Z.; Pi, Y.; Jiang, L.; Xiang, Y.; Wei, J.; Yang, P.; Zhang, W.; Zhong, X.; Zhou, K.; Li, Y.; et al. Deep neural network based artificial intelligence assisted diagnosis of bone scintigraphy for cancer bone metastasis. Sci. Rep. 2020, 10, 17046. [Google Scholar] [CrossRef]
- Huhtanen, J.T.; Nyman, M.; Doncenco, D.; Hamedian, M.; Kawalya, D.; Salminen, L.; Sequeiros, R.B.; Koskinen, S.K.; Pudas, T.K.; Kajander, S.; et al. Deep learning accurately classifies elbow joint effusion in adult and pediatric radiographs. Sci. Rep. 2022, 12, 11803. [Google Scholar] [CrossRef] [PubMed]
- England, J.R.; Gross, J.S.; White, E.A.; Patel, D.B.; England, J.T.; Cheng, P.M. Detection of Traumatic Pediatric Elbow Joint Effusion Using a Deep Convolutional Neural Network. Am. J. Roentgenol. 2018, 211, 1361–1368. [Google Scholar] [CrossRef] [PubMed]
- Wang, X.; Zhou, B.; Gong, P.; Zhang, T.; Mo, Y.; Tang, J.; Shi, X.; Wang, J.; Yuan, X.; Bai, F.; et al. Artificial Intelligence–Assisted Bone Age Assessment to Improve the Accuracy and Consistency of Physicians With Different Levels of Experience. Front. Pediatr. 2022, 10, 818061. [Google Scholar] [CrossRef] [PubMed]
- Liang, Y.; Chen, X.; Zheng, R.; Cheng, X.; Su, Z.; Wang, X.; Du, H.; Zhu, M.; Li, G.; Zhong, Y.; et al. Validation of an AI-Powered Automated X-ray Bone Age Analyzer in Chinese Children and Adolescents: A Comparison with the Tanner–Whitehouse 3 Method. Adv. Ther. 2024, 41, 3664–3677. [Google Scholar] [CrossRef]
- Yang, J.; Wang, J.; Meng, M.Q.-H. A Landmark-aware Network for Automated Cobb Angle Estimation Using X-ray Images. arXiv 2024, arXiv:2405.19645. [Google Scholar] [CrossRef]
- Li, H.; Qian, C.; Yan, W.; Fu, D.; Zheng, Y.; Zhang, Z.; Meng, J.; Wang, D. Use of Artificial Intelligence in Cobb Angle Measurement for Scoliosis: Retrospective Reliability and Accuracy Study of a Mobile App. J. Med. Internet Res. 2024, 26, e50631. [Google Scholar] [CrossRef]
- Joseph, G.B.; McCulloch, C.E.; Sohn, J.H.; Pedoia, V.; Majumdar, S.; Link, T.M. AI MSK clinical applications: Cartilage and osteoarthritis. Skelet. Radiol. 2022, 51, 331–343. [Google Scholar] [CrossRef]
- Lee, J.-S.; Adhikari, S.; Liu, L.; Jeong, H.-G.; Kim, H.; Yoon, S.-J. Osteoporosis detection in panoramic radiographs using a deep convolutional neural network-based computer-assisted diagnosis system: A preliminary study. Dentomaxillofacial Radiol. 2019, 48, 20170344. [Google Scholar] [CrossRef]
- Meetschen, M.; Salhöfer, L.; Beck, N.; Kroll, L.; Ziegenfuß, C.D.; Schaarschmidt, B.M.; Forsting, M.; Mizan, S.; Umutlu, L.; Hosch, R. AI-Assisted X-ray Fracture Detection in Residency Training: Evaluation in Pediatric and Adult Trauma Patients. Diagnostics 2024, 14, 596. [Google Scholar] [CrossRef] [PubMed]
- Guermazi, A.; Tannoury, C.; Kompel, A.J.; Murakami, A.M.; Ducarouge, A.; Gillibert, A.; Li, X.; Tournier, A.; Lahoud, Y.; Jarraya, M.; et al. Improving Radiographic Fracture Recognition Performance and Efficiency Using Artificial Intelligence. Radiology 2022, 302, 627–636. [Google Scholar] [CrossRef] [PubMed]
- Oppenheimer, J.; Lüken, S.; Hamm, B.; Niehues, S.M. A Prospective Approach to Integration of AI Fracture Detection Software in Radiographs into Clinical Workflow. Life 2023, 13, 223. [Google Scholar] [CrossRef]
- Wood, G.; Knapp, K.M.; Rock, B.; Cousens, C.; Roobottom, C.; Wilson, M.R. Visual expertise in detecting and diagnosing skeletal fractures. Skelet. Radiol. 2013, 42, 165–172. [Google Scholar] [CrossRef]
- Franco, P.N.; Maino, C.; Mariani, I.; Gandola, D.G.; Sala, D.; Bologna, M.; Franzesi, C.T.; Corso, R.; Ippolito, D. Diagnostic performance of an AI algorithm for the detection of appendicular bone fractures in pediatric patients. Eur. J. Radiol. 2024, 178, 111637. [Google Scholar] [CrossRef] [PubMed]
- Xie, Y.; Li, X.; Chen, F.; Wen, R.; Jing, Y.; Liu, C.; Wang, J. Artificial intelligence diagnostic model for multi-site fracture X-ray images of extremities based on deep convolutional neural networks. Quant. Imaging Med. Surg. 2024, 14, 1930–1943. [Google Scholar] [CrossRef]
- Dupuis, M.; Delbos, L.; Rouquette, A.; Adamsbaum, C.; Veil, R. External validation of an artificial intelligence solution for the detection of elbow fractures and joint effusions in children. Diagn. Interv. Imaging 2024, 105, 104–109. [Google Scholar] [CrossRef] [PubMed]
Cases | ||
---|---|---|
(n = 792) | ||
Patient’s age (years old) median [Q1; Q3] * | 48.0 [33.0; 61.5] | |
Gender (n,%) | ||
Men | 385 (48.6) | |
Women | 407 (51.4) | |
Radiological projections (n,%) | ||
1 | 62 (7.8) | |
2 | 714 (90.2) | |
3 | 12 (1.5) | |
4 | 4 (0.5) | |
X-ray image quality (n,%) | ||
Optimal | 774 (97.8) | |
Average | 18 (2.2) | |
Joint groups (n,%) | ||
Small joint | 407 (51.4) | |
Large joint | 343 (43.3) | |
Flat/long bones | 42 (5.3) | |
Breakdown by joints (n,%) | ||
Knee | 151 (19.1) | |
Ankle | 127 (16.0) | |
Shoulder | 99 (12.5) | |
Foot | 77 (9.7) | |
Wrist | 68 (8.6) | |
Pelvis–Hip | 65 (8.2) | |
Hand | 53 (6.7) | |
Fingers | 47 (5.9) | |
Toes | 31 (3.9) | |
Elbow | 28 (3.5) | |
Tibia | 11 (1.4) | |
Forearm | 10 (1.3) | |
Humerous | 9 (1.1) | |
Clavicle | 6 (0.8) | |
Femur | 6 (0.8) | |
Calcaneus | 4 (0.5) |
Overall Cases | Large Joints | Small Joints | Flat/Long Bones | ||
---|---|---|---|---|---|
(n = 792) | (n = 343) | (n = 407) | (n = 42) | ||
Prevalence * (n, % [IC 95%]) | |||||
Acute fracture | 134 (16.9 [14.0–19.2]) | 37 (10.8 [7.1–13.8]) | 84 (20.6 [16.0–24.5]) | 13 (30.9 [18–48.1]) | |
Acute joint dislocation | 20 (2.5 [1.4–3.6]) | 14 (4.1 [2–6.4]) | 3 (0.7 [0.15–2.1]) | 3 (7.1 [0.6–16.5]) | |
Chronic fractures | 25 (3.2 [2–4.5]) | 9 (2.6 [1–4.6]) | 14 (3.4 [1.9–5.7]) | 2 (4.8 [0.6–16.2]) | |
Arthropathy | 157 (19.8 [17–22.8]) | 99 (28.9 [24–34]) | 49 (12.4 [9–15.6]) | 9 (21.3 [10–36.8]) | |
Focal lesion | 15 (1.9 [1.1–3.1]) | 9 (2.6 [0.9–4.3]) | 5 (1.2 [0.4–2.8]) | 1 (2.4 [0–7.2]) | |
Anatomical variant | 100 (12.6 [10–15.2]) | 40 (11.7 [8.5–15.6]) | 60 (16.8 [11–18.6]) | 0 (0) | |
Elbow joint effusion (N = 28) | 7 (25.0 [11–44.9]) | ||||
Other findings | 195 (24.6 [21–27.6]) | 109 (31.8 [27–36.8]) | 77 (18.9 [15–22.9]) | 9 (21.4 [10–36.8]) |
Overall Cases | Large Joints | Small Joints | Flat/Long Bones | ||||||
---|---|---|---|---|---|---|---|---|---|
(n = 792) | (n = 343) | (n = 407) | (n = 42) | ||||||
Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | ||
Acute fracture (ratio of doubtful cases */certain) | 12/780 | 58/734 | 5/338 | 13/330 | 6/401 | 41/366 | 1/41 | 4/38 | |
Positivity of doubtful cases (n, %) | 6 (50) | 15 (25.9) | 3 (60) | 4 (30.8) | 3 (50) | 9 (21.9) | 0 (0) | 2 (50) | |
Sensitivity (%, 95% CI) | 90.6 (84.2–95.1) | 95.8 (90.5–98.6) | 94.1 (80.3–99.3) | 93.9 (79.8–99.3) | 87.7 (78.5–93.9) | 96 (88.8–99.2) | 100 (75.3–100) | 100 (71.5–100) | |
Specificity (%, 95% CI) | 98.0 (96.6–98.9) | 97.6 (96.0–98.6) | 100 (98.8–100) | 99.7 (98.1–100) | 95.9 (93.2–97.8) | 95.5 (92.5–97.6) | 100 (87.7–100) | 96.3 (81–99.9) | |
PPN (%, 95% CI) | 89.9 (83.4–94.5) | 88.4 (81.5–93.3) | 100 (89.1–100) | 96.9 (83.8–99.9) | 84.5 (75–91.5) | 84.7 (75.3–91.6) | 100 (75.3–100) | 91.7 (61.5–99.8) | |
NPV (%, 95% CI) | 98.2 (96.8–99.0) | 99.2 (98.1–99.7) | 99.3 (97.7–99.9) | 99.3 (97.6–99.9) | 96.8 (94.3–98.5) | 98.9 (96.9–99.8) | 100 (87.7–100) | 100 (86.8–100) | |
AUC (95% CI) | 0.943 (0.917–0.969) | 0.967 (0.948–0.986) | 0.971 (0.93–1.000) | 0.968 (0.927–1.000) | 0.918 (0.88–0.956) | 0.958 (0.932–0.983) | 1 (1–1) | 0.981 (0.945–1.000) |
Overall Cases | Large Joints | Small Joints | Flat/Long Bones | ||||||
---|---|---|---|---|---|---|---|---|---|
(n = 792) | (n = 343) | (n = 407) | (n = 42) | ||||||
Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | ||
Acute joint dislocation (ratio of doubtful cases */certain) | 3/789 | 2/790 | 2/341 | 2/341 | 0/407 | 0/407 | 1/41 | 0/42 | |
Positivity of doubtful cases (n, %) | 2 (66.7) | 0 (0) | 1 (50) | 0 (0) | 0 (0) | 0 (0) | 1 (100) | 0 (0) | |
Sensitivity (%, 95% CI) | 77.8 (52.4–93.6) | 35.0 (15.4–59.2) | 84.6 (54.6–98.1) | 35.7 (12.8–64.9) | 66.7 (9.43–99.2) | 66.7 (9.43–99.2) | 50 (1.26–98.7) | NC | |
Specificity (%, 95% CI) | 100 (99.5–100) | 99.7 (99.1–100) | 100 (98.9–100) | 100 (98.9–100) | 100 (99.1–100) | 99.5 (98.2–99.9) | 100 (91–100) | NC | |
PPN (%, 95% CI) | 100 (76.8–100) | 77.8 (40–7.2) | 100 (71.5–100) | 100 (47.8–100) | 100 (15.8–100) | 50 (6.76–93.2) | 100 (2.5–100) | NC | |
NPV (%, 95% CI) | 99.5 (98.7–99.9) | 98.3 (97.2–99.1) | 99.4 (97.8–99.9) | 97.3 (95–98.8) | 99.8 (98.6–100) | 99.8 (98.6–100) | 97.5 (86.8–99.9) | NC | |
AUC (95% CI) | 0.889 (0.79–0.988) | 0.674 (0.566–0.781 | 0.923 (0.821–1) | 0.679 (0.548–0.809) | 0.833 (0.507–1) | 0.831 (0.504–1) | 0.75 (0.26–1) | NC |
Overall Cases | |||
---|---|---|---|
(n = 792) | |||
Radiology Resident | IA Software | ||
Elbow joint effusion (ratio of doubtful cases */certain) | 0/28 | 4/24 | |
Positivity of doubtful cases (n, %) | 0 (0) | 1 (25) | |
Sensitivity (%, 95% CI) | 100 (59–100) | 100 (54,1–100) | |
Specificity (%, 95% CI) | 90.5 (69.6–98.8) | 94.4 (72.7–99.9) | |
PPN (%, 95% CI) | 77.8 (40–97.2) | 85.7 (42.1–99.6) | |
NPV (%, 95% CI) | 100 (82.4–100) | 100 (80.5–100) | |
AUC (95% CI) | 0.952 (0.888–1.000) | 0.972 (0.918–1.000) |
Overall Cases | Large Joints | Small Joints | Flat/Long Bones | ||||||
---|---|---|---|---|---|---|---|---|---|
(n = 792) | (n = 343) | (n = 407) | (n = 42) | ||||||
Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | Radiology Resident | IA Software | ||
Chronic fracture (ratio of doubtful cases */certain) | 4/788 | NA | 2/341 | NA | 2/405 | NA | 0/42 | NA | |
Positivity of doubtful cases (n, %) | 1 (25) | 1 (50) | 0 (0) | 0 (0) | |||||
Sensitivity (%, 95% CI) | 29.2 (12.6–51.1) | 12.5 (0.316–52.7) | 35.7 (12.8–64.9) | 50 (1.26–98.7) | |||||
Specificity (%, 95% CI) | 99.9 (99.3–100) | 100 (98.9–100) | 99.7 (98.6–100) | 100 (91.2–100) | |||||
PPN (%, 95% CI) | 87.5 (47.3–99.7) | 100 (2.5–100) | 83.3 (35.9–99.6) | 100 (2.5–100) | |||||
NPV (%, 95% CI) | 97.8 (96.5–98.7) | 97.9 (95.8–99.2) | 97.7 (95.8–99) | 97.6 (87.1–99.9) | |||||
AUC (95% CI) | 0.645 (0.552–0.738) | 0.563 (0.44–0.685) | 0.677 (0.547–0.808) | 0.75 (0.26–1.0) | |||||
Arthropathy (ratio of doubtful cases */certain) | 0/792 | NA | 0/343 | NA | 0/407 | NA | 0/42 | NA | |
Positivity of doubtful cases (n, %) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | |||||
Sensitivity (%, 95% CI) | 70.1 (62.2–77.1) | 79.8 (70.5–87.2) | 51 (36.3–65.6) | 66.7 (29.9–92.5) | |||||
Specificity (%, 95% CI) | 97.6 (96.1–98.7) | 98.4 (95.9–99.6) | 96.9 (94.6–98.5) | 100 (89.4–100) | |||||
PPN (%, 95% CI) | 88 (81–93.1) | 95.2 (88.1–98.7) | 69.4 (51.9–83.7) | 100 (54.1–100) | |||||
NPV (%, 95% CI) | 93 (90.7–94.8) | 92.3 (88.4–95.2) | 93.5 (90.5–95.8) | 91.7 (77.5–98.2) | |||||
AUC (95% CI) | 0.839 (0.802–0.875) | 0.891 (0.85–0.931) | 0.74 (0.668–0.811) | 0.833 (0.67–0.997) | |||||
Focal lesion (ratio of doubtful cases */certain) | 1/791 | NA | 1/342 | NA | 0/407 | NA | 0/42 | NA | |
Positivity of doubtful cases (n, %) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | |||||
Sensitivity (%, 95% CI) | 6.67 (0.169–31.9) | NA | 20 (0.505–71.6) | NA | |||||
Specificity (%, 95% CI) | 99.4 (98.5–99.8) | NA | 98.8 (97.1–99.6) | NA | |||||
PPN (%, 95% CI) | 16.7 (0.421–64.1) | NA | 16.7 (0.421–64.1) | NA | |||||
NPV (%, 95% CI) | 98.2 (97–99) | NA | 99 (97.5–99.7) | NA | |||||
AUC (95% CI) | 0.53 (0.465–0.596) | NA | 0.594 (0.398–0.79) | NA | |||||
Anatomical variant (ratio of doubtful cases */certain) | 0/792 | NA | 0/342 | NA | 0/407 | NA | 0/42 | NA | |
Positivity of doubtful cases (n, %) | 0 (0) | 0 (0) | 0 (0) | 0 (0) | |||||
Sensitivity (%, 95% CI | 64 (53.8–73.4) | 90 (76.3–97.2) | 46.7 (33.7–60) | NA | |||||
Specificity (%, 95% CI) | 98.6 (97.4–99.3) | 99 (97.1–99.8) | 98 (95.9–99.2) | NA | |||||
PPN (%, 95% CI) | 86.5 (76.5–93.3) | 92.3 (79.1–98.4) | 80 (63.1–91.6) | NA | |||||
NPV (%, 95% CI) | 95 (93.1–96.5) | 98.7 (96.7–99.6) | 91.4 (88.1–94) | NA | |||||
AUC (95% CI) | 0.813 (0.765–0.86) | 0.945 (0.898–0.992) | 0.723 (0.659–0.787) | NA |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Díaz Moreno, A.; Cano Alonso, R.; Fernández Alfonso, A.; Álvarez Vázquez, A.; Carrascoso Arranz, J.; López Alcolea, J.; García Castellanos, D.; Sanabria Greciano, L.; Recio Rodríguez, M.; Andreu-Vázquez, C.; et al. Diagnostic Performance of an Artificial Intelligence Software for the Evaluation of Bone X-Ray Examinations Referred from the Emergency Department. Diagnostics 2025, 15, 491. https://doi.org/10.3390/diagnostics15040491
Díaz Moreno A, Cano Alonso R, Fernández Alfonso A, Álvarez Vázquez A, Carrascoso Arranz J, López Alcolea J, García Castellanos D, Sanabria Greciano L, Recio Rodríguez M, Andreu-Vázquez C, et al. Diagnostic Performance of an Artificial Intelligence Software for the Evaluation of Bone X-Ray Examinations Referred from the Emergency Department. Diagnostics. 2025; 15(4):491. https://doi.org/10.3390/diagnostics15040491
Chicago/Turabian StyleDíaz Moreno, Alejandro, Raquel Cano Alonso, Ana Fernández Alfonso, Ana Álvarez Vázquez, Javier Carrascoso Arranz, Julia López Alcolea, David García Castellanos, Lucía Sanabria Greciano, Manuel Recio Rodríguez, Cristina Andreu-Vázquez, and et al. 2025. "Diagnostic Performance of an Artificial Intelligence Software for the Evaluation of Bone X-Ray Examinations Referred from the Emergency Department" Diagnostics 15, no. 4: 491. https://doi.org/10.3390/diagnostics15040491
APA StyleDíaz Moreno, A., Cano Alonso, R., Fernández Alfonso, A., Álvarez Vázquez, A., Carrascoso Arranz, J., López Alcolea, J., García Castellanos, D., Sanabria Greciano, L., Recio Rodríguez, M., Andreu-Vázquez, C., Thuissard Vasallo, I. J., & Martínez De Vega, V. (2025). Diagnostic Performance of an Artificial Intelligence Software for the Evaluation of Bone X-Ray Examinations Referred from the Emergency Department. Diagnostics, 15(4), 491. https://doi.org/10.3390/diagnostics15040491