Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance
Abstract
:1. Introduction
2. Materials and Methods
2.1. Digitisation of Surgical Pathology in the Laboratory
2.2. Validation for Full DP Reporting
3. Results
3.1. Cases with Discordances
3.2. Potential Pitfalls of Digital Reporting That Did Not Cause Discordance
3.3. Learning Curves over Time
3.4. Diagnostic Areas That Potentially May Be Easier on the Digital Platform
3.5. Pathologists’ Experiences of Digital Reporting
3.6. Diagnostic Confidence and Diagnostic Modality Preference
4. Discussion
4.1. Engagement and Time to Validation
4.2. Number of Cases Needed for Stage 2 Validation
4.3. Causes of Discordance and Mitigating Strategies
4.4. Present and Future Perspectives
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Griffin, J.; Treanor, D. Digital pathology in clinical use: Where are we now and what is holding us back? Histopathology 2017, 70, 134–145. [Google Scholar] [CrossRef] [PubMed]
- Azam, A.S.; Miligy, I.M.; Kimani, P.K.; Maqbool, H.; Hewitt, K.; Rajpoot, N.M.; Snead, D.R. Diagnostic concordance and discordance in digital pathology: A systematic review and meta-analysis. J. Clin. Pathol. 2021, 74, 448–455. [Google Scholar] [CrossRef] [PubMed]
- Browning, L.; White, K.; Siiankoski, D.; Colling, R.; Roskell, D.; Fryer, E.; Hemsworth, H.; Roberts-Gant, S.; Roelofsen, R.; Rittscher, J.; et al. RFID analysis of the complexity of cellular pathology workflow-An opportunity for digital pathology. Front. Med. 2022, 9, 933933. [Google Scholar] [CrossRef]
- Williams, B.J.; Lee, J.; Oien, K.A.; Treanor, D. Digital pathology access and usage in the UK: Results from a national survey on behalf of the National Cancer Research Institute’s CM-Path initiative. J. Clin. Pathol. 2018, 71, 463–466. [Google Scholar] [CrossRef] [PubMed]
- Williams, B.J.; Bottoms, D.; Treanor, D. Future-proofing pathology: The case for clinical adoption of digital pathology. J. Clin. Pathol. 2017, 70, 1010–1018. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Cancer Research UK. Testing Times to Come? An Evaluation of Pathology Capacity across the UK. 2016. Available online: http://www.cancerresearchuk.org/sites/default/files/testing_times_to_come_nov_16_cruk.pdf (accessed on 19 January 2023).
- Klein, C.; Zeng, Q.; Arbaretaz, F.; Devêvre, E.; Calderaro, J.; Lomenie, N.; Maiuri, M.C. Artificial intelligence for solid tumour diagnosis in digital pathology. Br. J. Pharmacol. 2021, 178, 4291–4315. [Google Scholar] [CrossRef]
- Turnquist, C.; Roberts-Gant, S.; Hemsworth, H.; White, K.; Browning, L.; Rees, G.; Roskell, D.; Verrill, C. On the Edge of a Digital Pathology Transformation: Views from a Cellular Pathology Laboratory Focus Group. J. Pathol. Inform. 2019, 10, 37. [Google Scholar] [CrossRef]
- PathLake. Available online: https://www.pathlake.org/ (accessed on 25 January 2023).
- Browning, L.; Fryer, E.; Roskell, D.; White, K.; Colling, R.; Rittscher, J.; Verrill, C. Role of digital pathology in diagnostic histopathology in the response to COVID-19: Results from a survey of experience in a UK tertiary referral hospital. J. Clin. Pathol. 2021, 74, 129–132. [Google Scholar] [CrossRef]
- Pantanowitz, L.; Sinard, J.H.; Henricks, W.H.; Fatheree, L.A.; Carter, A.B.; Contis, L.; Beckwith, B.A.; Evans, A.J.; Lal, A.; Parwani, A.; et al. Validating whole slide imaging for diagnostic purposes in pathology: Guideline from the College of American Pathologists Pathology and Laboratory Quality Center. Arch. Pathol. Lab. Med. 2013, 137, 1710–1722. [Google Scholar] [CrossRef] [Green Version]
- Cross, S.; Furness, P.; Igali, L.; Snead, D.; Treanor, D. Best Practice Recommendations for Implementing Digital Pathology; The Royal College of Pathologists: London, UK, 2018. [Google Scholar]
- Williams, B.J.; Hanby, A.; Millican-Slater, R.; Nijhawan, A.; Verghese, E.; Treanor, D. Digital pathology for the primary diagnosis of breast histopathological specimens: An innovative validation and concordance study on digital pathology validation and training. Histopathology 2018, 72, 662–671. [Google Scholar] [CrossRef] [Green Version]
- Williams, B.J.; Ismail, A.; Chakrabarty, A.; Treanor, D. Clinical digital neuropathology: Experience and observations from a departmental digital pathology training programme, validation and deployment. J. Clin. Pathol. 2021, 74, 456–461. [Google Scholar] [CrossRef]
- Colling, R.; Colling, H.; Browning, L.; Verrill, C. Validation of grading of non-invasive urothelial carcinoma by digital pathology for routine diagnosis. BMC Cancer 2021, 21, 995. [Google Scholar] [CrossRef]
- Colling, R.; Protheroe, A.; Sullivan, M.; Macpherson, R.; Tuthill, M.; Redgwell, J.; Traill, Z.; Molyneux, A.; Johnson, E.; Abdullah, N.; et al. Digital Pathology Transformation in a Supraregional Germ Cell Tumour Network. Diagnostics 2021, 11, 2191. [Google Scholar] [CrossRef]
- Sewell, C. Review of the Categorisation of Discrepancies in Histopathology; Royal College of Pathologists: London, UK, 2008. [Google Scholar]
- Kent, M.N.; Olsen, T.G.; Feeser, T.A.; Tesno, K.C.; Moad, J.C.; Conroy, M.P.; Kendrick, M.J.; Stephenson, S.R.; Murchland, M.R.; Khan, A.U.; et al. Diagnostic Accuracy of Virtual Pathology vs Traditional Microscopy in a Large Dermatopathology Study. JAMA Dermatol. 2017, 153, 1285–1291. [Google Scholar] [CrossRef]
- Shah, K.K.; Lehman, J.S.; Gibson, L.E.; Lohse, C.M.; Comfere, N.I.; Wieland, C.N. Validation of diagnostic accuracy with whole-slide imaging compared with glass slide review in dermatopathology. J. Am. Acad. Dermatol. 2016, 75, 1229–1237. [Google Scholar] [CrossRef]
- Williams, B.J.; DaCosta, P.; Goacher, E.; Treanor, D. A Systematic Analysis of Discordant Diagnoses in Digital Pathology Compared With Light Microscopy. Arch. Pathol. Lab. Med. 2017, 141, 1712–1718. [Google Scholar] [CrossRef] [Green Version]
- Williams, B.J.; Treanor, D. Practical guide to training and validation for primary diagnosis with digital pathology. J. Clin. Pathol. 2020, 73, 418–422. [Google Scholar] [CrossRef]
- Ordi, J.; Castillo, P.; Saco, A.; Del Pino, M.; Ordi, O.; Rodríguez-Carunchio, L.; Ramírez, J. Validation of whole slide imaging in the primary diagnosis of gynaecological pathology in a University Hospital. J. Clin. Pathol. 2015, 68, 33–39. [Google Scholar] [CrossRef] [Green Version]
- WHO. World Health Organisation Classification of Tumours: Urinary and Male Genital Tumours, 5th ed.; WHO: Geneva, Switzerland, 2022; Volume 8.
- Hanna, M.G.; Reuter, V.E.; Samboy, J.; England, C.; Corsale, L.; Fine, S.W.; Agaram, N.P.; Stamelos, E.; Yagi, Y.; Hameed, M.; et al. Implementation of Digital Pathology Offers Clinical and Operational Increase in Efficiency and Cost Savings. Arch. Pathol. Lab. Med. 2019, 143, 1545–1555. [Google Scholar] [CrossRef] [Green Version]
- Hanna, M.G.; Reuter, V.E.; Hameed, M.R.; Tan, L.K.; Chiang, S.; Sigel, C.; Hollmann, T.; Giri, D.; Samboy, J.; Moradel, C.; et al. Whole slide imaging equivalency and efficiency study: Experience at a large academic center. Mod. Pathol. 2019, 32, 916–928. [Google Scholar] [CrossRef]
- RCPath. Cancer Screening: Call for Evidence. Response from the Royal College of Pathologists. Available online: https://www.rcpath.org/ (accessed on 17 April 2023).
- David Snead, A.A.; Elliot, E.; Hiller, L.; Thirlwall, J.; Dunn, J.; Gill, S.; Kimani, P.; Higgins, H.; Muthiah, C.; Bickers, A.; et al. PROTOCOL: Multi-Centre Validation of Digital Whole Slide Imaging for Routine Diagnosis. 2019. Available online: https://njl-admin.nihr.ac.uk/document/download/2034152 (accessed on 17 April 2023).
- Thorstenson, S.; Molin, J.; Lundström, C. Implementation of large-scale routine diagnostics using whole slide imaging in Sweden: Digital pathology experiences 2006–2013. J. Pathol. Inform. 2014, 5, 14. [Google Scholar] [CrossRef] [PubMed]
Duration | Number of Pathologists |
---|---|
<6 months | 5 |
6 months–1 year | 3 |
12–18 months | 8 |
> 18 months | 4 |
Across All Specialities | Urology * | Breast | Renal | GI * | H&N | Skin | Gynaecology | Respiratory | |
---|---|---|---|---|---|---|---|---|---|
Total no of cases | 3777 | 566 | 660 | 138 | 1287 | 87 | 726 | 180 | 133 |
Technical deferral rate to glass | 2.6% | 2.1% | 3.5% | 0% | 4% | 5.8% | 1.2% | 0.6% | 0% |
Cases with discordances | 1.3% | 3.9% | 1.5% | 1.5% | 0.8% | 0% | 0.8% | 0% | 0% |
Type of Discordance | B1 | B2 | B3 | N/A |
---|---|---|---|---|
Percentage of cases | 0% (0/3777) | 0.1% (3/3777) | 0.4% (16/3777) | 0.8% (30/3777) |
Speciality | Pitfall | Free Text Comments, Including the Discordance Category |
---|---|---|
Urology Germ cell | Pagetoid spread into rete (one case) | B3. Case where pagetoid spread into rete missed on digital and seen on glass. Not clear if it was a digital issue as it was very subtle. Will know to look specifically for this in the future. |
Prostate | Dysplasia identification (three cases) | N/A. Case where there was a possible focus of PIN on digital which was less convincing on glass. It was difficult on either platform so would require a glass check. N/A. Case of PIN seen digitally and need glass for confirmation. N/A. Case where digitally reported as benign and glass reported by another pathology as small foci of PIN and focus of ASAP. |
Subjective finding/Reported by another pathologist (four cases) | B3. Case where a tiny focus of tumour at the prostate base margin missed on digital but spotted on glass. Not clear if it was a digital issue as base is often harder to see tumour. No clinical difference. B3. Case with a minor difference in Gleason scoring—subjective. B3. Case of ASAP identified digitally and when reported by another pathologist identified as benign. N/A. Case where it was digitally reported as benign but on glass reported as ASAP by another pathologist | |
No details included (one case) | N/A. | |
BKP | Muscularis propria identification (one case) | N/A. Case where pathologist missed presence of muscularis propria as first case reporting on digital and concentrating on tumour grade assessment. Diagnosis of tumour, grade and stage were correct and no impact on clinical management. |
Reactive atypia identification (two cases) | B2. Case deferred to check if it was benign due to the inflammatory changes.N/A. Case deferred to check florid reactive changes were definitely reactive as atypia stands out more on digital. | |
Grading of dysplasia (two cases) | N/A. Case deferred to clarify G2 vs. G3. N/A. Case deferred to clarify high or low grade of small foci of papillary urothelial carcinoma amongst abundant radiotherapy changes. | |
Grading (two cases) | B3. Case of minor difference in grading G2 versus G3. B2. Case of urothelial carcinoma falling short of high grade change digitally and reported as just meeting the criteria for high grade. | |
Assessment of invasion (two cases) | B3. Case showed minor difference in staging from very suspicious TI versus T1. B3. Case reported digitally as Ta but reported on glass as early suspicious invasion by another pathologist | |
Subjective finding (two cases) | B3. Case of PUNLMP versus low grade urothelial cell carcinoma, subjective difference. B3. Case of minor difference in grading between digital and glass. Subjective and reported by different pathologist. | |
Unusual or complex case (two cases) | N/A. Case of complex bilateral renal tumours requiring glass for interpretation. N/A. Case was difficult required glass and opinions from other pathologists. | |
Breast | Mitotic count (three cases) | N/A. Case deferred as difficult to see mitoses on digital. Easy to find on glass and clearly enough to make the tumour grade 3. N/A. Case deferred as unable to score mitoses on digital. N/A. Case where slightly undercalled mitotic count on digitally. This did not affect the grade. |
Identification of dysplasia (one case) | N/A. Case of possible atypia in one duct Review of glass showed no atypia. Team opinion—agreed. | |
Pleomorphism (one case) | N/A. Pleomorphism 3 on digital versus 2 on glass. This did not affect the grade. | |
Calcium oxalate (one case) | N/A. Case deferred as could not see calcium oxalate confidently on digital and had to defer to glass to be able to polarise. Calcium phosphate is easy to see on digital. | |
Her2 positivity amplified (three cases) | N/A. Case deferred as digital amplifies immunohistochemistry positivity which needs to be taken into account when assessing. B3. Case deferred as digital interpretation more difficult due to apparent enhanced intensity of immunohistochemistry. B3. Case deferred as assessed as just reaching 2+ on digital (with low confidence) but on glass staining clearly less intense and in category. | |
No details included (one case) | N/A. | |
Renal | Necrosis identification (one case) | B3. Case where small area of eosinophilic necrosis was overlooked on digital slide, easier to spot on glass. I now know I need to raise the threshold of suspicion or adjust the image so that eosinophilic areas of necrosis are more readily spotted. |
Unusual or complex case (one case) | N/A. Case where diagnosis was deferred until after glass review due to limited experience with lesions of acute glomerular thrombotic microangiopathy on digital. Interestingly, mesangiolysis appeared crispier and microthrombi were more well defined on digital. | |
GI | Grading of tumour (one case) | N/A. Case with a slight difference in grade. Focal intermediate was better seen on glass, but also seen more clearly on unscanned slides. |
Helicobacter pylori identification (two cases) | N/A. Case where helicobacter pylori organisms were more easily visible on glass. Will need to always check glass if morphology suggests Helicobacter pylori but organisms not seen on digital.N/A. Case of very scanty H. Pylori organisms visible on toluidine blue. | |
Identification of tiny focus of neuroendocrine tumour (one case) | N/A. Case deferred to check if a tiny focus of neuroendocrine tumour was present. | |
Identification of metastatic focus (one case) | B2. Missed focus of metastatic disease in lymph node. This did not change the staging as other lymph nodes positive. | |
Assessment of invasion (one case) | N/A. Case deferred as possible nodal involvement and adventitial involvement very focal and not definitely invasive tumour. Post-neoadjuvant tumour difficult on digital platform—need to see a few more of these before will be confident without review on glass. | |
Mucin (one case) | B3. More mucinous differentiation identified on glass review. This changed the diagnosis to an adenocarcinoma with 40% differentiation to a mucinous adenocarcinoma. No difference to treatment or prognosis. | |
No details included (two cases) | N/A. | |
Liver | Reactive atypia and assessment of invasion (one case) | N/A. Case where atypia more likely reactive on basis of appearance on glass slides (but difficult on both). Need glass review due to uncertainty with regards to invasion. |
Skin | Mitotic count (one case) | N/A. Case deferred to count mitoses. |
Measurements of margins and Breslow thickness (three cases) | B3. Case with a slight difference in clearance margin but was not clinically significant. B3. Case with a closer margin as needed to see higher power on digital. B3. Case where Breslow thickness and deep margin changed on glass review. | |
Incidental finding (one case) | N/A. Case where an incidental benign naevus was seen on glass which was not seen on digital. The main pathology was the basal cell carcinoma excision however and the parameters of the tumour were not affected so the clinical outcome did not change. |
Speciality | Learning Curve |
---|---|
General | Speed of reporting Navigating the slide Use of scanning magnification/panning for screening Realising when to slow down and what parameters to check Identifying potential pitfalls and when to defer to glass slides Grading of dysplasia Annotating lymph nodes |
Urology Germ Cell | Assessment of rete invasion Assessment of hilar invasion Assessment of lymphovascular invasion Identification of GCNIS Identification of small foci of seminoma |
Breast | Her2 scoring 1+/2+ cases |
Renal | Identification of tubulitis in rejection and tubulointerstitial nephritis |
GI | Identification of Helicobacter pylori on Toluidine blue staining |
Respiratory | Identification of giant cells Counting IgG4-positive plasma cells |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kelleher, M.; Colling, R.; Browning, L.; Roskell, D.; Roberts-Gant, S.; Shah, K.A.; Hemsworth, H.; White, K.; Rees, G.; Dolton, M.; et al. Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance. Diagnostics 2023, 13, 2144. https://doi.org/10.3390/diagnostics13132144
Kelleher M, Colling R, Browning L, Roskell D, Roberts-Gant S, Shah KA, Hemsworth H, White K, Rees G, Dolton M, et al. Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance. Diagnostics. 2023; 13(13):2144. https://doi.org/10.3390/diagnostics13132144
Chicago/Turabian StyleKelleher, Mai, Richard Colling, Lisa Browning, Derek Roskell, Sharon Roberts-Gant, Ketan A. Shah, Helen Hemsworth, Kieron White, Gabrielle Rees, Monica Dolton, and et al. 2023. "Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance" Diagnostics 13, no. 13: 2144. https://doi.org/10.3390/diagnostics13132144
APA StyleKelleher, M., Colling, R., Browning, L., Roskell, D., Roberts-Gant, S., Shah, K. A., Hemsworth, H., White, K., Rees, G., Dolton, M., Soares, M. F., & Verrill, C. (2023). Department Wide Validation in Digital Pathology—Experience from an Academic Teaching Hospital Using the UK Royal College of Pathologists’ Guidance. Diagnostics, 13(13), 2144. https://doi.org/10.3390/diagnostics13132144