A High-Resolution Digital Pathological Image Staining Style Transfer Model Based on Gradient Guidance
Abstract
:1. Introduction
- (1)
- By using only grayscale images to train and test the model, the problem caused by color differences is avoided. Although this method enhances the generalization of the model to a certain extent, it also reduces the classification performance of the model itself.
- (2)
- Data augmentation based on staining is performed on digital pathological images to enable the model to learn more potential staining differences during training. This method forces the model to learn more texture-based features, but a lot of information in digital pathological images is reflected through staining, so this method may also enhance the generalization while reducing the performance of the model [34].
- (3)
- All pathological images are matched to a fixed color combination by preprocessing. This method may lead to the introduction of artifacts at the pretreatment stage because the staining is not properly separated. Moreover, this method requires a lot of prior knowledge and is not suitable for CAD systems.
- (4)
- The staining pattern of test data is transferred to the staining pattern of training data through the dye transfer algorithm, so that the test data can be tested under the original model after the dye transfer [16,28]. This method does not need to change the original trained model, so it does not degrade the performance of the model. Simultaneous dye transfer algorithms are usually based on unpaired generative adversative networks, which do not require additional labeling and are very easy to deploy. Therefore, this attempt has attracted wide attention from researchers.
- (1)
- The resolution of the whole slide image (WSI) is very high, and the color differences in different tissue structures in a single WSI are observed. If the color transfer algorithm is trained at a higher resolution, the trained model may have instability or low contrast. As shown in Figure 1, if the picture of small squares is captured at a higher resolution, the color difference within different small squares in the picture is huge. When we use the unpaired generation algorithm for stain transfer, the cube images of the test data may match to any of the cube images shown in Figure 1.
- (2)
- The diagnostic model of benign and malignant digital pathological images depends on the high resolution of WSIs. If the color transfer algorithm is trained under low-resolution WSIs, although the small blocks of the same size contain a larger scale of cell tissues at low resolution, which can alleviate the problem caused by the color difference within the WSI, the low-resolution WSIs will cause a degradation of the model performance.
- (3)
- The regularity of the nuclear edge and cell membrane edge is very important for the diagnosis of benign and malignant cells, so the color migration algorithm should ensure that the edges of the nucleus and cell membrane are as clear as possible based on the premise of correct migration and staining.
- (1)
- We propose a stain transfer network based on an unpaired adversary-generative network to solve the generalization problem encountered in the multi-center testing of benign and malignant diagnostic models. The network does not require additional data labeling and is easy to deploy.
- (2)
- We propose a new pair-wise training paradigm to train our proposed generative network, which is able to learn the correct color transfer model at high resolution, solving the contradiction between the high resolution of a WSI and the large internal color difference.
- (3)
- We introduce a gradient-guided loss function to train our proposed generative network, which can ensure the clarity of the generated images at the edges of the nucleus and cell membrane, which is helpful to improve the performance of the benign and malignant diagnostic model.
2. Materials and Methods
2.1. Dataset and Preprocessing
2.2. A High-Resolution Staining Style Transfer Model Based on Gradient Guidance
- (1)
- Pretreatment. To preserve the resolution of the WSI after migration staining, we directly clipped the WSI into small block pictures using a sliding window at the resolution we wanted to preserve, which in our experiments was chosen at a magnification of 20×, which is routinely used in benign and malignant diagnostic models.
- (2)
- The gradient density and classification were calculated. After cropping into small patch images, we use the Sobel operator to detect edges for each patch image and calculate the proportion of the number of pixels of the edge to the number of pixels of the whole patch image, which we call the gradient density. According to the gradient density, the patches were divided into three categories: gradient sparse (<10%), gradient medium (10–50%), and gradient dense (>50%).
- (3)
- The pair constraint strategy was used to train the adversary-generative network. Simply put, when selecting the input images of the adversative generation network, it is necessary to pair them according to the gradient density. Only images belonging to the same class of gradient density can pair the input network for training. The reason for this is that combinations of pictures with the same density gradient tend to have the same tissue structure, which can avoid learning the staining style of other tissue structures during the process of staining migration.
2.2.1. Unpaired Adversarial Generative Networks
2.2.2. High-Resolution Image Conversion
2.2.3. Generator Network Architecture
2.2.4. Discriminator Network Architecture
2.3. Paired Training Strategy
2.4. Gradient-Guided Loss Function
2.5. Evaluation Metrics
3. Results
3.1. Implementation Details
3.2. Visual Evaluation of Staining Migration Effect
3.3. Evaluation of Generalization Performance
- (1)
- (2)
- After dyeing and migration, the performance of the model was improved, and HDGAN improved the performance of the model to a greater extent than CycleGAN.
- (3)
- After staining and migration, the performance gap between MIL models began to appear. In Table 3, after HDGAN staining and migration, the AUC of IMIL was increased to 0.8313, while that of MI-Net was increased to 0.7233, and the AUC gap between the two was 0.1080.
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Sung, H.; Ferlay, J.; Siegel, R.L.; Laversanne, M.; Soerjomataram, I.; Jemal, A.; Bray, F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J. Clin. 2021, 71, 209–249. [Google Scholar] [CrossRef]
- Viray, H.; Li, K.; Long, T.A.; Vasalos, P.; Bridge, J.A.; Jennings, L.J.; Halling, K.C.; Hameed, M.; Rimm, D.L. A prospective, multi-institutional diagnostic trial to determine pathologist accuracy in estimation of percentage of malignant cells. Arch. Pathol. Lab. Med. 2013, 137, 1545–1549. [Google Scholar] [CrossRef]
- Smits, A.J.; Kummer, J.A.; de Bruin, P.C.; Bol, M.; van den Tweel, J.G.; Seldenrijk, K.A.; Willems, S.M.; Offerhaus, G.J.; de Weger, R.A.; van Diest, P.J.; et al. The estimation of tumor cell percentage for molecular testing by pathologists is not accurate. Mod. Pathol. 2014, 27, 168–174. [Google Scholar] [CrossRef]
- Ilse, M.; Tomczak, J.; Welling, M. Attention-based deep multiple instance learning. In Proceedings of the International Conference on Machine Learning, PMLR, Stockholm, Sweden, 10–15 July 2018; pp. 2127–2136. [Google Scholar]
- Lu, M.Y.; Williamson, D.F.K.; Chen, T.J.; Chen, R.J.; Barbieri, M.; Mahmood, F. Data-efficient and weakly supervised computational pathology on whole-slide images. Nat. Biomed. Eng. 2021, 5, 555–570. [Google Scholar] [CrossRef] [PubMed]
- Yao, J.; Zhu, X.; Jonnagaddala, J.; Hawkins, N.; Huang, J. Whole slide images based cancer survival prediction using attention guided deep multiple instance learning networks. Med. Image Anal. 2020, 65, 101789. [Google Scholar] [CrossRef]
- Coudray, N.; Ocampo, P.S.; Sakellaropoulos, T.; Narula, N.; Snuderl, M.; Fenyö, D.; Moreira, A.L.; Razavian, N.; Tsirigos, A. Classification and mutation prediction from non–small cell lung cancer histopathology images using deep learning. Nat. Med. 2018, 24, 1559–1567. [Google Scholar] [CrossRef] [PubMed]
- Wei, J.W.; Tafe, L.J.; Linnik, Y.A.; Vaickus, L.J.; Tomita, N.; Hassanpour, S. Pathologist-level classification of histologic patterns on resected lung adenocarcinoma slides with deep neural networks. Sci. Rep. 2019, 9, 3358. [Google Scholar] [CrossRef] [PubMed]
- Topol, E.J. High-performance medicine: The convergence of human and artificial intelligence. Nat. Med. 2019, 25, 44–56. [Google Scholar] [CrossRef]
- Couture, H.D.; Williams, L.A.; Joseph, G.; Nyante, S.J.; Butler, E.N.; Marron, J.S.; Perou, C.M.; Troester, M.A.; Niethammer, M. Image analysis with deep learning to predict breast cancer grade, er status, histologic subtype, and intrinsic subtype. NPJ Breast Cancer 2018, 4, 30. [Google Scholar] [CrossRef] [PubMed]
- Gurcan, M.N.; Boucheron, L.E.; Can, A.; Madabhushi, A.; Yener, B. Histopathological image analysis: A review. IEEE Rev. Biomed. Eng. 2009, 2, 147–171. [Google Scholar] [CrossRef] [PubMed]
- Elmore, J.G.; Longton, G.M.; Carney, P.A.; Geller, B.M.; Onega, T.; Tosteson, A.N.A.; Nelson, H.D.; Pepe, M.S.; Allison, K.H.; Schnitt, S.J. Diagnostic concordance among pathologists interpreting breast biopsy specimens. JAMA 2015, 313, 1122–1132. [Google Scholar] [CrossRef] [PubMed]
- Yu, K.H.; Zhang, C.; Berry, G.J.; Altman, R.B.; Ré, C.; Rubin, D.L.; Snyder, M. Predicting non-small cell lung cancer prognosis by fully automated microscopic pathology image features. Nat. Commun. 2016, 7, 12474. [Google Scholar] [CrossRef] [PubMed]
- Litjens, G.; Sánchez, C.I.; Timofeeva, N.; Hermsen, M.; Nagtegaal, I.; Kovacs, I.; Christina, H.; Bult, P.; Van Ginneken, B.; Jeroen, V. Deep learning as a tool for increased accuracy and efficiency of histopathological diagnosis. Sci. Rep. 2016, 6, 26286. [Google Scholar] [CrossRef] [PubMed]
- Steiner, D.F.; MacDonald, R.; Liu, Y.; Truszkowski, P.; Hipp, J.D.; Gammage, C.; Thng, F.; Peng, L.; Stumpe, M.C. Impact of deep learning assistance on the histopathologic review of lymph nodes for metastatic breast cancer. Am. J. Surg. Pathol. 2018, 42, 1636. [Google Scholar] [CrossRef] [PubMed]
- Shaban, M.T.; Baur, C.; Navab, N.; Albarqouni, S. Staingan: Staingan: Stain style transfer for digital histological images. In Proceedings of the 2019 IEEE 16th International Symposium on Biomedical Imaging (ISBI 2019), Venice, Italy, 8–11 April 2019; IEEE: New York, NY, USA, 2019; pp. 953–956. [Google Scholar]
- Ciompi, F.; Geessink, O.; Bejnordi, B.E.; De Souza, G.S.; Baidoshvili, A.; Litjens, G.; Van Ginneken, B.; Nagtegaal, I.; Jeroen, V.D.L. The importance of stain normalization in colorectal tissue classification with convolutional networks. In Proceedings of the 2017 IEEE 14th International Symposium on Biomedical Imaging (ISBI 2017), Melbourne, Australia, 18–21 April 2017; IEEE: New York, NY, USA, 2017; pp. 160–163. [Google Scholar]
- Salvi, M.; Michielli, N.; Molinari, F. Stain color adaptive normalization (scan) algorithm: Separation and standardization of histological stains in digital pathology. Comput. Methods Programs Biomed. 2020, 193, 105506. [Google Scholar] [CrossRef] [PubMed]
- Zheng, Y.; Jiang, Z.; Zhang, H.; Xie, F.; Shi, J.; Xue, C. Adaptive color deconvolution for histological wsi normalization. Comput. Methods Programs Biomed. 2019, 170, 107–120. [Google Scholar] [CrossRef]
- Coltuc, D.; Bolon, P.; Chassery, J.M. Exact histogram specification. IEEE Trans. Image Process. 2006, 15, 1143–1152. [Google Scholar] [CrossRef] [PubMed]
- Reinhard, E.; Ashikhmin, M.; Gooch, B.; Shirley, P. Color transfer between images. IEEE Comput. Graph. Appl. 2001, 21, 34–41. [Google Scholar] [CrossRef]
- Santanu, R.; Alok, K.J.; Shyam, L.; Jyoti, K. A study about color normalization methods for histopathology images. Micron 2018, 114, 42–61. [Google Scholar]
- Macenko, M.; Niethammer, M.; Marron, J.S.; Borland, D.; Woosley, J.T.; Guan, X.; Schmitt, C.; Thomas, N.E. A method for normalizing histology slides for quantitative analysis. In Proceedings of the 2009 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, Boston, Massachusetts, 28 June–1 July 2009; IEEE: New York, NY, USA, 2009; pp. 1107–1110. [Google Scholar]
- Khan, A.M.; Rajpoot, N.; Treanor, D.; Magee, D. A nonlinear mapping approach to stain normalization in digital histopathology images using image-specific color deconvolution. IEEE Trans. Biomed. Eng. 2014, 61, 1729–1738. [Google Scholar] [CrossRef] [PubMed]
- Li, X.; Plataniotis, K.N. A complete color normalization approach to histopathology images using color cues computed from saturation-weighted statistics. IEEE Trans. Biomed. Eng. 2015, 62, 1862–1873. [Google Scholar] [CrossRef] [PubMed]
- Vahadane, A.; Peng, T.; Sethi, A.; Albarqouni, S.; Wang, L.; Baust, M.; Steiger, K.; Schlitter, A.M.; Esposito, I.; Navab, N. Structure-preserving color normalization and sparse stain separation for histological images. IEEE Trans. Med. Imaging 2016, 35, 1962–1971. [Google Scholar] [CrossRef] [PubMed]
- Janowczyk, A.; Basavanhally, A.; Madabhushi, A. Stain normalization using sparse autoencoders (stanosa): Application to digital pathology. Comput. Med. Imaging Graph. 2017, 57, 50–61. [Google Scholar] [CrossRef]
- BenTaieb, A.; Hamarneh, G. Adversarial stain transfer for histopathology image analysis. IEEE Trans. Med. Imaging 2017, 37, 792–802. [Google Scholar] [CrossRef]
- Zhu, J.Y.; Park, T.; Isola, P.; Efros, A.A. Unpaired image-to-image translation using cycle-consistent adversarial networks. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2223–2232. [Google Scholar]
- Tellez, D.; Balkenhol, M.; Otte-Hller, I.; Loo, R.V.D.; Vogels, R.; Bult, P.; Wauters, C.; Vreuls, W.; Mol, S.; Karssemeijer, N. Whole-slide mitosis detection in h&e breast histology using phh3 as a reference to train distilled stain-invariant convolutional networks. IEEE Trans. Med. Imaging 2018, 37, 2126–2136. [Google Scholar]
- Bel, T.D.; Hermsen, M.; Kers, J.; Laak, J.V.D.; Litjens, G. Stain-transforming cycle-consistent generative adversarial networks for improved segmentation of renal histopathology. In Proceedings of the International Conference on Medical Imaging with Deep Learning–Full Paper Track, Amsterdam, The Netherlands, 4–6 July 2018. [Google Scholar]
- Chong, X.; Madeti, Y.; Cai, J.; Li, W.; Cong, L.; Lu, J.; Mo, L.; Liu, H.; He, S.; Yu, C.; et al. Recent developments in immunotherapy for gastrointestinal tract cancers. J. Hematol. Oncol. 2024, 17, 65. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
- Lyon, H.O.; Leenheer, A.P.D.; Horobin, R.W.; Lambert, W.E.; Schulte, E.K.W.; Liedekerke, B.V.; Wittekind, D.H. Standardization of reagents and methods used in cytological and histological practice with emphasis on dyes, stains and chromogenic reagents. Histochem. J. 1994, 26, 533–544. [Google Scholar] [CrossRef] [PubMed]
- Madabhushi, A.; Lee, G. Image analysis and machine learning in digital pathology: Challenges and opportunities. Med. Image Anal. 2016, 33, 170–175. [Google Scholar] [CrossRef] [PubMed]
- Albertina, B.; Watson, M.; Holback, C.; Jarosz, R.; Kirk, S.; Lee, Y.; Rieger-Christ, K.; Lemmerman, J. Radiology data from the cancer genome atlas lung adenocarcinoma [tcga-luad] collection. Cancer Imaging Arch. 2016. [Google Scholar] [CrossRef]
- Kirk, S.; Lee, Y.; Kumar, P.; Filippini, J.; Albertina, B.; Watson, M.; Rieger-Christ, K.; Lemmerman, J. Radiology data from the cancer genome atlas lung squamous cell carcinoma [tcga-lusc] collection. Cancer Imaging Arch. 2016. [Google Scholar] [CrossRef]
- He, Y.; Liu, Z.; Qi, M.; Ding, S.; Zhang, P.; Song, F.; Ma, C.; Wu, H.; Cai, R.; Feng, Y.; et al. PST-Diff: Achieving High-Consistency Stain Transfer by Diffusion Models with Pathological and Structural Constraints. IEEE Trans. Med. Imaging 2024, 43, 3634–3647. [Google Scholar] [CrossRef] [PubMed]
- Yan, R.; He, Q.; Liu, Y.; Ye, P.; Zhu, L.; Shi, S.; Gou, J.; He, Y.; Guan, T.; Zhou, G. Unpaired virtual histological staining using prior-guided generative adversarial networks. Comput. Med. Imaging Graph. 2023, 105, 102185. [Google Scholar] [CrossRef] [PubMed]
- Shin, S.J.; You, S.C.; Jeon, H.; Jung, J.W.; Roh, J. Style transfer strategy for developing a generalizable deep learning application in digital pathology. Comput. Methods Programs Biomed. 2021, 198, 105815. [Google Scholar] [CrossRef] [PubMed]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar]
- Jiang, G.; Wei, J.; Xu, Y.; He, Z.; Zeng, H.; Wu, J.; Qin, G.; Chen, W.; Lu, Y. Synthesis of mammogram from digital breast tomosynthesis using deep convolutional neural network with gradient guided cgans. IEEE Trans. Med. Imaging 2021, 40, 2080–2091. [Google Scholar] [CrossRef]
- Wang, X.; Yan, Y.; Tang, P.; Bai, X.; Liu, W. Revisiting multiple instance neural networks. Pattern Recognit. 2018, 74, 15–24. [Google Scholar] [CrossRef]
- Campanella, G.; Hanna, M.G.; Geneslaw, L.; Miraflor, A.; Fuchs, T.J. Clinical-grade computational pathology using weakly supervised deep learning on whole slide images. Nat. Med. 2019, 25, 1301–1309. [Google Scholar] [CrossRef]
- Zhou, Y.; Lu, Y. Deep Hierarchical Multiple Instance Learning for Whole Slide Image Classification. In Proceedings of the 2022 IEEE 19th International Symposium on Biomedical Imaging (ISBI), Kolkata, India, 28–31 March 2022; pp. 1–4. [Google Scholar]
- Zhou, Y.; Lu, Y. Multiple Instance Learning with Task-Specific Multi-Level Features for Weakly Annotated Histopathological Image Classification. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 22–27 May 2022; pp. 1366–1370. [Google Scholar]
- Zhou, Y.; Wei, J.; Helvie, M.A.; Chan, H.P.; Zhou, C.; Hadjiiski, L.; Lu, Y. Generating high resolution digital mammogram from digitized film mammogram with conditional generative adversarial network. In Proceedings of the Medical Imaging 2020: Computer-Aided Diagnosis, Houston, TX, USA, 16–19 February 2020; Volume 11314, pp. 508–513. [Google Scholar]
- Chen, T.; Kornblith, S.; Norouzi, M.; Hinton, G. A simple framework for contrastive learning of visual representations. In Proceedings of the International Conference on Machine Learning, PMLR, Virtual, 13–18 July 2020; pp. 1597–1607. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
Train Set for Stain Migration Model | Train Set for MIL Model | Independent Test Set | |
---|---|---|---|
TCGA-Lung | 100 | 740 | 210 |
Inhouse-Lung | 100 | 700 | 200 |
Method | MIL Model | AUC | p-Value | Accuracy | Precision | Recall | F1-Score |
---|---|---|---|---|---|---|---|
Direct test | MI-Net [43] | 0.7012 | - | 0.7000 | 0.6750 | 0.7714 | 0.7200 |
MIL-RNN [44] | 0.6912 | - | 0.6810 | 0.6532 | 0.7714 | 0.7074 | |
Att-MIL [4] | 0.7018 | - | 0.7000 | 0.6694 | 0.7905 | 0.7249 | |
CLAM [5] | 0.7118 | - | 0.7095 | 0.6803 | 0.7905 | 0.7313 | |
DHMIL [45] | 0.7018 | - | 0.7095 | 0.6803 | 0.7905 | 0.7313 | |
TSML-MIL [46] | 0.7289 | - | 0.7095 | 0.6803 | 0.7905 | 0.7313 | |
IMIL [47] | 0.7312 | - | 0.7095 | 0.6803 | 0.7905 | 0.7313 | |
CycleGAN [28] used for stain migration before testing | MI-Net [43] | 0.8502 | <0.05 | 0.8381 | 0.8901 | 0.7714 | 0.8265 |
MIL-RNN [44] | 0.8622 | <0.05 | 0.8476 | 0.9101 | 0.7714 | 0.8351 | |
Att-MIL [4] | 0.8811 | <0.05 | 0.8619 | 0.9419 | 0.7714 | 0.8482 | |
CLAM [5] | 0.8817 | <0.05 | 0.8714 | 0.9643 | 0.7714 | 0.8571 | |
DHMIL [45] | 0.8835 | <0.05 | 0.8714 | 0.9643 | 0.7714 | 0.8571 | |
TSML-MIL [46] | 0.8856 | <0.05 | 0.8714 | 0.9535 | 0.7810 | 0.8586 | |
IMIL [47] | 0.8856 | <0.05 | 0.8810 | 0.9762 | 0.7810 | 0.8677 | |
HDGAN used for stain migration before testing | MI-Net [43] | 0.8634 | <0.05 | 0.8524 | 0.8627 | 0.8381 | 0.8502 |
MIL-RNN [44] | 0.8818 | <0.05 | 0.8762 | 0.8990 | 0.8476 | 0.8725 | |
Att-MIL [4] | 0.9011 | <0.05 | 0.8905 | 0.9184 | 0.8571 | 0.8867 | |
CLAM [5] | 0.9011 | <0.05 | 0.8905 | 0.9184 | 0.8571 | 0.8867 | |
DHMIL [45] | 0.9220 | <0.05 | 0.9000 | 0.9286 | 0.8667 | 0.8966 | |
TSML-MIL [46] | 0.9223 | <0.05 | 0.9000 | 0.9286 | 0.8667 | 0.8966 | |
IMIL [47] | 0.9243 | <0.05 | 0.9048 | 0.9381 | 0.8667 | 0.9010 |
Method | MIL Model | AUC | p-Value | Accuracy | Precision | Recall | F1-Score |
---|---|---|---|---|---|---|---|
Direct test | MI-Net [43] | 0.5288 | - | 0.6150 | 0.6646 | 0.8231 | 0.7354 |
MIL-RNN [44] | 0.5128 | - | 0.6100 | 0.6711 | 0.7846 | 0.7234 | |
Att-MIL [4] | 0.5671 | - | 0.6250 | 0.6846 | 0.7846 | 0.7312 | |
CLAM [5] | 0.5510 | - | 0.6300 | 0.6842 | 0.8000 | 0.7376 | |
DHMIL [45] | 0.5423 | - | 0.6300 | 0.6842 | 0.8000 | 0.7376 | |
TSML-MIL [46] | 0.5647 | - | 0.6300 | 0.6842 | 0.8000 | 0.7376 | |
IMIL [47] | 0.5832 | - | 0.6300 | 0.6842 | 0.8000 | 0.7376 | |
CycleGAN [28] used for stain migration before testing | MI-Net [43] | 0.7189 | <0.05 | 0.7500 | 0.7740 | 0.8692 | 0.8188 |
MIL-RNN [44] | 0.7191 | <0.05 | 0.7400 | 0.7671 | 0.8615 | 0.8116 | |
Att-MIL [4] | 0.7634 | <0.05 | 0.8000 | 0.8358 | 0.8615 | 0.8485 | |
CLAM [5] | 0.7658 | <0.05 | 0.8000 | 0.8358 | 0.8615 | 0.8485 | |
DHMIL [45] | 0.7823 | <0.05 | 0.8000 | 0.8309 | 0.8692 | 0.8496 | |
TSML-MIL [46] | 0.7923 | <0.05 | 0.8200 | 0.8561 | 0.8692 | 0.8626 | |
IMIL [47] | 0.8012 | <0.05 | 0.8150 | 0.8496 | 0.8692 | 0.8593 | |
HDGAN used for stain migration before testing | MI-Net [43] | 0.7233 | <0.05 | 0.7400 | 0.7635 | 0.8692 | 0.8129 |
MIL-RNN [44] | 0.7233 | <0.05 | 0.7450 | 0.7651 | 0.8769 | 0.8172 | |
Att-MIL [4] | 0.7787 | <0.05 | 0.8100 | 0.8382 | 0.8769 | 0.8571 | |
CLAM [5] | 0.7802 | <0.05 | 0.8200 | 0.8507 | 0.8769 | 0.8636 | |
DHMIL [45] | 0.8011 | <0.05 | 0.8200 | 0.8507 | 0.8769 | 0.8636 | |
TSML-MIL [46] | 0.8281 | <0.05 | 0.8400 | 0.8769 | 0.8769 | 0.8769 | |
IMIL [47] | 0.8313 | <0.05 | 0.8400 | 0.8769 | 0.8769 | 0.8769 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tang, Y.; Zhou, Y.; Zhang, S.; Lu, Y. A High-Resolution Digital Pathological Image Staining Style Transfer Model Based on Gradient Guidance. Bioengineering 2025, 12, 187. https://doi.org/10.3390/bioengineering12020187
Tang Y, Zhou Y, Zhang S, Lu Y. A High-Resolution Digital Pathological Image Staining Style Transfer Model Based on Gradient Guidance. Bioengineering. 2025; 12(2):187. https://doi.org/10.3390/bioengineering12020187
Chicago/Turabian StyleTang, Yutao, Yuanpin Zhou, Siyu Zhang, and Yao Lu. 2025. "A High-Resolution Digital Pathological Image Staining Style Transfer Model Based on Gradient Guidance" Bioengineering 12, no. 2: 187. https://doi.org/10.3390/bioengineering12020187
APA StyleTang, Y., Zhou, Y., Zhang, S., & Lu, Y. (2025). A High-Resolution Digital Pathological Image Staining Style Transfer Model Based on Gradient Guidance. Bioengineering, 12(2), 187. https://doi.org/10.3390/bioengineering12020187