**5. Current Limits and Perspectives of Biomarkers in Renal Transplant**

Advances in high-throughput technologies have been providing an avalanche of new potential biomarkers over the last decade. However, in general, their application in clinical practice is currently being restrained by several drawbacks. Most available biomarkers do not meet ideal requirements outlined in Table 11 and certainly require further validation through multicenter studies, as single-center discovery step often inflates their value [187].

Most important, their role and cost-effectiveness should be assessed in prospective randomized trials designed to compare them with standard KTx management with traditional diagnostic tools.

Despite these limits, biomarkers represent the cornerstone of precision medicine, which aims at integrating traditional clinical information and tailoring medical care to select the best treatment for an individual patient [5]. This new frontier will probably deeply change the way we monitor KTx and manage its complications.

Renal biopsy, the traditional gold standard for assessing graft dysfunction, is usually triggered by a change in serum creatinine and/or proteinuria and has a limited diagnostic power for initial injury, when histological changes are minimal or equivocal [3]. By contrast, an ideal biomarker (or a set of biomarkers) should lead to an earlier and more objective diagnosis (Table 11) making it possible to pre-emptively treat histological initial lesions long before they become irreversible, or even before they become visible with traditional tools, marking patterns of molecular alterations which predate histological injury ("molecular rejection"). Biomarkers could decrease the need for renal biopsy to detect subclinical disease (e.g., protocol biopsies) and even substitute for it when contraindicated. Furthermore, while current new potential biomarkers in KTx mainly have a diagnostic/prognostic meaning, the area of monitoring, pharmacodynamic/response, and safety biomarkers (Table 1) is substantially unexplored in this setting and could help us improve long-term management of allograft dysfunction (e.g., follow-up of patients after BPAR, with repeated, non-invasive monitoring biomarkers to rule out persistence of ongoing subclinical rejection; assessment of etiology and degree of activity/chronicity in CAD).


**Table 11.** Features of an ideal biomarker for kidney transplant (KTx) [1,2,187].

Particularly interesting perspectives are immunological risk stratification and identification of low-risk, or even tolerant patients.

Peripheral blood gene expression tests such TruGraf [91] or kSORT [57] have already become commercial and appear accurate in identifying a state of "immunological quiescence" in stable recipients; due to their high NPP they could allow to rule out ongoing subclinical rejection through serial monitoring, as an alternative to surveillance biopsies, and guide immunosuppression minimization in fragile patients at low immunological risk [188].

A further step forward would be to identify biomarkers of operational tolerance, a rare condition characterized by maintenance of stable renal function without any immunosuppressive therapy.

Tolerant patients seem to be depicted by increased expression of B cell associated genes in the blood and urine and by a peculiar B cell repertoire, enriched in naive and transitional B cells. Of interest, this pattern appears to be associated with better long-term graft function [189] and potential biomarkers of this process are beginning to emerge. For example, TCL1A, an oncogene expressed in immature naive and transitional B cells, and promoting their survival, has been associated with immunosuppressive properties of this lymphocyte sub-population and seems to be upregulated in stable, rejection-free KTx recipients [190].

Of interest, Newell et al. identified a B-cell signature formed by a set of three genes which correlated with increased expression of CD20 mRNAs (FoxP3, CD20, CD3, perforin) in urinary sediment of tolerant patients compared to healthy controls (all of them) and to stable KTx (only CD20) [191], whereas Danger et al. showed that a composite score based on a 20-gene signature peripheral blood cells could accurately discriminate operationally tolerant recipients from stable ones, independent

of immunosuppressive therapy [192]. All these approaches need to be validated, but they may pave the way for the identification of tolerance biomarkers, with important implications on management of immunosuppressive therapy [193]. The state-of-the-art of this family of biomarkers was recently analyzed in several reviews [2,194,195] and is beyond the scope of this work.

At the other end of the spectrum, biomarkers could be preferably employed to monitor high-immunological risk patients (e.g., sensitized, DSA-positive recipients). Testing biomarkers in this subset helps increase PPP due to a higher a priori risk of AR. A combination of different biomarkers can also increase diagnostic accuracy; for example, association of kSORT with IFNγ ELISPOT improves predictive power for subclinical TCMR and ABMR [74].

Another intriguing perspective is the application of artificial intelligence (AI) models which allows computational analysis and interpretation of large-scale molecular data generation by exploiting machine learning algorithms and neural networks [196,197]. For example, classifiers like artificial neural networks, support vector machines and Bayesian inference have already been employed in pilot studies to screen KTx recipients requiring renal biopsy [198] and AI has proved useful to improve estimation of TAC Area Under the Concentration Over Time Curve [199].

"Molecular microscope" is another important example application of AI to renal tissue transcriptomic analysis [93,94].

In another recent work an unsupervised learning method integrating a wide range of parameters (clinical functional, immunologic, and histologic) was applied to a large cohort of KTx recipients and allowed to classify five transplant glomerulopathy archetypes, each associated with a different allograft 5-year graft survival (ranging from 88% to 22%) [200].

These studies suggest that progress in AI can significantly contribute to a completely new, more accurate disease nosology, integrating complex sets of biomarkers of different nature (from clinical data to molecular aspects) for a subtle characterization of traditional entities.
