**CD45RC Expression of Circulating CD8**<sup>+</sup> **T Cells Predicts Acute Allograft Rejection: A Cohort Study of 128 Kidney Transplant Patients**

**Marie Lemerle 1, Anne-Sophie Garnier 1, Martin Planchais 1, Benoit Brilland 1, Yves Delneste 2,3, Jean-François Subra 1,2, Odile Blanchet 4, Simon Blanchard 2, Anne Croue 5, Agnès Duveau <sup>1</sup> and Jean-François Augusto 1,2,\***


Received: 22 June 2019; Accepted: 29 July 2019; Published: 1 August 2019

**Abstract:** Predictive biomarkers of acute rejection (AR) are lacking. Pre-transplant expression of CD45RC on blood CD8<sup>+</sup> T cells has been shown to predict AR in kidney transplant (KT) patients. The objective of the present study was to study CD45RC expression in a large cohort of KT recipients exposed to modern immunosuppressive regimens. CD45RC expression on T cells was analyzed in 128 KT patients, where 31 patients developed AR, of which 24 were found to be T-cell mediated (TCMR). Pre-transplant CD4<sup>+</sup> and CD8<sup>+</sup> CR45RChigh T cell proportions were significantly higher in patients with AR. The frequency of CD45RChigh T cells was significantly associated with age at transplantation but was not significantly different according to gender, history of transplantation, pre-transplant immunization, and de novo donor specific anti-Human Leucocyte Antigen (HLA) antibody. Survival-free AR was significantly better in patients with CD8<sup>+</sup> CD45RChigh T cells below 58.4% (*p* = 0.0005), but not different according to CD4<sup>+</sup> T cells (*p* = 0.073). According to multivariate analysis, CD8<sup>+</sup> CD45RChigh T cells above 58.4% increased the risk of AR 4-fold (HR 3.96, *p* = 0.003). Thus, pre-transplant CD45RC expression on CD8<sup>+</sup> T cells predicted AR, mainly TCMR, in KT patients under modern immunosuppressive therapies. We suggest that CD45RC expression should be evaluated in a prospective study to validate its usefulness to quantify the pre-transplant risk of AR.

**Keywords:** kidney transplantation; acute rejection; lymphocyte; CD45RC

#### **1. Introduction**

Significant progress has been made over the past few years in the immunological and histological fields, allowing for better differentiation and to refine the diagnosis and prognosis of T cell-mediated rejection (TCMR) and of antibody-mediated rejection (ABMR) in kidney transplant patients [1]. While both rejection types may develop concomitantly, TCMR mainly occurs within the first year post-transplant, while ABMR usually develops later in the course and is associated with the presence of preformed or de novo donor-specific anti-Human Leucocyte Antigen (HLA) antibodies (DSA) [1,2]. Although modern immunosuppressive regimens efficiently prevent allograft rejection in most patients, acute rejection (AR) episodes still occur in some patients and are associated with premature graft loss and morbidity [1–3].

Several risk factors of AR have been identified in previous studies including young age, female gender, black race, and immunological characteristics (HLA mismatch, pre-transplant or de novo DSA) [4]. However, despite being indicative at the population level, when considered at the individual level, most of these factors do not allow for the accurate stratification of AR risk, especially in "low immunological risk patients," which represents most kidney transplant candidates. The identification of patients with higher versus lower AR risk among low immunological risk patients would theoretically allow one to tailor the immunosuppressive regimen according to the AR risk and thus to decrease post-transplant morbidity [5,6].

The risk of allograft rejection relies on graft-, environmental-, and host-related factors [1]. However, the molecular mechanisms underlining the development of alloreactivity are far from being fully understood [7]. An illustrative example of the inter-individual variability of AR risk is represented by operationally tolerant patients, defined as solid allograft recipients that do not develop allograft rejection despite immunosuppressive treatment discontinuation [7]. Thanks to intrinsic immunological factors, and probably also acquired factors, these patients are unable to mount an efficient alloreactive response.

The identification of biomarkers reflecting the level of tolerance emerges as a major goal in solid organ transplantation. This would allow one to tailor the immunosuppressive regimens, especially in low immunological risk kidney transplant candidates. Given that CD4<sup>+</sup> and CD8<sup>+</sup> T cell subsets have an essential role in the development of alloimmune response, defining T cell subpopulations with higher and lower alloimmune properties may constitute an interesting approach.

CD45 is a transmembrane protein tyrosine phosphatase heavily expressed on T cells and critical for signal transduction by regulating kinases of the Src-family [8,9]. Four CD45 isoforms (RO, RA, RB, RC), resulting from an alternative splicing of three exons, are expressed in humans [9]. The CD45RC isoform is highly expressed on human naive T cells with a bimodal and a trimodal pattern on CD4<sup>+</sup> T cells (high and low expression) and CD8+ T cells (low, intermediate, and high) [10,11]. These patterns of expression define CD45RC T cell subsets with different cytokine profiles. Interestingly, the expression of CD45RC on T cells is highly variable between individuals and is genetically determined [10–12].

We demonstrated in a previous work that the level of CD45RC expression at the surface of blood CD8<sup>+</sup> T cells before kidney transplantation was associated with the risk of AR after transplantation [10,13]. This study was conducted on a cohort of 89 kidney transplant recipients transplanted between 1999 and 2004, and we observed that a pre-transplant proportion of CD8<sup>+</sup> CD45RChigh T cells above 54.7% conferred a 6-fold increased risk of developing AR after 4.8 years of follow-up [10]. The aim of the present study was thus to confirm this observation in a prospective cohort of kidney transplant patients treated with current immunosuppressive regimens.

#### **2. Material and Methods**

#### *2.1. Study Design and Aim*

This is a monocentric cohort study that included patients transplanted in the University Hospital of Angers between 2007 and 2015. During the period of the study, after giving their written consent, patients were offered the chance to participate to a biocollection ("Collection Néphrologie et voies urinaires"). Samples were collected before kidney transplantation and stored at the dedicated department ("Centre de Ressources Biologiques BB-0033-00038"). All patients that gave their written consent to the study were included. The primary aim of the study was to analyze the value of CD45RC expression on T cells for AR prediction. The study was approved by the Medical Ethics Committee of Angers University Hospital (2009/10).

#### *2.2. Immunosuppressive Regimens*

The immunosuppressive treatment was not imposed by the study and was based on the assessment of immunological risk according to clinical practice in our department as detailed hereafter. Low immunological risk patients, defined as first time kidney transplant recipients and with PRA < 20%, received two injections of Basiliximab (Simulect; Novartis Pharma, Basel, Switzerland), while higher immunological risk recipients (previous transplantation, PRA > 20%) were more likely to receive

antithymocyte globulines (ATG; Thymoglobuline; Genzyme, Lyon, France) during the first 3 to 7 days post-transplant. ATG was also used for induction in donors with cardiac arrest before brain death, in non-heart-beating donors, and when delayed graft function was anticipated by clinician. Moreover, between 2010 and 2013, no induction therapy was performed in patients aged >70 years old. All patients received a single methylprednisolone bolus of 500 mg followed by prednisone (1 mg/kg/day) with a progressive tapering and discontinuation at the end of month 5 post-transplant, unless there was an occurrence of AR. A maintenance immunosuppressive regimen relied mainly on mycophenolate mofetil or mycophenolic acid and tacrolimus.

#### *2.3. Data Collection and Definitions*

Characteristics of the study population were collected prospectively via the systematic screening of patients' medical records. All clinical events and biological data were retained until last follow-up: anthropometric data, nature of original kidney disease, and graft donor characteristics. Diagnosis of acute rejection (AR) episodes was based on conventional clinical and laboratory criteria and confirmed using a histological examination of a graft biopsy (according to the last Banff Classification) [14]. AR diagnosis was based on clinical and laboratory criteria (clinically diagnosed AR) when the graft biopsy was non-contributive or contra-indicated.

#### *2.4. Sample Collection*

Peripheral blood mononuclear cells (PBMC) of kidney transplant candidates were prospectively harvested before transplantation and stored in liquid nitrogen. Patients with samples showing PBMC viability below 80% were excluded from the analysis.

After giving their written consent, fresh samples of end-stage renal disease (ESRD) patients and heathy individuals (HD) were used to monitor the proliferation capacities of CD45RC T cells.

#### *2.5. Antibodies and Flow Cytometry Analysis*

The following conjugated antibodies were used to characterize CD45RC T cell subpopulations: CD3-VioGreen (REA613), CD4-PerCP-Vio700 (REA623), CD8-PE-Vio770 (REA734), from Milteny Biotec, Bergisch-Gladbach, Germany; CD45RA-APC (HI100) from BD Biosciences, San Jose, CA, USA; and CD45RC-FITC (MT2) from IQ Product, Houston, TX, USA. Cell viability was systematically assessed (LIVE/DEAD Fixable Near-IR Dead Cell Stain kit; Fischer Scientific, Pittsburgh, PA, USA). Briefly, 10<sup>6</sup> cells were incubated with the viability dye according to the manufacturer's recommendations before incubation with the antibodies. Data were collected using a FACS-Canto II (BD Biosciences) cytometer and analyzed using the FlowJo software, Ashland, OR, USA. The expression of CD45RC is bimodal on CD4<sup>+</sup> T cells, some cells expressing low levels of CD45RC (CD45RClow), and others expressing high levels (CD45RChigh). On CD8<sup>+</sup> T cells, expression of CD45RC is trimodal, the first fraction of cells expressing low levels (CD45RClow), the second fraction expressing intermediate levels (CD45RCInt), and the last fraction expressing high levels of CD45RC (CD45RChigh). Figure S1 illustrates the gating strategy.

#### *2.6. CD45RC*<sup>+</sup> *T Cell Purification and T Cell Proliferation Analyses*

CD45RC T cells were sorted from freshly isolated PBMC of end-stage renal disease (ESRD) patients and age/sex matched healthy donors (HD) using a FACS-Aria cytometer, BD Bioscience, San Jose, CA, USA. Briefly, after a gradient centrifugation, 2 <sup>×</sup> 107 PBMCs were stained using a Cell Trace Violet proliferation kit (Thermofischer, San Jose, CA, USA) for proliferation assessment and then stained using CD4-BV421 (L3T4, BD Biosciences), CD8-PE-Vio770 (REA734, Miltenyi Biotec), and CD45RC-FITC (MT2, IQ Product). CD45RChigh and CD45RClow subpopulations were sorted among CD4<sup>+</sup> and CD8<sup>+</sup> T cells. Purity was always routinely above 95%. Then, 5 <sup>×</sup> 104 T cells were cultured at 37 ◦C in RPMI 1640 medium (containing 8% fetal calf serum) in 96-well round-bottomed microplates (Becton Dickinson, Franklin Lakes, NJ, USA), with or without a 1 μg/mL plate-bound anti-CD3

(Beckman-Coulter, Brea, CA, USA) and 0.5 μg/mL soluble anti-CD28 (Beckman-Coulter). After 72 h of culture, cells were harvested and proliferation was assessed using flow cytometry (FACS-Canto II; BD Biosciences).

#### *2.7. Statistical Analysis*

Data were expressed as a median with minimum to maximum values for continuous variables and absolute count with percentage for categorical variables. Categorical and continuous data were analyzed with χ<sup>2</sup> or Fischer's exact test and Mann–Whitney U tests, respectively. The Wilcoxon matched-pairs rank test was used to compare the proliferative capacities of T cells. The predictive values of the CD45RC subset frequency for the first AR episode were analyzed using receiver operating characteristics (ROC) curves. Subsequently, cut-off values were determined by using the Youden index. The Kaplan–Meyer method was used to analyze AR-free survivals according to predetermined cut-off values of CD45RC subset frequencies. A log-rank test was used to compare survival curves. Correlations were analyzed using Spearman's rank correlation test. Multivariate cox models were used to analyze the association between CD45RC subset frequencies and AR. Results are reported as hazard ratio (HR) with 95% CIs. All *p*-values were two-sided and a *p*-value lower than 0.05 was considered statistically significant. Statistical analysis was performed using Graphpad Prism® version 7 (San Diego, CA, USA) and SPSS® software version 22.0 (IBM, Armonk, NY, USA).

#### **3. Results**

#### *3.1. Characteristics of the Population*

Between January 1, 2007, and December 31, 2015, 396 patients underwent kidney transplantation in Angers University hospital. Among them, 292 patients had blood samples collected and stored in a biocollection before transplantation, and 140 gave their written consent to participate in the present study. Among these 140 patients, samples from 12 patients were excluded because of technical errors (*n* = 6) or poor blood cell viability (*n* = 6). Thus, 128 patients were included and finally analyzed (Figure 1, flowchart).

**Figure 1.** Flowchart of the study.

The population was predominantly composed of males, with a median age of 50.2 years. The main cause of ESRD was autosomal dominant polycystic kidney disease and patients were first-time transplanted in 90% of cases. Based on PRA, 69.5% were non-sensitized before transplantation, while 6.25% of patients had a PRA > 20%. Basiliximab was used predominantly for induction in 56.3% of patients and most patients received tacrolimus with mycophenolate mofetil as a maintenance regimen. These data are detailed in Table 1.

**Table 1.** Baseline characteristics of the population. Results are presented as a median with minimum to maximum value ranges for continuous variables and absolute count and percentage for categorical variables.


ADPKD, autosomic dominant polycystic kidney disease; BMI, body mass index; GN, glomerulonephritis; HLA, human leukocyte antigens; MMF, mycophenolate mofetil; MPA, mycophenolic acid; PRA, Panel Reactive Antibody; Tac, tacrolimus; TIN, tubulo-interstitial nephropathy.

#### *3.2. Acute Rejection Episodes*

The mean follow-up of the cohort was 3.82 ± 2.22 years. During the follow-up, AR occurred in 31 patients (24.2%) at a mean delay of 0.73 ± 1.24 years post-transplant. When considering only the first AR episode, 28 were histologically-proven and 3 were diagnosed based on clinical and biological criteria. Among the histologically-proven AR cases, 24 were TCMR, and 6 being borderlines. The four other AR episodes were ABMR in one case and mixed AR (TCMR and ABMR) in the three other cases. At one-year post-transplant, mean serum creatinine was 141.4 ± 75.2 μmol/L and mean glomerular filtration rate (GFR) was 53.2 <sup>±</sup> 21.8 mL/min/1.73 m2. DSA developed in 15 patients (11.7%) during follow-up. These data are reported in Table 2.


**Table 2.** Acute rejection episodes. Results are presented as a median with minimum to maximum value ranges for continuous variables and absolute count and percentage for categorical variables.

Patients that experienced AR received more frequently Basiliximab as induction therapy as compared to patients that did not experienced AR, who received more-frequent ATG (*p* = 0.035). Baseline characteristics, including age and pre-transplant immunization, were not significantly different between groups. These data are reported in Table 3. When borderline ARs were excluded, no significant differences were observed between patients with and without AR (Table S1).

<sup>\*</sup> In patients followed at the indicated time; AR, acute rejection; DSA, donor specific antibodies; GFR, glomerular filtration rate; disorder; TCMR, T-cell-mediated rejection; AMR, antibody-mediated rejection.



#### *3.3. Proliferative Capacities of CD45RC T Cells*

The proliferative properties of CD45RC T cells have been studied only in HD [10,11]. Thus, we analyzed the proliferative properties of subpopulations in ESRD patients as compared to age and sex-matched HD. As shown in Figure 2, proliferative properties of CD45RC T cells were not different between ESRD patients and HD, suggesting that their immune function was maintained in ESRD.

**Figure 2.** Analysis of proliferative capacities of CD45RClow and CD45RChigh T cells in ESRD patients and HD. After 72 h, the proliferation of activated CD4+CD45RClow (**A**), CD4+CD45RChigh (**B**), CD8<sup>+</sup>CD45RClow (**C**), and CD8+CD45RChigh (**D**) T cell subsets of ESRD patients (black bars) and HD (white bars) was analyzed. The experiment reported results of four ESRD patients and four age and matched HD. Error bars show the median with a 95% CI. Comparisons were done using the Wilcoxon matched-pairs rank test. ns, non-significant. CI, Confidence Interval; ESRD: end-stage renal disease; HD: heathy individuals.

#### *3.4. Association between CD45RC Expression on T Cells and Acute Rejection*

We first analyzed the association of CD45RC expression on T cells with patient's characteristics. As shown in Figure 3, the level of CD45RC expression on CD4<sup>+</sup> and CD8<sup>+</sup> T cells were not significantly different according to gender, the number of previous transplantations, the level of pre-transplant immunization, or de novo DSA development. As previously reported in healthy subjects [10,11], CD45RC expression on CD4<sup>+</sup> and CD8<sup>+</sup> T cells was correlated with age (*p* = 0.047 and *p* = 0.0002, respectively).

**Figure 3.** Proportion of CD45RChigh and CD45RClow CD4+ and CD8+ T cells according to gender (**A**), previous transplantation (**B**), pre-transplant PRA (**C**), de novo DSA occurrence (**D**), and age at transplantation (**E**). For A–D, comparisons were done using the Mann–Whitney U test and error bars show median with a 95% CI. For E, correlation analysis was done using the Spearman test. CI, Confidence Interval; PRA, Panel Reactive Antibody.

We next analyzed the frequency of CD45RC T cell subsets according to the occurrence of AR (Table 4). In line with our previous observations [10], patients who experienced AR had a higher proportion of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RChigh compared to patients that did not develop AR. The difference between groups remained significant regardless of whether all AR episodes were considered, when analysis was restricted to biopsy-proven ARs, or when borderline AR episodes were excluded. Moreover, the absolute number of both CD4<sup>+</sup> and CD8<sup>+</sup> CD45RChigh cells was significantly greater in patients that experienced AR (*p* = 0.0101 and 0.0073, respectively; Figure S2).


**Table 4.** Frequency of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RC subsets according to AR occurrence. Comparisons were done using the Mann–Whitney U test. Significant *p*-values appear in bold.

\* Patients with clinical diagnosed AR without biopsy were excluded. \*\* Patients with biopsy-proven AR excluding patients with borderline AR. CD45RC subsets were determined as specified in the Materials and Method section. High, high expression; Low, low expression; Int, intermediate expression.

#### *3.5. Value of CD45RC Expression on T Cell for Acute Rejection Prediction*

We next analyzed the best thresholds of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RChigh T cell frequencies for AR prediction. Using ROC curve analysis, we could determine 45.4% and 58.4% as the best thresholds of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RChigh frequencies for AR prediction, respectively (Figure 4A). Using these thresholds, we observed that AR-free survival was significantly greater in patients with a CD8<sup>+</sup> CD45RChigh T cell frequency below 58.4% (*p* = 0.0005), while AR-free survival was not significantly different according to CD4<sup>+</sup> CD45RChigh T cell frequencies (*p* = 0.073) (Figure 4, upper and lower panel). Thus, CD8<sup>+</sup> CD45RChigh T cell frequency allows one to better differentiate patients at risk of AR, as compared to CD4<sup>+</sup> CD45RChigh T cells. The sensitivity, the specificity, and the positive predictive and negative predictive values of CD8<sup>+</sup> CD45RChigh T cell frequency above 58.4% for ARs were 80.6%, 55.7%, 36.8%, and 90%, respectively. These results are illustrated in Figure 4C. We did not observe any correlation between TCMR severity and the proportion of CD8<sup>+</sup> CD45RChigh T cells (Figure S3).

We finally analyzed the risk factors of AR in a cox model analysis (Table 5). We successively considered all ARs (including those clinically diagnosed), ARs excluding borderline cases, and biopsy-proven ARs. A CD8<sup>+</sup> CD45RChigh T cell frequency above 58.4% was significantly associated with AR after adjustment on type of induction. The CD8<sup>+</sup> CD45RChigh T cell frequency and ATG as an induction treatment were both associated with AR (all, including borderline) and biopsy-proven AR. When borderline ARs were excluded, ATG was no longer significantly associated with AR occurrence.

**Figure 4.** Predictive value of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RC T cell subsets. (**A**) ROC curve analysis of CD4<sup>+</sup> and CD8<sup>+</sup> CD45RChigh T cell subsets for AR prediction. (**B**) Rejection-free survival of patients according to CD8<sup>+</sup> (upper panel) and CD4<sup>+</sup> CD45RChigh T cell proportions. Comparison between survivals was done using a log-rank test. (**C**) predictive values of CD8<sup>+</sup> CD45RChigh T cell above 58.4% for AR prediction. AUC: Area Under Curve.

**Table 5.** Multivariate cox analysis of factors associated with acute rejection occurrence. Significant *p*-values appear in bold.


\* Patients with a biopsy–proven AR excluding patients with a borderline AR. \*\* Patients with a clinically diagnosed AR without biopsy were excluded.

#### **4. Discussion**

In the present work, we confirmed that the CD8<sup>+</sup> CD45RChigh T cell subset was associated with an increased risk of AR. We could determine that patients with a proportion of CD8<sup>+</sup> CD45RChigh T cell frequency above 58.4% had a 4-fold increased risk of AR. Thus, these results are in line with our previous work [10,13] and confirm the interest of CD45RC expression on T cells to assess the immunological risk of candidates to kidney transplantation.

As compared to our previous work [10,13], in the present study, patients were treated with current immunosuppressive strategies. Indeed, most patients received Basiliximab as an induction regimen and tacrolimus maintenance, while ATG and tacrolimus monotherapy after 6 months post-transplant was the most commonly used regimen in our previous study. Another important difference is that patients considered at high immunological risk were included in the present study. Although, pre-transplant sensitized patients represented a minority of the study population, we suggest that CD45RC expression may also help determine the AR risk in this specific population. Interestingly, CD45RC expression was not significantly different between patients with or without previous kidney transplantation or who were sensitized before kidney transplantation, which suggests that previous exposure to immunosuppressive drugs may not affect CD45RC expression on T cells. As previously reported, CD45RC expression was negatively associated with age, and correlation was greater for the CD8<sup>+</sup> T cell subset [10,11]. A genetic control of CD45RC on T cells has been demonstrated in rats [12,15]; however, its regulation in humans remains largely unknown.

AR incidence, including borderline AR, in our study was 24.2% after a mean follow-up of almost 4 years. As expected, most AR episodes were TCMR, with ABMR being implicated in only four AR episodes. This observation is in line with recent reports [3,16] showing the predominant occurrence of TCMR as compared to ABMR in the first few years of transplantation. Most AR episodes occurred within the first few years following kidney transplantation, which is also in line with published data [17,18]. Thus, CD45RC expression mainly allows one to assess the risk of TCMR, which represented the majority of ARs in the cohort. Interestingly, we did not observe any difference in CD45RChigh T cell subset frequency according to subsequent DSA development, which suggests that CD45RC status does not allow one to assess the risk of ABMR. TCMR still represents a significant cause of morbidity in the era of ABMR, the latter being the main cause of late graft lost [19]. Moreover, several recent reports suggested that the occurrence of TCMR is independently linked to subsequent interstitial fibrosis development [20] and probably an increased risk of ABMR development by exposing HLA antigens within kidney [18,21,22]. Thus, these data suggest that decreasing the rate of TCMR, including sub-clinical TCMR, could allow one to improve the long-term graft outcome and decrease the risk of ABMR. This reinforces the need to better delineate patients at high TCMR risk to prevent it by adjusting the immunosuppressive regimen.

Interestingly, not only CD45RC proportion was associated with AR occurrence, but also the absolute count of CD45RChigh CD4<sup>+</sup> and especially CD8<sup>+</sup> T cells. In vitro, CD45RChigh CD8<sup>+</sup> T cells from HD mainly produced interferon gamma a key cytokine in TCMR, and poor levels of regulatory cytokines [10]. Whether, the ESRD milieu affects the cytokine profile of CD45RC T cell subsets remains to be fully analyzed. However, we show here that CD45RC T cell subsets from ESRD patients had a similar proliferative capacity as compared to those of HD. This suggests that the immune functions of CD45RC T cell subsets were preserved in a uremic context.

We could determine in the present study that 58.4% was the best threshold of CD8<sup>+</sup> CD45RChigh T cells for AR prediction. In our previous work, the best threshold was determined at 54.7% [10]. Finally, moving the cut-off to this value would result in minimal changes in the predictive value of CD8<sup>+</sup> CD45RChigh T cells in the present study. The very closed cut-off values as determined in the two works, conducted in two different populations with different immunosuppressive regimens, reinforces the strength of this biomarker.

The present study has several limitations. First, only 32% of patients transplanted during the period were included and we could not exclude a selection bias. Next, AR diagnosis was based on for-cause biopsies and not protocol biopsies. Thus, the value of CD45RC expression for subclinical rejection remains to be investigated. Moreover, given the relatively short follow-up, we were not able to analyze the relationship between pre-transplant CD45RC expression and the risk of ABMR, which occurred in only four patients in the present study.

In conclusion, the present study confirmed and extended the data on CD45RC that has appeared as a promising biomarker to assess the risk of AR before transplantation. This work confirmed that the pre-transplant proportion of CD8<sup>+</sup> CD45RChigh T cells is associated with a 4-fold increased risk of AR, mainly TCMR. Thus, CD45RC could be used to help define the immunological risk and the level of immunosuppressive regimen before kidney transplantation. However, the value of CD45RC expression on T cells should be evaluated in a multicenter prospective cohort study with protocol biopsies to confirm its usefulness and to study its predictive value for subclinical AR.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/8/1147/s1. Figure S1. Representative experiment showing the cytometry gating strategy. Figure S2. Absolute number

of CD4<sup>+</sup> and CD8<sup>+</sup> CDR45RChigh T cells according to AR occurrence. Figure S3. Correlation between CD8<sup>+</sup> CD45RChigh T cell proportion and the severity of TCMR. The correlation was analyzed using Spearman's test. Table S1: Univariate analysis of factors associated with acute rejection occurrence. Patients with borderline AR were excluded from the analysis.

**Author Contributions:** Conceptualization, J.-F.A., A.D., and J.-F.S.; Methodology, J.-F.A., A.D., and J.-F.S.; Validation, J.-F.A.; Formal Analysis, M.L., S.B., A.-S.G., and B.B.; Investigation, M.L., A.-S.G., and J.-F.A.; Resources, O.B.; Data Curation, O.B.; Writing—Original Draft Preparation, M.L., M.P., and A.-S.G.; Writing—Review and Editing, J.-F.A., J.-F.S., O.B., Y.D., A.D., and A.C.; Supervision, J.-F.A. and J.-F.S.; Project Administration, J.-F.A.

**Acknowledgments:** We acknowledge Ms Jolivot Denise and the members of "La Maison de la Recherche" of the University Hospital of Angers for the help they provided in this project.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **Abbreviations**


#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

*Article*

### **Exposure to Hyperchloremia Is Associated with Poor Early Recovery of Kidney Graft Function after Living-Donor Kidney Transplantation: A Propensity Score-Matching Analysis**

**Jin Go 1, Sun-Cheol Park 2, Sang-Seob Yun 2, Jiyeon Ku 3, Jaesik Park 4, Jung-Woo Shim 4, Hyung Mook Lee 4, Yong-Suk Kim 4, Young Eun Moon 4, Sang Hyun Hong <sup>4</sup> and Min Suk Chae 4,\***


Received: 8 June 2019; Accepted: 1 July 2019; Published: 2 July 2019

**Abstract:** The effects of hyperchloremia on kidney grafts have not been investigated in patients undergoing living-donor kidney transplantation (LDKT). In this study, data from 200 adult patients undergoing elective LDKT between January 2016 and December 2017 were analyzed after propensity score (PS) matching. The patients were allocated to hyperchloremia and non-hyperchloremia groups according to the occurrence of hyperchloremia (i.e., ≥110 mEq/L) immediately after surgery. Poor early graft recovery was defined as estimated glomerular filtration rate (eGFR) < 60 mL/min/1.73 m<sup>2</sup> during the first 48 hours after surgery. After PS matching, no significant differences in perioperative recipient or donor graft parameters were observed between groups. Although the total amount of crystalloid fluid infused during surgery did not differ between groups, the proportions of main crystalloid fluid type used (i.e., 0.9% normal saline vs. Plasma Solution-A) did. The eGFR increased gradually during postoperative day (POD) 2 in both groups. However, the proportion of patients with eGFR > 60 mL/min/1.73 m<sup>2</sup> on POD 2 was higher in the non-hyperchloremia group than in the hyperchloremia group. In this PS-adjusted analysis, hyperchloremia was significantly associated with poor graft recovery on POD 2. In conclusion, exposure to hyperchloremia may have a negative impact on early graft recovery in LDKT.

**Keywords:** hyperchloremia; kidney graft dysfunction; living donor kidney transplantation

#### **1. Introduction**

Acute kidney injury (AKI) is a common complication that increases the risk of poor graft outcome in patients undergoing kidney transplantation (KT). Kidney grafts seem to be vulnerable to poor early function recovery because of various types of acute perioperative damage, such as ischemia-reperfusion injury, immunological insult, medication toxicity, and surgical stress [1]. In living-donor kidney transplantation (LDKT), better graft quality and patient condition result in better early graft function compared with deceased-donor KT. However, the reported incidence of poor early graft function after

LDKT is 10–20%, and this early graft dysfunction is closely related closely to an increased risk of long-term graft failure [2,3].

Chloride is a major anion in extracellular fluid and an essential element for plasma tonicity. Chloride plays a role in maintaining aspects of homeostasis, such as acid balance, muscular activation, and immunological response [4]. Hyperchloremia is a potential risk factor for AKI in critically ill patients admitted to the intensive care unit (ICU) [5–8]. In patients undergoing noncardiac surgery, the perioperative development of hyperchloremia was reported to be independently associated with morbidity and mortality risks [9]. These harmful effects of hyperchloremia have largely been investigated in the context of chloride-rich solution resuscitation, which has been found to have a detrimental impact on kidney function [10,11]. However, the relationship between hyperchloremia and AKI has been investigated only in native (i.e., not transplanted) kidneys, in the context of clinical conditions such as sepsis, subarachnoid hemorrhage, and abdominal surgery [5–8,12].

To date, the effects of hyperchloremia on kidney grafts have not been investigated in patients undergoing LDKT. We investigated the association between hyperchloremia and early graft function recovery. In addition, postoperative outcomes, such as the requirement for renal replacement therapy (RRT), graft rejection, and mortality, were investigated according to the occurrence of hyperchloremia.

#### **2. Patients and Methods**

#### *2.1. Ethical Considerations*

This study was approved by the Institutional Review Board and Ethics Committee of Seoul St. Mary's Hospital (KC19RESI0088) and was performed in accordance with the Declaration of Helsinki. The requirement for informed consent was waived due to the retrospective study design.

#### *2.2. Study Population*

The study population consisted of 330 adult patients (i.e., age ≥ 19 years) who underwent elective LDKT at Seoul St. Mary's Hospital between January 2016 and December 2017. Pediatric patients (i.e., age < 19 years), those undergoing deceased-donor or ABO-incompatible KT, patients undergoing multiorgan transplantation including the kidney, and those undergoing re-transplantation were excluded from the study because these patients require various and complex immunosuppression regimens or surgical technique application [13–17]. Patients with defective or missing recipient and donor graft data were also excluded. Based on the exclusion criteria, 29 patients were not included in the study. In total, 301 patients were initially enrolled and their data were included in propensity score (PS)-matching analysis; data from 200 matched patients were included in the final analysis.

#### *2.3. Living Donor Kidney Transplantation*

The surgical technique for LDKT involved an initial hockey-stick (i.e., pararectal inverted J-shaped curvilinear) incision and exposure of the right pelvic fossa. After back table preparation of the graft, end-to-side anastomoses between the recipient external iliac artery/vein and the graft renal artery/vein were performed using Prolene 6.0 resorbable monofilament (Ethicon, Somerville, NJ, USA). Subsequently, ureteroneocystostomy was performed with insertion of a double-J stent (INLAY ureteral stent; Bard Medical, Covington, GA, USA) using the Lich-Grègoir technique [18,19]. After careful hemostasis and re-assessment of the vascular anastomosis and renal pedicle area, closed drains were placed and the wound was closed.

Balanced anesthesia was applied using 1–2 mg/kg propofol (Fresenius Kabi, Bad Homburg, Germany) and 0.6 mg/kg rocuronium (Merck Sharp & Dohme Corp., Kenilworth, NJ, USA), and maintained using 2.0–6.0% desflurane (Baxter, Deerfield, IL, USA) with medical air/oxygen and continuous remifentanil (Hanlim Pharmaceutical Co., Ltd., Seoul, Republic of Korea) infusion at a rate of 0.1–0.5 μg/kg/min, as appropriate. The maintenance of appropriate hypnotic depth between 40 and 50 was ensured with a Bispectral Index™ instrument (Medtronic, Minneapolis, MN, USA). Central venous pressure (CVP) was monitored using a central venous catheter (Arrow, Morrisville, NC, USA) inserted before surgery. The optimal hemodynamic status was adjusted to a mean arterial pressure of ≥65 mmHg with infusion of dopamine (Reyon Pharm. Co., Ltd., Seoul, Republic of Korea) at a rate of 5–10 μg/kg/min. Mannitol (Daihan Pharm. Co., Ltd., Seoul, Republic of Korea) was used at doses of 20–50 g to promote urine flow [20]. However, we did not regularly cannulate and/or puncture a radial artery for continuous monitoring of blood pressure or arterial blood gas analysis (ABGA), to avoid arterial injury. We only measured ABGA when oxygen saturation was below 90% using pulse oximetry.

For intraoperative fluid therapy, an isotonic crystalloid fluid (0.9% normal saline (Daihan Pharm Co., Ltd.) or Plasma Solution-A (CJ Healthcare, Seoul, Republic of Korea)) was selected at the discretion of the attending anesthesiologist. The 0.9% normal saline (i.e., chloride-liberal fluid) included sodium chloride (9 g/L; 154 mEq/L sodium; and 154 mEq/L chloride). Plasma Solution-A (i.e., chloride-restrictive fluid) included magnesium chloride (0.3 g/L; 0.37 g/L potassium chloride, 3.68 g/L sodium acetate, 5.26 g/L sodium chloride, and 5.02 g/L sodium gluconate (140 mEq/L sodium, 98 mEq/L chloride, 5 mEq/L potassium, and 3 mEq/L magnesium)). Baseline isotonic crystalloid infusion was based on the estimated fluid maintenance requirements, calculated from the patient's weight and anticipated tissue trauma [21]. Additional fluid boluses were administered to reach a target CVP of 10–15 mmHg or hydration volume of 50–100 mL/kg, to ensure sufficient flow for the maintenance of adequate kidney graft perfusion and to replace the amount of urine output after graft reperfusion [22].

In the immunosuppressive regimen (Table 1), the induction drug was based on interleukin-2 receptor antagonist (i.e., Basiliximab) and T lymphocyte-depleting rabbit-derived anti-thymocyte globulin (i.e., thymoglobulin), and drug maintenance was based on a calcineurin inhibitor (i.e., tacrolimus), mycophenolate mofetil, and steroids. Steroid pulse therapy and/or thymoglobulin rescue therapy were applied in cases of graft rejection.


**Table 1.** Immunosuppressive regimen in living donor kidney transplantation.

Abbreviations: POD, postoperative day; MMF; mycophenolate mofetil, MYF; mycophenolate sodium, IV; intravenous, p.o.; per oral; q.d., quaque die (every day); bid, bis in die (twice a day); qid, quarter in die (four times a day).

#### *2.4. Measurement of Serum Chloride Levels*

The baseline chloride level was estimated between the day after dialysis and the preoperative period, and serial chloride levels were measured 1 day before surgery, immediately after surgery, and on postoperative days (PODs) 1 and 2. The chloride level was analyzed by indirect potentiometry (Clinical Analyzer 7600; Hitachi, Tokyo, Japan).

The patients were divided according to the presence or absence of hyperchloremia (defined as ≥110 mEq/L [6,23]) immediately after surgery into the hyperchloremia group and the non-hyperchloremia group, respectively.

#### *2.5. Definition of Poor Early Recovery of Kidney Graft Function*

Kidney graft function was quantified based on the estimated glomerular filtration rate (eGFR), calculated using the Modification of Diet in Renal Disease formula: eGFR = 175 × standardized serum chloride−1.154 <sup>×</sup> age−0.203 <sup>×</sup> 1.212 (if black) <sup>×</sup> 0.742 (if female) [24]. The baseline eGFR was estimated between the day after dialysis and the preoperative period, and serial eGFRs were measured immediately after surgery and on PODs 1 and 2. Based on the eGFR, the degree of graft function was classified as chronic kidney disease (CKD) stage I (i.e., normal kidney function and eGFR ≥ 90 mL/min/1.73 m2), stage II (i.e., mild loss of kidney function and eGFR 60–89 mL/min/1.73 m2), stage IIIa (i.e., mild to moderate loss of kidney function and eGFR 45–59 mL/min/1.73 m2), stage IIIb (i.e., moderate to severe loss of kidney function and eGFR 30–44 mL/min/1.73 m2), stage IV (i.e., severe loss of kidney function and eGFR 15–29 mL/min/1.73 m2), or stage V (i.e., kidney failure and eGFR < 15 mL/min/1.73 m2) [25].

In the present study, poor early recovery of kidney graft function, defined as eGFR< 60 mL/min/1.73 m2 during the first 48 hours after surgery [26], was the primary outcome.

#### *2.6. Clinical Variables*

Preoperative and intraoperative recipient and donor graft factors in the non-hyperchloremia and hyperchloremia groups were assessed by PS matching analysis. Preoperative recipient factors included age, sex, body mass index (BMI), comorbidities (i.e., diabetes mellitus (DM) and hypertension), dialysis history and duration, and laboratory variables (i.e., white blood cell count, platelet count, and hemoglobin, sodium, chloride, potassium, albumin, creatinine, and glucose concentrations). Intraoperative factors included the operation time, averages of vital signs (i.e., systolic and diastolic blood pressures, heart rate, CVP, and body temperature), and total amounts of crystalloid infusion, urine output, and blood loss. Donor graft factors included age, sex, BMI, graft weight, total graft ischemic time, and human leukocyte antigen level.

Postoperative clinical factors included RRT requirement, biopsy-proven graft rejection [27], and patient mortality during the follow-up period.

#### *2.7. Statistical Analysis*

The normality of continuous data was assessed using the Shapiro-Wilk test. Continuous data are expressed as medians and interquartile ranges, and categorical data are expressed as numbers and proportions. PS matching analysis was applied to reduce the impact of potential confounding factors on intergroup differences based on hyperchloremia [28,29]. PSs were derived to match patients at a 1:1 ratio using greedy matching algorithms without replacement. Perioperative recipient and donor graft factors were compared using the Mann–Whitney *U* test and χ<sup>2</sup> test or Fisher's exact test, as appropriate. Wilcoxon's signed-rank sum test and McNemar's test were used for analysis of the pair-matched data. Postoperative changes in the proportions of patients with kidney graft function classified as eGFR ≥ 60 vs. 30–59 vs. < 30 mL/min/1.73 m<sup>2</sup> were analyzed using Cochran's *Q* test with the McNemar post hoc test. The association of hyperchloremia with poor early recovery of kidney graft function was evaluated by multivariable logistic regression analysis with PS adjustment. The values are presented as odds ratios with 95% confidence intervals. All tests were two sided, and *p* < 0.05 was taken to indicate statistical significance. All statistical analyses were performed using R software version 2.10.1 (R Foundation for Statistical Computing, Vienna, Austria) and SPSS for Windows (ver. 24.0; SPSS Inc., Chicago, IL, USA).

#### **3. Results**

#### *3.1. Demographic Characteristics of Patients Undergoing LDKT*

The total study population of 301 patients comprised 188 (62.5%) males and 113 (35.5%) females with an average age of 49 <sup>±</sup> 11 years and average BMI of 23.1 <sup>±</sup> 3.5 kg/m2. The incidences of DM and hypertension were 28.6% (*n* = 86) and 58.1% (*n* = 175), respectively. Dialysis was performed in 220 (73.1%) patients for an average duration of 30 ± 53 months. The average eGFR was 7.6 ± 3.5 mL/min/1.73 m2. A total of 295 (98.0%) patients had CKD stage V (i.e., eGFR <15 mL/min/1.73 m2), five (1.7%) patients had CKD stage IV (i.e., eGFR 15–29 mL/min/1.73 m2), and one (0.03%) patient had CKD stage IIIb (i.e., eGFR 30–44 mL/min/1.73 m2).

#### *3.2. Comparison of Perioperative Factors before and after PS Matching*

Before PS matching, there were significant differences between groups in preoperative findings (i.e., dialysis duration, sodium and glucose levels), intraoperative findings (i.e., average diastolic blood pressure and total amount of hemorrhage), and donor graft parameters (i.e., total graft ischemic time; Table 2). After PS matching, there were no significant differences in perioperative recipient or donor graft parameters between groups.


**Table** 


 analysis.



Abbreviations: WBC, white blood cell; SBP, systolic blood pressure; DBP, diastolic blood pressure; CVP, central venous pressure; PRA, panel reactive antibody; DSA, donor-specific antibody; FCXM, flow cytometric cross-match. Note: Values are expressed as the median (interquartile range) and number (proportion).

#### *3.3. Comparison of Main Crystalloid Fluid Infusion during Surgery and Electrolyte Values Immediately after Surgery in PS-Matched Patients*

The total amount of crystalloid fluid infused during surgery did not differ between groups. However, the proportions of main crystalloid fluid type used (i.e., 0.9% normal saline vs. Plasma Solution-A) differed between the groups (Table 3). With regard to electrolyte values immediately after surgery, serum sodium and potassium levels were similar in the two groups, but the serum chloride level was higher and the change in chloride level was greater in the hyperchloremia group.

**Table 3.** Comparison of main crystalloid fluid during surgery and electrolyte values immediately after surgery between propensity score-matched non-hyperchloremia and hyperchloremia groups.


‡[Cl−] is defined as the difference in serum chloride between preoperative day and immediately after surgery. Note: Values are expressed as the median and interquartile range.

#### *3.4. Serial Changes in eGFR until POD 2 in PS-Matched Patients*

The eGFR, as an indicator of kidney graft function, increased gradually until POD 2 in both groups (Table 4). However, the proportion of patients with eGFR > 60 mL/min/1.73 m2 increased significantly from immediately after surgery to PODs 1 and 2 in the non-hyperchloremia group (Table 5). On POD 2, the proportion of patients with eGFR > 60 mL/min/1.73 m<sup>2</sup> was larger and the proportion of patients with eGFR < 30 mL/min/1.73 m<sup>2</sup> was smaller in the non-hyperchloremia group than in the hyperchloremia group.

**Table 4.** Serial changes in estimated glomerular filtration rate during postoperative day 2 between propensity score-matched non-hyperchloremia and hyperchloremia groups.


Abbreviation: eGFR, estimated glomerular filtration rate; †††*p* < 0.001 compared to the level immediately after surgery in each group; §§§*p* < 0.001 compared to the level on postoperative day 1 in each group. Note: Values are expressed as the median and interquartile range.


**Table 5.** Comparison of kidney graft function according to estimated glomerular filtration rate during postoperative day 2 between propensity score-matched non-hyperchloremia and hyperchloremia groups.

Abbreviation: eGFR, estimated glomerular filtration rate; †††*p* < 0.001 compared to the level immediately after surgery in each group, §§*p* < 0.01 compared to the level on postoperative day 1 in each group, §§§*p* < 0.001 compared to the level on postoperative day 1 in each group. Note: Values are expressed as number and proportion.

### *3.5. Association of Hyperchloremia with Kidney Graft Function (i.e., eGFR* ≤ *60 mL*/*min*/*1.73 m2) on POD 2*

Hyperchloremia was associated with poor graft recovery on POD 2 in the whole study population and in PS-matched patients (Table 6). After PS adjustment, hyperchloremia remained an independent factor related to poor graft recovery.

**Table 6.** Association of hyperchloremia with poor early graft function (eGFR < 60 ml/min/1.73 m2) on postoperative day 2 in living donor kidney transplantation.


Abbreviation: CI, confidence interval; PS, propensity score.

#### *3.6. Postoperative Clinical Outcomes in PS-Matched Patients*

In the hyperchloremia group, five (5.0%) patients required RRT due to poor graft function, five (5.0%) patients suffered graft rejection, and two (2.0%) patients died. In the non-hyperchloremia group, two (2.0%) patients required RRT, one (1.0%) patient suffered graft rejection, and one (1.0%) patient died. These postoperative outcomes did not differ significantly between groups.

#### **4. Discussion**

The main finding of our study was that hyperchloremia was associated with poor early recovery of graft function after LDKT in an analysis adjusted for clinical factors related to kidney graft function by PS matching. Patients without hyperchloremia showed appropriate graft function recovery during POD 2, as indicated by the increase in proportion of patients with eGFR > 60 mL/min/1.73 m<sup>2</sup> and decrease in

the proportion of patients with eGFR < 30 mL/min/1.73 m2. The proportion of patients with adequate graft function (i.e., eGFR > 60 mL/min/1.73 m2) on POD 2 was larger in the non-hyperchloremia group than in the hyperchloremia group.

Many studies have examined the impact of hyperchloremia on AKI in critically ill patients, but debate persists regarding the relation between hyperchloremia and the occurrence of AKI [5,6,23,30,31]. A study of severely septic patients in the ICU showed that hyperchloremia was common in these patients due to aggressive fluid resuscitation, but that neither hyperchloremia nor an increase in the serum chloride level was associated with an increased risk of AKI development within the first three days of ICU admission [23]. In patients undergoing craniotomy for intracranial hemorrhage, hyperchloremia due to infusion of 0.9% NaCl solution was related to a decrease in the water shift across the blood–brain barrier, leading to metabolic acidosis, but did not directly cause AKI [30,32]. However, in another study of patients undergoing craniotomy for primary brain tumor resection, the occurrence of hyperchloremia within 3 PODs appeared to aggravate early kidney function [33]. In a study of patients with subarachnoid hemorrhage in the neurocritical care unit, those who developed AKI had a higher average serum chloride level than did those without AKI, despite similar chloride loading [5]. Suetrong et al. [6] suggested that the maximum serum chloride level (i.e., ≥ 110 mmol/L) and an increase ≥ 5 mmol/L within 48 hours after ICU admission due to severe sepsis or septic shock were the predominant factors associated with the development of AKI. Another study conducted by Marouli et al. [12] suggested that a high intraoperative chloride load (i.e., > 500 mEq) played a significant role in the development of AKI within 48 hours after major abdominal surgery. In a prospective ICU study, a lesser intravenous chloride supply was associated with significant reduction in the worst stages of AKI and the requirement for RRT. In fluid management, the infusion of a chloride-restrictive fluid (i.e., Plasma-Lyte 148), rather than a chloride-liberal fluid (i.e., 0.9% normal saline), may effectively attenuate the increase in serum creatinine level from baseline to peak during the ICU stay [34].

Patients with chronic kidney dysfunction and those undergoing RRT have been routinely excluded from many studies of AKI because the initial kidney status is one of the most critical factors contributing to the development of AKI after surgery or ICU admission [35]. In contrast, our study included patients with end-stage renal disease, most of whom were receiving dialysis, with kidney grafts rendered vulnerable by ischemic injury [1]. The impact of hyperchloremia on graft function recovery in the early period has not been investigated fully in such LDKT settings. As various preoperative and intraoperative risk factors may be related to postoperative graft function, these clinical risk factors were matched between patients with and without hyperchloremia to reduce selection bias using a PS-based method [29,36]. Therefore, the results of the present study suggest that hyperchloremia is an independent risk factor for inappropriate graft recovery after LDKT. The intraoperative infusion of a large amount of 0.9% normal saline may be related to an increase in the chloride load and consequent effects on kidney grafts because 0.9% saline contains 50% more chloride than serum (154 vs. 100 mEq/L) [37]. Postoperatively, however, the sodium and potassium levels did not differ between our study groups, suggesting that they are not related to an increased risk of AKI development, consistent with previous findings [5,6]. In animal experiments [38,39], a potential explanation for chloride-load kidney injury was suggested to be the dysregulation of tubuloglomerular feedback caused by chloride reaching the macula densa, which caused renal afferent arteriole vasoconstriction related to decreased renal cortical tissue perfusion and the development of tissue ischemia. A second explanation was proposed to involve renal interstitial edema related to fluid overload resulting in intracapsular hypertension or vasomotor nephropathy. Although the specific pathophysiology of the relationship between hyperchloremia and graft dysfunction in patients undergoing LDKT remains unclear, exposure of the kidney graft to high serum chloride levels may have a negative effect on functional recovery and prolong the recovery period.

This study had some limitations. First, although confounding factors were adjusted between patients with and without hyperchloremia by PS matching, hidden biases attributable to unknown factors could not be completely excluded. Second, we were not able to determine the amounts of chloride infused from the patients' records, so the analysis was based on the serum chloride concentrations. Third, our observations did not elucidate the pathophysiology underlying the relationship between hyperchloremia and kidney graft function recovery. Fourth, because of the possibility of graft failure requiring RRT, such as arteriovenous fistula [40], we did not routinely perform arterial procedures, such as ABGA. We were unable to determine the effects of metabolic acidosis on early graft recovery. Finally, the power to identify associations of hyperchloremia with the requirement for RRT, rejection, and mortality was limited because the sample was small.

#### **5. Conclusions**

Exposure to hyperchloremia may have a negative effect on the early recovery of kidney grafts injured by ischemia in LDKT. The infusion of large amounts of chloride-rich fluid seems to be a major factor contributing to increased serum chloride levels, and this chloride load may play a role in prolonging kidney graft recovery in the early postoperative period. Previous studies have shown that chloride loads cause renal vasoconstriction and interstitial edema, thereby decreasing renal blood flow and perfusion, and consequently reducing the GFR and urine output [41,42]. However, the potential pathophysiological relationships in the KT-specific setting have not been clarified. Therefore, further studies are required to determine the association between the chloride load and transplanted kidney graft functional recovery.

**Author Contributions:** Conceptualization, M.S.C.; methodology, M.S.C.; software, M.S.C.; validation, M.S.C.; formal analysis, J.G., S.-C.P., S.-S.Y., J.P., J.-W.S., H.M.L., Y.-S.K., Y.E.M., S.H.H., and M.S.C.; investigation, J.G., S.P., S.-S.Y., J.P., J.-W.S., H.M.L., Y.-S.K., Y.E.M., S.H.H., and M.S.C.; resources, J.G., J.K., and M.S.C.; data curation, J.G. and J.K.; writing—original draft preparation, J.G. and M.S.C.; writing—review and editing, M.S.C.; visualization, M.S.C.; supervision, M.S.C.

**Conflicts of Interest:** No author has any conflict of interest regarding the publication of this article.

#### **Abbreviations**


#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

### *Article* **Phenotypic and Genotypic Characterization of** *Escherichia coli* **Causing Urinary Tract Infections in Kidney-Transplanted Patients**

**Jonas Abo Basha 1, Matthias Kiel 2, Dennis Görlich 3, Katharina Schütte-Nütgen 1, Anika Witten 4, Hermann Pavenstädt 1, Barbara C. Kahl 5, Ulrich Dobrindt 2,**†**, \* and Stefan Reuter 1,**†**, \***


Received: 3 June 2019; Accepted: 5 July 2019; Published: 7 July 2019

**Abstract:** Urinary tract infection (UTI), frequently caused by uropathogenic Escherichia coli (UPEC), is the most common infection after kidney transplantation (KTx). Untreated, it can lead to urosepsis and impairment of the graft function. We questioned whether the UPEC isolated from KTx patients differed from the UPEC of non-KTx patients. Therefore, we determined the genome sequences of 182 UPEC isolates from KTx and control patients in a large German university clinic and pheno- and genotypically compared these two isolated groups. Resistance to the β-lactams, trimethoprim or trimethoprim/sulfamethoxazole was significantly higher among UPEC from KTx than from control patients, whereas both the isolated groups were highly susceptible to fosfomycin. Accordingly, the gene content conferring resistance to β-lactams or trimethoprim, but also to aminoglycosides, was significantly higher in KTx than in control UPEC isolates. *E. coli* isolates from KTx patients more frequently presented with uncommon UPEC phylogroups expressing higher numbers of plasmid replicons, but interestingly, less UPEC virulence-associated genes than the control group. We conclude that there is no defining subset of virulence traits for UPEC from KTx patients. The clinical history and immunocompromised status of KTx patients enables *E. coli* strains with low uropathogenic potential, but with increased antibiotic resistance to cause UTIs.

**Keywords:** Uropathogenic E. coli; UPEC; phylogeny; genomics; antibiotic resistance; virulence traits; kidney transplantation

#### **1. Introduction**

Kidney transplantation (KTx) is the best treatment for patients with end-stage renal disease (ESRD). Despite routine screening, treatment and prophylaxis, the prevalence of urinary tract infections (UTIs) in KTx patients is high; more than every third patient is affected [1]. While lower UTI/cystitis usually does not limit graft prognosis, pyelonephritis and urosepsis limit the same, not only in the acute setting, but also in the long run, impairing graft function and prognosis [2,3]. Older age, duration of catheterization, acute rejection periods and deceased donor KTx are known risk factors for UTI after KTx [1]. In addition, female patients are at a higher risk for UTIs. Recently, we concluded that a male donor kidney is a new risk factor [4]. Further potential that predisposes host factors are, e.g., excessive immunosuppression, diabetes mellitus, and the instrumentation of the urinary tract [5]. Uropathogenic *Escherichia coli* (UPEC) are the most common clinical isolates in UTI patients after KTx [6]. UPEC encode virulence factors including adhesins, toxins, capsules, and factors involved in biofilm formation [7]. Resistances to certain antibiotics are common, as patients are frequently treated by antimicrobial compounds, for, e.g., the prophylaxis of *Pneumocystis* pneumonia, UTIs, or in perioperative/periprocedural settings.

As clinical variables and features of patients and UTIs are significantly different from non-transplanted patients, we hypothesized that UPEC isolates from both groups are also different. Therefore, we sought to characterize phylogeny, virulence traits, and antibiotic susceptibility of UPEC from KTx patients and non-transplanted patients to ultimately assist transplant physicians in the management of UTI in KTx patients.

#### **2. Experimental Section**

Detailed methods are provided in the Supplemental Information.

#### *2.1. Study Population*

We prospectively collected and analyzed 182 UPEC isolates from 167 patients who were diagnosed with an UPEC-associated UTI between July and December 2016 at the Department of Nephrology or the emergency room at the University Clinic of Muenster. We included 62 KTx patients (71 isolates) and 105 non-transplanted controls (111 isolates). Urine samples were taken in case of clinically relevant UTI symptoms or pyuria and included if <sup>≥</sup>10<sup>5</sup> colony forming units (CFU)/mL urine were detected. Additional blood or respiratory tract samples were collected if required. More than one isolate per patient was included if genotypically different strains were identified from samples.

#### *2.2. Patients' Characteristics*

Patients' and donors' characteristics were collected from the patients' files. All non-KTx patients treated at the Department of Nephrology or the emergency room during the study period served as controls. The study was permitted by the local ethics committee (Ethikkommission der ÄrztekammerWestfalen-Lippe und der Medizinischen Fakultät der Westfälischen Wilhelms-Universität, No. 2014-381-f-N).

#### *2.3. Bacterial Strains and Culture Conditions*

We analyzed *E. coli* samples from urine (*n* = 164), blood (*n* = 8), and respiratory tract (*n* = 10).

#### *2.4. Phenotypic Tests*

Hemolytic capacity, bacteriocin production, expression of type 1 fimbriae and biofilm formation were analyzed.

#### *2.5. Genome Sequencing*

Genome sequencing was carried out as described in the Supplemental Information.

#### *2.6. Draft Genome Comparison and Typing*

Multi-Locus Sequence Typing (MLST) was conducted on all UPEC isolates and UPEC virulence-associated genes were detected.

#### *2.7. Statistical Analysis*

The data were statistically analyzed using the IBM SPSS statistics 24 for Windows (IBM Corporation, Somers, NY, USA). All baseline variables were described using standard univariate analysis, while bivariate analysis such as the Fisher's exact test and the t-test were performed for comparing the

two groups' results with each other. The *p*-values were interpreted as exploratory, not confirmatory. If the *p*-values were ≤0.05, the result was considered as statistically significant.

#### **3. Results**

#### *3.1. Patients' Characteristics*

The mean age at the time of infection was comparable between the KTx recipients (79% females) and the control group (70.5% females) (56.7 ± 15.1 vs. 53.45 ± 21.82, respectively). The KTx recipients had a higher BMI (26.19 <sup>±</sup> 4.58 kg/m<sup>2</sup> vs. 24.62 <sup>±</sup> 4.41, *p* = 0.0251) and showed a higher frequency of hypertension than control patients (80.6% vs. 35.3%), while the frequency of diabetes did not differ between the groups (Table 1). Causes of ESRD are given in Supplemental Table S1. The recipients' glomerular filtration rate at the time of UTI was 58.4 ± 32.48 mL/min in KTx patients and 70.21 ± 35.67 mL/min in the control group. To note, the kidney function in KTx patients is based on one working kidney (the transplant) only. More than every fourth recipient and more than every fifth control patient experienced acute renal injury at the time of infection. Lower UTI was observed in ~80% of KTx and controls. 24.5% of KTx and 39.3% controls had been hospitalized for UTI (Table 1). Interestingly, within three months before the UTI diagnosis, KTx patients differed significantly from control patients by being more often hospitalized, having a ureteral stent or urinary catheter, and by antibiotic treatment with cephalosporines, TMP/SMX and fosfomycin mainly for treatments of UTIs and for prophylaxis, e.g., for UTIs and/or Pneumocystis jirovecii infection (Table 1).


**Table 1.** Clinical data.


**Table 1.** *Cont.*

σ = standard variation, BMI= body mass index, eGFR = estimated glomerular filtration rate, KTx = kidney transplantation, *p*-values were obtained using t-test.

#### *3.2. Epidemiological Classification of Urine Isolates from KTx and Control Patients*

The 182 *E. coli* urine isolates were allocated to phylogenetic lineages and to sequence types (STs) as published by Clermont et al. [8] (Figure 1). 60.5% of all KTx isolates were allocated to phylogroups B2 and A, to which the majority of UPEC primarily belong (Table 2).


**Table 2.** Distribution of phylogroups in relation to KTx and control UPEC isolates.

No. = number, KTx = strains of kidney transplanted patients, *p*-values were obtained using Fisher's exact test.

78 different STs were identified. The top ranked STs, representing 93 isolates (51.1% of all isolates), are shown in Supplementary Table S2. Isolates from the KTx and control groups could not be distinguished based on STs. ST10 and ST69 were the most prevalent STs in KTx isolates. ST73 and ST131 were most frequently detected in the control group (Supplementary Table S2). Isolates of KTx patients with a BMI > 25 kg/m<sup>2</sup> showed a higher number of ST10 and ST131, and a fewer number of ST23 compared to control patients' isolates. Age did not affect the ST distribution. The prevalence of ST127 and ST73 isolates was markedly higher in the control group supporting our finding that the phylogroup B2 strains are overrepresented in this group.

The O serogroups were highly variable in the urine isolates. Table 3 shows the most frequently predicted O serogroups in 98 isolates (53.8% of all isolates). The O antigen could be typed for 161 isolates leading to the identification of 46 different O antigen types (Table 3). O8 and O89 (9.9% each) were most prevalent in KTx isolates, whereas O6 (15.2%) and O8 (9.8%) were the most prevalent O serotypes in the control group. Typical UPEC O serotypes, i.e., O6, were less prevalent among the KTx isolates. In contrast, KTx isolates more frequently belonged to the O89 serotype than strains of the control group.


**Table 3.** Distribution of O serotypes in relation to KTx and control UPEC isolates.

No. = Number, KTx = strains of kidney transplanted patients, nd = not determined, *p*-values were obtained using Fisher's exact test.

**Figure 1.** Phylogenetic diversity of the UPEC isolated from KTx and control patients. The neighboring joining tree is based on the MLST of seven housekeeping genes performed with Ridom SeqSphere+ (https://www.ridom.de/seqsphere/) and was created with iTOL (https://itol.embl.de/). The sequence type (ST), the corresponding phylogroup and the isolate group is indicated for each UPEC isolate. The allocation to the two isolate groups (KTx or control) is indicated by different colors. Phylogenetic diversity of UPEC isolated from KTx and control patients. The neighboring joining tree is based on the MLST of seven housekeeping genes and was created with the Ridom SeqSphere+ (https://www.ridom.de/seqsphere/). The sequence type (ST), the corresponding phylogroup and the isolate group (KTx patient yes or no) is indicated for each UPEC isolate. The allocation to different phylogroups is indicated by different colors.

#### *3.3. Prevalence of UPEC Virulence-ASSOCIATED genes*

The binary matrix with colored gradation of presence/absence ratios of the virulence factors (VFs) and their functional categories are shown in Figure 2. Of the 1154 VFs tested, 889 were found in at least one isolate. In general, both isolate groups exhibited a similar VF pattern. The overall VF content was slightly higher in the control than in the KTx patient isolates (Figure 2). 113 VFs showed a tendency to be overrepresented either in isolates of the KTx or the control group. Only three VFs (FimB, FocB, SfaB), representing type 1- or S/F1C fimbriae, were significantly associated with control group isolates when a Bonferroni correction was applied (Supplementary Table S3; Supplementary Figure S2). VFs with a tendency of overrepresentation in KTx patient isolates included bacteriocins, adhesins (e.g., Ybg, Yfc, Yhc), ABC transporter (ets), group 4 capsule (GfcABCDE-Etp), type 3 secretion system components, and three autotransporters. VFs enriched in the isolates of control UTI patients comprised VFs which were described for UPEC and phylogroup B2 isolates, including S-/F1C fimbriae, other chaperone-usher family adhesins, group II capsule, the polyketide colibactin and the CjrABC-SenB gene products. The VF binary matrix was used to examine the grouping of isolates according to the VF presence/absence matrix, phylogeny and the isolate group by the principle coordinates analysis. We could not observe a correlation between the source of isolation (KTx vs. control group) and the VF content or phylogroup. Nevertheless, phylogroup B2 strains clustered separately from isolates from other phylogroups.

#### *3.4. Antibiotic Susceptibility*

A significant number of resistance genes (RGs) was detected in the isolates except for fosfomycin (0.5% of all isolates) and fluoroquinolones (5% of all isolates). UPEC isolates of KTx patients presented more RGs than control UPEC strains (Table 4, Supplementary Table S4). Higher prevalence of the same in KTx patient isolates were observed for trimethoprim, β-lactam, and aminoglycoside resistance determinants. Some resistance genes with high prevalence in the KTx group were less frequently observed in control strains, such as *bla*TEM-1B, *sul*2, *strA* and *strB* (Supplementary Table S4). Clinical susceptibility tests showed that UPEC strains of the KTx and control groups differed significantly in their resistance phenotypes (Table 5).


**Table 4.** Distribution of resistance genes (RGs) present in relation to KTx and control UPEC isolates.

No. = number, KTx = strains of kidney transplanted patients, RG = resistance gene, nd = not determined, *p*-values were obtained using Fisher's exact test.

**Figure 2.** Heatmap displaying the distribution of virulence factors present in KTx and control UPEC isolates. Each row represents one virulence factor. The prevalence of each virulence factor (VF) in KTx and control strains is coded by a color gradient. Vertical black and grey bars indicate the functional VF group: CU-fimbriae = chaperone-usher fimbriae, T2SS = type 2 secretion system, T3SS = type 3 secretion system, T5SS = type 5 secretion system, T6SS = type 2 secretion system. The color code indicates the prevalence of the corresponding VF (calculated as percentage) in the KTx and control group, respectively.


**Table 5.** Susceptibility patterns of KTx and control UPEC isolates.

No. = number, KTx = strains of kidney transplanted patients, AMP = ampicillin, AMX = amoxicillin, SAM= ampicillin/sulbactam, CFX = Cefuroxime, TMP = trimethoprim, SMX = sulfamethoxazole, CIP = ciprofloxacin, LVX = levofloxacin, GEN = gentamicin, *p*-values were obtained using Fisher's exact test.

#### *3.5. Plasmid Types*

Virulence and resistance determinants are frequently located on plasmids. We identified 25 different plasmid replicon types in 149 isolates. UPEC isolates of the control group were more often plasmid-free than isolates from the KTx group. IncFIB, IncFII, Col156 and Inc1 were the most frequently identified replicon types among all strains. IncFIB, IncFII, IncFIC, IncI1 and IncQ1 replicons were more prevalent in the KTx than in the control group. The Col156 replicon was more frequently found in control isolates. The seven most prevalent plasmid types are summarized in Table 6.

**Table 6.** Distribution of the most frequent plasmid replicon types in relation to KTx and control UPEC isolates.


No. = number, KTx = strains of kidney transplanted patients, *p*-values were obtained using Fisher's exact test.

#### *3.6. Phenotypic Assays*

Almost all the analyzed UPEC isolates (97.8%) carried the type 1 fimbrial adhesin gene *fimH*. 156 strains (85.7%), and functionally expressed type 1 fimbriae (KTx: 84.5% vs. controls: 86.5%). α-Hemolysin was produced in 18.1% of isolates without any significant differences between both groups. UPEC from KTx patients (63.4%) more often expressed often bacteriocins killing *E. coli* DH5α than control UPEC (49.5%) (Figure 3). Cellulose and curli expression are important for biofilm formation of *E. coli* (Figure 4, Supplementary Tables S5 and S6). The corresponding rdar/ras morphotype expression was more prevalent at lower growth temperatures and decreased with increasing temperature. Rdar/ras morphotype expression was more frequently observed in the UPEC of KTx patients. Increased expression of the "saw" morphotype correlated with increasing incubation temperature and occurred more often in the control group. The rare mucoid morphotype appeared more often in the KTx isolates.

**Figure 3.** Phenotypic characteristics of the KTx and control UPEC isolates. The percentage of strains phenotypically tested positive for the expression of α-hemolysin, type 1 fimbriae and bacteriocins killing *E.coli* DH5α has been compared between the KTx and control groups.

**Figure 4.** Expression of biofilm morphotypes in KTx and control UPEC isolates. The percentage of strains expressing different biofilm morphotypes at different growth temperatures is compared between the KTx and control groups.

#### **4. Discussion**

UTI results from complex pathogen–host interactions and is affected by host risk factors, bacterial traits and the host immune response [9]. Risk factors for uncomplicated and complicated UTI in adults have been defined before [10]. Genetic factors involved in host susceptibility to acute pyelonephritis have been described including polymorphisms that reduce expression of the interferon regulatory factor 3 (IRF3) or CXCR1 coding for the IL-8 receptor [11].

Our main objectives were to evaluate if the KTx-related factors, such as antibiotic therapy, immunosuppression or host characteristics, impact on UPEC, the major pathogen causing UTI after

KTx [1]. We herein present the genotypic and phenotypic characteristics of the UPEC isolated from KTx patients first.

Prior use of antibiotics promotes the risk of UTI due to (multi-)resistant uropathogens as frequent antibiotic prescription leads to a selection of more resistant bacteria, which is also highlighted by our data which shows increased antibiotic resistances in the KTx group isolates (Tables 4 and 5). Antibiotic susceptibility rates are significantly lower in hospital- than in community-acquired UTIs [12]. Immunocompromised patients often display co-morbidities and are more often hospitalized than immunocompetent patients. Accordingly, the former are more likely to develop UTI caused by multi-drug resistant pathogens [13]. Accordingly, comparing the KTx patients with controls three months prior to the UTI diagnosis, there were significant differences in clinical parameters such as hospitalization rate or application of devices (ureteral stents, urinary catheter); all of which seem to facilitate UTIs in KTx patients. In the transplant setting, invasive procedures such as the instrumentation of the urinary tract as well as antimicrobial prophylaxis against opportunistic microorganisms and treatment of infections are common and are more frequently performed than in controls (Table 1). These interventions affect the microbiome composition, fostering urinary dysbiosis in KTx patients [14]. *E. coli* was found seven times more often in KTx recipients than in controls. Rani and colleagues concluded that UTIs follow the disruption of the local microbial homeostasis and arise from bacteria being part of the colonizing microbiome [14]. Patients suffering from recurrent UTI or with frequent occurrence of symptomatic episodes are at a higher risk of being infected with resistant uropathogenic strains [15–18]. Interestingly, we more frequently observed treated UTIs in KTx patients within a three-month period before the sample acquisition. Thus, factors promoting the selection of virulent UPEC should be minimized and therapeutic or probiotic modification of the colonizing microbiome may mitigate frequency and/or severity of UTIs. Against this background, depletion of intestinal reservoirs of UPEC [19] or fecal microbiota transplantation [20] are promising approaches to tackling such infections.

In particular, treatment of asymptomatic bacteriuria led to a significant exposure to antibiotics in KTx patients. Recent studies suggest against the treatment this condition, which seems to be a step in the right direction [21].

In Germany, the resistance rate of UPEC in uncomplicated UTI was 25.9% for TMP/SMX, 34.9% for ampicillin, 3.7% for ciprofloxacin, and 0.8% for fosfomycin [22]. International data shows higher resistance rates for all antibiotic classes in UPEC from KTx recipients [6,23,24]. Similarly, we found relevant resistance rates among our KTx patients as they have received more frequently antibiotic treatments, e.g., with TMP/SMX, and the UPEC carried more trimethoprim, sulfonamide, and β-lactam resistance genes. In particular, the carriage of blaTEM-1B, strA and strB genes was significantly higher than in control strains (Table 4, Supplementary Table S4). Accordingly, the high resistance rates to antibiotics are clinically important (Table 5). Especially for TMP/SMX, one should balance the high resistance rate against the benefit of the *P. jirovecii* prophylaxis [25]. In line with international findings, we only identified a few fosfomycin resistance determinants leading to reasonable susceptibility rates of UPEC [26]. In this regard, our data are relevant for the clinical decision-making.

Several reports focused on "common" O serogroups linked with UTIs [27], but data regarding O serogroups of *E. coli* causing UTIs in KTx patients is limited. In an analysis of 40 urine isolates from KTx patients, *E. coli* ST131 (O25:H4) was identified to be the most prevalent clone [28]. We detected most of the previously reported "common" UPEC O serogroups, including O25 in KTx isolates with the highest prevalence of serogroups O8 and O89 [28]. In general, distinct O serogroups could not be correlated with KTx isolates (Table 3). In contrast to Rice et al., who proposed a unique pattern of UPEC O serogroups in their US patient cohort with acute kidney injury (AKI) at the time of UTI, we did not find such an association [28]. Our AKI rate was lower and there was a similarity between KTx and controls. The fact that the renal function of KTx patients is based on one working kidney (the transplant) only might confound this analysis because two kidneys might better compensate for affections, especially when one kidney is infected (pyelonephritis). Also, the definition of AKI varied

between studies. Interestingly, the hospitalization rate was lower in KTx patients than in controls, although the types of UTI were comparable between the groups.

The majority of *E. coli* strains with an increased potential to cause UTIs belong to phylogroups B2 and D, whereas isolates of other phylogenetic lineages display a reduced extraintestinal virulence potential. Phylogroups A and B1 often comprise commensals or diarrheagenic pathogens [29]. In our study, the B2 strains were overrepresented among UPEC isolates of the control group, while phylogroup A strains occurred more frequently in KTx patients. Similarly, Tashk et al. observed that *E. coli* urine isolates from KTx patients more frequently belonged to phylogroups which are uncommon for UPEC, such as phylogroups A, B1, and F, while typical UPEC phylogroups (B2 and D) were overrepresented among the fecal strains of these patients [30].

Recent analyses of UPEC phylogeny describe ST14, ST69, ST73, ST95 and ST131 as the most prevalent clones causing UTI [31]. ST131 is a pandemic clone responsible for a high incidence of extraintestinal infections worldwide [32]. Among our isolates, ST131 was one of the three most prevalent STs. In congruence with the results obtained for the phylogroup distribution, seven out of the ten most prevalent STs in our cohort represent phylogenetic lineages B2 and D (Supplementary Table S1), whereas ST127 significantly correlated with UPEC isolates of controls, and no ST was significantly associated with KTx UPEC.

Only a few selected UPEC virulence traits have been studied in the isolates of KTx patients (*n* = 36) and of non-transplanted patients (*n* = 27) without differences between groups [33]. This result was confirmed in our genome-wide approach. Although 113 virulence-associated gene products showed the tendency to be overrepresented in either the KTx or the control group, only three VFs (FimB, SfaB, FocB) proved to be significantly associated with one group, i.e., the control group. S/F1C fimbriae, as well as the other VFs, are important virulence-associated factors of *E. coli* strains of the phylogroups B2 and D [34]. Additionally, the presence of the B2 phylogroup-enriched cjrABC-senB gene products correlates with increased urovirulence [35,36]. Our PCoA underlines that the prevalence of VFs does not strictly correlate with the isolate groups, but rather with their phylogenetic background (Supplementary Figure S3). As the phylogenetic background plays a critical role for the overall genome content, the VF distribution correlates with phylogeny [37,38]. Typical phylogroup B2 VFs were more frequently found in controls; while in the KTx group, other VFs, including phylogroup A-associated Yhc fimbriae, were overrepresented.

Our analysis of plasmid replicons present in both isolate groups partly corresponds with recent studies [39]. Plasmids are common in UPEC with IncFIB, with IncFII being the most common replicon types. This is in congruence with our data. We identified a large number of colicin plasmids. Col-type plasmids frequently harbor resistance, as well as UPEC virulence-associated genes. Interestingly, isolates from control groups carried almost twice as many colicin plasmids as isolates from KTx patients.

Phenotypic characteristics like α-hemolysin, bacteriocin, and type 1 fimbriae expression or biofilm formation have not been studied in UPEC from KTx patients so far. In this isolate group, the bacteriocin expression was significantly higher than in the UPEC of controls. The rdar/ras morphotype was the most prevalent biofilm-related property in both isolate groups, always slightly higher when expressed in KTx patients' isolates. The slightly increased ability of the KTx isolates to express the rdar biofilm morphotype is an aspect that needs further investigations.

In summary, we show significant differences regarding the phylogenetic background of the UPEC isolates from KTx patients compared to non-transplanted patients. As the prevalence of individual VFs correlates with the phylogroup allocation, the observed differences in the VF distribution among the two isolate groups can be explained. Regarding their phylogenetic background, plasmid content and overall VF distribution, *E. coli* isolates from UTI in non-KTx patients who often share these traits with the typical UPEC described for uncomplicated UTI cases. In contrast, isolates from KTx patients included typical UPEC clones and carried less UPEC virulence-associated genes at a lesser frequency. These strains rather exhibited a weak uropathogenic potential. Our findings suggest that in immunocompromised KTx patients, even *E. coli* strains lacking typical UPEC VFs can cause UTIs, whereas the establishment of UTI in non-transplanted patients requires an increased uropathogenic potential. Clinically important are our findings that UPEC from KTx patients show a higher resistance to several groups of commonly used antibiotics, phenotypically and genotypically. Our study adds important aspects to the concept that UTI establishment results from multiple independent factors including the pathogenic potential of the invading pathogens, its interaction with the host and the host's clinical history and susceptibility.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/7/988/s1. Figure S1: Representative biofilm morphotypes expressed by UPEC isolates on Congo Red agar plates, Figure S2: Manhattan Plot of Fisher's exact test *p*-values for the prevalence of individual virulence factors (VF) in an isolate group (KTx or control). The red line separates VFs with *p*-values < 0.05 from VFs without significant association with an isolate group. The green line separates VFs with significant association (*p* < 0.05) after the Bonferroni correction, Figure S3: Principal Coordinates Analysis (PCoA) to examine the grouping of UPEC isolates according to the presence/absence of virulence-associated genes, their phylogenetic group and their isolate group (KTx vs. control group). The axes are scaled with eigenvalue scaling using the square root of the eigenvalue and indicate the percentage of variation explained in the PCoA. KTx strains are marked with (+) and control strains are marked with (•). The phylogroup of each strain is color coded (phylogroup A = blue, phylogroup B1 = light green, phylogroup B2 = dark green, phylogroup C = brown, phylogroup D = pink, phylogroup E = red, phylogroup F = orange, clade V = black), Table S1: Additional KTx recipient and donor data, Table S2: Distribution of sequence types (STs) in relation to KTx and control UPEC isolates, Table S3: List of VFs significantly associated (Fisher's exact test) with either KTx or control UPEC isolates, Table S4: Resistance genes (RGs) with the largest differences between KTx and control isolates, Table S1: Distribution of the rdar/ras, saw and mucoid morphotypes expressed by KTx and control UPEC isolates, Table S6: Calcofluor binding KTx and control UPEC isolates.

**Author Contributions:** S.R., B.K. and U.D. conceived and designed the study. B.K. acquired the samples. J.A.B. conducted the experiments. A.W. analyzed the data, S.R., K.S.-N. and H.P. provided clinical data. J.A.B., M.K., B.K., D.G., S.R. and U.D. analyzed and interpreted the data. J.A.B., S.R. and U.D. wrote the main manuscript text. All authors reviewed the manuscript.

**Funding:** J.A.B. was supported by the Medical Faculty of the University of Münster (Doctoral program "Medizinerkolleg MedK"). H.P., M.K., and U.D. have received funding from the German Research Foundation (grant number SFB1009, TP B05 and B10). Support from the Münster Graduate School of Evolution (MGSE) to M.K. is gratefully acknowledged. We also acknowledge support by the Open Access Publication Fund of the University of Münster.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

### *Article* **Comparison of Glucose Tolerance between Kidney Transplant Recipients and Healthy Controls**

**Hisao Shimada 1, Junji Uchida 1,\*, Shunji Nishide 1, Kazuya Kabei 1, Akihiro Kosoku 1, Keiko Maeda 2, Tomoaki Iwai 1, Toshihide Naganuma 1, Yoshiaki Takemoto <sup>1</sup> and Tatsuya Nakatani <sup>1</sup>**


Received: 24 May 2019; Accepted: 24 June 2019; Published: 27 June 2019

**Abstract:** Post-transplant hyperglycemia and new-onset diabetes mellitus after transplantation (NODAT) are common and important metabolic complications. Decreased insulin secretion and increased insulin resistance are important to the pathophysiologic mechanism behind NODAT. However, the progression of glucose intolerance diagnosed late after kidney transplantation remains clearly unknown. Enrolled in this study were 94 kidney transplant recipients and 134 kidney transplant donors, as the healthy controls, who were treated at our institution. The 75 g-oral glucose tolerance test (OGTT) was performed in the recipients, and the healthy controls received an OGTT before donor nephrectomy. We assessed the prevalence of glucose intolerance including impaired fasting glucose and/or impaired glucose tolerance, as well as insulin secretion and insulin resistance using the homeostasis model assessment, and compared the results between the two groups. Multivariate analysis after adjustment for age, gender, body mass index, estimated glomerular filtration rate, and systolic blood pressure showed that the prevalence of glucose intolerance, insulin resistance, insulin secretion, and 2 h plasma glucose levels were significantly higher in the kidney transplant recipients compared to the healthy controls. Elevation of insulin secretion in kidney transplant recipients may be compensatory for increase of insulin resistance. Impaired compensatory pancreas β cell function may lead to glucose intolerance and NODAT in the future.

**Keywords:** kidney transplantation; glucose intolerance; insulin secretion; insulin resistance; oral glucose tolerance test; healthy subject

#### **1. Introduction**

Post-transplant hyperglycemia and new-onset diabetes mellitus after transplantation (NODAT) are common and significant metabolic complications in kidney transplant recipients (KTRs) which can lead to increased mortality and cardiovascular morbidity [1–3]. Similar to type 2 diabetes mellitus (DM), decreased insulin secretion and increased insulin resistance are important to the pathophysiologic mechanism behind NODAT [4]. Previous reports have shown that impaired insulin secretion is a more dominant mechanism for the development of NODAT compared to type 2 DM [5]. However, the exact mechanism of glucose intolerance diagnosed late after kidney transplantation remains unknown [6], although immunosuppressive agents such as calcineurin inhibitors, steroids, and mammalian target of rapamycin inhibitors are thought to cause glucose intolerance following kidney transplantation [1–3]. The increased prevalence of cardiovascular events in transplant recipients is an issue that remains to be solved. Abnormal glucose homeostasis is considered to be one of the established risk factors for the development of cardiovascular events following kidney transplantation [7], but there have been few reports comparing glucose tolerance between KTRs and healthy subjects.

The oral glucose tolerance test (OGTT) has many advantages over fasting plasma glucose for diagnosing glucose intolerance, as it not only accurately identifies persons with DM but also identifies those with impaired fasting glucose (IFG) and impaired glucose tolerance (IGT). Abnormal glucose tolerance determined by the OGTT is a risk factor for the future development of type 2 DM in general populations [8,9], and the OGTT has also been established as a sensitive tool to detect NODAT and glucose intolerance in KTRs [10,11]. The homeostasis model assessment (HOMA) model is a well-known method used for the quantitative verification of insulin resistance and insulin secretion. In this model, fasting glucose levels and fasting insulin levels are mainly defined by feedback of glucose release from the liver and insulin secretion from pancreatic β cells [12]. The aim of this study was to compare glucose tolerance between KTRs and healthy subjects using the OGTT, and we assessed the prevalence of glucose intolerance including IFG and/or IGT as well as insulin secretion and insulin resistance using the HOMA. The correct homeostasis model assessment evaluation using a computer program was reported to be another verification of insulin sensitivity and pancreatic β cell function [13]. However, we used the homeostasis model assessment of insulin resistance (HOMA-R) and homeostasis model assessment of β cell function (HOMA-β) in this study, because we had previously reported on glucose intolerance in kidney transplant recipients using these methods [11].

#### **2. Patients and Methods**

#### *2.1. Study Design and Participants*

This study was a single-center, cross-sectional, observational investigation conducted at Osaka City University Graduate School of Medicine. Ninety-four out of 101 KTRs who underwent a transplant at our institution and consented to participate in this study were enrolled (KTR group) (Figure S1). To compare glucose tolerance between KTRs and healthy subjects, 134 kidney transplant donors who underwent a nephrectomy at our institution between 2006 and 2017 were enrolled in this study as the healthy controls (HC group). A 75 g-OGTT was performed in the KTR group from October 2010 to April 2012, and the HC group received a 75 g-OGTT before donor nephrectomy. For the KTR group, the inclusion criteria were as follows: (1) stable calcineurin inhibitor (CNI) levels for the last 6 months, (2) no prior evidence of DM, (3) at least a year after transplantation, (4) serum creatinine below 2.0 mg/dL, and (5) stable renal function for the last 6 months. The following recipients were excluded from this study: (1) patients with acute infection, liver dysfunction, abnormal thyroid tests, history of gastrectomy, or chronic pancreatitis and (2) patients who had histories of hepatitis B virus and/or hepatitis C virus infection irrespective of being treated or untreated. This study was approved by the Ethics Committee of Osaka City University Graduate School of Medicine (No. 4120). We provided patients with information explaining the proposed research plan (the purpose, required individual data, and duration of research) by means of an information website of our hospital and gave them the opportunity to opt out, and all the procedures were in accordance with the Helsinki Declaration of 2000 and the Declaration of Istanbul 2008.

#### *2.2. Immunosuppressive Regimen*

Standard immunosuppressive regimen for kidney transplantation consisted of basiliximab (BAS), methylprednisolone (MP), CNI (cyclosporine or tacrolimus), and antimetabolites (mycophenolate mofetil or mizoribine or azathioprine). CNI and antimetabolites were initiated 3 days before transplantation. BAS has been administered in all recipients since 2002 at a dose of 20 mg/day at the time of transplantation and 4 days after transplantation. MP was intravenously administered at a dose of 500 mg at the time of transplantation and orally administered at 40 mg/day 1 to 7 days after transplantation, the dose of which was decreased to 24, 12, 8, and 4 mg/day every week. For treatment of acute cellular rejection events, MP was intravenously administered at a dose of 500 mg/day for 3 days alone or in combination with deoxyspergualin (5 mg/kg/day: 5–7 days).

#### *2.3. Data Collection*

Patient characteristics (age, gender) and clinical data [estimated glomerular filtration rate (eGFR), fasting plasma glucose (FPG), fasting immunoreactive insulin, hemoglobin A1c, triglyceride, total cholesterol, high-density lipoprotein-cholesterol, low-density lipoprotein-cholesterol] were collected from electronic medical records in all subjects enrolled in this study. eGFR was estimated by the modified Modification of Diet in Renal Disease Equation using the Japanese coefficient [14]. Blood samples were obtained after overnight fasting. Body mass index (BMI) was calculated as body weight in kilograms divided by the square of body height in meters (kg/m2). Blood pressure was reported as the average of five automated measurements taken at 3-min intervals. Clinical parameters of the KTR group such as type of CNIs, donor type, and post-transplant duration were collected. Dyslipidemia was defined by oral administration such as statin and polyunsaturated fatty acid, triglycerides over 150 mg/dL, or HDL cholesterol below 40 mg/dL, or LDL cholesterol over 140 mg/dL. Hypertension was defined by administration of antihypertensive drug, systolic blood pressure over 140 mmHg, or diastolic blood pressure over 90 mmHg. The administration of angiotensin converting enzyme inhibitor and/or angiotensin II receptor blocker, β-blocker, calcium channel blocker, loop diuretic, and thiazide diuretic were evaluated in all subjects.

#### *2.4. Glucose Intolerance, Insulin Resistance, and* β *Cell Function*

All patients underwent an OGTT in the morning after overnight fasting. Blood samples were drawn for determining plasma glucose and insulin before glucose loading and at 30 and 120 min after glucose loading. According to the World Health Organization [15], normal glucose tolerance (NGT) was defined as FPG and 2-h plasma glucose of <110 mg/dL and <140 mg/dL, IFG as 110–126 mg/dL and <140 mg/dL, IFG/IGT as 110–126 mg/dL and 140–200 mg/dL, and DM as ≥ 126 mg/dL and/or ≥200 mg/dL, respectively. Glucose intolerance consisted of IFG, IGT, IFG/IGT, and DM. Insulin resistance was estimated using the HOMA of insulin resistance (HOMA-R) according to the formula HOMA-R = fasting insulin (mU/L) × FPG (mg/dL)/405. For the assessment of pancreatic β cell function, we used the HOMA of β cell function (HOMA-β) according to the formula HOMA-β = 360 × fasting insulin (mU/L)/(FPG (mg/dL)-63), and the insulinogenic index = (insulin 30 min-fasting insulin (mU/L))/(plasma glucose 30 min-FPG (mg/dL)) [12].

#### *2.5. Statistical Analysis*

Categorical variables were expressed as count and percentage, and continuous variables were expressed as mean ± standard deviation, or median and interquartile range, or range. Differences between the groups were examined by Student's t-test or Mann-Whitney U-test. Categorical variables were compared using chi-squared analysis. Logistic regression analysis was used to test the influence of KTR group or HC group on the glucose intolerance (versus normal glucose tolerance). We also performed logistic regression analysis adjusted for multiple models (Model 1: Adjusted for age, gender, and BMI; Model 2: Adjusted for Model 1 and eGFR; Model 3: Adjusted for Model 2 and SBP). Linear regression analysis was used to test if the group (KTR group or HC group) was related to various dependent variables. We also performed linear regression analysis adjusted for multiple models (Model 1: Adjusted for age, gender, and BMI; Model 2: Adjusted for Model 1 and eGFR; Model 3: Adjusted for Model 2 and SBP). All statistical analyses were performed with SPSS version 22.0 for windows (IBM Japan, Tokyo, Japan). A *p*-value of less than 0.05 was considered statistically significant.

#### **3. Results**

#### *3.1. Study Participants*

The comparison of characteristics between the KTR and HC groups is presented in Table 1. The age in the KTR group was significantly younger than that in the HC group, while BMI in the HC group was significantly higher than that in the KTR group. Hemoglobin A1c levels in the KTR group were lower than those in the HC group. eGFR was lower in the KTR group compared to the HC group. The prevalence of hypertension in the KTR group was significantly higher than that in the HC group. The clinical parameters related to kidney transplantation in the KTR group are shown in Supplementary Table S1.



KTR, kidney transplant recipients; HC, healthy controls; BMI, body mass index; eGFR, estimated glomerular filtration rate; HDL cholesterol, high-density lipoprotein cholesterol; LDL cholesterol, low-density lipoprotein cholesterol; ARB, angiotensin II receptor blocker; ACEi, angiotensin converting enzyme inhibitor; CNI, calcineurin inhibitors.

#### *3.2. Insulin Secretion and Resistance in Subjects with NGT and Glucose Intolerance*

In the KTR group, NGT was detected in 74 (78.7%) patients, while 20 (21.3%) had glucose intolerance including two with IFG, 10 with IGT, two with IFG/IGT, and six with DM. Meanwhile, in the HC group, NGT was detected in 107 (79.9%) patients, while 27 (20.1%) had glucose intolerance including four with IFG, 16 with IGT, three with IFG/IGT, and four with DM. There was no significant difference in the prevalence of glucose intolerance between the two groups. HOMA-R in the KTR group tended to be higher than that in the HC group (*p* = 0.051). HOMA-β in the KTR group was significantly higher than that in the HC group. There was also no significant change in the insulinogenic index between the two groups. There were no significant changes in FPG and 2 h plasma glucose levels between the two groups (Table 2).


**Table 2.** Glucose intolerance between kidney transplant recipients and healthy controls.

KTR, kidney transplant recipients; HC, healthy controls; IRI, immunoreactive insulin; HOMA-R, homeostasis model assessment of insulin resistance; HOMA-β, homeostasis model assessment of β cell function. IQR, interquartile range.

#### *3.3. Comparison of Prevalence of Glucose Intolerance by Multivariate Logistic Regression Analysis*

Multivariate logistic regression analysis was performed to compare the prevalence of glucose intolerance between the two groups (Table 3). There was no significant association in the unadjusted Model and Model 1 (adjusted for age, gender, and BMI). In Model 2 (adjusted for Model 1 and eGFR), there was a statistically significant association between glucose intolerance and group (KTR group versus HC group) (OR = 3.544, 95% CI = 1.143–10.986, *p* = 0.028). In Model 3 (adjusted for Model 2 and SBP), there was a statistically significant association between glucose intolerance and group (KTR group versus HC group) (OR = 3.794, 95% CI = 1.200–11.996, *p* = 0.023).

**Table 3.** Multiple logistic regression analysis for prevalence of glucose intolerance (glucose intolerance versus normal glucose tolerance) between kidney transplant recipients and healthy controls.


KTR, kidney transplant recipients; HC, healthy controls; BMI, body mass index; eGFR, estimated glomerular filtration rate; SBP, systolic blood pressure; OR, odds ratio; 95% CI, 95% confidence interval.

#### *3.4. Comparison of FPG and 2 h Plasma Glucose Levels by Multivariate Logistic Regression Analysis*

The results of linear regression analysis of FPG and 2 h plasma glucose levels are shown in Table 4. In all models, there was no significant association between FPG and group (KTR group versus HC group). However, in Model 1, Model 2, and Model 3, there was a statistically significant association between 2 h plasma glucose levels and group (KTR group versus HC group) (Model 1: B = 10.713, S.E. = 4.861, *p* = 0.029; Model 2: B = 15.079, S.E. = 7.311, *p* = 0.040; Model 3: B = 15.091, S.E. = 7.329, *p* = 0.041).

**Table 4.** Correlation between fasting plasma glucose and 2 h plasma glucose with presence of kidney transplantation in adjusted linear regression analysis.


FPG, fasting plasma glucose; 2-hPG, 2-h plasma glucose; KTR, kidney transplant recipients; HC, healthy controls; BMI, body mass index; eGFR, estimated glomerular filtration rate; SBP, systolic blood pressure; B, coefficient estimate; S.E., standard error.

#### *3.5. Comparison of HOMA-R and HOMA-*β *by Multivariate Logistic Regression Analysis*

Table 5 shows the results of linear regression analysis of HOMA-R and HOMA-β. In Model 1, Model 2, and Model 3, there was a statistically significant association between HOMA-R and group (KTR group versus HC group) (Model 1: B = 0.516, S.E. = 0.170, *p* = 0.003; Model 2: B = 0.615, S.E. = 0.256, *p* = 0.017; Model 3: B = 0.616, S.E. = 0.256, *p* = 0.017). In all models, there was a statistically significant association between HOMA-β and group (KTR group versus HC group) (unadjusted Model: B = 15.850, S.E. = 6.341; *p* = 0.013; Model 1: B = 24.581, S.E. = 6.417, *p* < 0.001; Model 2: B = 28.699, S.E. = 9.658, *p* = 0.003; Model 3: B = 28.715, S.E. = 9.689, *p* = 0.003).

**Table 5.** Correlation between HOMA-R and HOMA-β with presence of kidney transplantation in adjusted linear regression analysis.


HOMA-R, homeostasis model assessment of insulin resistance; HOMA-β, homeostasis model assessment of β cell function; KTR, kidney transplant recipients; HC, healthy controls; BMI, body mass index; eGFR, estimated glomerular filtration rate; SBP, systolic blood pressure; B, coefficient estimate; S.E., standard error.

#### **4. Discussion**

In this study, multivariate regression analysis revealed that the prevalence of glucose intolerance in the KTR group was significantly higher than in the HC group. Moreover, insulin resistance in the KTR group was significantly higher than that in the HC group, and insulin secretion in the KTR group was also higher than that in the HC group. The elevation of insulin secretion may be compensatory for the increase of insulin resistance in the KTR group. To our knowledge, this is the first demonstration comparing glucose tolerance between KTRs and healthy subjects.

The pathophysiology of NODAT is similar to type 2 DM but with important differences. Previous reports have shown that the primary pathophysiological defect is more pancreatic β cell dysfunction in NODAT compared to type 2 DM [5]. However, the mechanism of glucose intolerance diagnosed late after kidney transplantation is not clear [6]. Because of the prolonged and elevated insulin resistance due to immunosuppressive agents such as steroids, CNIs, and mTOR inhibitors administered for a long time at a late post-transplant stage, long-term compensatory insulin secretion of pancreatic β cells may be required to prevent impaired glucose metabolism in KTRs. Additional risk factors of DM such as gaining weight after transplantation or aging may lead to the collapse of pancreatic β cells function in these patients. As a consequence, glucose intolerance and DM in KTRs may occur at a late post-transplant stage despite a concomitant decrease in steroid use and CNI blood levels.

It has been established that IGT and IFG are risk factors for developing type 2 DM in the future in general populations [6,7]. In our study, multivariate logistic regression analysis revealed that the prevalence of glucose intolerance based on the OGTT such as IFG, IGT, and DM in the KTR group was higher than that in the HC group. KTRs may therefore have a higher risk for new-onset DM compared to healthy subjects. One study reported that the occurrence of acute rejection and NODAT within the first post-transplantation year showed a similar impact on long-term transplant survival. Moreover, NODAT seems to be associated with death with a functioning graft [16]. As pre-stages of NODAT, IFG and IGT have been introduced as gluco-metabolic targets in an effort to reduce the risk of developing chronic transplant-associated morbidity and mortality by implementing proper management approaches during pre-and post-transplant stages [17]. Assessment of glucose intolerance based on the OGTT may also be more important for achieving excellent transplant outcomes in KTRs.

KTRs have an increased risk of premature death due to cardiovascular disease, malignancy, and infectious disease, as they are the predominant causes of mortality [18]. Immunosuppressive

therapy may potentiate these risk factors. However, the increased long-term mortality after kidney transplantation cannot be fully explained by this. Hyperglycemia is reported to be a risk marker for cardiovascular disease and cancer among healthy subjects without DM [19,20]. In a previous study, 2-h post-load glucose concentrations indicated a risk of all cause and cardiovascular morbidity in a general population without known DM [21]. Multivariate regression analysis in our study identified that 2-h plasma glucose levels in the KTR group were higher than those in the HC group. Increased 2-h plasma glucose levels in KTRs may elevate the risk of all cause and cardiovascular morbidity, and KTRs may tend to have subclinical hyperglycemia.

This study has several important limitations. Because it was a cross-sectional, observational study, the risk for glucose intolerance was not completely clarified. Also, the risk factor data of NODAT, such as family history of DM and hypomagnesemia were not available [22,23]. However, this may be the first demonstration comparing glucose tolerance between KTRs and healthy subjects. Our study may be helpful for understanding the status of glucose tolerance in KTRs receiving immunosuppressive therapy. Prospective and cohort studies involving a much larger population are needed to identify the mechanism of glucose intolerance in KTRs.

#### **5. Conclusions**

In conclusion, insulin resistance as well as insulin secretion in the KTR group were significantly higher than those in the HC group. Moreover, the prevalence of glucose intolerance and 2-h plasma glucose levels in the KTR group were significantly higher than those in the HC group. The elevation of insulin secretion may be compensatory for the increase of insulin resistance in KTRs. In these recipients, the impaired compensatory pancreas β cell function may lead to NODAT and glucose intolerance late after kidney transplantation.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/7/920/s1, Figure S1: Flow chart of patients' enrollment, Table S1: Clinical parameters of kidney transplant recipients.

**Author Contributions:** Conceptualization, J.U., Y.T. and K.K.; methodology, J.U., S.N. and T.N.; software, H.S.; validation, H.S., K.M. and J.U.; formal analysis, H.S.; investigation, H.S. and T.I.; resources, J.U.; data curation, H.S.; writing—original draft preparation, H.S. and A.K.; writing—review and editing, J.U.; visualization, H.S.; supervision, T.N.; project administration, J.U.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

*Article*

### **The E**ff**ect of Donors' Demographic Characteristics in Renal Function Post-Living Kidney Donation. Analysis of a UK Single Centre Cohort**

**Maria Irene Bellini 1,\*, Sotiris Charalampidis 1, Ioannis Stratigos 2, Frank J.M.F. Dor 1,3 and Vassilios Papalois 1,3**


Received: 23 May 2019; Accepted: 18 June 2019; Published: 20 June 2019

**Abstract:** Introduction: There is a great need to increase the organ donor pool, particularly for living donors. This study analyses the difference in post-living donation kidney function according to pre-donation characteristics of age, genetic relationship with the recipient, sex, ethnicity, and Body Mass Index (BMI). Methods: Retrospective single centre analysis of the trajectory of estimated Glomerular Filtration Rate (eGFR) post-living kidney donation, as a measure of kidney function. Mean eGFR of the different groups was compared at 6 months and during the 60 months follow up. Results: Mean age was 46 ± 13 years, 57% were female, and 60% Caucasian. Mean BMI was <sup>27</sup> <sup>±</sup> 5 kg/m2, with more than a quarter of the cohort having a BMI > 30 (26%), and the majority of the donors genetically related to their recipients (56%). The higher decline rate in eGFR was at 6 months after donation, with female sex, non-Caucasian ethnicity, and age lower than 60 years being independently associated with higher recovery in kidney function (*p* < 0.05). In the 60 months follow up, older age, genetic relationship with the recipient, and male sex led to higher percentual difference in eGFR post-donation. Conclusion: In this study, with a high proportion of high BMI living kidney donors, female sex, age lower than 60 years, and non-genetic relationship with recipient were persistently associated with higher increase in post-donation kidney function. Ethnicity and BMI, per se, should not be a barrier to increasing the living donor kidney pool.

**Keywords:** living donor; kidney transplantation; ethnicity; age; obesity; genetic relationship donor/recipient

#### **1. Introduction**

Living donor (LD) kidney transplantation provides the best long-term outcomes for patients with chronic kidney failure [1]. A careful selection to limit the potential risks related to living kidney donation is important, not only to safeguard these healthy individuals who should not be harmed as a result of their generous act, but also to keep expanding the organ donor pool [2]. The donor's demographic characteristics are currently a topic of interest to assess the potential risk of end stage renal disease (ESRD) among living donors.

It has been reported that high body mass index (BMI kg/m2) has no higher perioperative risks for living kidney donors [3] but has a negative impact on post-donation kidney function [4]. However, there is no consensus regarding the BMI threshold for LD acceptance criteria; this is particularly important in populations where the average BMI is increasing [5,6].

Previous studies have also suggested that renal function reached at one year post-donation remains stable—at least over the next decade—but then declines with ageing [7,8]. Ibrahim et al. indicated that a younger age at the time of donation, a longer time since donation, and a higher eGFR at the time of donation were associated with a greater compensatory increase in the eGFR in the remaining kidney [9]. Conversely, Dols et al. reported that kidney donation by older donors is relatively safe over time since, in their experience, kidney function did not decline progressively [10,11]. A correlation with age and ethnicity has also been reported, with higher risk for ESRD for older Caucasians and younger Africans [12].

Regarding the donor's sex, there are reports that in males there is a more pronounced decline in the short-term renal function [13], but the absolute risk to develop ESRD is still very low: one per 2000, compared to one per 10000 in the general population (RR 8.83), according to a recent meta-analysis [14].

Living kidney donors with a first-degree genetic relationship to the recipient have an increased risk of developing ESRD [15], but a personalised estimation on the basis of donor characteristics still remains unavailable and controversial.

Risk estimation is critical for appropriate informed consent, so special consideration in potential living donors' relative risk triggered by certain demographics or conditions is massively important nowadays. This single centre study aims to establish the effect of the donor's sex, age, ethnicity, BMI, and genetic relationship to the recipient on the evolution of the eGFR as a marker of kidney function recovery after living kidney donation.

#### **2. Methods**

The study, performed in accordance with the Declaration of Helsinki principles, is a retrospective analysis on consecutive living kidney donors who had their operation in our centre during the period of 2000 and 2017, and with 60 months of follow up. After discharge, donors first came to our follow up clinic in 2 weeks' time, and then at six months and yearly (up to five years) post-donation for routine blood controls.

The data used were anonymised and extracted from an electronic database of medical records. The study fell under the category of research through the use of anonymised data of existing databases which, based on the Health Research Authority criteria [16], does not require proportional or full ethics review and approval.

Obesity was defined according to the World Health Organization (WHO) classification when BMI <sup>≥</sup> 30 kg/m2, normal weight when BMI <sup>≤</sup> 25 kg/m2, and overweight when 25 kg/m<sup>2</sup> < BMI < 30 kg/m2. Donors were also stratified according to sex, ethnicity, and age below or above 60 years, which is the cut off between standard or extended criteria used in deceased donor organ donation [17]. Mean eGFR was compared using the CKD-EPI equations between groups, as this is recommended in view of the eligibility of potential living donors [18]. The decline in kidney function was analysed between different points of the 60 months follow up and donation. It was expressed as the percentual difference in eGFR (Δ eGFR), between mean eGFR at a given point of follow up and the eGFR at the time of donation.

Continuous variables were presented as mean ± standard deviation and compared using one-way ANOVA at 6, 12, 24, 36, 48, and 60 months follow up. Confidence interval was set to 95%, and *p* was considered significant at less than 0.05. A general linear model of repeated measures of ANOVA eGFR during the 60 months follow up was built to observe the percentual difference in eGFR recovery and assess the independent effect of donors' characteristics.

Statistical analysis was performed using SPSS (IBM SPSS Statistics for Windows, Version 20.0; IBM Corp, Armonk, NY, USA).

#### **3. Results**

A total of 889 consecutive living kidney donors were analysed (Table 1). Mean follow up was 44.1 ± 31.3 months. Full dataset at 6 months was available for 700 donors (79%), at 12 months for 635 (71%), at 24 months for 569 (64%), at 36 months for 489 (55%), at 48 months for 408 (46%), and at 60 months for 348 (39%) donors. The mean age at donation was 46 ± 13 years, with a mean BMI of <sup>27</sup> <sup>±</sup> 5 kg/m2. More than a quarter of the total LD cohort had a BMI <sup>&</sup>gt; 30 (26%). Females were 57% and the prevalent ethnicity was Caucasian (60%). The majority of the donors were genetically related to the recipient (56%).

**Table 1.** Baseline demographic characteristics and eGFR reported in mL/min/1.73 m2 at donation. In bold are highlighted statistically significant higher values. LRD: Living Related Donor; LURD: Living Unrelated Donor.


Mean eGFR at donation was confirmed to be statistically significantly related to sex, age, and ethnicity (*p* < 0.001), but not to the BMI or the genetic relationship with the recipient (Table 1).

More specifically, females, donors aged > 60 years, and Caucasians had lower eGFR at donation. Mean eGFR was compared in the different groups during follow up, evaluating the evolution of kidney function recovery after donation. It was expressed as Δ eGFR, between mean eGFR at a given point of follow up and the eGFR at the time of donation. The lowest mean eGFR was after 6 months from donation (*p* < 0.001), with then a progressive decrease of Δ eGFR in the 60 months of follow up, mirroring a progressive recovery in kidney function after donation. Δ eGFR for the whole cohort at 6 months was 0.27 ± 16, Δ eGFR at 12 months 0.26 ± 15, Δ eGFR at 24 months 0.24 ± 15, Δ eGFR at 36 months 0.22 ± 16, Δ eGFR at 48 months 0.2 ± 17, and Δ eGFR at 60 months 0.18 ± 17 (*p* < 0.001). See Supplementary Table S1 for difference in Δ eGFR within different groups.

At 6 months of follow up, the nadir of the kidney function recovery, age older than 60 years, male sex, and Caucasian ethnicity were independent predictors for lower recovery in kidney function (*p* < 0.05). No effect was observed for different classes of BMI and genetic relationship with the recipient.

Figures 1–5 represent mean eGFR and the percentual difference from the donation eGFR during the 60 months follow up. The generalised repeated measures of ANOVA eGFR and Δ eGFR are stratified according to donor age, relationship with the recipient, sex, BMI, and ethnicity.

Mean eGFR post-donation was confirmed as persistently higher in the younger cohort (*p* < 0.001, Figure 1A). The percentual difference in eGFR during the 60 months follow up was also significantly different, with lower recovery from the original kidney function for donors aged > 60 years (*p* = 0.037, Figure 1B).

**Figure 1.** General Linear Model of Repeated Measures ANOVA of mean eGFR (Figure 1A) and mean Δ eGFR (Figure 1B) during the 60 months follow up. There is a decline in eGFR post-donation, with a statistical significant correlation with age > 60 years (*p* < 0.001). The percentual difference in eGFR is also statistically different, with lower recovery for age > 60 years (*p* = 0.037).

Figure 2A shows that although the genetic relationship with the recipient did not statistically affect mean eGFR during the follow up (*p* = 0.168), there was a difference in the kidney function recovery after donation (Figure 2B), in favor of living unrelated donors (*p* = 0.007).

**Figure 2.** General Linear Model of Repeated Measures ANOVA of mean eGFR (Figure 2A) and mean Δ eGFR (Figure 2B) during the 60 months follow up. There is no difference in mean eGFR post-donation according to the genetic relationship with the recipient (*p* = 0.168), while the percentual recovery in eGFR is statistically different, being higher for live unrelated donor (*p* = 0.007). LRD: Living Related Donor; LURD: Living Unrelated Donor.

Males had higher mean eGFR at donation (*p* > 0.001), but there was no significant difference in mean eGFR after donation (*p* = 0.3, Figure 3A). The recovery of kidney function was persistently higher in females (*p* < 0.001, Figure 3B). This aspect could also be noted in Supplementary Table S1, where only at donation time male donors have a higher mean eGFR: 95 <sup>±</sup> 32 mL/min/1.72 m<sup>2</sup> versus <sup>87</sup> <sup>±</sup> 22 mL/min/1.72 m<sup>2</sup> for female donors (*<sup>p</sup>* <sup>&</sup>lt; 0.001), and where the <sup>Δ</sup> eGFR is persistently higher for men during follow up (*p* < 0.001).

**Figure 3.** General Linear Model of Repeated Measures ANOVA of mean eGFR (Figure 3A) and mean Δ eGFR (Figure 3B) during the 60 months follow up. Mean eGFR at donation is lower in females (*p* < 0.001), but in the 60 months follow up mean eGFR is no longer significant (*p* = 0.3), because the mean percentual difference in eGFR is higher in males (*p* < 0.001).

Figure 4A,B represent kidney function according to different BMI classes; there was no significant difference pre- or post-donation, either in mean eGFR (*p* = 0.53) or in the percentual difference in kidney function recovery (*p* = 0.79) observed during the study period.

**Figure 4.** General Linear Model of Repeated Measures ANOVA of mean eGFR (Figure 4A) and mean Δ eGFR (Figure 4B) during the 60 months follow up. The mean eGFR post-donation does not differ according to BMI (*p* = 0.53), with no difference also in kidney function recovery after live donation (*p* = 0.79).

Finally, Caucasian ethnicity was related to a lower mean eGFR pre- and post-donation (*p* < 0.001, Table 1 and Figure 5A). There was also an ethnicity-related effect in the early kidney function recovery after donation at 6 months (*p* = 0.005) in favor of Africans and Asians, see also Supplementary Table S1. However, this ethnicity-related effect in kidney function recovery was not observed during the 60 months follow up, with no significant different mean Δ eGFR in the general linear model (*p* = 0.38, Figure 5B).

**Figure 5.** General Linear Model of Repeated Measures ANOVA of mean eGFR (Figure 5A) and mean Δ eGFR (Figure 5B) during the 60 months follow up. The mean eGFR post-donation confirmed to be lower for Caucasian ethnicity (*p* < 0.001), as well as the recovery in kidney function at 6 months (*p* = 0.035). There is not a statistically significance difference in the general linear model for the Δ eGFR at 60 months follow up (*p* = 0.38).

In a logistic regression with stepwise procedure, it was confirmed that only male sex and older age affected Δ eGFR in the long term.

#### **4. Discussion**

The present study investigated the kidney function recovery post-living donation in our centre at 6 months post-donation, and during 60 months follow up. Donors' demographic characteristics of age, genetic relationship to the recipient, sex, BMI, and ethnicity were analysed, aiming to assess long term risk, offer comprehensive information to potential donors, and further encourage living kidney donation [19].

Sixty percent of the living kidney donors in our study were Caucasian, the absolute majority for the cohort, but an inferior percentage compared to the rest of the UK, where 88% of the entire living donor pool are Caucasian [19]. The lack of an ethnicity-related effect to the recovery of kidney function could encourage potential living donors from African and Asian communities to proceed with donation; this is most important, considering the fact that these minority groups are more likely to deny consent for deceased organ donation, while at the same time they face prolonged waiting times due to difficulties in the matching process independently from the allocation policy [18,19]. In our centre, practicing in a multi-ethnic country like the UK, we focus on educational programmes directed to ethnic minorities to assure potential donors of the long-term safety of living kidney donation; the results of this study and similar findings in other cohorts [20,21] are informing the content of those programmes.

Similarly to other studies, we demonstrated that the lowest eGFR is within 6 months of follow up [20], and it is statistically significantly related to age older than 60 years [21] and male sex [15]. It is interesting to note that the mean eGFR is higher pre-donation for males, and after donation there is no difference during the 60 months follow up. This happens in relation to the higher Δ eGFR for the males versus females expressing their living donor kidney function recovery. The general longer life expectancy for women could also be a contributing factor [22].

Previous studies have demonstrated the overall safety of living kidney donation, even at older ages [10]. In our study, 15% of LDs were > 60 years old. Our data showed that the recovery of kidney function following living kidney donation was lower for the elderly, reflecting most probably the natural biological process [23]. However, the final mean eGFR for donors > 60 years old was

<sup>54</sup> <sup>±</sup> 11 mL/min/1.73 m2, a very satisfactory outcome after 60 months of follow up (Figure 1A,B). Therefore, our centre policy is that living donation is not discouraged on the basis of age only.

Whether or not obesity plays a role in the deterioration of the kidney function following living kidney donation remains a hotly debated issue. Despite the recent report from Locke et al. of a 1.9-fold higher risk for ESRD when compared to normal BMI donors [4], the adjusted risk of ESRD associated with obesity is only 1.16 in living donors with obesity [24]. The 26% of living donors in our cohort had a BMI > 30, but there was no increased risk in terms of mean eGFRs and Δ eGFR or recovery in kidney function during the 60 months follow up (Figure 4A,B). As it is the case with age, our centre policy does not rely for LD risk on the basis of BMI only: Potential candidates are screened on the basis of a multi-disciplinary input and discussion with a multifactorial analysis, tailored on a case-by-case basis to consider the overall medical condition, rather than a single factor [25]. In our centre, we screen potential donors on the basis of a multi-disciplinary input and discussion. We investigate the past medical history, previous hospitalisation, and regular medications. Hypertension controlled with up to two medications is not a contraindication according to the British Transplant Society guidelines [26]; moreover, there is evidence that this does not affect long-term overall or cardiovascular mortality [27]. Proteinuria, diabetes, angina/heart disease, stroke, severe pulmonary disease, kidney stone (or other kidney disease), blood disorders, sickle cell disease, or active cancer are contraindications to donate. No BMI or age cut off are considered as standing only criteria to proceed with donation. All the potential donors undergo imaging of their kidneys and then decision to proceed is made if eGFR > 80 mL/min/1.73 m2.

Finally, we found a statistically significant higher Δ eGFR after donation for genetically related donors (LRDs), as shown in Figure 2B. The concern that biologically-related living donors, especially first-degree relatives, may face increased risk of adverse renal outcomes after living kidney donation has been a topic of discussion in the transplant community for a long time [28,29]. Increased risks of renal failure in close biological relatives of ESRD patients have been observed in population-based and case-control studies among non-donors, regardless of whether recipient ESRD has a known hereditary cause [30,31]. In a study of Caucasian donors in Norway, the nine donors who developed ESRD were all biologically related to their recipients, and renal failure appeared mainly due to immunological diseases [21]. We believe that, although there is still controversy in the literature around whether LRDs are at increased risk compared to Living Unrelated Donors (LURDs) [32,33], the difference in mean eGFR after donation demonstrated in our study generates the responsibility for the transplant team to inform potential donors genetically related to the recipient about the potential increased risk of ESRD. A possible explanation for this finding, along with the latency of some genetic renal disorders running in the family, could be the presence of emotional factors [34] that push towards the decision to donate in a biologically related donor more than in an unrelated donor. It is also possible that the background risk of ESRD is dependent on several conditions, including family habits or life in a particular socio-economic area—and together, all of these may affect synergistically the eGFR trajectory more significantly than in unrelated donors.

The retrospective nature of this study has limited our analysis to those patients for whom we had complete data during the follow up period. The difficulty for a lifelong follow up in living kidney donors is possibly related to the fact that they are healthy individuals, therefore reluctant to come for tests after the first year post-donation. Another possible explanation could be the fact that as the years go by, donors move around the country and there is no continuity in their follow up.

#### **5. Conclusions**

The present study demonstrated that the higher decline rate in eGFR post-living kidney donation was at 6 months, with female sex, non-Caucasian ethnicity, and age lower than 60 years being independently associated with higher recovery in kidney function. At 60 months follow up, older age, genetic relationship with the recipient and male sex were related to a lesser recovery of eGFR, with

male sex and older age confirmed at logistic regression analysis. BMI did not relate to kidney function pre or post living kidney donation.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/6/883/s1, Table S1: Mean eGFR during follow up and percentual difference between mean eGFR during follow up and eGFR at donation (Δ eGFR).

**Author Contributions:** Conceptualisation, M.I.B. and V.P.s; Data curation, S.C. and I.S.; Formal analysis, M.I.B.; Investigation, M.I.B., S.C., F.J.M.F.D., and V.P.; Methodology, M.I.B., F.J.M.F.D., and V.P.; Project administration, M.I.B., S.C., I.S., and V.P.; Writing – original draft, M.I.B., F.J.M.F.D., and V.P.

**Conflicts of Interest:** The authors declare no conflict of interest.

**Meeting Presentation:** This work has been presented at the Association of Surgeons of Great Britain and Ireland meeting held in Telford, UK on the 7th–9th May 2019.

#### **Abbreviations**


#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

#### *Article*

### **Outcomes of Kidney Transplant Patients with Atypical Hemolytic Uremic Syndrome Treated with Eculizumab: A Systematic Review and Meta-Analysis**

**Maria L. Gonzalez Suarez 1,\*, Charat Thongprayoon 2, Michael A. Mao 3, Napat Leeaphorn 4, Tarun Bathini <sup>5</sup> and Wisit Cheungpasitporn <sup>1</sup>**


Received: 19 May 2019; Accepted: 26 June 2019; Published: 27 June 2019

**Abstract:** Background: Kidney transplantation in patients with atypical hemolytic uremic syndrome (aHUS) is frequently complicated by recurrence, resulting in thrombotic microangiopathy in the renal allograft and graft loss. We aimed to assess the use of eculizumab in the prevention and treatment of aHUS recurrence after kidney transplantation. Methods: Databases (MEDLINE, EMBASE and Cochrane Database) were searched through February 2019. Studies that reported outcomes of adult kidney transplant recipients with aHUS treated with eculizumab were included. Estimated incidence rates from the individual studies were extracted and combined using random-effects, generic inverse variance method of DerSimonian and Laird. Protocol for this systematic review has been registered with PROSPERO (International Prospective Register of Systematic Reviews; no. CRD42018089438). Results: Eighteen studies (13 cohort studies and five case series) consisting of 380 adult kidney transplant patients with aHUS who received eculizumab for prevention and treatment of post-transplant aHUS recurrence were included in the analysis. Among patients who received prophylactic eculizumab, the pooled estimated incidence rates of recurrent thrombotic microangiopathy (TMA) after transplantation and allograft loss due to TMA were 6.3% (95%CI: 2.8–13.4%, *I* <sup>2</sup> = 0%) and 5.5% (95%CI: 2.9–10.0%, *I* <sup>2</sup> = 0%), respectively. Among those who received eculizumab for treatment of post-transplant aHUS recurrence, the pooled estimated rates of allograft loss due to TMA was 22.5% (95%CI: 13.6–34.8%, *I* <sup>2</sup> = 6%). When the meta-analysis was restricted to only cohort studies with data on genetic mutations associated with aHUS, the pooled estimated incidence of allograft loss due to TMA was 22.6% (95%CI: 13.2–36.0%, *I* <sup>2</sup> = 10%). We found no significant publication bias assessed by the funnel plots and Egger's regression asymmetry test (*p* > 0.05 for all analyses). Conclusions: This study summarizes the outcomes observed with use of eculizumab for prevention and treatment of aHUS recurrence in kidney transplantation. Our results suggest a possible role for anti-C5 antibody therapy in the prevention and management of recurrent aHUS.

**Keywords:** atypical hemolytic uremic syndrome; eculizumab; kidney transplantation; renal transplantation; meta-analysis

#### **1. Introduction**

Atypical hemolytic uremic syndrome (aHUS) is a microvascular occlusive disorder characterized by hemolytic anemia, thrombocytopenia and acute kidney injury that is not associated with Shiga

toxin-producing Escherichia coli (STEC) or ADAMTS13 deficiency. Instead, it is typically associated with dysregulation of the alternative complement pathway [1,2]. Thrombotic microangiopathy (TMA) is the pathological lesion seen with aHUS, which represents a response to endothelial injury [3]. About 10% of hemolytic uremic syndrome (HUS) pediatric cases and the majority of cases in adults are due to atypical HUS [4].

Kidney transplantation in patients with aHUS has been linked to poor outcomes due to high recurrence rate and graft failure. Approximately 50% of patients with aHUS develop end stage renal disease (ESRD) with a high risk of recurrence after kidney transplantation [5]. Roughly 60–70% of aHUS patients have mutations in factors of the complement system or antibodies directed against complement factor H (*CFH*) [6]. In some cases, aHUS recurrence is noted early after transplantation, while other cases may be at lower risk of recurrence [7]. The risk of aHUS recurrence after transplant is higher in patients with gene mutations that encode circulating complement factors (*C3, CFH,* complement factor *I* (*CFI*)) when compared to patients with gene mutations that encode solid phase proteins such as CD46 [6,8,9]. Patients with no prior history of aHUS may also present with de novo aHUS after kidney transplantation.

Therapies described for management of aHUS in the post-transplant period include the use of plasma exchange (PLEX), rituximab (used for anti-Factor *H* autoantibodies; helps to maintain low levels of antibodies, preventing recurrence of aHUS after transplant) [10], simultaneous liver–kidney transplant for *CFH* mutations, and use of eculizumab (a humanized monoclonal antibody directed against complement protein C5 and thus inhibits terminal complement activation) [11–13]. Prior to the use of eculizumab, patients with gene mutations *CFH*, *CFH*-*CFHR1*/*3*, *CFI*, *C3*, and *CFB* had a 50% risk of progression to ESRD or death at onset of recurrent aHUS during the first year, and this risk increased to 75% after 3–5 years [14].

The KDIGO workgroup recommends the prophylactic use of eculizumab in kidney transplant patients at high risk of recurrence based on their genetic mutations [14]. Whether there is an advantage of preemptive use of eculizumab in all patients with a known pretransplant history of aHUS is currently unclear. In addition, eculizumab use is associated with an increased risk of infection due to terminal complement blockade such as meningococcal infections [15,16]. In this study, we aimed to assess the use of eculizumab in the prevention and treatment of aHUS recurrence after kidney transplantation.

#### **2. Methods**

#### *2.1. Search Strategy and Literature Review*

The protocol for the systematic review has been registered in PROSPERO (registration number: CRD42018089438; http://www.crd.york.ac.uk/PROSPERO). A systematic literature review of EMBASE (1988 to February 2019), MEDLINE (1946 to February 2019), and the Cochrane Database of Systematic Reviews (CDSR) (database inception to February 2019) was performed to assess the use of eculizumab in the prevention and treatment of aHUS recurrence after kidney transplantation. The systematic literature search was undertaken independently by two investigators (M.G.S. and C.T.) using a search approach that incorporated the terms of "kidney" OR "renal" AND "transplant" OR "transplantation" AND "eculizumab". The search strategy is provided in online Supplementary Data 1. No language restriction was applied. A manual literature search for conceivably pertinent studies using references of the included articles was also performed. This study was conducted by the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) statement [17].

#### *2.2. Selection Criteria*

Eligible studies must be (1) clinical trials or observational studies (cohort, case-series, or cross-sectional studies) that reported use of eculizumab in the prevention and treatment of aHUS recurrence after kidney transplantation; (2) adult (age ≥ 18 years old) kidney transplant recipients; and (3) they must provide the data on outcomes of interest including rates of aHUS recurrence and allograft loss among patients who received prophylactic eculizumab and rates of allograft loss among patients who received eculizumab for treatment of post-transplant aHUS recurrence. The eculizumab treatment group included post-transplant patients with de novo or recurrent aHUS. We excluded case reports and studies with single cases treated with eculizumab. Retrieved studies were independently reviewed for eligibility by the two authors (M.L.G.S. and C.T.). Discrepancies were discussed and resolved by a third author (W.C.) and common consensus. Inclusion was not confined by the size of study. Newcastle-Ottawa quality assessment scale [18] was used to appraise the quality of observational studies and the Cochrane risk-of-bias tool correspondingly for clinical trials [19]. Detailed evaluation of each study is presented in online Supplementary Tables S1 and S2.

#### *2.3. Data Abstraction*

A structured information collecting form was used to obtain the following information from each study including title, name of the first author, publication year, country where the study was conducted, demographic data of kidney transplant patients, history of previous kidney transplantation, type of donor, genetic mutations associated with aHUS, eculizumab regimen, use of PLEX, and outcomes following kidney transplantation (rates of aHUS recurrence and allograft loss among patients who received prophylactic eculizumab and rates of allograft loss among patients who received eculizumab treatment for post-transplant aHUS recurrence).

#### *2.4. Statistical Analysis*

Analyses were conducted utilizing the Comprehensive Meta-Analysis 3.3 software (version 3; Biostat Inc., Englewood, NJ, USA). We estimated the pooled incidence rate of recurrent TMA, all-cause allograft loss, and allograft loss due to TMA by the generic inverse variance approach of DerSimonian and Laird, which designated the weight of each study based on its variance [20]. Due to the possibility of between-study variance, we used a random-effects model rather than a fixed-effect model. Sensitivity analysis was performed with restriction to only cohort studies with data on genetic mutations associated with aHUS. We used Cochran's Q test and *I* <sup>2</sup> statistic to assess the between-study heterogeneity. A value of *I* <sup>2</sup> of 0% to 25% represents insignificant heterogeneity, 26% to 50% low heterogeneity, 51% to 75% moderate heterogeneity and 76–100% high heterogeneity [21]. The presence of publication bias was assessed by both subjective inspections of funnel plot and the Egger test [22]. The raw data for this systematic review is publicly available through the Open Science Framework (URL: osf.io/2pz4k).

#### **3. Results**

A total of 1096 potentially eligible articles were identified using our search strategy. After the exclusion of 888 articles based on title and abstract for clearly not fulfilling inclusion criteria on the basis of type of article, study design, or population or outcome of interest, and exclusion of 172 articles due to being duplicates; 36 full-length articles were reviewed. Fifteen of these articles were subsequently excluded from full-length review because they were also duplicates [23–28], did not fulfill inclusion criteria [29–33], did not report the outcomes of interest, or data was unable to be abstracted [34–37]. An additional three articles were excluded because they were case reports or single cases treated with eculizumab [8,38,39]. Ultimately 18 studies (13 cohort studies and five case series) [5,9,15,40–54] consisting of 380 patients with a history of aHUS who received eculizumab for prevention and treatment of post-transplant recurrent aHUS were included into the final analysis. The literature review and selection process are shown in Figure 1. The characteristics of the included studies are presented in Tables 1 and 2.



donor; LR: Living related donor; AMR: Antibody mediated rejection; CMR: Chronic antibody mediated rejection; PLEX: Plasma exchange; TMA: Thrombotic

microangiopathy.



#### *J. Clin. Med.* **2019** , *8*, 919

Ecu: Eculizumab cont.: Continued; D: Day; DD: Deceased donor; LD: Living donor; LU: Living unrelated donor; AMR: Antibody mediated rejection; ACR: Acute cellular rejection.

**Figure 1.** Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) flow diagram.

The baseline characteristics of kidney transplant patients and data on genetic mutations associated with aHUS are summarized in Table 3. There were 192 patients in the treatment group and 188 patients in the prophylaxis group, with a median age of 38 and 32.3 years, respectively. Females were predominant in both groups (59.4% vs. 84.4%, respectively). The most commonly reported genetic mutation in this our study population was CFH, followed by CFI and C3 in both groups.


**Table 3.** Baseline characteristics of kidney transplant patients included in the meta-analysis \*.

\* Percentages are calculated from total available data. *n* = number.

A history of aHUS prior to kidney transplantation was known in all patients included in this systematic review, except for de novo cases. The triggers for post-transplant aHUS were not reported in the majority of patients. The most commonly reported potential triggers were: Viral infections, urosepsis, *C. di*ffi*cile* infection, tacrolimus use, and everolimus use. De novo aHUS was reported in 56 patients [15,42,47,49]. Tacrolimus was identified as a possible trigger in one of the de novo cases in which aHUS presented 16 years after transplant without any genetic mutations identified [49], while it was suspected in four other cases that had antibody-mediated rejection and their immunosuppression was switched from tacrolimus to sirolimus [42,49]. Urosepsis was reported as a trigger in one de novo case [47].

#### *3.1. Use of Eculizumab in the Prevention of aHUS Recurrence after Kidney Transplantation*

Data on the initiation of prophylactic eculizumab therapy are summarized in Table 3. Overall, among patients who received prophylactic eculizumab, the pooled estimated rates of aHUS recurrence and allograft loss due to TMA were 6.3% (95%CI: 2.8–13.4%, *I* <sup>2</sup> = 0%, Figure 2A) and 5.5% (95%CI: 2.9–10.0%, *I* <sup>2</sup> = 0%, Figure 2B), respectively.

**Figure 2.** Incidence of (**A**) aHUS recurrence (recurrent TMA) and (**B**) allograft loss due to TMA after kidney transplantation with prophylactic eculizumab.

Sensitivity analysis excluding studies with potentially duplicated patients was performed, and showed the pooled estimated rates of aHUS recurrence and allograft loss due to TMA of 5.5% (95%CI: 1.9–14.6%, *I* <sup>2</sup> = 0%, Figure S1) and 5.3% (95%CI: 2.4–11.3%, *I* <sup>2</sup> = 0%, Figure S2), respectively. When analysis was limited only to studies with a mean follow-up time >12 month (mean follow up time range 21 to 35 months), we found pooled estimated rates of aHUS recurrence and allograft loss due to TMA of 4.6% (95%CI: 1.5–13.3%, *I* <sup>2</sup> = 0%) and 4.8% (95%CI: 2.1–10.7%, *I* <sup>2</sup> = 0%), respectively.

#### *3.2. Use of Therapeutic Eculizumab for aHUS Recurrence after Kidney Transplantation*

Data on the initiation of eculizumab treatment for post-transplant aHUS recurrence are summarized in Table 3. Among those who received eculizumab for treatment of post-transplant aHUS recurrence, the pooled estimated rates of allograft loss due to all causes was 24.4% (95%CI: 15.9–35.6%, *I* <sup>2</sup> = 23%, Figure 3A). The pooled estimated rates of allograft loss due to TMA was 22.5% (95%CI: 13.6–34.8%, *I* <sup>2</sup> = 6%, Figure 3B). Meta-regression analysis demonstrated no significant correlations between concomitant use of PLEX and rates of allograft loss due to all causes or due to TMA (*p* = 0.61 and 0.18, respectively).


**Figure 3.** Incidence of (**A**) allograft loss due to all causes and (**B**) allograft loss due to TMA after kidney transplantation with therapeutic eculizumab.

Sensitivity analysis excluding studies that potentially included duplicate patients was performed, and showed the pooled estimated rates of allograft loss due to all causes was 26.7% (95%CI: 13.0–46.9%, *I* <sup>2</sup> = 33%, Figure S3) and the pooled estimated rates of allograft loss due to TMA was 20.0% (95%CI: 9.1–38.4%, *I* <sup>2</sup> = 29%, Figure S4), respectively. Sensitivity analysis restricted only to cohort studies with data on genetic mutations associated with aHUS was performed. The pooled estimated incidence of allograft loss due to TMA in cohort studies with data on genetic mutations was 22.6% (95%CI: 13.2–36.0%, *I* <sup>2</sup> = 10% Figure 4).


**Figure 4.** Forrest plot evaluating for the incidence of allograft loss due to TMA in cohort studies with data on genetic mutations.

Subgroup analysis stratified by mean follow-up time was performed. We found the pooled estimated rates of allograft loss due to all causes was 25.5% (95%CI: 14.8–40.3%, *I* <sup>2</sup> = 34%) at a mean follow-up time of 12 to 24 months and 26.0% (95%CI: 3.9–75.1%, *I* <sup>2</sup> = 13%) at a mean follow-up time of 60 to 72 months. The pooled estimated rates of allograft loss due to TMA were 22.6% (95%CI: 12.1–38.1%, *I* <sup>2</sup> = 8%) at a mean follow-up time of 12 to 24 months and 26.0% (95%CI: 3.9–75.1%, *I* <sup>2</sup> = 13%) at a mean follow-up time of 60 to 72 months.

#### *3.3. Evaluation for Publication Bias*

We found no significant publication bias as assessed by the funnel plots (Figure 5) and Egger's regression asymmetry test for the rates of aHUS recurrence and allograft loss due to TMA in patients treated with prophylactic eculizumab therapy (*p* = 0.48 and 0.28, respectively), nor for rates of allograft loss due to all causes and allograft loss due to TMA in patients treated with therapeutic eculizumab for aHUS recurrence after kidney transplantation (*p* = 0.78 and 0.20, respectively).

**Figure 5.** Funnel plots evaluating for publication bias for (**A**) the incidence of aHUS recurrence after kidney transplantation among patients who received prophylactic eculizumab; (**B**) the incidence of allograft loss due to TMA after kidney transplantation among patients who received prophylactic eculizumab; (**C**) the incidence of allograft loss due to all causes after kidney transplantation among patients with therapeutic eculizumab for aHUS recurrence; (**D**) the incidence of allograft loss due to TMA after kidney transplantation among patients with therapeutic eculizumab for aHUS recurrence.

#### **4. Discussion**

Atypical HUS recurrence after transplant is diagnosed with the presence of laboratory abnormalities such as renal failure, microangiopathic hemolytic anemia, thrombocytopenia and microvascular occlusion [1,48]. In many cases, aHUS recurrence is confirmed with a kidney biopsy–showing the usual findings of glomerular intracapillary thrombosis, congestion, endothelial swelling, and thickening of the capillary wall [4].

Our systematic review showed that mutations to genes encoding *CFH*, followed by mutations in *CFI* genes were the most commonly reported in patients with aHUS recurrence after transplant. This observation, however, is limited to the studies included in our analysis, which only included kidney transplant patients treated with eculizumab and did not include patients treated with alternative therapies. In our systematic review, mutations in *CFH* and *CFH*/*CFHR1* were more commonly associated with graft loss in patients who had received eculizumab therapy. The current literature describes the presence of complement factor genetic mutations in approximately 30% of patients who present with de novo HUS after kidney transplantation [6]. However, no mutations were identified in four patients who presented with de novo aHUS and had graft loss [47,49].

We found that in most pretransplant patients with a history of aHUS who had received eculizumab prophylaxis, this was started on the day of kidney transplant surgery (data abstracted from Mallet et al. [47] and Manani et al. [49] were not utilized for this subgroup analysis as only one case in each publication was prophylactically managed with eculizumab). Less than 6% of the patients who received eculizumab prophylaxis presented with aHUS recurrence, and recurrence was more likely to occur when eculizumab was discontinued. Yelken et al. reported their clinical experience in a retrospective cohort of seven patients with a prior history of aHUS who had received eculizumab prophylaxis and underwent kidney transplantation. None of them had presented with aHUS recurrence or graft loss during the follow-up period of 28 months [52]. More recently, a global registry for aHUS enrolled 1549 patients over a 5-year period. One hundred and eighty-eight patients underwent kidney transplantation. From those, 88 patients received eculizumab before or during transplant surgery. This group of patients presented significantly better graft function when compared to those patients who received eculizumab with aHUS recurrence or de novo aHUS in the post-transplant period [15].

The time for recurrence of aHUS after transplant in the eculizumab therapeutic group varied, with recurrence seen as early as 3 days and up to 6 years (median of 2 months). Our study shows that allograft outcomes of patients treated after early recurrence (3 days to 3 months post-transplant) [25,28,42,44,49] when compared to those patients treated for late recurrence (29–96 months post-transplant) [15,47,50] were similar. The effect of timing for therapeutic eculizumab is also unclear. Some studies have concluded that early therapy with eculizumab for aHUS recurrence has failed to show better allograft survival [9,47,48]. Future prospective randomized control trials would help elucidate whether early initiation of therapy has a beneficial effect for allograft survival.

In many cases, management for aHUS recurrence in kidney transplantation includes the use of PLEX. Previously, Le Quintrec et al. had reported a trend towards decreased aHUS recurrence in patients treated preemptively with PLEX [44]. Zuber et al.'s retrospective cohort had subsequently compared PLEX versus eculizumab in both therapeutic and prophylactic modalities. They found that eculizumab was superior when compared to PLEX in preventing and treating aHUS recurrence after kidney transplantation [51]. Other studies have shown no difference in graft survival independently of whether patients received PLEX or not [44,51]. More recently, Favi et al. reported a case-control study where patients with and without PLEX were compared with eculizumab prophylaxis alone. This is the only study that we encountered where a comparison group was used. Patients who did not receive eculizumab (whether they received prophylaxis with PLEX or not) were more likely to have higher graft loss recurrence and allograft rejection when compared to patients who had received eculizumab [54]. Our systematic review showed that 36.2% of individuals required eculizumab after PLEX therapy, while 23.1% received concomitant PLEX in the prophylaxis group. There is no data available regarding the use of PLEX in the rest of the cases. Those patients who received PLEX plus eculizumab had no significant difference in allograft survival and aHUS recurrence in comparison to those without PLEX, (*p* = 0.65). However, it remains difficult to conclude about the effects of concomitant PLEX and eculizumab use in aHUS post-transplant recurrence and associated graft loss due to the small number of patients analyzed, and these patients of interest may have had more severe

risk markers that prompted dual therapy. PLEX is still considered an option in the management for aHUS recurrence [15].

Duration of eculizumab therapy has been controversial and so far is based on expert opinion, as there is no strong evidence to support lifelong therapy. We found that most patients who were prophylactically treated had continued eculizumab up to the time of their respective studies' publication. There is a report of one patient in whom no complement mutation was identified that stopped prophylactic eculizumab after 28.7 months. No aHUS recurrence was identified after 9 months of follow up [46]. The median duration of therapy was 18.9 months in the therapy patient group and 21 months in the prophylaxis patient group. While aHUS allograft recurrence after eculizumab cessation was reported in 5.2% of patients, it is difficult to conclude if other recurrent aHUS cases were missed due to the length of follow up after eculizumab therapy discontinuation.

Acute allograft rejection rates were similar in treatment and prophylactic groups. Despite receiving eculizumab treatment for aHUS post-transplant recurrence, graft loss was seen in 20.3% of the patients compared to 7.9% of patients who received eculizumab prophylaxis. While graft loss in the treatment group was likely associated with recurrence of aHUS, in the prophylaxis group, two cases of kidney allograft loss were associated with the presence of intestinal hemorrhage (one case immediately after transplant [45], the other case presented four months after transplant [47]). Whether there was a possible association of acute arterial allograft thrombosis with aHUS remains unclear.

We acknowledge the limitations inherent in this study. Firstly, this systematic review included only observational studies (cohorts and case series). Consequently, the majority of available studies also lacked a comparison (control) group. Only one of the included studies had a comparison group who did not receive eculizumab [54]. Recently, Duineveld C. et al. described 17 patients with aHUS who underwent living donor kidney transplantation without prophylactic eculizumab with median follow-up of 25 months after kidney transplantation [8]. Their institution's protocol emphasized lower target tacrolimus level and blood pressure control among these patients with aHUS. They found only one patient developed aHUS recurrence at 68 days after kidney transplantation, and was treated successfully with eculizumab. The investigators suggested that living donor kidney transplantation in aHUS without prophylactic eculizumab treatment appears feasible [8]. Secondly, inconsistencies in the reporting of certain variables such as type of donor, history of previous transplantation, genetic mutations, and immunosuppression regimen can make it difficult to draw firm associations between eculizumab prophylaxis or treatment on renal graft outcomes. Caution should be exercised in interpreting these data. Lastly, this study analysis was conducted on a highly selected study population. Only studies in which eculizumab was used were analyzed, and individuals who received an alternate treatment course or did not receive eculizumab were excluded from our analysis. Therefore, the frequency of mutations might not be representative. Although funnel plots and Egger's test of event rates demonstrated no statistical significance, a low risk of publication bias cannot be implied given the lack of comparison groups. Thus, future studies to assess the responsiveness to eculizumab based on certain aHUS genetic mutations (low risk vs. moderate risk vs. high risk of recurrence) [6] are needed.

In summary, our study describes the outcomes observed with use of eculizumab for prevention and treatment of aHUS recurrence in kidney transplantation. Our results suggest a possible advantageous role for anti-C5 antibody therapy in the prevention and management of recurrent aHUS. Future prospective studies and clinical trials are needed to evaluate for the efficacy of eculizumab, timing of initiation, and duration of prophylaxis and treatment therapy according to genetic mutations.

**Supplementary Materials:** The supplementary materials are available online at http://www.mdpi.com/2077-0383/ 8/7/919/s1.

**Author Contributions:** Conceptualization, M.L.G.S., C.T. and W.C.; Data curation, M.L.G.S., C.T. and W.C.; Formal analysis, W.C.; Investigation, M.L.G.S. and C.T.; Methodology, M.L.G.S. and W.C.; Project administration, N.L. and T.B.; Resources, M.A.M., N.L. and T.B.; Software, T.B. and W.C.; Supervision, M.A.M., N.L. and W.C.; Validation, M.L.G.S., C.T. and W.C.; Visualization, M.A.M. and W.C.; Writing, original draft, M.L.G.S.; Writing, review & editing, M.L.G.S., C.T., M.A.M. and W.C.

**Acknowledgments:** All authors had access to the data and played essential roles in writing of the manuscript. **Conflicts of Interest:** The authors deny any conflicts of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

*Review*

### **Timing of Ureteric Stent Removal and Occurrence of Urological Complications after Kidney Transplantation: A Systematic Review and Meta-Analysis**

### **Isis J. Visser 1, Jasper P. T. van der Staaij 1, Anand Muthusamy 1,2, Michelle Willicombe 1, Je**ff**rey A. Lafranca <sup>1</sup> and Frank J. M. F. Dor 1,2,\***


Received: 23 April 2019; Accepted: 13 May 2019; Published: 16 May 2019

**Abstract:** Implanting a ureteric stent during ureteroneocystostomy reduces the risk of leakage and ureteral stenosis after kidney transplantation (KTx), but it may also predispose to urinary tract infections (UTIs). The aim of this study is to determine the optimal timing for ureteric stent removal after KTx. Searches were performed in EMBASE, MEDLINE Ovid, Cochrane CENTRAL, Web of Science, and Google Scholar (until November 2017). For this systematic review, all aspects of the Cochrane Handbook for Interventional Systematic Reviews were followed and it was written based on the PRISMA-statement. Articles discussing JJ-stents (double-J stents) and their time of removal in relation to outcomes, UTIs, urinary leakage, ureteral stenosis or reintervention were included. One-thousand-and-forty-three articles were identified, of which fourteen articles (three randomised controlled trials, nine retrospective cohort studies, and two prospective cohort studies) were included (describing in total *n* = 3612 patients). Meta-analysis using random effect models showed a significant reduction of UTIs when stents were removed earlier than three weeks (OR 0.49, CI 95%, 0.33 to 0.75, *p* = 0.0009). Regarding incidence of urinary leakage, there was no significant difference between early (<3 weeks) and late stent removal (>3 weeks) (OR 0.60, CI 95%, 0.29 to 1.23, *p* = 0.16). Based on our results, earlier stent removal (<3 weeks) was associated with a decreased incidence of UTIs and did not show a higher incidence of urinary leakage compared to later removal (>3 weeks). We recommend that the routine removal of ureteric stents implanted during KTx should be performed around three weeks post-operatively.

**Keywords:** kidney transplantation; urological complications; ureteric stent; urinary tract infection; timing of removal

#### **1. Introduction**

Kidney transplantation (KTx) is considered the best option for end-stage renal disease (ESRD) management [1,2]. Kidney transplantation increases the life expectancy and quality of life of ESRD patients significantly compared to dialysis [3]. However, KTx is not without peri-operative complication risks; urinary tract infections (UTIs), urinary leakage, and ureteral stenosis are the most frequently seen urological complications. These complications are likely to compromise graft functions [4–7]. In order to minimize leakage and stenosis, in general, a stent is inserted in the ureter during implantation. Ureteric stents decrease the risk of these urological complications by five to ten-fold. [6,8,9] Most

centres use a variation of a JJ-stent (double-J stent), with the typical pigtail end preventing stent migration by positioning one end in the pyelum. The other end remains in the bladder after the ureteroneocystostomy (UNC) is created. The stent can easily be removed by flexible cystoscopy later. Various randomised controlled trials demonstrate that JJ-stenting reduces urinary leakage and ureteral stenosis [5,10–12]. Meta-analyses by Wilson and Mangus [6,9] also confirmed these results. Although the use of a JJ-stent reduces the risk of urinary leakage and ureteral stenosis, it may also predispose to UTIs [12–16]. In the existing literature, there is no consensus about the preferred time of stent removal. The European Association of Urology states in its renal transplantation guideline that stent retention longer than 30 days is associated with an increased risk (6% versus 40%) of UTIs [12,17]. Therefore, the guideline advises stents to be removed earlier than six weeks post-transplant (which is protocol in most transplant centres) rather than later. Therefore, in the last two years, studies have started to investigate different timings of stent removal within one-month post-transplant [8,18–23]. Recently, Thompson et al. [24] published a review regarding this topic; however, the authors included fewer studies and did provide a robust conclusion about timing of stent removal as their focus was more on the difference between per-urethral and bladder indwelling stents.

The aim of this systematic review is to give a comprehensive overview of currently available literature and to investigate if meta-analysis can elucidate whether a more definite timing for stent removal can be determined.

#### **2. Methods**

All aspects of the Cochrane Handbook for Interventional Systematic Reviews were followed, and the study was written according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) statement [25,26]. Details of the protocol for this systematic review were registered on PROSPERO (ID: CRD42018079867) and can be accessed online [27].

#### *2.1. Literature Search Strategy*

A literature search for all articles regarding JJ-stenting after KTx was performed in EMBASE, MEDLINE Ovid, CENTRAL (the Cochrane Library 2017, issue 11), Web of Science, and Google Scholar. The search was performed for articles published up to November 2017 relevant to outcomes of ureteric stent placement and timings of removal. The search strings for each respective database are attached in Appendix A. Reference lists of the identified relevant articles were manually scrutinized to ensure that no articles were missed.

#### *2.2. Literature Screening*

Study selection was performed independently by two authors (J.P.T.v.d.S. and I.J.V.). Study inclusion was carried out in two phases: after an initial title and abstract selection, full articles of the abstracts regarded as potentially eligible were retrieved and underwent complete review and assessment until a final inclusion was made. When a discrepancy in inclusion between the two authors occurred, articles were discussed with two senior authors (J.A.L., F.J.M.F.D.) in order to reach a consensus.

#### *2.3. Data Extraction*

Studies were assessed for timing of stent removal and the incidence of UTIs, urinary leakage, ureteral stenosis, and reintervention. Other parameters that were assessed were donor type, mean recipient age, the type of stent, technique of UNC, technique of stent removal, immunosuppressive therapy and antibiotic prophylaxis regimen. When the type of stent was not specified in a particular article, we reached out to the authors. If the authors did not respond, we noted the stent-type as "unspecified" but did not mark it as an exclusion criterium.

Studies were only included if they indicated time of stent removal and at least one of the following outcome parameters: UTI, urinary leakage, and/or ureteral stenosis. If a study stated an outcome as "major urological complication" (MUC) and we were not able to define whether this included stenosis or leakage, we analysed this outcome parameter separately as MUC.

To define UTI, we used the Guideline for Urological Infections of The European Association of Urology [28]. It states that a positive urine culture with a bacterial colony count of more than 10<sup>5</sup> colony-forming units per mL urine is defined as a UTI. However, if patients had lower counts of colony-forming units per mL urine but were reported to have symptoms of a UTI, we chose not to exclude them. When authors of the articles did not define a UTI, we assumed that the official definition was used.

We did not find an official guideline defining urinary leakage and ureteral stenosis. However, Dominguez et al. [10] stated the following definitions for urinary leakage and ureteral stenosis: leakage is defined as drainage or accumulation of perirenal fluid with characteristics of urine and ureteral stenosis is defined as impairment of adequate kidney drainage demonstrated at ultrasound (US) or intravenous pyelogram. When authors of the articles did not define urinary leakage and/or ureteral stenosis, we assumed the abovementioned definitions.

#### *2.4. Critical Appraisal*

The level of evidence of each selected paper was established using the GRADE tool [29]. The GRADE approach defines the quality of a body of evidence by consideration of within study risk of bias (methodological quality), directness of evidence, heterogeneity, precision of effect estimates, and risk of publication bias.

#### *2.5. Statistical Analysis*

Articles were assessed as to whether they were suitable for quantitative analysis. If articles compared two or more groups with different timings of stent removal and if those groups could be divided in an early and late timing of stent removal with a cut off at three weeks, they were included for meta-analysis.

Review Manager Software (RevMan 5.3; The Nordic Cochrane Centre, Copenhagen, Denmark) was used for meta-analysis [30]. Each study was weighted by sample size. Heterogeneity of time of stent removal effects between studies was tested using the Q (heterogeneity χ2) and the *I* <sup>2</sup> statistics. A random effects model was used for calculating the summary estimates (odds ratio (OR)) and 95% confidence intervals (CIs) to account for possible heterogeneity. Overall, the effects were determined using a Z-test. In addition, sensitivity analyses were performed to examine whether removing a particular study would significantly change the results and were presented in funnel plots.

#### **3. Results**

Of the 1043 articles identified from the initial literature search, fourteen articles were within the scope of this systematic review; three randomised controlled trials (RCTs) [20,21,31], nine retrospective cohort studies [8,13,19,22,23,32–35] and two prospective studies [36,37]. A total of 3216 patients were included, of which 2406 patients (74.8%) underwent living donor KTx (two studies did not record if they used living or deceased donors [34,35]). Table 1 presents an overview of the included studies. In Figure 1, the PRISMA flowchart diagram for systematic reviews is presented. The quality assessment of the included studies is depicted in a Summary of Findings table in Figure 2.

We could not specify the type of stent in five studies. The authors of these five papers were contacted, but unfortunately, we received no response [19,34–37]. All studies reported the age of the recipients, except for the preliminary results in the three included abstracts [34–36]. Only four studies reported that they both included adults and children [19,21,23,37]. All articles described the incidence of UTIs, and nine articles also reported urinary leakage and/or ureteral stenosis [19–21,31–33,35–37]. Table 2 presents the incidence of the different outcome parameters for each study. Regarding the used UNC technique, seven out of the fourteen articles described the use of the Lich–Gregoir technique [13,20–23,31,32]. Seven articles did not specify which surgical technique was used [8,19,33–37]. Regarding stent removal,

ten articles used a cystoscopy for removal [8,20–23,32–36]. In four studies, the stent was removed along with the urinary catheter seven days after transplantation [21,34–36]. Two studies removed the stent on the seventh day posttransplant by pulling on strings that were placed around the stent at the time of transplantation [8,20]. Four studies did not describe the technique of stent removal [13,31,37,38]. Ten studies reported the use of antibiotic prophylaxis to protect against UTI after KTx, of which seven studies also reported the duration of antibiotic prophylaxis regimen [13,20,21,23,31–33]. Six studies specified the use of co-trimoxazole as prophylaxis [13,21–23,31,32]. Four studies did not record the use of prophylactic antibiotics at all [8,36–38]. Seven studies reported which immunosuppressive regimen KTx recipients received [8,20,22,23,31–33]. An overview of all the characteristics of the included studies can be found in Table A1 in Appendix A. Four studies could not be included in the meta-analysis because the timing of stent removal was much earlier (e.g., earlier and later than four days) or much later (e.g., earlier and later than seven weeks) than the cut-off value of three weeks [18,39–41].


**Table 1.** Overview of the included studies.

RCT: Randomised controlled trial. (1) Abstract of the 17th Congress of the European Society for Organ Transplantation; (2) Abstract of the the World Transplant Congress 2014; and (3) Abstract of the 16th Congress of the European Society for Organ Transplantation.

**Figure 1.** Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) flowchart.


**Figure 2.** Quality assessment of the included studies.


**Table 2.** Overview of the measured outcome parameters.

\* Defined as urinary tract infection (UTI).

#### *3.1. Urinary Tract Infection*

Fourteen studies described the incidence of UTIs with different timings of stent removal and were included for meta-analysis (a total of 3216 patients). There was a significant difference between the groups in the risk of developing UTIs, favouring early stent removal (OR 0.49, CI 95%, 0.33 to 0.75 *p* = 0.0009) (Figure 3). Sensitivity analysis showed no change of significance. (Figure A1, Appendix A).


**Figure 3.** Forest plot of urinary tract infection for early (<3 weeks) versus late (>3 weeks) stent removal.

#### *3.2. Urinary Leakage*

Eight studies described the incidence of urinary leakage: three RCTs [20,21,31], one prospective study [37] and four retrospective studies [32,33,35,38]. One of these studies described zero events of urinary leakage; therefore, seven studies remained formeta-analysis, with a total of 1505 patients [21,31–33,35,37,38]. After pooling the data, there was no significant difference between groups in the risk of developing urinary leakage (OR 0.60, CI 95%, 0.29 to 1.23, *p* = 0.16) (Figure 4). Sensitivity analysis showed no change in significance (Figure A2, Appendix A).


**Figure 4.** Forest plot urinary leakage for early (<3 weeks) versus late (>3 weeks) stent removal.

#### *3.3. Ureteral Stenosis*

Five studies described the incidence of ureteral stenosis [20,21,32,33,36]. Three out of these seven studies reported zero incidents of ureteral stenosis in both groups [20,32,33]. Patel et al. [21] described one case of ureteral stenosis in both the early and late group of stent removal (1.2% and 0.8%, respectively). Gunawansa et al. [36] reported two cases of ureteral stenosis in the late stent removal group (1.1%). No meta-analysis was performed given the low incidence of ureteral stenosis.

Dadkhah et al. [37] and Asgari et al. [19] recorded the incidence of hydronephrosis; however, they did not describe the cause of the hydronephrosis. Dadkhah et al. [37] reported eleven cases in the early stent removal group (3.4%) versus three (2.8%) in the late group of stent removal (*p* = 0.122). Asgari et al. [19] reported, respectively, seven (11.5%) and four (13.3%) cases in the early and late group of stent removal (*p* = 0.71).

Some studies only reported MUC without defining whether this was urinary leakage or ureteral stenosis [13,23,34]. We decided to perform an additional meta-analysis of MUC. We included data from those studies and combined ureteral stenosis and urinary leakage in a single MUC category. After pooling the data, there was no significant difference between groups in the risk of developing major urological complications (OR 1.01, CI 95%, 0.45 to 2.27, *p* = 0.98) (Figure A3, Appendix A). However, we think that ureteral stenosis and urinary leakage are fundamentally different because these complications have a different pathophysiology, so we should be careful with interpretation of these combined outcome parameters.

#### *3.4. Reintervention*

Yuksel et al. [23] described the incidence of surgical reintervention because of urological complications after renal transplantation at four different timings of stent removal. There was a clear difference between early (less than three weeks) and late (more than three weeks) stent removal (6.3% versus 1.3%). Patel et al. [21] reported three cases (3.7%) of major urological complications that required surgical revision in the early (five days) versus one case (0.8%) in the late (28 days) stent removal group. Indu et al. [31] reported one case (2.0%) of urinary leakage that required percutaneous nephrostomy in the early stent removal group. Huang et al. [33] reported two cases (1.1%) of urinary leakage that required surgical revision in both the early and late stent removal groups.

Verma et al. [32] reported zero surgical reintervention after major urological complications in both early and late stent removal group (two and four weeks, respectively).

Soldano et al. [35] and Liu et al. [20] investigated surgical reimplantation of the JJ-stent; Soldano et al. [35] reported one case (2.1%) of surgical reimplantation of the stent in the late stent removal group (six weeks), whereas Liu et al. did not report any reimplantation in both the early and late stent removal groups (one and four weeks, respectively).

#### **4. Discussion**

There is good evidence that stenting the UNC at the time of KTx is beneficial to reduce major urological complications. Intuitively, transplant professionals feel that ureteric stents should not be in situ for too long to reduce the incidence of infectious complications. However, the optimal timing of stent removal remains unclear. Previous studies show a wide range in the timing of stent removal after KTx (five days until 60 days) [42–45]. It is already known that the incidence of UTIs is higher when a stent is removed later than five weeks (24.6% to 44%) [13,21,22,34,35]. A UTI after KTx is associated with graft loss, higher morbidity rates, increased risk of rejection and increased hospitalisation rates [6,7,46–48]. For this reason, studies have been performed to investigate whether earlier stent removal (e.g., around three weeks) would reduce the incidence of UTIs [8,13,19,20,22,23,31,33,36,37]. We decided to perform a meta-analysis to further investigate if we could define a more optimal timing of stent removal.

Based on the results, we demonstrated that earlier (<three weeks) stent removal shows a significantly lower incidence of UTIs compared to later removal (>three weeks). Furthermore, earlier stent removal does not appear to lead to a higher incidence of urine leakage. Regarding ureteral stenosis and reintervention, no hard statements can be made, since we were not able to meta-analyse the results. However, overall, incidence or ureteral stenosis and reintervention is clearly very low in kidney transplant recipients (~1 and 3%).

#### *4.1. Di*ffi*culty in Anastomosis*

We realize that characteristics of the study population of both donors and recipients varies and that these are factors that can influence the difficulty of the ureteral anastomosis, and therefore, the outcome after KTx. Unfortunately, only a few studies describe the donor and recipient characteristics in detail; type of donor (living/deceased), type of deceased donor (donor after brainstem/circulatory death), pre-emptive status of the recipients and duration of dialysis are often not given.

Almost every study recorded whether they included a living and/or deceased donor. The majority of the included studies only involved living donor KTx [19,20,22,23,31,32,36,37]; Huang et al. [33] only included deceased donor kidneys. Three studies included both deceased and living donors [8,13,21]. Two studies did not record whether they included living and/or deceased donors [34,35].

Furthermore, Patel et al. [21] reported that of the deceased donors, the majority was a DBD—in the early stent group 81.2% and in the late stent group 64.4%. They recorded that in both the early and late stent removal group, 72% of the transplant patients were dialysis dependant before KTx. In the study by Huang et al. [33], all patients were receiving dialysis before KTx. In both groups, around 95% of the transplant patients underwent haemodialysis and 5% peritoneal dialysis. In the early and late stent removal groups, patients were respectively 25.7 and 24.8 months on dialysis prior to transplantation. Coskun et al. [13] only reported that duration of dialysis prior to transplantation varied between 1 and 168 months.

#### *4.2. Urinary Tract Infection*

The incidence of UTIs varied widely between included studies: 0 to 53% [8,13,19–23,31–37]. Most transplant centres used a similar triple-regimen immunosuppressive therapy, consisting of calcineurin inhibitors (tacrolimus or cyclosporine), mycophenolate mofetil and corticosteroids. For all details of the immunosuppressive therapy see Table A1 in appendix A. We noticed that some studies did not describe the immunosuppressive drugs nor the prophylactic antibiotics used. We assumed that in these cases, a calcineurin inhibitor-based triple-immunosuppressive regimen was used, and antibiotic prophylaxis was prescribed. Yuksel et al. [23] attribute the low incidence of UTIs to their strict regime of antibiotic prophylaxis. In addition, Sarier et al. [22] and Wilson et al. [6] stress the importance of prophylactic co-trimoxazole to protect against UTIs after transplantation. Furthermore, they state that previous in vivo and in vitro studies have demonstrated that the antibiotic types fluoroquinolones, aminoglycosides and

beta-lactam antibiotics may be effective in prevention of the biofilm mechanism—a major problem in bacterial stent colonization [22].

Coskun et al. [13] state that UTIs should rather be treated with earlier stent removal opposed to the prescription of antibiotics. We agree with this statement and would advise transplant centres to remove the (potential) source of UTIs earlier rather than later as a best possible way to prevent UTIs.

Some studies investigated very early stent removal, around one week post-transplantation, and it showed promising results, specifically (and maybe only) regarding the incidence of UTIs [20,21,23,31,34,36].

Dadkhah et al. [37] showed a remarkably high incidence of UTIs in the early stent removal group (ten days), which was two times higher than for late stent removal (30 days). Surprisingly, they concluded that removal of the ureteral stent shortly after KTx has a statistically negligible impact on the rate of UTIs. We decided to include their controversial incidence of UTIs in our meta-analysis. However, one should keep in mind that this study had some paucity of data granularity, as the technique of UNC, length of follow up, immunosuppressive regimen, the use of antibiotic prophylaxis and technique of stent removal were not described.

#### *4.3. Ureteral Leakage and Ureteral Stenosis*

Included studies with wide varying timings of stent removal show that ureteral leakage and stenosis are complications with low incidences (0–3%). Three studies described a remarkably high incidence of urinary leakage [32,35,38]. Asgari et al. [38] reported 6.6% urinary leakage in the early stent removal group (ten days) and 13.3% in the late stent removal group (30 days). Soldano et al. [35] reported 6.3% urinary leakage in the late stent removal group (at six weeks). Verma et al. [35] reported 5.8% and 10% urinary leakage in the early (two weeks) and late (four weeks) stent removal groups. Because these remarkably high incidences were derived from (pilot) studies with small patient populations (each group containing respectively around 50 patients), we have to interpret these data very carefully.

Overall, the data implies that both leakage and stenosis can be successfully prevented with insertion of a stent but the duration of the stent being in situ does not have a great influence on the incidence of these urological complications [5,6,10–12]. Huang et al. [33] support this by concluding that stent removal at three weeks is as effective in preventing urological complications as removal at six weeks (with similar prophylactic antibiotics and immunosuppressive therapies). Furthermore, Patel et al. [21] demonstrate that very early stent removal (less than five days) results in a lower incidence of leakage and stenosis than in the un-stented population. Therefore, even when the stent is inserted for a brief period, it already shows benefit in preventing leakage and stenosis. The first two weeks after KTx are believed to have the highest incidence of urinary leakage and ureteral stenosis [8,13,23,32]. Yuksel et al. [23] conclude that stent removal earlier than fourteen days shows a significant increase of recurrent surgical UNC intervention. In addition, Coskun et al. [13] conclude that stent removal at two weeks results in acceptable mucosal healing of the anastomosis to prevent urological complications. In order to keep the incidence of urinary leakage and ureteral stenosis as low as possible, we recommend that stents should not be removed earlier than two weeks.

#### *4.4. Additional Advantages*

In addition to a lower incidence of UTIs, early stent removal has other advantages. Instead of a cystoscopy, stents can be removed less invasively together with the removal of the urinary catheter if tied to it at the time of transplantation. This procedure is considered far more comfortable for the patient [8,20]. Additionally, early stent removal provides the opportunity to remove the stent during the same admission, which leads to a reduction in costs and fewer forgotten stents [5,21,32,33].

#### *4.5. Limitations*

A meta-analysis can only be as good as the quality of the included studies. Unfortunately, most of the included studies are retrospective cohort studies. Only three RCTs were included [20,21,31]. We mention this limitation to alert the reader to carefully interpret the data. In the forest plot analysing urinary tract infections for early (<3 weeks) versus late (>3 weeks) stent removal (Figure 3), one can appreciate the relatively high heterogeneity (*I* <sup>2</sup> = 61%). The cause of this heterogeneity lies in the differences in study design; for example, not defining which type of stent has been used, technique of stent removal, technique of ureteroneocystostomy, use of prophylactic antibiotics or choice of immunosuppressive therapy. The latter two aspects are particularly important factors influencing UTI rates. Furthermore, urinary leakage and ureteral stenosis were not always defined, or no uniform definition was adhered to. This leads to assumptions, which may cause inadequate comparison. Studies often do not report at what time during follow-up, especially before or after stent removal, specific complications occurred.

Recently, Thompson et al. [24] also investigated the benefits of early stent removal. They concluded that early stent removal reduces the incidence of UTIs, while it was uncertain if there is a higher risk of MUC. Furthermore, the authors used incidence of MUC as a primary outcome and UTIs as a secondary outcome. In our study, UTIs and MUC were chosen as, respectively, primary and secondary outcome, because our main focus was to investigate whether early stent removal reduced the disadvantages of stenting (incidence of UTIs), without compromising the beneficial effects (preventing MUC).

Thompson et al. [24] classified early and late stent removal in a different manner; although they mention that early stent removal was defined as stent removal below fifteen days, they did not particularly use this definition in their analyses. The authors copied the "early" and "late" groups from included studies. As a consequence, there is no common cut-off value analysed in their study and meta-analysis.

Furthermore, the focus of their analysis was more on the specific type of stents used (bladder indwelling stent and per-urethral stents). In our opinion, we should first focus on the relation between duration of stenting and the incidence of UTIs rather than the influence of the type of stent.

#### **5. Conclusions**

The results of this systematic review clearly point favourably towards an earlier stent removal around three weeks opposed to the six weeks that is currently used in most transplant centres. Earlier stent removal (at or below three weeks) results in fewer UTIs without negatively affecting the anastomosis between ureter and bladder. We recommend at this stage that ureteric stents should not be removed earlier than two weeks. We would recommend initiating an RCT, randomising between very early stent removal at one week and stent removal at three weeks. Another option would be a three-armed RCT, adding an additional group of stent removal at six weeks.

**Author Contributions:** The study was conceptualized by F.J.M.F.D. and J.A.L., methodology was developed by I.J.V., J.P.T.v.d.S., J.A.L. and F.J.M.F.D. Formal analyses were done by I.J.V., J.P.T.v.d.S. and J.A.L. The original draft was written by I.J.V., J.P.T.v.d.S and J.A.L. The MS was further reviewed and edited by I.J.V., J.P.T.v.d.S., A.M., M.W., J.A.L. and F.J.M.F.D. The project was supervised by J.A.L. and F.J.M.F.D.

**Acknowledgments:** We thank Wichor Bramer for his expert assistance with the systematic literature search.

**Conflicts of Interest:** The authors declare no conflict of interest.

*J. Clin. Med.* **2019**, *8*, 689

#### **Appendix A**

**Figure A1.** Funnel plot of urinary tract infection for early (<3 weeks) versus late (>3 weeks) stent removal.

**Figure A2.** Funnel plot of urinary leakage for early (<3 weeks) versus late (>3 weeks) stent removal.


**Figure A3.** Forest plot of major urological complications for early (<3 weeks) versus late (>3 weeks) stent removal.


**A1.** Overview of the study characteristics.

**Table** 

Tac =

(2) including the non-stented population (*<sup>n</sup>* = 99). \* Recorded that they both included children and adults; ˆ Induction therapy; " Specific type of prophylactic antibiotics.

Tacrolimus, Cyc =

cyclosporine,

 Pred =

antithymocyte

 globulin, Bas =

Basiliximab, Dac =

Daclizumab, Aza =

Azathioprine.

 (1) 20–30, 30–40, 40–50, 50–60 days,

prednisone, ATG =

#### *J. Clin. Med.* **2019**, *8*, 689

#### Search string A


### **Embase.com 944**

('kidney graft'/exp OR 'kidney transplantation'/exp OR (((kidney\* OR renal\*) NEAR/3 (graft\* OR transplant\* OR allograft\* OR allotransplant\* OR recipient\*))):ab,ti) AND ('ureter stent'/exp OR (ureter/de AND stent/de) OR (((ureter\* OR nephroureter\* OR double-j OR jj OR j-j OR pigtail\* OR urinar\*) NEAR/3 stent\*)):ab,ti) NOT ([animals]/lim NOT [humans]/lim)

#### **Medline Ovid 443**

(kidney transplantation/ OR kidney/tr OR (((kidney\* OR renal\*) ADJ3 (graft\* OR transplant\* OR allograft\* OR allotransplant\* OR recipient\*))).ab,ti.) AND ((ureter/ AND stents/) OR (((ureter\* OR nephroureter\* OR double-j OR jj OR j-j OR pigtail\* OR urinar\*) ADJ3 stent\*)).ab,ti.) NOT (exp animals/ NOT humans/)

#### **Cochrane CENTRAL 43**

((((kidney\* OR renal\*) NEAR/3 (graft\* OR transplant\* OR allograft\* OR allotransplant\* OR recipient\*))):ab,ti) AND ((((ureter\* OR nephroureter\* OR double-j OR jj OR j-j OR pigtail\* OR urinar\*) NEAR/3 stent\*)):ab,ti)

#### **Web of Science 268**

TS=(((((kidney\* OR renal\*) NEAR/2 (graft\* OR transplant\* OR allograft\* OR allotransplant\* OR recipient\*)))) AND ((((ureter\* OR nephroureter\* OR double-j OR jj OR j-j OR pigtail\* OR urinar\*) NEAR/2 stent\*))) NOT ((animal\* OR rat OR rats Or mouse OR mice OR murine) NOT (human\* OR patient\*))) AND DT=(article) AND LA=(English)

#### **Google Scholar**

"kidney|renal graft|transplantation|allograft|allotransplant|recipients"

"ureter|ureteral|nephroureteral|jj|pigtail|urinary stent|stents"|"j|double j stent|stents

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

### *Article* **Incidence and Mortality of Renal Cell Carcinoma after Kidney Transplantation: A Meta-Analysis**

**Api Chewcharat 1,2, Charat Thongprayoon 3,\*, Tarun Bathini 4, Narothama Reddy Aeddula 5, Boonphiphop Boonpheng 6, Wisit Kaewput 7, Kanramon Watthanasuntorn 8, Ploypin Lertjitbanjong 8, Konika Sharma 8, Aldo Torres-Ortiz 9, Napat Leeaphorn 10, Michael A. Mao 11, Nadeen J. Khoury <sup>12</sup> and Wisit Cheungpasitporn <sup>9</sup>**


Received: 31 March 2019; Accepted: 15 April 2019; Published: 17 April 2019

**Abstract:** Background: The incidence and mortality of renal cell carcinoma (RCC) after kidney transplantation (KTx) remain unclear. This study's aims were (1) to investigate the pooled incidence/incidence trends, and (2) to assess the mortality/mortality trends in KTx patients with RCC. Methods: A literature search was conducted using the MEDLINE, EMBASE and Cochrane databases from inception through October 2018. Studies that reported the incidence or mortality of RCC among kidney transplant recipients were included. The pooled incidence and 95% CI were calculated using a random-effect model. The protocol for this meta-analysis is registered with PROSPERO; no. CRD42018108994. Results: A total of 22 observational studies with a total of 320,190 KTx patients were enrolled. Overall, the pooled estimated incidence of RCC after KTx was 0.7% (95% CI: 0.5–0.8%, *I* <sup>2</sup> = 93%). While the pooled estimated incidence of de novo RCC in the native kidney was 0.7% (95% CI: 0.6–0.9%, *I* <sup>2</sup> = 88%), the pooled estimated incidence of RCC in the allograft kidney was 0.2% (95% CI: 0.1–0.4%, *I* <sup>2</sup> = 64%). The pooled estimated mortality rate in KTx recipients with RCC was 15.0% (95% CI: 7.4–28.1%, *I* <sup>2</sup> = 80%) at a mean follow-up time of 42 months after RCC diagnosis. While meta-regression analysis showed a significant negative correlation between year of study and incidence of de novo RCC post-KTx (slopes = −0.05, *p* = 0.01), there were no significant correlations between the year of study and mortality of patients with RCC (*p* = 0.50). Egger's regression asymmetry test was performed and showed no publication bias in all analyses. Conclusions: The overall estimated incidence of RCC after KTX was 0.7%. Although there has been a

potential decrease in the incidence of RCC post-KTx, mortality in KTx patients with RCC has not decreased over time.

**Keywords:** malignancy; post-transplant malignancy; renal cell carcinoma; meta-analysis; kidney transplantation; transplantation; systematic reviews

#### **1. Introduction**

Kidney transplantation (KTx) is the renal replacement therapy of choice for the majority of patients with end-stage renal disease (ESRD) and it significantly improves survival and quality of life [1,2]. The long-term mortality rate is 48% to 82% lower in KTx recipients when compared to ESRD patients on the transplant waitlist [2,3]. However, due to immunosuppression, KTx patients are at a two-fold increased risk of developing malignancy in comparison to the general population [4–6]. Malignancies are among the top three leading causes of death in KTx recipients, following infection and cardiovascular complications [6].

Studies have demonstrated a higher incidence of renal cell carcinoma (RCC) among ESRD patients (0.3%) than its reported incidence in the general population (approximately 0.005%) [7,8]. Thus, the Clinical Practice Guidelines Committee of the American Society of Transplantation (AST) [9] has recommended RCC screening for high-risk candidates, such as ESRD patients on dialysis for more than 3 years [10]. Despite screening for RCC among KTx candidates, de novo RCC has been reported among KTx patients in both native kidneys [11–18], and transplanted kidneys [17,19,20]. However, the incidence and incidence trends of RCC among KTx patients remain unclear [11–42].

Thus, we performed a systematic review to (1) investigate the pooled incidence/incidence trends, and (2) assess the mortality/mortality trends in KTx patients with RCC.

#### **2. Methods**

#### *2.1. Search Strategy and Literature Review*

The protocol for this systematic review is registered with PROSPERO (International Prospective Register of Systematic Reviews; no. CRD42018108994). A systematic literature search of MEDLINE (1946 to October 2018), EMBASE (1988 to October 2018), and the Cochrane Database of Systematic Reviews (database inception to October 2018) was performed to assess (1) the pooled incidence/incidence trends, and (2) to assess the mortality/mortality trends in KTx patients with RCC. The systematic literature review was conducted independently by two investigators (C.T. and W.C) using a search strategy that consolidated the terms "kidney cancer" OR "renal cell carcinoma" AND "kidney transplantation" OR "renal transplantation" which is provided in the online Supplementary Data S1. The database searches were limited to English language articles only. A manual search for conceivably related studies using references of the included articles was also performed. This study was conducted using the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) statement [43].

#### *2.2. Selection Criteria*

Eligible studies had to be clinical trials or observational studies (cohort, case-control, or cross-sectional studies) that reported the incidence or mortality of RCC among adult KTx recipients (age >/= 18 years old). Retrieved articles were individually reviewed for eligibility by two investigators (A.C. and C.T.). Discrepancies were addressed and solved by mutual consensus. Inclusion was not limited by the size of study.

#### *2.3. Data Abstraction*

A structured data collecting form was used to obtain the following information from each study: title, name of the first author, year of the study, publication year, country where the study was conducted, RCC definition, incidence of RCC, and mortality in KTx patients with RCC.

#### *2.4. Statistical Analysis*

Analyses were performed utilizing the Comprehensive Meta-Analysis 3.3 software (Biostat Inc, Englewood, NJ, USA). Adjusted point estimates from each study were consolidated by the generic inverse variance approach of DerSimonian and Laird, which designated the weight of each study based on its variance [44]. Given the possibility of between-study variance, we used a random-effect model rather than a fixed-effect model. Forest plots were constructed to visually evaluate the incidence and mortality of RCC among adult KTx recipients. Cochran's Q test and *I* <sup>2</sup> statistic were applied to determine the between-study heterogeneity. A value of *I* <sup>2</sup> of 0–25% represents insignificant heterogeneity, 26–50% low heterogeneity, 51–75% moderate heterogeneity and 76–100% high heterogeneity [45]. The presence of publication bias was assessed using the Egger test [46]. Funnel plots were created to evaluate for the presence or absence of publication bias.

#### **3. Results**

A total of 7815 potentially eligible articles were identified using our search strategy. After the exclusion of 7629 articles based on their title and abstract for clearly not fulfilling the inclusion criteria on the basis of type of article, patient population, study design, or outcome of interest, and 81 due to being duplicates, 105 articles were left for full-length review. Fifty-nine of them were excluded from the full-length review as they did not report the outcome of interest. Twenty-one articles were case reports and three articles were not in English. Thus, 22 cohort studies [11–20,23,28–36,38,39] with a total of 320,190 KTx patients were enrolled. The literature retrieval, review, and selection process are demonstrated in Figure 1. The characteristics of the included studies are presented in Table 1.

**Figure 1.** Outline of our search methodology.


*8*, 530



#### *J. Clin. Med.* **2019**, *8*, 530

**Table 1.** *Cont.*

#### *3.1. Incidence of RCC after KTx*

Eighteen studies provided data on the incidence of RCC after KTx [11–20,28,29,31,32,34,35]. Overall, the pooled estimated incidence of RCC after KTx was 0.7% (95% CI: 0.5–0.8%, *I* <sup>2</sup> = 93%, Figure 2). While the pooled estimated incidence of de novo RCC in the native kidney was 0.7% (95% CI: 0.6–0.9%, *I* <sup>2</sup> = 88%, Figure 3A), the pooled estimated incidence of RCC in the allograft kidney was 0.2% (95% CI: 0.1–0.4%, *I* <sup>2</sup> = 64%, Figure 3B).

**Figure 2.** Forest plots of the included studies [11–20,28,29,31,32,34,35,38,39] assessing incidence rates of RCC after KTx. A diamond data marker represents the overall rate from each included study (square data marker) and 95% confidence interval.

**Figure 3.** Forest plots of the included studies [11–18,28,29,31,32,34,35,38,39] assessing incidence rates of (**A**) de novo RCC in the native kidney and (**B**) RCC in the allograft kidney [17,19,20,38,39]. A diamond data marker represents the overall rate from each included study (square data marker) and 95% confidence interval.

Meta-regression showed a significant negative correlation between year of study and incidence of de novo RCC post-KTx (slopes = −0.05, *p* = 0.01, Figure 4).

**5HJUHVVLRQRI/RJLWHYHQWUDWHRQ<HDU**

**Figure 4.** Meta-regression analyses showed a significant negative correlation between the year of study and incidence of de novo RCC post-KTx (slopes = −0.05, *p* = 0.01). The solid line represents the weighted regression line based on variance-weighted least squares. The inner and outer lines show the 95% confidence interval and prediction interval around the regression line. The circles indicate the log event rates in each study.

#### *3.2. Mortality Rate in KTx Recipients with RCC*

Eleven studies provided data the on mortality rate in KTx recipients with RCC [13,14,16,17,19,20,23,30,33,36,39]. Overall, the pooled estimated mortality rate in KTx recipients with RCC was 15.0% (95% CI: 7.4–28.1%, *I* <sup>2</sup> = 80%, Figure 5) at a mean follow-up time of 42 months after RCC diagnosis. The data on the incidence and mortality of recurrent RCC among KTx recipients with a previous history of RCC prior to KTX were limited. A prior study demonstrated an incidence of recurrent RCC after KTX of 9.1% with an associated 5-year survival of 41.7% [23]. Sensitivity analysis, excluding the study of recurrent RCC among KTx recipients with a previous history of RCC prior to KTX (23), demonstrated a pooled estimated mortality rate of 11.5% in KTx recipients with RCC (95% CI: 6.4–19.8%, *I* <sup>2</sup> = 67%).

**Figure 5.** Forest plots of the included studies [13,14,16,17,19,20,23,30,33,36,39] assessing mortality rate in KTx recipients with RCC. A diamond data marker represents the overall rate from each included study (square data marker) and 95% confidence interval.

Meta-regression showed no significant correlations between the year of study and mortality of patients with RCC (*p* = 0.50, Figure 6). When meta-regression was performed excluding the study of recurrent RCC among KTx recipients with a previous history of RCC prior to KTX [30], there were still no significant correlations between the year of study and mortality of patients with RCC (*p* = 0.56, Figure 7).

#### **5HJUHVVLRQRI/RJLWHYHQWUDWHRQ<HDU**

**Figure 6.** Meta-regression analyses showed no significant correlations between the year of study and mortality of patients with RCC (*p* = 0.50). The solid line represents the weighted regression line based on variance-weighted least-squares. The inner and outer lines show the 95% confidence interval and prediction interval around the regression line. The circles indicate the log event rates in each study.

#### **5HJUHVVLRQRI/RJLWHYHQWUDWHRQ<HDU**

**Figure 7.** Meta-regression analyses, excluding the study of recurrent RCC among KTx recipients with a previous history of RCC prior to KTX, showed no significant correlations between the year of study and mortality of patients with RCC (*p* = 0.56). The solid line represents the weighted regression line based on variance-weighted least-squares. The inner and outer lines show the 95% confidence interval and prediction interval around the regression line. The circles indicate log event rates in each study.

#### *3.3. Evaluation for Publication Bias*

Funnel plots (Supplementary Figures S1 and S2) and Egger's regression asymmetry tests were performed to evaluate publication bias in the analysis evaluating the incidence and mortality of KTx recipients with RCC. There was no significant publication bias, with *p*-values of 0.58 and 0.54, respectively.

#### **4. Discussion**

In this systematic review, we found that RCC after KTx occurs with an incidence of 0.7%. RCC can occur in the native kidney with an incidence of 0.7% or in the allograft kidney with an incidence of 0.2%. Our findings also showed a statistically significant negative correlation between the incidence of RCC after KTx and study year, representing a potential decrease in the RCC incidence among KTx patients. However, mortality in KTx patients with RCC has not decreased over time.

Post-KTx malignancy is a common cause of death [5,6,47–51] and RCC is the most common solid-organ malignancy in this population [52,53]. Due to the increased risk of RCC among ESRD patients [7,8], the Clinical Practice Guidelines Committee of the AST has suggested RCC screening in ESRD patients on dialysis for longer than 3 years [9,10]. In addition, it is suggested that most KTx candidates with a history of RCC should wait at least 2 years from successful cancer treatment to KTx (unless candidates have only small localized incidental tumors, which may not require any waiting period) [54,55]. Candidates with large, invasive or symptomatic RCC may require a longer waiting period of 5 years [54,55]. Despite RCC screening prior to KTx, the findings from our study suggest that RCC can still occur post-KTx at a higher incidence (0.7%) than its reported incidence among ESRD patients (0.3%) [8]. In addition, studies have demonstrated that KTx recipients have a relative increased risk of five- to ten-fold for RCC compared with an age-matched general population, and that the majority of these tumors arise in the setting of acquired kidney cystic disease (AKCD) which develops with chronic renal failure [5,8,35,56–64]. Although RCC occurrence is more frequent in the native kidneys of KTx recipients, RCC can also occur in the renal allograft (incidence of 0.2%) [17,19,20].

While the exact etiology of the increased risk of RCC in KTx remains unclear, it is likely linked to the immunosuppressed state [4]. Reported risk factors for post-KTx RCC include older age, male sex, African descent, excess body weight, smoking, hypertension, history of acquired cystic kidney disease (ACKD), previous RCC prior to KTx, and longer pre-transplant dialysis duration [3,6,18,23,29,31,34,35,65–67]. Studies have shown that causes of ESRD before KTx may also affect the incidence of post-KTx RCC [14,32,35]. While KTx recipients with ESRD due to glomerulonephritis, hypertensive nephrosclerosis, and vascular diseases have been shown to have a higher incidence of post-KTx RCC, recipients with ESRD due to diabetic nephropathy carry a lower risk of post-KTx RCC [14,32,35,68]. KTx recipients are usually under intensified medical surveillance and the higher incidence of RCC among KTx recipients compared to general populations and ESRD patients might be due to detection bias. On the other hand, the lack of consensual RCC screening among KTx recipients may also have underestimated the exact incidence among the KTx patient population. Currently, there are no universal recommendations for RCC screening among KTx patients [3,22,69–72]. While the European Renal Best Practice (ERBP) guidelines recommend native kidney ultrasound as RCC screening in kidney transplant recipients, and the European Association of Urology (EAU) recommends an annual ultrasound of native kidneys and allografts for anyone with ACKD, previous RCC, or von Hippel–Lindau disease [3,71,72], the Kidney Disease Improving Global Outcomes (KDIGO) and AST guidelines for post-KTx care currently do not suggest universal screening for RCC among KTx recipients [22,69,70]. Thus, there are various RCC screening approaches for KTx recipients at different transplant centers. Many cases of RCC have been discovered during investigations for post-transplant erythrocytosis, elevated serum creatinine, hematuria, urinary infection, or incidentally from imaging for other indications [33,73–75]. The majority of studies with available data on surveillance programs performed screening for RCC post-KTx annually by ultrasonography of native and allograft kidneys. Among KTx recipients with ACKD, acquired multicystic dysplasia, or a prior history of RCC required more frequent screenings, every 6 months [16,17,19,20,28,36,76]. Given that the risk is greatest in the first year post-KTx and the majority of RCCs occur in the first 5 years after KTx [15,29,31,65,77], previous reports suggest that KTx recipients should routinely undergo ultrasonography to screen RCC on the native kidney during the first 30 days post-KTx and every 5 years afterwards in the absence of renal cysts, or every 2 years in the presence of renal cysts [65,77–79]. Our study's findings suggest the need for future studies to identify a cost-effective surveillance strategy for RCC among KTx recipients. This strategy would need to take into consideration both native and allograft kidneys, and differentiate KTx recipients with non-simple renal cysts [3,80].

Several limitations of our systematic review are worth mentioning. First, there are statistical heterogeneities in our meta-analysis. Potential sources for heterogeneities were the variations in the renal transplant recipient screening methods, patient characteristics, and differences in the immunosuppressive regimens used at various transplant centers, which may have affected the incidence of RCC and mortality rate in this population. Second, there is a lack of data from included studies on immunosuppressive regimens [81–85]. Mammalian target of rapamycin (mTOR) inhibitors have shown antineoplastic activities [86]. Although the effects of mTOR among KTx recipients have been shown mostly for non-melanoma skin cancer [87–89], future studies evaluating the effects of different immunosuppressive regimens on mortality in KTx patients with RCC are needed. Lastly, this is a meta-analysis of cohort studies and the data from population-based studies were limited. Thus, large population-based studies evaluating the incidence of RCC in KTx patients are required in the future.

In summary, the overall estimated incidence of RCC after KTX was 0.6%, with an associated high mortality rate in KTx recipients of 13.9%. Despite potential improvements in the post-KTx RCC incidence, the mortality in KTx patients with RCC has remained unchanged over time.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/4/530/s1, Data S1: Search terms for systematic review; Figure S1: Funnel plot evaluating for publication bias evaluating incidence of KTx recipients with RCC; Figure S2: Funnel plot evaluating for publication bias evaluating mortality of KTx recipients with RCC.

**Author Contributions:** Conceptualization, A.C., C.T., W.K., M.A.M., N.J.K. and W.C.; Data curation, A.C. and C.T.; Formal analysis, C.T. and W.C.; Investigation, A.C., C.T. and W.C.; Methodology, A.C., T.B., B.B., W.K., P.L., N.L. and W.C.; Project administration, K.W. and K.S.; Resources, K.W. and K.S.; Software, K.W.; Supervision, T.B., N.R.A., B.B., W.K., A.T.-O., N.L., M.A., N.J. and W.C.; Validation, A.C., C.T., P.L. and W.C.; Visualization, T.B. and K.S.; Writing—original draft, A.C. and N.R.; Writing—review and editing, C.T., T.B., N.R.A., B.B., W.K., K.W., P.L., K.S., A.T.-O., N.L., M.A.M., N.J.K. and W.C.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

#### *Article*

### **Subarachnoid Hemorrhage in Hospitalized Renal Transplant Recipients with Autosomal Dominant Polycystic Kidney Disease: A Nationwide Analysis**

**Wisit Cheungpasitporn 1,\*, Charat Thongprayoon 2, Patompong Ungprasert 3, Karn Wijarnpreecha 4, Wisit Kaewput 5, Napat Leeaphorn 6, Tarun Bathini 7, Fouad T. Chebib <sup>2</sup> and Paul T. Kröner <sup>4</sup>**


Received: 21 March 2019; Accepted: 15 April 2019; Published: 17 April 2019

**Abstract:** Background: This study aimed to evaluate the hospitalization rates for subarachnoid hemorrhage (SAH) among renal transplant patients with adult polycystic kidney disease (ADPKD) and its outcomes, when compared to non-ADPKD renal transplant patients. Methods: The 2005–2014 National Inpatient Sample databases were used to identify all hospitalized renal transplant patients. The inpatient prevalence of SAH as a discharge diagnosis between ADPKD and non-ADPKD renal transplant patients was compared. Among SAH patients, the in-hospital mortality, use of aneurysm clipping, hospital length of stay, total hospitalization cost and charges between ADPKD and non-ADPKD patients were compared, adjusting for potential confounders. Results: The inpatient prevalence of SAH in ADPKD was 3.8/1000 admissions, compared to 0.9/1000 admissions in non-ADPKD patients (*p* < 0.01). Of 833 renal transplant patients with a diagnosis of SAH, 30 had ADPKD. Five (17%) ADPKD renal patients with SAH died in hospitals compared to 188 (23.4%) non-ADPKD renal patients (*p* = 0.70). In adjusted analysis, there was no statistically significant difference in mortality, use of aneurysm clipping, hospital length of stay, or total hospitalization costs and charges between ADPKD and non-ADPKD patients with SAH. Conclusion: Renal transplant patients with ADPKD had a 4-fold higher inpatient prevalence of SAH than those without ADPKD. Further studies are needed to compare the incidence of overall admissions in ADPKD and non-ADPKD patients. When renal transplant patients developed SAH, inpatient mortality rates were high regardless of ADPKD status. The outcomes, as well as resource utilization, were comparable between the two groups.

**Keywords:** autosomal dominant polycystic kidney disease; epidemiology; hospitalization; kidney transplantation; subarachnoid hemorrhage

#### **1. Introduction**

Subarachnoid hemorrhage (SAH) is a major clinical problem worldwide, associated with a poor prognosis, long-term morbidity, and extremely high mortality [1–3]. In the United States, over 30,000 cases of SAH occur annually [4,5]. Although global SAH has decreased each year from 1960 through 2017 [6], approximately 30% to 50% of patients who developed SAH died at 60 days. Furthermore, over 30% of survivors subsequently suffered neurologic deficits post-SAH [1–4,7–9]. Major causes of SAH include ruptured intracranial aneurysm, cerebral arteriovenous malformation, and traumatic brain injury [6,10].

Among patients with autosomal dominant polycystic kidney disease (ADPKD), a disorder that affects the kidneys and other organs caused by mutations in PKD1 and PKD2, a wide spectrum of vascular abnormalities have been described including intracranial aneurysms (and dolichoectasias), thoracic aorta and cervicocephalic artery dissections, and coronary artery aneurysms [11,12]. Compared to the general population, the prevalence of intracranial aneurysms in ADPKD patients is approximately five times higher and is estimated at 4% to 22.5% [13–19]. Among ADPKD patients with a family history of SAH/intracranial aneurysms, the frequency is three to five times higher than the general population [12]. Thus, screening is recommended for intracranial aneurysms among ADPKD patients with (1) family or past medical history of intracranial aneurysm presence or rupture; (2) symptoms suggesting intracranial aneurysm; (3) occupations in which loss of consciousness may be fatal; (4) upcoming major elective surgery; and (5) patient subjective concern for possible intracranial aneurysm presence [20–22].

Despite the recommendation for intracranial aneurysm screening prior to major elective surgery, such screening for ADPKD patients is not consistently performed among all transplant centers in the USA during pre-transplant assessment for renal transplantation [23,24]. In addition, previously published studies on the increased risk of SAH after renal transplantation in patients with ADPKD have not yielded relevant information, or have been underpowered [25–31].

Thus, we conducted this study using a nationwide inpatient USA database to evaluate the hospitalization rates for SAH among renal transplant patients with ADPKD and its outcomes, when compared to non-ADPKD renal transplant patients.

#### **2. Methods**

#### *2.1. Data Source*

The 2005–2014 National Inpatient Sample (NIS) databases were used to conduct this retrospective cohort study. The NIS is the largest inpatient all-payer database that is publicly available in the US. This database was developed by the Agency for Healthcare Research and Quality (AHRQ) as part of its Healthcare Cost and Utilization Project (HCUP). The dataset for the studied years contains more than 78 million hospitalizations, which in itself is a 20% stratified sample of over 4000 non-federal acute care hospitals in 44 states of the United States, and is representative of 95% of hospitalizations nationwide. This dataset included codes for principal diagnosis, secondary diagnoses, and codes for procedures performed during the hospitalization.

#### *2.2. Study Population*

Initially, renal transplant patients were identified using the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) codes of v42.0. The ADPKD status in this cohort was identified using the ICD-9-CM code 753.13. The associated diagnosis of SAH was identified using the ICD-9-CM code 430. Patients undergoing elective hospital admission or patients undergoing renal transplantation during the same admission were excluded from the analysis.

#### *2.3. Variable Definition*

Patient characteristics included age, gender, ethnicity, median income in patients' zip code, family history of stroke, and insurance type. Hospital characteristics included hospital region, teaching status, number of hospital beds, urban location, and weekend admission. The Healthcare Cost and Utilization Project (HCUP) divides the US into four census regions based on geographical location: Northeast, Midwest, South and West. The vital status at the end of hospitalization, length of hospital stay (LOS), and total hospitalization charges were abstracted from the database. To account for patient comorbidities, the Deyo adaptation of the Charlson Comorbidity Index was used, which was appropriate for the large database analysis [32].

#### *2.4. Outcomes*

The outcome for primary analysis was to determine the inpatient prevalence of SAH as a discharge diagnosis in renal transplant patients with ADPKD, compared to renal transplant patients without ADPKD. The outcomes for secondary analysis were to compare in-hospital mortality, the use of aneurysm clipping, length of hospital stay, and expenditures between ADPKD and non-ADPKD patients with SAH. Expenditures included total hospitalization charges and hospitalization costs. Total hospitalization charges represented the amount of financial resources that each hospital billed for providing its service for each patient, whereas hospitalization costs represented the amount of money spent by each hospital in providing the patient care. Hospitalization costs were calculated by multiplying the cost-to-charge ratios for the respective hospitals with the total hospitalization charges. Cost-to-charge ratios were provided by the Healthcare Cost and Utilization Project (HCUP) for each hospitalization in the database in order to enable this calculation. Since this study used datasets for 10 different calendar years, costs and charges were adjusted for inflation using the consumer price index and converting them to 2014 \$USD equivalents.

#### *2.5. Statistical Analysis*

Discharge-level weights on the HCUP nationwide databases were used to estimate the total number of renal transplant patients that had an associated diagnosis of SAH. Descriptive statistics were used to identify the patient characteristics. Fisher's exact test was used to compare proportions. Students' *t*-test was used to compare means. A hybrid multivariate logistic regression model was built by first conducting a univariate regression analysis on variables that were identified from other studies as being relevant to the outcome. If these variables impacted the outcome in any direction with a *p*-value of <0.01, they were included in the multivariate logistic regression model. In multivariate logistic regression, odds ratios and means were adjusted for age, gender, insurance type, family history of stroke, the median income in patients' zip code, hospital region, urban location, number of hospital beds and teaching status. All statistical analyses were performed using STATA, Version 13 (StataCorp LP, College Station, TX, USA).

#### **3. Results**

#### *3.1. Inpatient Prevalence of SAH as a Discharge Diagnosis*

Out of 382,516,561 patients admitted to hospitals during the study period, 918,478 were identified as having had a history of renal transplant. ADPKD patients who underwent renal transplant had higher inpatient prevalence of SAH as a discharge diagnosis than non-ADPKD renal transplant patients (3.8 vs. 0.9 cases per 1000 discharges; *p* < 0.01).

#### *3.2. Patient and Hospital Characteristics in SAH Patients*

In total, 833 patients had an associated diagnosis of SAH. These included 30 ADPKD renal transplant patients and 803 non-ADPKD renal transplant patients. The ADPKD renal transplant patients were older, had higher comorbidity burden, and were more likely to be admitted to teaching hospitals than non-ADPKD renal patients. There was no significant difference in terms of weekend admission, the median income in patients' zip code, insurance type, hospital region, urban location, and number of hospital beds between the two cohorts (Table 1).


**Table 1.** Patient and hospital characteristics.

ADPKD, adult polycystic kidney disease.

#### *3.3. Mortality*

A total of 5 (17%) ADPKD renal transplant patients with SAH died in hospitals compared to 188 (23.4%) non-ADPKD renal transplant patients (*p* = 0.70). Similarly, in adjusted analysis, there was no difference in in-hospital mortality between the two groups, with an adjusted odds ratio (aOR) of 0.87 (95% confidence interval (CI) 0.12–6.23; *p* = 0.89) (Table 2).


**Table 2.** Outcomes and procedures of ADPKD and non-ADPKD patients with SAH.

#### *3.4. Use of Aneurysm Clipping*

Ten (33.0%) ADPKD renal transplant patients with SAH underwent aneurysm clipping, compared to 137 (17.1%) non-ADPKD renal transplant patients (*p* = 0.32). On adjusted analysis, the use of aneurysm clipping in ADPKD renal transplant patients was not significantly higher than non-ADPKD renal transplant patients (aOR: 2.02; 95% CI 0.28–14.81; *p* = 0.49) (Table 2).

#### *3.5. Hospital Length of Stay*

The mean LOS in ADPKD renal transplant patients with SAH was 9.8 days, compared to 8.9 days in the non-ADPKD renal transplant SAH patients. Although the mean additional LOS in ADPKD renal transplant patients with SAH was 3.0 days shorter than non-ADPKD renal transplant patients with SAH, this was not statistically significant (95% CI: −10.1–4.1, *p* = 0.41) on adjusted analysis (Table 3).

**Table 3.** Hospital length of stay and expenditure differences between ADPKD and non-ADPKD patients with SAH.


#### *3.6. Total Hospitalization Costs and Charges*

The mean hospital cost for ADPKD renal transplant patients with SAH was \$ 30,519, while the mean hospital cost for non-ADPKD renal transplant patients with SAH was \$ 33,526 (*p* = 0.04). Although the ADPKD renal transplant patients with SAH had a mean hospital cost \$ 1086 lower than the non-ADPKD renal transplant patients, the difference was not statistically significant (95% CI: −22,548–20,376, *p* = 0.92) on adjusted analysis (Table 3).

The mean total hospitalization charge for ADPKD renal transplant patients with SAH was \$ 85,682, while the mean total hospitalization charges for non-ADPKD renal transplant patients with SAH was \$ 112,514. Although ADPKD renal transplant patients with SAH had a mean hospitalization charge \$ 14,944 lower than the non-ADPKD renal transplant patients, this was not statistically significant (95% CI: −73,293–43,404, *p* = 0.62) on adjusted analysis (Table 3).

#### **4. Discussion**

In this study utilizing the USA Nationwide Inpatient Sample database, we demonstrated that renal transplant patients with ADPKD had a 4-fold higher inpatient prevalence of SAH than those without ADPKD. When renal transplant patients developed SAH, the inpatient mortality rate was high (around 20 to 30%), regardless of ADPKD status. In addition, the use of aneurysm clipping for SAH and hospital LOS were comparable among renal transplant patients with and without ADPKD.

In the previous analysis using USA Renal Data System registry data, Lentine et al. reported a decreased risk of SAH in renal transplant patients when compared to end-stage renal disease (ESRD) patients on the transplant waiting list [33]. Additionally, among ESRD patients on dialysis, it has been shown that ADPKD is associated with an increased risk of SAH [34,35]. Despite an overall reduction

in SAH risk after renal transplantation [33], our study demonstrates a higher relative frequency of SAH among ADPKD patients compared to those without ADPKD and aims to raise awareness that SAH remains an important concern in the post-transplantation population.

The risk of intracranial aneurysms among ADPKD patients increased with age [36,37] and its prevalence is substantially increased after 45 years of age, especially among ADPKD Caucasian patients [38]. The average age of patients with ADPKD and ESRD is greater than 45 years [34,35,38]. However, intracranial aneurysm screening during pre-transplant evaluation for renal transplantation was a requirement for some but not all transplant centers in the USA, despite recommendation for intracranial aneurysm screening prior to major elective surgery [23,24]. Furthermore, a very recent study by Flahault et al. including 495 ADPKD patients suggested that systematic screening was cost-effective and provided a gain of 0.68 quality-adjusted life years compared to targeted screening in only those with a familial history of intracranial aneurysms [21]. The investigators concluded that intracranial aneurysm screening could be proposed to all ADPKD patients regardless of family history of stroke [21].

In our study involving renal transplanted patients, after adjusting for potential confounders including family history of stroke, the relative frequency of SAH among ADPKD patients remained significantly higher compared to those without ADPKD. This may suggest potential development of SAH in renal transplant patients with ADPKD, regardless of family history of stroke. When patients develop SAH, the mortality rate is high regardless of renal transplant status [27,33,39–41]. In our study, nearly 25% of renal transplant patients with SAH died during hospital admission. After adjusting for potential confounders, we found no differences in in-hospital mortality, rate of aneurysm clipping, hospital LOS, hospital costs and total hospitalization charges among ADPKD and non-ADPKD renal transplant patients with SAH.

Several limitations of this study must be acknowledged. Firstly, although the utilization of the NIS database enables an assessment of inpatient relative frequency and burden of SAH in renal transplant patients in the USA, potential inaccuracies in ICD-9-CM coding are significant limitations to our study. Secondly, the data relating to types of mutation among ADPKD patients were limited in this study. Since patients with *PKD2*-associated ADPKD usually develop ESRD at an older age compared to renal-transplanted ADPKD patients (mean age: 79.7 vs. 58.9 years) [22,42,43], it is likely that our study represents the outcomes of SAH among renal transplant patients with *PKD1*-associated ADPKD. Thirdly, this is an analysis of an inpatient USA database and, thus, it does not consider the broader USA outpatient renal transplant population nor the renal transplant population in other countries. Because of the nature of the NIS database, our study included only inpatient renal transplant patients and might be subject to selection bias. Without knowing the total number of all ADPKD and non-ADPKD renal transplant patients, the overall admission rates for any reasons cannot be calculated. Fourthly, although the findings of our study highlight the burden of SAH in ADPKD renal transplant patients, it cannot be concluded that screening all ADPKD patients prior to renal transplantation is cost-effective. Nevertheless, physicians should take into account the risk of SAH among ADPKD renal transplant patients. Furthermore, the practice of intracranial aneurysm screening in ADPKD patients might vary between hospitals. However, information regarding the practice of intracranial aneurysm screening was not available in our study. Lastly, given the administrative nature of the dataset, it was not possible to investigate the effects of medication, such as immunosuppressants, on economic burden and mortality among ADPKD and non-ADPKD renal transplant patients with SAH.

#### **5. Conclusions**

In conclusion, in this study using the USA Nationwide Inpatient Sample database, we demonstrate higher inpatient relative frequency of SAH among ADPKD renal transplant patients when compared with non-ADPKD renal transplant patients. In-hospital mortality due to SAH among renal transplant patients was high, regardless of ADPKD status. The use of aneurysm clipping for SAH and hospital length of stay were comparable between ADPKD renal transplant patients and those without ADPKD. **Author Contributions:** Conceptualization, W.C., N.L., F.T.C. and P.T.K.; Data curation, P.T.K.; Investigation, W.C., C.T., W.K. and P.T.K.; Methodology, W.C., C.T., P.U., K.W., T.B., F.T.C. and P.T.K.; Project administration, W.C. and P.U.; Resources, P.T.K.; Software, P.T.K.; Supervision, N.L., F.T.C. and P.T.K.; Validation, W.C. and P.T.K.; Visualization, N.L.; Writing—original draft, W.C. and C.T.; Writing—review & editing, P.U., K.W., W.K., N.L., T.B., F.T.C. and P.T.K.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

#### *Article*

### **Epidemiology, Risk Factors, and Outcomes of Opportunistic Infections after Kidney Allograft Transplantation in the Era of Modern Immunosuppression: A Monocentric Cohort Study**

**Philippe Attias 1,**†**, Giovanna Melica 2,**†**, David Boutboul 3,4, Nathalie De Castro 5, Vincent Audard 1,6, Thomas Stehlé 1,6, Géraldine Gaube 2, Slim Fourati 7,8, Françoise Botterel 7,8,9, Vincent Fihman 7,8,9, Etienne Audureau 10, Philippe Grimbert 1,6,11 and Marie Matignon 1,6,\***


Received: 13 March 2019; Accepted: 22 April 2019; Published: 30 April 2019

**Abstract:** Epidemiology of opportunistic infections (OI) after kidney allograft transplantation in the modern era of immunosuppression and the use of OI prevention strategies are poorly described. We retrospectively analyzed a single-center cohort on kidney allograft adult recipients transplanted between January 2008 and December 2013. The control group included all kidney recipients transplanted in the same period, but with no OI. We analyzed 538 kidney transplantations (538 patients). The proportion of OI was 15% (80 and 72 patients). OI occurred 12.8 (6.0–31.2) months after transplantation. Viruses were the leading cause (*n* = 54, (10%)), followed by fungal (*n* = 15 (3%)), parasitic (*n* = 6 (1%)), and bacterial (*n* = 5 (0.9%)) infections. Independent risk factors for OI were extended criteria donor (2.53 (1.48–4.31), *p* = 0.0007) and BK viremia (6.38 (3.62–11.23), *p* < 0.0001). High blood lymphocyte count at the time of transplantation was an independent protective factor (0.60 (0.38–0.94), *p* = 0.026). OI was an independent risk factor for allograft loss (2.53 (1.29–4.95), *p* = 0.007) but not for patient survival. Post-kidney transplantation OIs were mostly viral and occurred beyond one year after transplantation. Pre-transplantation lymphopenia and extended

criteria donor are independent risk factors for OI, unlike induction therapy, hence the need to adjust immunosuppressive regimens to such transplant candidates.

**Keywords:** kidney transplantation; opportunistic infection; allograft survival; BK virus nephropathy

#### **1. Introduction**

Kidney allograft recipients are exposed to a broad range of infectious pathogens that give rise to infections with unusual and more severe presentations [1]. Opportunistic infections (OIs) include infections caused by uncommon pathogens and those caused by common pathogens but with unusual and more severe forms [2]. The reported incidence of OIs is variable, from 10% to 25% [3,4]. Currently, prevention strategies against cytomegalovirus (CMV), herpes simplex viruses (HSV), and *Pneumocystis* spp. are recommended and result in a significant reduction of post-transplantation OIs [5] and 50% decrease in the risk of death due to infectious causes. However, infections remain the most common cause of non-cardiovascular deaths (15–20%) [5,6].

After solid-organ transplantation (SOT), OIs flourish in the first 12 months boosted by the immunosuppressive status [2] since less than 20% of SOT recipients receive no induction therapy and up to 60% of kidney transplant recipients receive a T-cell depleting agent [7,8]. Anti-thymocyte globulin primarily induces rapid, profound, and long-lasting depletion of T-lymphocytes in peripheral blood and lymphoid organs, and apparently it does not spare B-cell and NK cell populations [9,10]. Thanks to such therapies, patient and kidney allograft survival after kidney transplantation have markedly improved and acute allograft rejection has decreased [11–13]. On the other hand, one could argue that the long duration of immunosuppression might be the culprit for the increased incidence of OIs.

The epidemiology of OIs after SOT was previously described in two large cohorts on transplant recipients. The first one was conducted 10 years ago and included SOT recipients treated with alemtuzumab [4]. They showed that receiving lung or intestinal transplants was independent risk factors for OIs [4]. Published in the era of modern immunosuppression and after the wide use of prevention strategies, the second study included abdominal SOT recipients (kidney, pancreas, and liver), hence the heterogeneous patient profiles and immunosuppressive regimens [3]. The authors highlighted the delayed onset of OIs where most infections occurred after six months without any impact on recipient's survival and graft function [3]. A recent pediatric cohort on kidney allograft recipients has confirmed the absence of impact of viral OIs (CMV, Epstein Barr virus (EBV), and BK virus (BKV)) on kidney allograft survival [14]. In other studies on kidney allograft recipients, only selected OIs, secondary to specific pathogens (*Nocardia, Aspergillus, Cryptococcus neoformans*), have been reported [15–17].

Given the lack of clinical and epidemiological data on OIs after kidney allograft transplantation, we conducted a large monocentric cohort study on all kidney allograft recipients in our center to analyze the epidemiology of OIs and their impact on kidney recipient survival and allograft function.

#### **2. Materials and Methods**

#### *2.1. Study Design and Patients*

We conducted a single center retrospective cohort enrolling all adult kidney allograft recipients registered between January 2008 and December 2013. We excluded cases with primary allograft non-function happening within seven days after transplantation. Expanded criteria donor (ECD) was defined as donors older than 60 years or between 50 and 60 years, with two of the three following criteria: (i) hypertension; (ii) pre-retrieval serum creatinine > 1.50 mg/dL; and (iii) cerebrovascular cause of brain death [18]. Glomerular filtration rate was estimated (eGFR) using MDRD formula [19]. Acute rejection episodes were classified according to updated Banff classification [20]. Allograft loss was considered if eGFR was below 15 mL/min/1.73 m2. All recipients were followed at least one year after transplantation unless death or graft loss occurred earlier.

#### *2.2. Infectious Prophylaxis*

The management for CMV prophylaxis followed international recommendations [21]. Prophylaxis involved the administration of oral valganciclovir to high (D+/R-) and intermediate (R+ treated with thymoglobulin) risk patients. Duration of prophylaxis was 6 months in high risk patients and 3 months in intermediate ones.

Participants with past history of tuberculosis were treated with isoniazid for three months after transplantation. *Pneumocystis jirovecii* prophylaxis included trimethoprim-sulfamethoxazole (400 mg) or pentacarinat aerosol for 12 months after transplantation and till CD4 count dropped to <200/μL.

#### *2.3. Opportunistic Infections*

OIs were defined according to current literature [1] and international guidelines [22,23]. All episodes were retrospectively and blindly validated (review of all medical reports without the patient name and the final conclusion (clinical and biological data) of infections that happened in kidney-transplant recipients included in the study) by an infectious disease specialist part of the study group. The following OIs were considered:



We included BKV infection, as BK virus, highly seroprevalent in humans, appears to cause clinical disease only in immunocompromised patients and almost all after kidney transplantation (tubulointerstitial nephritis called BKV-induced nephropathy directly related to plasma viral load) [24]. In our center, during the first year after kidney transplantation, BK viruria tests were performed at 1, 2, 3, 6, 9, and 12 months. BK viremia was checked once BK viruria was positive. If BK viruria (associated with BK viremia or not) was positive, a blood test was performed every two weeks.

We also considered Kaposi sarcoma, as one of the four types was organ transplant-associated and usually regresses with reduction in immunosuppression [25].



#### *2.4. Endpoints*

Clinical endpoints were an OI episode, death, and allograft loss. Recipients with at least one episode of OI were compared with the control group which included all other kidney allograft recipients engrafted at the same time period.

#### *2.5. Statistical Analysis*

Continuous variables are presented as mean (± Standard Deviation (SD)) or median (Interquartile Range (IQR)). Categorical variables are presented as counts (%). Baseline donor, recipient, and kidney transplant characteristics were compared between OI and control groups using Student *t*-test or Wilcoxon test for continuous variables, and Chi-2 or Fisher's exact tests for categorical variables, as appropriate. Time-to-event survival analyses were conducted to determine predictors of OI occurrence, patient overall survival, and allograft survival. Survival curves were plotted using Kaplan–Meier method and logrank tests to assess significance upon group comparison. Time varying Cox proportional hazard models were built for each endpoint, and hazard ratios (HR) along with their 95% confidence intervals (95% CI) were calculated. Factors yielding *p* < 0.2 in the univariate analyses were then considered in the multivariate analyses' models, using a stepwise backward approach by sequentially removing variables not significant at *p* < 0.1 until the final model was reached. Variables with available

repeated data over time were entered both as time-fixed (value at the time of transplantation) and as time-varying (all available time points) variables into the Cox model. No imputation of missing data was done. Competing risk survival analysis (e.g., Fine–Gray methodology) cannot be directly applied on time-varying variables, therefore only results from Cox models are reported for allograft survival. All tests were two-tailed, and the significance level was reached with *p* value < 0.05. The analysis was performed using Stata SE v15.1 (College Station, TX, USA).

#### **3. Results**

#### *3.1. Whole Cohort*

A flow-chart of the study population is presented in Figure 1. Between January 2008 and December 2013, 557 kidney transplantations were performed in 557 patients (*n*), of whom 19 showed early primary allograft non-function. Overall, only 538 transplantations in 538 patients were included. Mean age was 52 ± 14 years. Mean follow-up was 55 ± 24 months. At the end of follow-up period, patient survival was 88% with 65 deaths, allograft survival was 87% with 72 allograft losses, and mean eGFR was 48 <sup>±</sup> 20 mL/min/1.73 m2. Tables <sup>1</sup> and <sup>2</sup> described the whole cohort.

**Figure 1.** Flow chart of the study population. Between January 2008 and December 2013, 557 kidney transplantations were performed in *n* = 557 patients. Nineteen patients were excluded because of primary allograft non-function within the first week after transplantation. The final cohort included 538 transplantations in 538 patients.





#### *3.2. Opportunistic Infections*

Eighty OI episodes were reported in 15% of patients (*n* = 72). The median time to post-transplantation OI was 12.8 (6.0–31.2) months, and in 39 patients (48.8%), OI occurred over the first post-transplantation year.

Viruses were the leading cause of OI episodes, *n* = 54 (68%), representing 10% of the whole cohort. Median time to viral OI onset was 14 (7–31) months after transplantation. Of those viral OIs, we recorded 21 (39%) shingles (4%-whole cohort), 18 (33%) BKV nephropathy (BKVN) (3%-whole cohort), 6 (11%) Kaposi sarcoma (1%-whole cohort), 3 (6%) CMV disease (0.5%-whole cohort), 3 (6%) norovirus gastroenteritis (0.5%-whole cohort), and 1 (2%) of each of the following: JC virus causing progressive multifocal leukoencephalopathy (PML) (0.2%-whole cohort), VZV retinitis (0.2%-whole cohort), and HSV-1 esophagitis (0.2%-whole cohort).

Fungal infections were the second most common OIs, registered in 15 patients (19%) (3%-whole cohort), in the first 6 (2–25) months after transplantation, which is significantly earlier than viral infections (*p* = 0.04). We counted five (33%) invasive candidiasis (0.9%-whole cohort), four (27%) invasive aspergillosis (IA) (0.7%-whole cohort), three (20%) cryptococcosis (0.5%-whole cohort), two (13%) *Pneumocystosis* pneumonia (PCP) (0.3%-whole cohort), and one (7%) disseminated *Trichophyton Rubrum* infection (0.2%-whole cohort).

Among the six (7%) parasitic infections (1%-whole cohort) occurring 16 (5–23) months after transplantation, four were cryptosporidiosis (0.7%-whole cohort) and two microsporidiosis with gastrointestinal involvement (0.3%-whole cohort). Finally, five (6%) bacterial infections (0.9%-whole cohort) were described, of which two (40%) were tuberculosis (0.3%-whole cohort), two (40%) were nocardiosis (0.3%-whole cohort), and one (20%) was disseminated atypical mycobacteria infection (0.15%-whole cohort). Time to post-transplantation infection was 11 (9–34) months. Seven (10%) recipients had more than one post-transplantation OI episode.

The comparison between OI and control groups is shown in Tables 1 and 2. Donors were significantly older in OI group than in control group (*p* = 0.02), with a similar statistical trend in recipients (*p* = 0.056). At the time of transplantation, blood lymphocytes count was significantly lower in OI group (*p* = 0.04). Numbers and percentages of CD4 and CD8 T-cells were similar in both groups; the same was found for the immunosuppressive treatments after transplantation (induction and maintenance).

The estimated GFR in OIs group was significantly lower than in control group at any given time (i.e., at 12-months or last available follow-up data). Acute rejection incidence and CMV viremia were similar in both groups. At the end of follow-up, event rates, allograft loss, and time to death after transplantation were similar in both groups.

In time-to-event analysis, the univariate risk factors for OIs after kidney transplantation (Table 3) were older recipient age (HR 1.02 (1–1.04), *p* = 0.03), older donor age (1.02 (1.01–1.04), *p* = 0.02), and ECD (2.76 (1.68–4.54), *p* < 0.0001). Higher CD4+ T-cells during follow-up and higher blood lymphocyte count at the time of transplantation were protective factors against OI (0.31 (0.11–0.83) and 0.61 (0.40–0.95), respectively). At the time of transplant, blood lymphocytes count was significantly lower in patients with OI (Table 1 (OI) Median 1.2 (IQR 0.9–1.6) vs. (Controls) 1.3 (1.0–1.8); *p* = 0.04) while CD4/CD8 numbers (%) were similar in both groups (Table 1) or using time-to-event analysis (Table 3). Induction and maintenance immunosuppressive regimens, acute rejection episode, and CMV viremia were not OI risk factors.

Independent risk factors for OI according to multivariate analysis were ECD (2.53 (1.48–4.31), *p* = 0.0007), and BK viremia (6.38 (3.62–11.23), *p* < 0.0001). High blood lymphocyte count at the time of transplantation was an independent protective risk factor (0.60 (0.38–0.94), *p* = 0.026). The multivariable analysis conducted only on patients with available pre-transplantation CD4 T-cell counts (*n* = 456) showed that ECD (2.92 (1.62–5.27), *p* = 0.0004) and BK viremia (5.11 (2.72–9.57), *p* < 0.0001) were independent risk factors for OI. In contrast, a higher CD4 T-cell percentage during follow-up (time-varying variable) (0.98 (0.96–0.99), *p* = 0.015) and, to a lesser extent, a higher lymphocyte count at the time of transplantation (0.68 (0.44–1.07), *p* = 0.09) were independent protective factors.



*J. Clin. Med.* **2019**, *8*, 594



#### *3.3. Patients and Allograft Survival*

In OI group, patient survival was significantly lower than in control group (Figure 2a, *p* = 0.009). After OI episode, 10 patients (14%) died, of whom three (30%) deaths were related to an OI episode (one PML, one PCP, and one IA). Other causes of death included cardio-vascular disease (*n* = 3), hemorrhagic shock (*n* = 1), traumatism (*n* = 1), bacterial infections (*n* = 1), and neoplasia (*n* = 1). OI was not an independent risk factor for death as shown by the multivariable analysis (Table 4). OI lost its statistical significance after multivariable adjustment for recipient age at transplantation, TCD8 cells (/1000) during follow-up, neutrophils (/1000) during follow-up, HCV+ status, former kidney transplantation and diabetes. Consequently, and in accordance with our statistical analysis strategy (section #2), OI was left out of Table 4 showing only results from the final multivariable model after a stepwise backward approach was applied.

**Figure 2.** Patient, allograft, and event-free survival in both groups (Kaplan–Meier survival analysis): (**a**) in OI group, patient survival was significantly lower than in control group (*p* = 0.009); (**b**) allograft survival was significantly lower in OI group (*p* = 0.0002); and (**c**) allograft survival without BK virus nephropathy was not significantly lower in OI group (*p* = 0.87).


**Table 4.** Patient and allograft survival independent risk factors (time varying Cox model).

Allograft survival was significantly lower in OI group (Figure 2b, *p* = 0.0002). After OI episode, allograft loss occurred in 13 (18%) patients, around 31 (5–63) months after transplantation. Causes of allograft loss were five (38%) BKVN, five (38%) chronic allograft dysfunction, two (16%) refractory acute rejection, and one (8%) unknown cause. OI episode was an independent risk factor for allograft loss with HR = 2.53 (1.29–4.95) (*p* = 0.007) (Table 4).

#### *3.4. Analysis Excluding BKVN*

As BKVN is well-known to cause a chronic destructive infection [24], we performed another analysis excluding BKVN events (Table 3). ECD and low blood lymphocytes count at the time of transplantation remained the two independent risk factors for OI episode (4.09 (2.06–8.09), *p* < 0.0001 and 0.64 (0.38–1.06), *p* = 0.08, respectively). OI was not found to be a risk factor for allograft loss (*p* = 0.87; Figure 2c and Table S1).

#### **4. Discussion**

We present here the results of a monocentric cohort analysis conducted on more than 500 kidney allograft recipients. We showed that, in the era of modern immunosuppression and the wide use of infectious disease prophylactic strategies, OIs occurred more than one year after transplantation and that pre-transplantation lymphopenia was an independent risk factor for OI episode, which was not the case for induction therapy. Moreover, OIs were an independent risk factor for allograft loss but had no effect on patient survival.

Although OIs are well defined in the setting of HIV [23], no classification of post-SOT OIs is currently available [2]. However, we tried in our work to carefully apply the current OI definitions on post-SOT settings taking into account the standardized immunosuppressive regimen and the type of SOT. On this point, former studies on allograft recipients were quite heterogenous concerning the infections considered and the type of SOT [3,4,14]. To our knowledge, no study evaluating the risk factors for OIs versus more severe common infections in engrafted patients has been published. Therefore, no conclusion regarding physiopathology and risk factors is available. In our cohort, we used HIV classification to define OI updated with BKVN, an immunosuppression-induced infection after kidney transplantation [23,24]. This selection process allowed us to provide reliable data on incidence and spectrum of OI after kidney transplantation and could be routinely used by clinicians to customize the prevention strategies to the patient condition.

OI proportion in our cohort was significantly lower than the most recently published incidence rate of around 25% [3]. Several explanations may account for this low incidence. First, the post-transplantation CMV, PCP, and bacterial prophylaxis strategies we use in our center are in fulfilment with the international recommendations (e.g., trimethoprim-sulfamethoxazole for Nocardia) [5,21]. Secondly, the immunocompromised recipients were exposed to a lower level of CNI, a strategy previously described to significantly decrease OI incidence [26]. At last, solid-organs failure before transplantation induced variable degrees of immune suppression. For instance, liver cirrhosis is associated with dysfunction of the defensive mechanisms against infections and higher incidence of sepsis [27] unlike end-stage renal failure [28]. Therefore, fungal infections risk is lower after kidney transplantation compared with other SOT populations [29].

Thus, we updated the description of post- kidney transplantation OIs to align it with the new strategies of immunosuppressive therapy. In our cohort, the incidence of CMV disease was significantly lower than previously described, probably because of the application of the regularly-updated prevention recommendations [2]. However, viral infections remained the first cause of OIs, mainly cutaneous shingles and BKVN. No prevention strategy is currently recommended for shingles. BKVN is clearly problematic after kidney transplantation since it thrives in immune suppression status, has a great impact on kidney allograft survival, and there is no curative treatment for it [24]. IA incidence is also lower in our cohort [29], whereas other OIs incidence was in the previously described range [16] after kidney transplantation.

Interestingly, time to OI onset was long, more than one year after transplantation. The latest review has reported a peak of OI at 6–12 months after transplantation [2]. Again, prevention strategies could probably postpone post-transplantation infections onset. However, post-transplantation fungal infection developed significantly earlier as in former studies, which confirmed that those infections flourish by the peak of immunosuppression [29]. No prevention strategy is currently recommended for those infections as well as PCP.

Thereafter, we aimed to identify independent risk factors for post-kidney transplantation OI. We found that ECD and low pre-transplantation lymphocyte count were independent risk factors; the type of induction immunosuppressive treatments and the recipient age were not. In kidney allograft recipients, older donor age, irrespective of recipient age, increases the rate of acute allograft rejection and infections [30,31]. The underlying immune system seems to be more important than immunosuppressive therapy. Aged transplanted mice could have an impaired anti-infectious response with accumulation of memory CD4+ T-cells and reduced Th1 anti-donor immune response [32,33]. These immunological effects could significantly decrease anti-infectious response in recipients transplanted from ECD. High CD4+ T-cells count was significantly a protective factor, but there was no effect of CD8+ T-cells count while CD4/ CD8 numbers (%) at the time of transplant were similar in both groups. The total count in lymphocytes had a superior predictive value for OI than the separate levels of CD4/CD8. However, the study population for analyses on CD4/CD8 was slightly decreased due to missing information on these variables, thus possibly resulting in a moderate loss of statistical power. High late stage differentiated CD28+CD57+CD4+ T-cells rates at the time of transplantation is independently associated with a decreased risk of OI [28]. Analysis of naive CD4+ T-cells remains to be determined since such phenotype has been associated with a high risk of infection in patients with common variable immunodeficiency [34]. Surprisingly, immunosuppressive induction using depletive monoclonal agents was not associated with OI incidence. Comparing the risk of infection with depletive and non-depletive therapies yielded controversial data although the most recent work shows that thymoglobulin was not associated with higher infection risk [35–37]. Almost all of our patients were treated with induction therapy. No induction therapy in immunocompromised kidney allograft recipients could be an option [38]. Whether the absence of induction could be associated with a significantly lower incidence of OI need to be elucidated.

How lymphopenia before transplant could influence OI occurring more than one year after transplantation remains unknown. Again, the wide use of prophylaxis (trimethoprim-sulfamethoxazole and valganciclovir) prevents early infection (mostly PCP, Nocardia, and CMV disease). Considering late infection, we believed that lymphopenia before transplantation could be a cumulative effect of immunosuppressive therapies in older patients.

Our data confirm that OI is not an independent risk factor for death [3,4]. In a recent large Finnish cohort, OI rarely caused deaths after kidney transplantation, but the most common cause of infection-related mortality was common bacterial infections, e.g. septicemia and pneumonia [6]. The lack of OI-related effect on mortality compared with the role of common bacterial infections needs deeper analyses of causes and risk factors for common infections; this should enable us to adjust prevention strategies to different contexts. Additionally, recent data suggest that infections could be the first cause of death after transplantation [39].

Finally, in our cohort, OI was an independent risk factor for allograft loss only if BKVN episodes were considered. The negative impact of BKVN on kidney allograft survival is well-documented [24]. Thus, in one of the analyses, we excluded BKVN from OI episodes and found no impact on kidney allograft survival on the long-term [3]. To decrease BKVN, only m-TOR inhibitors based immunosuppressive combination showed a significant effect, thus should be considered in all patient with standard immunologic risk [40].

Our study presents limits. The first one is being a single center study and retrospective. These results must be confirmed in a prospective multicentric cohort. However, the single center study implies only one way to manage immunosuppression after transplantation. The second one is that we performed an overview of OI without considering specific prognosis of each infection.

In conclusion, our study showed that, in the era of modern immunosuppression and the wide use of infection prophylactic regimens, OIs occurred later, more than one year after kidney transplantation and were mainly viral. Pre-transplantation lymphopenia and ECD were the two independent risk factors for OI, hence the need for customized immunosuppressive regimen in such transplant candidates. BKVN incidence remained high with a clear negative impact on allograft survival. In low-risk recipients, m-TOR based immunosuppressive therapy is the only prophylaxis to prevent BKVN and should be considered more widely. Two more issues need to be further studied: the specific role of pre-transplantation leucocytes subpopulation especially naive T-cells, and the difference between OI and common infections which have been described as the main cause of patient death after kidney transplantation.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/5/594/s1, Table S1: Allograft survival risk factors univariable analysis.

**Author Contributions:** Conceptualization, P.A., G.M., D.B., N.D.C., and M.M.; Methodology, G.M., E.A., and M.M.; Software, E.A.; Validation, all authors; Formal Analysis, P.A., G.M., D.B., E.A., M.M., and P.G.; Investigation, P.A., G.M., and M.M.; Resources, M.M.; Data Curation, V.A., T.S., S.F., G.G., F.B., and V.F.; Writing—Original Draft Preparation, P.A., G.M., and M.M.; Writing—Review and Editing: P.A., G.M., D.B., V.F., S.F., M.M., and P.G.; Visualization, G.M.; and Supervision, M.M.

**Conflicts of Interest:** The authors declare no conflict of interest. The results presented in this paper have not been published previously in whole or part, except in abstract format.

#### **Abbreviations**


#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

### *Article* **Higher Incidence of BK Virus Nephropathy in Pediatric Kidney Allograft Recipients with Alport Syndrome**

**Young Hoon Cho 1, Hye Sun Hyun 1,2, Eujin Park 1,3, Kyung Chul Moon 4, Sang-Il Min 5, Jongwon Ha 5,6, Il-Soo Ha 1, Hae Il Cheong 1, Yo Han Ahn 1,7,\*,† and Hee Gyung Kang 1,\*,†**


Received: 7 March 2019; Accepted: 8 April 2019; Published: 11 April 2019

**Abstract:** A retrospective review was performed to assess the risk factors and outcomes of BK virus infection and nephropathy (BKVN), an early complication in pediatric kidney allograft recipients. The study investigated the incidence, risk factors, and clinical outcomes of BK viremia and BKVN in a Korean population of pediatric patients who received renal transplantation from 2001–2015 at the Seoul National University Hospital. BKVN was defined as biopsy-proven BKVN or plasma BK viral loads >10,000 copies/mL for >3 weeks. BK viremia was defined as a BK viral load >100 copies/mL in blood. Among 168 patients assessed for BK virus status, 30 patients (17.9%) tested positive for BK viremia at a median of 12.6 months after transplantation. BKVN was diagnosed in six patients (3.6%) at a median of 13.4 months after transplantation. Three of the six BKVN patients had Alport syndrome (*p* = 0.003), despite this disease comprising only 6% of the study population. Every patient with BK viremia and Alport syndrome developed BKVN, while only 11.1% of patients with BK viremia progressed to BKVN in the absence of Alport syndrome. Multivariate analysis revealed that Alport syndrome was associated with BKVN development (hazard ratio 13.2, *p* = 0.002). BKVN treatment included the reduction of immunosuppression, leflunomide, and intravenous immunoglobulin. No allografts were lost in the two years following the diagnosis of BKVN. In summary, the incidence of BKVN in pediatric kidney allograft recipients was similar to findings in previous reports, but was higher in patients with underlying Alport syndrome.

**Keywords:** BK virus; BK virus nephropathy; kidney allograft; transplantation; Alport syndrome; children

#### **1. Introduction**

BK virus (BKV) is a polyomavirus that resides in the urogenital tract as a latent infection [1]. The seroprevalence of BKV in the first decade of life is 90% or higher [1,2], implying that most primary infections occur during childhood. In immunocompromised patients, reactivation of a latent infection is frequently observed [1,3] as BK virus nephropathy (BKVN) or hemorrhagic cystitis [1,4]. While hemorrhagic cystitis frequently develops in patients with hematologic stem cell transplantation, BKVN is essentially a complication of kidney transplantation [2,5]. BKVN has recently gained clinical significance with the introduction of potent immunosuppressive agents. The prevalence of BKVN in adult kidney allograft recipients is 1% to 10% [2,5,6] and has been reported to be 2% to 8% in pediatric renal transplant recipients [7,8]. BKVN is considered to be an early complication of kidney transplantation that often occurs in the first year of transplantation; 95% of BKVN develops within two years of transplantation according to the organization Kidney Disease: Improving Global Outcomes (KDIGO), but it may develop as late as in the fifth year [9–11]. Importantly, after BKVN is diagnosed, over 15% of patients are expected to lose their allograft kidney within one year [9].

Risk factors for BKVN in kidney allograft recipients include older age, male sex, ethnicity (non-African American), increased number of human leukocyte antigen (HLA) mismatches, prolonged cold-ischemia time, ureteral stent placement, immunosuppression induction with anti-thymocyte globulin, tacrolimus- and/or mycophenolate mofetil-based maintenance immunosuppression, and prior rejection history [3,12]. The most important risk factor for BKVN is considered to be the degree of immunosuppression [12]. However, risk factors for BKVN in children have not yet been studied sufficiently. In a retrospective cohort study of children, a seronegative status for BKV in recipients was associated with BKVN [13].

To gain a better understanding of BKVN in pediatric kidney transplantation recipients, the clinical characteristics and risk factors for BK virus infection and BKVN in pediatric kidney allograft recipients was assessed in this study.

#### **2. Methods**

#### *2.1. Study Population and Ethics*

We retrospectively reviewed the medical records of all pediatric kidney allograft recipients who underwent transplantation at the Seoul National University Hospital between January 2001 and July 2015. Clinical findings until July 2018 were assessed. This study was approved by the Institutional Review Board of the Seoul National University Hospital (IRB no. 1808-156-967) and was conducted in accordance with the Declaration of Helsinki.

#### *2.2. Immunosuppression*

The immunosuppression protocol of the Seoul National University Hospital consisted of steroids, tacrolimus, and mycophenolate mofetil. Methylprednisolone was administered as a 10 mg/kg intravenous bolus dose at the time of surgery and was tapered gradually to a maintenance dose of prednisolone 0.3 mg/kg by one month after transplantation. In patients with a low risk of rejection, prednisolone was discontinued by one year after transplantation. The tacrolimus target trough level was 8–12 ng/mL for up to three months, 6–8 ng/mL until six months, and 4–6 ng/mL thereafter. From 2001 to 2008, basiliximab was used as an induction therapy for high-risk patients with a transplanted kidney from a deceased donor or a high number of HLA mismatches. After 2008, all patients received basiliximab induction therapy.

#### *2.3. BK Viremia and BKVN*

BK virus DNA in the plasma of patients was tested by polymerase chain reaction (PCR) to detect the large T antigen of BKV (BKV ELITe kit, ELITechGroup, Puteaux, France). BK viral load quantification became available as of June 2008 and thereafter BK viral load of >100 copies/mL was considered as BK viremia. In principle, since 2008, BK viremia was screened every month for three months after transplantation, then every three months for one year, and then every year up to five years. BK virus was additionally screened in patients with an unexplained acute rise in serum creatinine or in patients receiving acute rejection treatment. Upon detection, BK viremia was followed every month. Persisting BK virus PCR loads over 10,000 copies/mL for >3 weeks was categorized as presumptive BKVN, as described previously [6]. Presumptive BKVN and pathologically proven BKVN were collectively categorized as BKVN to assess the risk factors for BKVN in this study. Histologic grading of BKVN was classified according to the criteria of the University of Maryland, USA [14].

#### *2.4. Statistical Analysis*

SPSS version 23.0 (SPSS, Armonk, NY, USA) was used for data analysis. Categorical variables were analyzed using the Pearson chi-square test or Fisher exact test and continuous variables were compared using the *t*-test or Mann–Whitney U test. All values were reported as a median (range). To assess risk factors for BKVN, univariate analysis was performed using the Kaplan–Meier test with the log-rank test and multivariate analysis was done using the Cox proportional hazards model. Factors with a value of *p* < 0.25 in the univariate analysis were included in the multivariate analysis. A value of *p* < 0.05 was considered statistically significant.

#### **3. Results**

A total of 195 patients younger than 20 years underwent allograft kidney transplantation between January 2001 and July 2015, and of these, 168 were tested for BK virus by PCR more than once (Table 1). BK viremia was positive in 30 patients (17.9%, 30/168) at 12.6 months (0.4–73.1 months) after transplantation (Figure 1). BKVN was diagnosed in six patients (3.6%, 6/168, BKVN group) at 13.4 months (3.9–60.0 months) after transplantation, with a BK viral titer at first detection of 46,508 copies/mL (16,924–289,699 copies/mL) at 10.9 months (1.5–60.0 months) and 306,277 copies/mL (41,914–10,165,852 copies/mL) at the peak at 14.2 months (4.3–60.3 months). The BK viral titer in the 24 patients with BK viremia but not BKVN (only-BK viremia group) was 437 copies/mL (119–6794 copies/mL) at first detection at 12.6 months (0.4–73.1 months) after kidney transplantation, and increased to 561 copies/mL (123–42,288 copies/mL) copies/mL at 15.3 months (0.4–73.7 months).

**Figure 1.** Onset of BK viremia after kidney transplantation. Values are represented as number of patients (% of total subject population).


**Table 1.** Baseline characteristics of patients with or without BK virus nephropathy.

<sup>1</sup> Cardiomyopathy, myocarditis, congenital heart defect, and myocardial infarction. <sup>2</sup> Developmental delay, congenital malformations of the nervous system, and epilepsy. <sup>3</sup> Hepatitis, fatty liver, congenital hepatic fibrosis, and liver cirrhosis. Values are expressed as numbers (%) and median (range). Abbreviations: BKVN: BK virus nephropathy; CAKUT: congenital anomalies of the kidney and the urinary tract; HLA: human leukocyte antigen; MMF: mycophenolate mofetil; Tac: tacrolimus; BSX: basiliximab; ATG: anti-thymocyte globulin; CMV: cytomegalovirus; EBV: Epstein–Barr virus; PTLD: post-transplant lymphoproliferative disease.

#### *3.1. Risk Factors for BKVN and BK Viremia*

To assess the risk factors for BKVN, the BKVN group (*n* = 6) and the remaining patients—including the BK viremia group and those who had not shown BK viremia (*n* = 162, non-BKVN group, Table 1)—were compared. There were no statistically significant differences in sex, age at transplant, primary kidney disease, donor source, HLA mismatch, induction with polyclonal or monoclonal antibody, prior acute rejection, and Epstein–Barr virus or cytomegalovirus infection. None of the BKVN patients had a ureteral stent placed after kidney transplantation. Interestingly, as a primary kidney disease of their native kidneys, Alport syndrome was significantly more common in the BKVN group compared to the non-BKVN group (50% vs. 4.3%, *p* = 0.003). Multivariate analysis using Cox proportional analysis also showed that Alport syndrome was a significant risk factor for BKVN (Table 2).


**Table 2.** Risk factors for BK virus nephropathy.

<sup>1</sup> Factors with a value of *p* < 0.25 in the univariate analysis were included in the multivariate analysis. Abbreviations: CI: confidence interval; NS: not significant; HLA: human leukocyte antigen.

Comparison of the BK viremia group and non-BK viremia group revealed that basiliximab induction therapy and transplant in the years after 2008 were significantly higher in the BK viremia group than in the non-BK viremia group (Table S1). Multivariable Cox hazards regression analysis showed that induction with basiliximab was a significant risk factor for the development of BK viremia (Table S2).

Comparison between the BKVN group and the non-BKVN BK viremia group (BK viremia-only group) revealed no significant differences between the two groups, except for BK viral load and Alport syndrome (Table S3).

#### *3.2. Clinical Course of BKVN*

Among the six patients in the BKVN group, four had pathologically-proven BKVN and the remaining two presented with presumptive BKVN. BKVN was managed with the reduction of immunosuppressive medications, intravenous immunoglobulin, leflunomide, ciprofloxacin, or cidofovir (Table 3). Three of the six patients had a pathological diagnosis of acute rejection along with BKVN on an allograft kidney biopsy, and were also treated with intravenous methylprednisolone.

After a 2.2–8.3-year follow-up, no patients experienced graft loss, but impairment of renal function was evident (median estimated glomerular filtration rate 39.9 (range, 20.7–56.9) mL/min/1.73 m2). BK viremia was only cleared in three patients over a median of 20.5 months (range, 16.6–89.9 months) after the first detection of viremia.


**Table 3.** The clinical course of patients with BKVN.

242

acute tubular necrosis; VUR:

immunoglobulin;

 mo: months; yr: year.

vesicoureteral

 regurgitation;

 FSGS: focal segmental

glomerulosclerosis;

 D: deceased; L: living; ND: not done; IS:

immunosuppressant;

 IVIG: intravenous

#### *3.3. Alport Syndrome and BKVN*

Although Alport syndrome comprised 6% of the study population (10 of 168 patients), as the primary disease of the native kidneys it accounted for 50% of BKVN patients. The prevalence of BK viremia was 30% (3 of 10 patients) for Alport syndrome, and all BK viremia in patients with Alport syndrome progressed to BKVN, whereas BK viremia was found in 17.1% of the remaining patients (other than Alport syndrome as their primary disease, 27 of 158 patients), and only 11.1% of those with BK viremia progressed to BKVN. In addition, BK viremia was detected relatively late in patients with Alport syndrome at 16, 24, and 60 months after transplantation (Table 3), and the initial viral load was higher, with a median 68,919 copies/mL (versus 4773 copies/mL in others with BK viremia, *p* = 0.001). BK viremia did not resolve in these patients, despite treatment for more than 2.2 years. BKVN-free survival curves of individuals with Alport syndrome and other patient groups also showed significant differences (Kaplan–Meier analysis, log-rank test *p* < 0.001, Figure 2).

**Figure 2.** Kaplan–Meier curves indicating progression to BK virus nephropathy after renal transplantation.

#### **4. Discussion**

In our study, the prevalence of BK viremia was 17.8% in pediatric kidney allograft recipients, which was similar to that shown in previous studies in adults (11% to 25%) [4,11,15] and pediatric kidney recipients (21%) [16]. The prevalence of BKVN in our pediatric kidney transplantation recipients was 3.6%, which was similar to the 4.6% reported by the North American Pediatric Renal Trials and Collaborative Studies registry [8]. BK viremia was observed immediately following kidney transplantation (0.4 months) in a minority of patients, but most cases occurred sometime after the surgery. This suggested that most patients may have contracted BKV from their peers, as primary infections of BK virus usually occur in childhood [17]. However, the pre-donation BKV status of donors had not been assessed at Seoul National University Hospital until recently, and therefore data was unavailable to help assess the source of the BK virus infection. Interestingly, all patients with BKVN received their allograft kidney after 2009 and were older than seven years; however, neither of these parameters were statistically significant, likely due to the small size of the study population. Nevertheless, regarding why BKVN only occurred after 2009, it was speculated that: (1) BK monitoring was more aggressive after 2009, as was the availability of BK viremia quantitation protocols, and (2) immunosuppression became more potent with the awareness of antibody-mediated

rejection associated with insufficient immunosuppression. BKVN in individuals older than seven years may reflect the timing of primary infection or a selection bias resulting from the small pool of younger recipients.

In this study, previously known risk factors in the adult population such as older age and type of immunosuppression were not identified as risk factors for BK viremia and BKVN. This was likely due to the relatively homogenous population in the study, as the Korean pediatric population evaluated mostly underwent induction treatment with monoclonal antibodies and maintenance treatment with tacrolimus and mycophenolate mofetil. Male sex and an increased number of HLA mismatches were not significant factors in this population, although there were no BKVN patients with zero HLA mismatches. Data relative to ischemia time was not available in this study, but the distribution of donor types did not differ between the groups, implying that ischemia time was not associated with BKVN in this population. Ureteral stent placement was previously indicated as a significant risk factor for BKVN [12], but in this study, no BKVN patients were subjected to ureteral stent placement. Acute rejection was more common in the BK viremia group and in the BKVN group compared with the BK viremia-free group, but there was no statistically significant difference either. While previous studies showed that the primary cause of end-stage renal disease was not a risk factor for BKVN in children [8,13,18], in this study population, Alport syndrome was a risk factor for BKVN. Alport syndrome is caused by a genetic defect in type IV collagen which comprises basement membranes. It is a rare hereditary disorder with an incidence of 1 in 50,000 persons [19], but it is relatively common in the pediatric population as a cause of end-stage renal disease compared to the adult population. Although Alport syndrome has not been previously reported to be a risk factor for BKVN, a study of kidney transplantations in Australia and New Zealand from 1965 to 2010 reported that BKVN (*n* = 6) was found only in the group with Alport syndrome, and not in the group without Alport syndrome in a group of 243 patients [20].

Thus, the question arises: how can the finding that Alport syndrome is a risk factor of BKVN be explained? Interestingly, BKVN developed in every patient with BK viremia if the patient had Alport syndrome, while only one-tenth of patients with BK viremia progressed to BKVN (3 of 27) when Alport syndrome patients were excluded. Hence, it is speculated that tubular cells in Alport syndrome patients might be more prone to BK virus propagation, possibly because of the defective distal tubular basement membrane [21]. The gradual repopulation of recipient cells in the allograft kidney has been documented previously [22]; therefore, the vulnerability of individuals with Alport syndrome might appear gradually, which might explain the delayed occurrence of BKVN. Alternatively, BK viremia screening might not be performed sufficiently early in these patients, resulting in the propagation of BKV irrespective of immunosuppression, and later the development of BKVN. Nonetheless, further study with larger populations is necessary to validate the notion that Alport syndrome is a risk factor for BKVN.

Screening for BK viremia is recommended for the prevention and early detection of BKVN [11]. Guidelines recommend screening for BK viremia regularly (every month for three to six months after transplantation, then every three months for one or two years, and then every year up to five years) with creatinine elevation and after acute rejection treatment [10]. Considering that Alport syndrome was a significant risk factor for BKVN in this study, a more meticulous screening for Alport syndrome patients is recommended, and for a longer period after kidney transplantation [23]. When BK viremia is detected from screening, it is recommended that immunosuppression be reduced [24,25]. In this study, by reducing immunosuppression, most cases of BK viremia resolved and did not progress to BKVN. Once BKVN is diagnosed, there are a few treatment options available in addition to the reduction of immunosuppression [2,6,10], with intravenous immunoglobulin treatment, fluoroquinolone, cidofovir, and leflunomide being reported as partially effective [26]. Most of our BKVN patients were treated with these methods and no allografts were lost. Nevertheless, neither allograft renal function of BKVN patients nor eradication of BK viremia was satisfactory, as both persisted in three patients with Alport syndrome.

A shortcoming of this study is that it was a retrospective study of a small patient population. Immunosuppression protocols have changed over time, and this factor could not be controlled well. In addition, not all the cases of BKVN were confirmed by an allograft kidney biopsy. Nonetheless, considering the pediatric focus of the study, the population size was appreciable, and a relatively good outcome of BKVN was evident with the application of aggressive treatment.

#### **5. Conclusions**

The incidence of BKVN was 3.6% in pediatric kidney allograft recipients at Seoul National University Hospital, and BKVN was associated with the underlying disease, Alport syndrome. Following aggressive treatment, no BKVN cases resulted in the loss of an allograft kidney for over two years. Further study with a larger population is necessary to validate the notion that Alport syndrome is a risk factor for BKVN.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/4/491/s1. Table S1: Baseline characteristics of patients with and without BK viremia, Table S2: Risk factors for BK viremia, Table S3: Comparison between the BKVN group and only-BK viremia groups.

**Author Contributions:** Conceptualization, Y.H.C., H.S.H., E.P., K.C.M., S.-I.M., J.H., I.-S.H., H.I.C., Y.H.A., and H.G.K.; Methodology, Y.H.C., I.-S.H., H.I.C., Y.H.A., and H.G.K.; Formal analysis, Y.H.C. and Y.H.A.; Investigation, Y.H.C., H.S.H., and E.P.; Data curation, H.S.H, E.P., K.C.M., S.-I.M.; Writing—original draft preparation, Y.H.C. and H.G.K.; Writing—review and editing, Y.H.A. and H.G.K.; Visualization, Y.H.C. and Y.H.A.; Supervision, J.H., I.-S.H., H.I.C., Y.H.A., and H.G.K.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

#### *Article*

### **Plasma Malondialdehyde and Risk of New-Onset Diabetes after Transplantation in Renal Transplant Recipients: A Prospective Cohort Study**

**Manuela Yepes-Calderón 1, Camilo G. Sotomayor 1,\*, António W. Gomes-Neto 1, Rijk O.B. Gans 2, Stefan P. Berger 1, Gerald Rimbach 3, Tuba Esatbeyoglu 4, Ramón Rodrigo 5, Johanna M. Geleijnse 6, Gerjan J. Navis <sup>1</sup> and Stephan J.L. Bakker <sup>1</sup>**


Received: 17 February 2019; Accepted: 30 March 2019; Published: 4 April 2019

**Abstract:** New-onset diabetes after transplantation (NODAT) is a frequent complication in renal transplant recipients (RTR). Although oxidative stress has been associated with diabetes mellitus, data regarding NODAT are limited. We aimed to prospectively investigate the long-term association between the oxidative stress biomarker malondialdehyde (measured by high-performance liquid chromatography) and NODAT in an extensively phenotyped cohort of non-diabetic RTR with a functioning graft ≥1 year. We included 516 RTR (51 ± 13 years-old, 57% male). Median plasma malondialdehyde (MDA) was 2.55 (IQR, 1.92–3.66) μmol/L. During a median follow-up of 5.3 (IQR, 4.6–6.0) years, 56 (11%) RTR developed NODAT. In Cox proportional-hazards regression analyses, MDA was inversely associated with NODAT, independent of immunosuppressive therapy, transplant-specific covariates, lifestyle, inflammation, and metabolism parameters (HR, 0.55; 95% CI, 0.36–0.83 per 1-SD increase; *p* < 0.01). Dietary antioxidants intake (e.g., vitamin E, α-lipoic acid, and linoleic acid) were effect-modifiers of the association between MDA and NODAT, with particularly strong inverse associations within the subgroup of RTR with relatively higher dietary antioxidants intake. In conclusion, plasma MDA concentration is inversely and independently associated with long-term risk of NODAT in RTR. Our findings support a potential underrecognized role of oxidative stress in post-transplantation glucose homeostasis.

**Keywords:** malondialdehyde; oxidative stress; new-onset diabetes; renal transplantation

#### **1. Introduction**

New-onset diabetes after transplantation (NODAT) is a major metabolic complication of solid organ transplantation, with a reported incidence of up to 50% [1]. Consequences of NODAT are detrimental for renal transplant recipients (RTR) as it is associated with reduced recipient survival, increased rate of cardiovascular events, and impaired graft survival in the long term [2,3]. In the era of high-dose steroid regimens, twelve-months cumulative incidence of NODAT was significantly higher, and the main risk factor identified for the occurrence of NODAT was immunosuppressant therapy [3]. However, with new cyclosporine-based and tacrolimus-based regimens [1,4], it is well-documented that the largest number of incident cases of NODAT occurred, indeed, after the first year of transplantation [4,5], and other agents potentially involved in the long-term pathogenesis of the disease remain to be elucidated. In order to improve the outcomes of RTR, it is of great interest to know which factors contribute to this long-term NODAT development and maintenance [2].

Oxidative stress (OS) is a factor that in different studies has been linked with both physiological response to insulin and pathophysiological mechanisms of, e.g., diabetes mellitus; it is also known to be enhanced in RTR when compared to general population [6]. Higher levels of OS biomarkers, e.g., MDA [7], have been found in patients with established diabetes mellitus compared to healthy controls [8], and in patients with diabetes-associated complications compared to patients with noncomplicated diabetes [9]. However, a developing body of evidence has linked oxidative species with insulin signaling [10–13], and it has been postulated that diabetes mellitus is—to a considerable extent—caused by a failure of the organism to create enough oxidative redox potential [14]. There is, nevertheless, only limited data relating OS with insulin resistance in prediabetes states [15]. Furthermore, to the extent of our knowledge, no longitudinal studies have aimed to study the association between OS biomarkers and long-term incidence of diabetes, which makes it difficult to foresee whether OS biomarkers may prospectively be associated with positive or negative outcomes regarding glucose metabolism outcomes.

In post-transplantation setting, less evidence is available regarding the role of OS on glucose homeostasis. Indeed, the long-term prospective association of systemic OS and the development of NODAT has not been explored. The primary objective of the present study was set to test the hypothesis that post-transplantation OS is associated with the development of NODAT. Furthermore, by considering evidence reporting an effect of dietary antioxidant intake on the development of type 2 diabetes and NODAT [16,17], we aimed to assess whether the potential association of MDA with NODAT may be modified by regular dietary antioxidant fatty acids intake. Finally, we investigated whether OS is associated with the secondary end-points of long-term all-cause mortality, cardiovascular mortality, and graft failure.

#### **2. Materials and Methods**

#### *2.1. Study Design and Patient Population*

In this prospective cohort study, all adult RTR with a functioning graft for at least one year who visited the outpatient clinic at the University Medical Center of Groningen (The Netherlands) between November 2008 and May 2011 were considered eligible to participate. Baseline data was obtained at least one year after transplantation with a median of five years. We excluded RTR with diabetes mellitus at baseline or before transplant (defined as fasting plasma glucose ≥126 mg/dL (7.0 mmol/L) and/or use of glucose lowering drugs) (*n* = 173); also patients who underwent combined pancreas-kidney transplantation (*n* = 5) or whose plasma MDA concentration measurement at baseline was missing (*n* = 12), resulting in 516 RTR eligible for statistical analyses. The patients were followed-up until 1 April 2014. Collection of these data was ensured by the continuous surveillance system of the outpatient clinic of our university hospital and close collaboration with affiliated hospitals. Follow-up was performed according to the guidelines of the American Society of Transplantation [18].

The primary end-point of the current study was the long-term development of NODAT. Secondary end points were all cause-mortality, cardiovascular mortality, and graft failure. No participants were lost due to follow-up. The current study was approved by the institutional review board (METc 2008/186) and adhered to the Declarations of Helsinki and Istanbul.

#### *2.2. Data Collection*

Baseline data was collected during a visit to the outpatient clinic, following a detailed protocol described elsewhere [19]. Anthropometric measurements were taken while participants wore indoor clothing without shoes. Systolic blood pressure (SBP) and diastolic blood pressure (DBP) were measured using a semiautomatic device (Dinamap1846; Critikon, Tampa, FL, USA) every minute for 15 min, following a strict protocol as described before [20].

Three questionnaires were administered to patients: first, the Short QUestionnaire to ASsess Health-enhancing physical activity (SQUASH) score for information about the daily physical activity [21]. Second, a questionnaire regarding smoking behavior to classify patients as current, previous or never smokers. Third, a semiquantitative self-administered food frequency questionnaire (FFQ) of 177 items to collect information on dietary intake during the past month. The FFQ was developed at Wageningen university, previously validated for our population, and it has been updated several times [22]. Number of servings was recorded in natural units (e.g., slice of bread) or household measures (e.g., a teaspoon). Subsequently, all dietary data were converted into total energy and nutrient intake per day, using the Dutch Food Composition Table 2006 [23]. Specific nutrient intakes were adjusted for total energy intake according to the residual method [24].

Of note, except for discouraging excess sodium intake and encouraging weight loss in overweight individuals, no specific dietary counseling was included, nor was dietary recommendation regarding antioxidant fatty acids intake or supplementation advised to the study subjects. Other relevant recipient and transplant information was extracted from the Groningen Renal Transplant Database, as described in detail before [25].

#### *2.3. Measurements and Definitions*

Fasting blood samples and complete 24-hour urine collection were taken at baseline. Serum creatinine was determined by using the Jaffe reaction (MEGA AU510; Merck Diagnostica, Darmstadt, Germany); plasma glucose by the glucose oxidase method (YSI 2300 Stat Plus; Yellow Springs Instruments, Yellow Springs, OH, USA); total cholesterol by the cholesterol oxidase-phenol aminophenazone method (MEGA AU510); HDL cholesterol by the cholesterol oxidase-phenol aminophenazone method on a Technicon RA-1000 (Bayer Diagnostics, Mijdrecht, the Netherlands); and plasma triglycerides by the glycerol-3-phosphate oxidase-oxidase method (YSI 2300 Stat Plus). LDL cholesterol was calculated by using the Friedewald equation; estimated glomerular filtration rate (eGFR) by the serum creatinine based Chronic Kidney Disease EPIdemiology collaboration equation (CKD-EPI) [26]; and the cumulative dose of prednisolone as the sum of the maintenance dose of prednisolone from transplantation until baseline. Plasma MDA concentration was chosen as the biomarker of OS because it has been used before in studies regarding pathologies of the glucose metabolism [8,9]; it was measured by high-performance liquid chromatography with a photodiode array detector as described by Faizan et al. to improve the sensitivity offered by spectrophotometrically methods [27].

NODAT was defined according to the International Expert Panel recommendations based on the 2003 American Diabetes Association criteria [28] and the HbA1c criterion proposed by the International Expert Panel of the international consensus meeting on post transplantation diabetes mellitus [29]. The diagnosis was made with the fulfillment of one or more of the following: symptoms of diabetes (classic symptoms, including polyuria, polydipsia, and unexplained weight loss) plus random plasma glucose concentration ≥200 mg/dL (11.1 mmol/L); fasting plasma glucose ≥126 mg/dL (7.0 mmol/L); plasma HbA1c ≥ 6.5%; or use of glucose-lowering medication. If fasting plasma glucose was elevated, a confirmatory laboratory test was performed, after which the diagnosis of NODAT was made.

Cardiovascular death was defined as the principal cause of death being cardiovascular in nature (International Classification of Diseases (ICD)-9 codes 410–447). The cause of death was obtained by linking the number of the death certificate to the primary cause of death as coded by a physician from the Central Bureau of Statistics according to the ICD-9 [30]. Graft failure was defined as restart of dialysis or retransplantation.

#### *2.4. Statistical Analyses*

Data analyses, computations, and graphs were performed with SPSS 22.0 software (IBM Corporation, Chicago, IL, USA), R version 3.2.3 software (The R-Foundation for Statistical Computing, Vienna, Austria), and GraphPad Prism version 7 software (GraphPad Software, San Diego, CA, USA).

For descriptive statistics data are presented as mean ± standard deviation (SD) for normally distributed data, and as median (interquartile range (IQR)) for variables with a non-normal distribution. Categorical data are expressed as number (percentage). Crude and age, sex, and eGFR-adjusted linear regression analyses were performed to examine the association of baseline characteristics with circulating MDA. Residuals were checked for normality and natural log-transformed when appropriate. In order to study in an integrated manner which baseline variables were independently associated with and were determinants of circulating MDA, we performed stepwise backwards multivariable linear regression analyses. For inclusion and exclusion in these analyses, *p*-values were set at 0.2 and 0.05, respectively.

NODAT development was visualized by Kaplan–Meier curves according to tertiles of plasma MDA concentration, with statistical significance among curves tested by log-rank (Mantel–Cox) test. The prospective association of plasma MDA concentration with the different outcomes was assessed through Cox regression analyses. We first performed crude analyses followed by additive adjustments for demographic and anthropometric factors (age, sex, and BMI) in model 1; metabolism-related variables (glucose, HbA1c, and HDL cholesterol) in model 2; lifestyle characteristics (current smoking, alcohol intake, and SQUASH score) in model 3; transplantation-related data (transplant vintage and eGFR) in model 4; immunosuppressive therapy (prednisolone dose and use of calcineurin inhibitors) in model 5; and inflammation (high sensitivity C-reactive protein (hs-CRP)) in model 6. NODAT and graft failure were censored at the date of last follow-up or death. Models were checked for the fulfillment of the assumptions of Cox regression analysis. The assumptions were met.

Furthermore, we performed prespecified analyses in which we tested for potential effect-modification by dietary intake of antioxidant fatty acids using multiplicative interaction terms over the fully adjusted model. In case of significant effect-modification, we proceeded with stratified prospective analyses for the concerned variable. Cut-off points of originally continuous variables used in the stratified analyses were determined so they would allow for an as much as possible similar number of events in each subgroup, and thus allow for similar statistical power for the assessment of the primary association under study (MDA concentration and NODAT) in each subgroup after stratification of the overall population. Since the number of events was reduced in each subgroup these analyses were adjusted analogous to model 3 of the overall prospective analyses to avoid overfitting. Also, since the dietary intake of antioxidant fatty acids could also be a potential confounder, we investigated if adjusting for this variable changed the association between MDA and NODAT.

For all statistical analyses, a statistical significance level of *p* ≤ 0.05 (two-tailed) was used, except for the effect-modification analyses where the significance level was *p* ≤ 0.1 (two-tailed) [31].

#### **3. Results**

#### *3.1. Baseline Characteristics*

In total 516 RTR (57% men) were included in the analyses with a mean ± SD age of 51 ± 13 years. Patients were included at a median of 5.2 (IQR 2.0–12.2) years after transplantation. The median plasma MDA concentration was 2.55 (IQR 1.92–3.66) μmol/L. Baseline characteristics of the overall RTR population are shown in Table 1. In crude linear regression analyses, glucose concentration had a significant direct association with plasma MDA concentration (*β* = 0.10, *p* = 0.02), which was not modified after adjustment for age, sex, and eGFR. Other variables with significant associations with plasma MDA concentration after adjustment were eGFR (*β* = 0.10, *p* = 0.03) and leucocytes concentration (*β* = 0.10, *p* = 0.03). A final reduced model of baseline variables obtained through backwards linear regression analyses (*α* = 0.05) included glucose concentration (*β* = 0.11, *p* = 0.02),

eGFR (*β* = 0.08, *p* = 0.09) leucocytes concentration (*β* = 0.10, *p* = 0.03), HDL concentration (*β* = 0.10, *p* = 0.04), and alcohol intake (*β* = −0.09, *p* = 0.07) (Table 1).

**Table 1.** Baseline characteristics of the study population and its association with circulating malondialdehyde (MDA) (*n* = 516).


\* *p* value < 0.20; \*\* *p* value < 0.05. ¥ Crude linear regression analysis. † Linear regression analysis adjusted for age, sex, and eGFR. § Stepwise backwards linear regression analysis; for inclusion and exclusion in this analysis, *p* Values were set at 0.2 and 0.05, respectively. ~ Excluded from the final model. Data available in: <sup>a</sup> 499, <sup>b</sup> 514, <sup>c</sup> 495, <sup>d</sup> 455, <sup>e</sup> 515, <sup>f</sup> 398, <sup>g</sup> 484, <sup>h</sup> 468, <sup>i</sup> 490 patients. MDA, malondialdehyde; Std. β, standarized B coefficient; eGFR, estimated glomerular filtration rate; CV, cardiovascular; HbA1c, glycosylated hemoglobin; hs-CRP, high-sensitive C-reactive protein; kcal, kilocalories; LA, linoleic acid; AA, arachidonic acid; ALA, α-lipoic acid; EPA, eicosapentaenoic acid; DHA, docosahexaenoic acid. ∧ Adjusted for total caloric intake according to the residual method.

#### *3.2. Prospective Analyses on NODAT*

During a median follow-up of 5.3 (IQR 4.6–6.0) years, NODAT developed in 56 (11%) RTR. Kaplan–Meier curves for NODAT development by tertiles of RTRs according to circulating MDA are shown in Figure 1. NODAT distribution was significantly different according to the log-rank test (*p* = 0.02). Cox regression analyses showed that plasma MDA concentration is inversely associated with the risk of NODAT (HR, 0.61; 95% CI, 0.41–0.92 per 1-SD; *p* = 0.02). This association was independent

of adjustment for demographic and anthropometric factors, metabolism-related variables, lifestyle factors, transplantation-related data, immunosuppressive medication, and inflammation (HR, 0.55; 95% CI, 0.36–0.83 per 1-SD; *p* < 0.01) (Table 2).

**Figure 1.** Kaplan–Meier curves for NODAT according to tertiles of plasma MDA concentration in RTR. Tertile 1: <2.15 μmol/L; Tertile 2: 2.15–3.09 μmol/L; Tertile 3: >3.09 μmol/L. *p* value was calculated by Log-rank (Mantel cox) test.

**Table 2.** Plasma MDA concentration and new-onset diabetes after transplantation (NODAT) in renal transplant recipients (RTR, *n* = 516).


In total, 56 (11%) RTR developed NODAT. Model 1: crude model plus adjustment for demographic and anthropometric characteristics. Model 2: model 1 plus adjustment for metabolism-related variables. Model 3: model 2 plus adjustment for lifestyle characteristics. Model 4: model 3 plus adjustment for transplantation-related data. Model 5: model 4 plus adjustment for immunosuppressive therapy. Model 6: model 5 plus adjustment for inflammation.

#### *3.3. Secondary Analysis on MDA and NODAT*

In effect-modification analyses, we found that the association between MDA and the risk of NODAT was significantly modified by vitamin E, linoleic acid (LA), and α-lipoic acid (ALA) intake in regular diet (*p*interaction = 0.06, 0.02, and 0.02; respectively). Thus, we performed stratified prospective analyses by subgroups of RTR according to vitamin E intake (≤ or >13.64 mg/day), LA intake (≤ or >14 g/day) and ALA intake (≤ or > or 1.24 g/day), in which cut-off points were determined so they would allow an as much as possible similar number of events in each subgroup. In each subgroup we assessed the association of MDA with development of NODAT and found that MDA was significantly inversely associated with the risk of NODAT in RTR with vitamin E intake >13.6 mg/day (HR, 0.52; 95% CI, 0.29–0.94 per 1-SD; *p* = 0.03), LA intake >14g/day (HR, 0.49; 95% CI 0.28–0.86 per 1-SD; *p* = 0.01), or ALA intake >1.24g/day (HR 0.42, 95% CI, 0.23–0.76 per 1-SD; *p* < 0.01), but not in the subgroups of relatively low intake (Figure 2).


**Figure 2.** Stratified analysis of the association of plasma MDA concentrations with NODAT. \* For the association between MDA and NODAT. HR are reported per 1-SD increase in plasma MDA concentration. Nutrient intake was adjusted for total energy intake according to the residual method. HR adjusted for age, sex, BMI, plasma glucose, HbA1c, smoking status, alcohol intake, and SQUASH score are shown.

Further, we performed Cox regression analyses with adjustment for these variables to explore if they might also be potential confounders. The association between MDA and NODAT was not significantly modified by additional adjustment for vitamin E intake (HR, 0.52; 95% CI, 0.34–0.81 per 1-SD; *p* < 0.01), ALA intake (HR, 0.55; 95% CI, 0.36–0.83 per 1-SD; *p* < 0.01), or LA intake (HR, 0.56; 95% CI, 0.37–0.84 per 1-SD; *p* < 0.01).

#### *3.4. Prospective Analysis on All-Cause Mortality, Cardiovascular Mortality, and Graft Failure*

During the same median follow up of 5.3 (IQR 4.6–5.9) years, 86 (17%) RTRs died, 29 (6%) from cardiovascular cause and 57 (11%) developed graft failure. In crude Cox regression analysis, plasma MDA concentration was not significantly associated with the risk of all-cause mortality (HR, 0.96; 95% CI, 0.73–1.25 per 1-SD; *p* = 0.74), cardiovascular mortality (HR, 0.81; 95% CI, 0.58–1.13 per 1-SD; *p* = 0.21), nor death-censored graft failure (HR, 0.89; 95% CI, 0.65–1.23 per 1-SD; *p* = 0.49). Further adjustments did not materially change these findings (Tables S1–S3).

#### **4. Discussion**

In a large cohort of stable RTR, we showed first that plasma MDA is directly associated with plasma glucose concentration. Second, plasma MDA is inversely associated with long-term risk of NODAT. This association remained present independent of potential confounders, including BMI, baseline glucose concentration and immunosuppressive therapy. Daily dietary intake of antioxidant fatty acids, e.g., vitamin E, LA and ALA were a significant effect-modifier of this association. No association was found whatsoever between MDA and all-cause mortality, cardiovascular mortality, or graft failure. These findings agree with developing evidence that proposes that oxidative status plays an important role in glucose homeostasis [10–13].

Experimental work has shown that reactive oxygen species (ROS) are part of intracellular insulin signal transmission [10,11]. ROS are upregulated in response to insulin and help to further up-regulate glucose-metabolism associated pathways related, e.g., with insulin-induced aerobic glycolysis [13]. Furthermore, ROS seem to have an effect on enzymes essential for catalytic activity, increasing glucose intake by skeletal muscle cells and glucose transport in adipocytes [11]. Also, human studies have shown that: (i) patients with severe deficiency of plasmatic antioxidants maintain supranormal insulin sensitivity, compared to healthy subjects, even if they are obese [12] and (ii) antioxidant molecules supplementation abrogates the usually generated increase in insulin sensitivity of patients on exercise

interventions [32]. The current study, performed in a high-risk of new-onset diabetes population, provides for the first-time prospective evidence in line with aforementioned basic studies, and may further support the postulate of James Watson, according to which, diabetes mellitus may be caused by an incapacity of the cell to produce an oxidative redox environment [14].

Controversy may arise from data that has shown higher plasma MDA concentration in patients with diabetes mellitus than in healthy controls [8], and in patients who develop diabetes-related complications than in those without them [9]. However, it is known that ROS—as intracellular messengers—can generate opposite cellular effects. ROS can activate specific pathways whose products interfere with insulin signaling, e.g., the activation of the redox-sensitive nuclear factor-kappa beta (NF-kB) leads to the expression of cytokines such as tumoral necrosis factor α (TNF-α), and interleukins (ILs) such as IL-1β and IL-6 and all these products have a quenching effect on insulin signaling [11]. On the other hand, as mentioned before, ROS can activate signaling pathways important to fulfil insulin functions. Also, they are known to be themselves and stimuli to increase cellular antioxidant capacity [33] by the activation of specific response elements known as Nuclear factor-erythroid related factor 2- antioxidant response elements (Nrf2-ARE); this induction of endogenous antioxidant mechanisms by ROS has been specifically named mitochondrial hormesis and has gained interest in the last years [34], as it is proposed that, contrary to traditional thinking, ROS are not merely deleterious but they are necessary to reach oxidative balance inside the cell. Intensity, location, duration, and concentration of the oxidant stimulus seem to be crucial in defining whether ROS have a physiological or a pathological outcome. However, the specific thresholds that spawn the differential responses have not been determined yet [10,11]. This also might be a potential explanation of why studies regarding antioxidant supplementation have not shown to be beneficial in RTR [35], and why we did not find an association between OS and mortality, cardiovascular mortality, or graft failure.

Our data also provided evidence that LA, ALA, and vitamin E intake modify the association between MDA and NODAT. Conceivable interpretations of these findings are as follows: MDA is formed after the peroxidation of double bonds of unsaturated fatty acids such as LA [7]. Food containing important amounts of unsaturated fatty acids usually also contain substantial amounts of vitamin E, which prevents them from rancidification [36]. It is possible that in this context, high MDA concentration is a marker of a diet rich in antioxidants and unsaturated fatty acids, which has been suggested to reduce diabetes incidence [16,37]. However, when we adjusted for intake of these nutrients to evaluate them as potential confounders, the association between MDA and NODAT remained materially unaltered. An alternative explanation might be the aforementioned Nrf2-ARE pathway. This pathway has been of particular interest in the study of antioxidant molecules as therapeutic interventions, because previous authors have proposed that these interventions could show beneficial results if they were combined with unsaturated fatty acids as precursors of oxidative stress [38]. The rationale is that through Nrf2-ARE pathway activation, provision of pro-oxidant and antioxidants agents would trigger the antioxidant cellular defenses [33], thus yielding cell precondition to new oxidative challenges [39], and ultimately allowing cells to reach hormesis. This fits with our findings that high levels of MDA, although significantly inversely associated with NODAT in all our population, showed a stronger association in the patients with relatively higher intake of antioxidant in regular diet according to our subgroup analyses. Our findings might also support previous suggestions of potential protector effect of antioxidant-rich diets against NODAT [17].

The present study has several strengths. To the extent of our knowledge, it comprises the largest cohort of patients at risk of new-onset diabetes after transplantation in which the relationship between oxidative stress and NODAT has been evaluated. Moreover, our extensively phenotyped cohort allowed us to control for several potential confounders, among which anthropometric measurements, smoking status, baseline glucose metabolism markers, and immunosuppressive therapy were accounted for. Furthermore, NODAT cases were diagnosed according to International Expert Panel recommendations that were based on American Diabetes Association criteria [28], which agrees with usual clinical practice in transplant centers. Another strength of the study is

that we included only stable RTRs who were 1-year post-transplantation, resulting in exclusion of transient post-transplantation hyperglycemia in NODAT diagnosis. Hyperglycemia is extremely common in the early posttransplant period and can occur as a result of rejection therapy, infections, and other critical conditions. Therefore, the formal diagnosis of NODAT in RTRs should only be based on likely maintenance of immunosuppression, stable kidney function, and absent acute infections [29]. The present study also has several limitations. It was carried out in a center with over-representation of Caucasian population, which calls prudence to extrapolation of our results to populations of other ethnicities. Another limitation of our study is that we only measured MDA concentrations in baseline samples. Most epidemiological studies use a single baseline measurement to predict outcomes, which adversely affects predictive properties of variables associated with outcomes. If intraindividual variability of predictive biomarkers is taken into account, this results in strengthening of predictive properties that, despite sometimes containing considerable intraindividual day-to-day variation, also existed for single measurements of these biomarkers [40,41]. The higher the intraindividual day-to-day variation is, the greater one would expect the benefit of repeated measurement for prediction of outcomes [40,41]. Next, although MDA has been the most commonly used OS biomarker in studies regarding glucose metabolism [9,10], further studies may want to account for other OS biomarkers to further validate our findings. Finally, the observational nature of this study makes it difficult to discern whether high levels of MDA are protective against NODAT or merely a marker of lower risk for NODAT; and, as with any observational study, residual confounding may have existed despite the substantial number of potentially confounding factors for which we adjusted, including well identified risk factors for NODAT.

In conclusion, plasma MDA concentration is inversely and independently associated with long-term risk of NODAT in stable RTR. This study provided for the first time relevant prospective data on the role of oxidative stress on glucose metabolism in a high-risk of diabetes population. This may further support already published basic studies and further promote studies to widen our knowledge on the role of oxidative stress in the pathophysiological mechanisms leading to diabetes and NODAT, which might be of relevant use in exploring novel therapeutic approaches to prevent and treat NODAT; also, it indicates that studies exploring antioxidant supplementation in RTR should explore and report metabolic outcomes in the long-term.

**Supplementary Materials:** The following are available online at http://www.mdpi.com/2077-0383/8/4/453/s1, Table S1: Plasma MDA concentration and all-cause mortality in RTR, Table S2: Plasma MDA concentration and cardiovascular mortality in RTR, Table S3: Plasma MDA concentration and dead-censored graft failure in RTR.

**Author Contributions:** Data curation, M.Y.-C., C.G.S., A.W.G.-N., and S.J.L.B.; Formal analysis, M.Y.-C., C.G.S. and A.W.G.-N.; Funding acquisition, M.Y.-C., C.G.S., and S.J.L.B.; Investigation, R.O.B.G., S.P.B., G.R., T.E., R.R., J.M.G., G.J.N. and S.J.L.B.; Methodology, G.R., T.E., J.M.G. and G.J.N.; Project administration, R.O.B.G., S.P.B., G.J.N. and S.J.L.B.; Resources, R.O.B.G. and S.P.B.; Supervision, R.R., G.J.N., and S.J.L.B.; Writing – original draft, M.Y.-C. and C.G.S.; Writing – review & editing, M.Y.-C., C.G.S., R.R. and S.J.L.B.

**Funding:** This study was based on the TransplantLines Food and Nutrition Biobank and Cohort Study (TxL-FN), which was funded by the Top Institute Food and Nutrition of the Netherlands (grant A-1003). The study is registered at clinicaltrials.gov under number NCT02811835.

**Acknowledgments:** Plasma MDA concentration was measured by Faizan et al. at the institute of Human Nutrition and Food Science, Christian Albrechts University of Kiel, Germany.

**Conflicts of Interest:** The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

#### **References**

1. Montori, V.M.; Basu, A.; Erwin, P.J.; Velosa, J.A.; Gabriel, S.E.; Kudva, Y.C. Posttransplantation diabetes: A systematic review of the literature. *Diabetes Care* **2002**, *25*, 583–592. [CrossRef]


*J. Clin. Med.* **2019**, *8*, 453

41. Danesh, J.; Wheeler, J.G.; Hirschfield, G.M.; Eda, S.; Eiriksdottir, G.; Rumley, A.; Lowe, G.D.; Pepys, M.B.; Gudnason, V. C-Reactive Protein and Other Circulating Markers of Inflammation in the Prediction of Coronary Heart Disease. *N. Engl. J. Med.* **2004**, *350*, 1387–1397. [CrossRef]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

MDPI St. Alban-Anlage 66 4052 Basel Switzerland Tel. +41 61 683 77 34 Fax +41 61 302 89 18 www.mdpi.com

*Journal of Clinical Medicine* Editorial Office E-mail: jcm@mdpi.com www.mdpi.com/journal/jcm

MDPI St. Alban-Anlage 66 4052 Basel Switzerland

Tel: +41 61 683 77 34 Fax: +41 61 302 89 18