**Analgesic E**ffi**cacy of a Combination of Fentanyl and a Japanese Herbal Medicine "***Yokukansan***" in Rats with Acute Inflammatory Pain**

**Yuko Akanuma 1,2, Mami Kato 1, Yasunori Takayama 1, Hideshi Ikemoto 1, Naoki Adachi 1, Yusuke Ohashi 1,3, Wakako Yogi 1,4, Takayuki Okumo 1, Mana Tsukada <sup>1</sup> and Masataka Sunagawa 1,\***


Received: 12 November 2020; Accepted: 15 December 2020; Published: 17 December 2020

**Abstract: Background:** Fentanyl can induce acute opioid tolerance and postoperative hyperalgesia when administered at a single high dose; thus, this study examined the analgesic efficacy of a combination of fentanyl and *Yokukansan* (YKS). **Methods:** Rats were divided into control, formalin-injected (FOR), YKS-treated+FOR (YKS), fentanyl-treated+FOR (FEN), and YKS+FEN+FOR (YKS+FEN) groups. Acute pain was induced via subcutaneous injection of formalin into the paw. The time engaged in pain-related behavior was measured. **Results:** In the early (0–10 min) and intermediate (10–20 min) phases, pain-related behavior in the YKS+FEN group was significantly inhibited compared with the FOR group. In the late phase (20–60 min), pain-related behavior in the FEN group was the longest and significantly increased compared with the YKS group. We explored the influence on the extracellular signal-regulated kinase (ERK) pathway in the spinal cord, and YKS suppressed the phosphorylated ERK expression, which may be related to the analgesic effect of YKS in the late phase. **Conclusions:** These findings suggest that YKS could reduce the use of fentanyl and combined use of YKS and fentanyl is considered clinically useful.

**Keywords:** *Yokukansan*; fentanyl; transient receptor potential ankyrin 1 (TRPA1) channel; phosphorylated extracellular signal-regulated kinase (pERK); whole-cell patch-clamp recording; herbal medicine

#### **1. Introduction**

Fentanyl and remifentanil are potent ultra-short-acting μ-opioid receptor agonists widely used for pain management in the perioperative period [1]. However, their use is limited because they can induce acute opioid tolerance and a post-treatment state of heightened pain sensitivity, known as opioid-induced hyperalgesia (OIH), when administered at a single high dose [2–5]. Development of OIH causes several problems, including delayed recovery after surgery, higher consumption of analgesics, and side effects associated with their administration. Remifentanil-induced hyperalgesia (RIH) has been extensively investigated [6–11]. There is some evidence that glutamate release and N-methyl-D-aspartate (NMDA) receptor activation may be important in the development of RIH [9–11]. In addition, remifentanil infusion downregulates μ-opioid receptors [12]. Moreover, glial activation is

involved in RIH. Microglia and astrocytes are activated by chronic opioid use, and their inhibition seems to reduce RIH [6,13]. Some clinical [14–17] and basic studies [18–22] reported that high-dose fentanyl use may also cause hyperalgesia. Xuerong et al. [14] conducted a randomized controlled trial on 90 women undergoing total abdominal hysterectomy. Patients in the control group who were administered bupivacaine requested significantly less morphine postoperatively compared with patients treated with fentanyl. However, the combined use of ketamine, an NMDA receptor antagonist, and fentanyl significantly decreased the need for postoperative morphine administration. Richeb é et al. [18] reported a similar phenomenon in fentanyl-treated rats. These results suggest that NMDA receptor activation is involved in the development of fentanyl-induced hyperalgesia (FIH).

*Yokukansan* (YKS) is a Japanese traditional herbal (Kampo) medicine that comprises seven herbs (Table 1). YKS is officially approved as an ethical pharmaceutical by the Japanese Ministry of Health, Labor, and Welfare. The three-dimensional high-performance liquid chromatography (3D-HPLC) profile chart of YKS was provided by Tsumura & Co. (Figure 1). YKS is administered to patients with symptoms, such as emotional irritability, neurosis, and insomnia, and to infants who suffer from night crying and convulsions [23,24]. Recently, it has been reported that YKS is effective against pain disorders, including headache, post-herpetic neuralgia, fibromyalgia, phantom-limb pain, and trigeminal neuralgia [25–28]. Studies have demonstrated antinociceptive effects of YKS in animal models with chronic neuropathic and inflammatory pain [29–32]. We previously reported that pre-administration of YKS attenuated the development of antinociceptive morphine tolerance and that suppression of glial cell activation may be one mechanism underlying this phenomenon [33,34]. YKS is also known to have an ameliorative effect on glutamate clearance in astrocytes and an antagonistic action at the NMDA receptor [35–37]. As mentioned above, NMDA receptor activation may be involved in FIH development [14,18]. Thus, we hypothesized that YKS might inhibit FIH development.



Weights indicate relative amounts mixed.

In the present study, we first evaluated the effect of combined treatment with YKS and fentanyl using the well-established inflammatory pain model induced by formalin injection [38]. The injection of formalin into the plantar surface of rodent paws induces acute nociceptive responses, such as lifting, licking, and flinching of the paw, which are biphasic. In the initial period of about 10 min (phase I), behavioral responses occur due to activated primary afferent nerve terminals and are mediated by activation of the transient receptor potential ankyrin 1 (TRPA1) channel [39,40]. Thus, we performed whole-cell patch-clamp recording in HEK293T cells expressing human TRPA1 to assess the influence of YKS or fentanyl on the TRPA1 channel. Phase II (10–60 min) reflects central sensitization of neurons in the dorsal horn and peripheral sensitization of nociceptors by the formalin-induced inflammatory response [41]. Accordingly, the expression of phosphorylated extracellular signal-regulated kinase (pERK) in the spinal dorsal horn was analyzed by immunofluorescent staining. ERK is an important molecule in pain signaling and a potential novel target for pain treatment [42].

**Figure 1.** Three-dimensional high-performance liquid chromatography (3D-HPLC) profile chart of the major chemical compounds in *Yokukansan* (YKS).

#### **2. Materials and Methods**

#### *2.1. Animals*

Experiments were performed using 7–8-week-old male Wistar rats (Nippon Bio-Supp. Center, Tokyo, Japan). Animals were housed two to three per cage (W 24 × L 40 × H 20 cm) under a 12 h light/dark cycle in our animal facility at 25 ◦C ± 2 ◦C and 55% ± 5% humidity. Food (CLEA Japan, CE-2, Tokyo, Japan) and water were provided ad libitum. The experiments were performed in accordance with the guidelines of the Committee of Animal Care and Welfare of Showa University. All experimental procedures were approved by the Committee of Animal Care and Welfare of Showa University (certificate number: 07064, date of approval: 1 April 2017). Effort was made to minimize the number of animals used and their suffering.

#### *2.2. Administration of Drugs*

Dry powdered extracts of YKS (Lot No. 2110054010) used in the present study were supplied by Tsumura & Co. (Tokyo, Japan). The seven herbs comprising YKS (Table 1) were mixed and extracted with purified water at 95.1 ◦C for 1 h; the soluble extract was separated from insoluble waste and concentrated by removing water under reduced pressure. YKS was mixed with powdered rodent chow (CE-2: CLEA Japan) at a concentration of 3% and fed to YKS-treated rats for 7 days prior to the test. This dose was chosen on the basis of effective doses of YKS in our previous study [33]. Previous studies have indicated that pre-administration of YKS may inhibit development of antinociceptive tolerance to morphine [33,34,43]. Thus, in this study, YKS administration was started 7 days before fentanyl injection. Rats that were not treated with YKS were fed powdered chow only.

Fentanyl (0.08 μg/kg) (Daiichisankyo, Tokyo, Japan) was injected intraperitoneally 10 min before pain induction using a 27 G hypodermic needle. This dose was determined by performing a preliminary experiment according to a previous study [38]. Rats not treated with fentanyl were intraperitoneally administered saline.

#### *2.3. Assessment of Analgesia*

The analgesic effects of fentanyl and YKS were examined using the formalin test and immunofluorescence staining of pERK. Subcutaneous injections of formalin have been widely used as an animal model of acute inflammatory pain [38–40].

#### 2.3.1. Formalin Test

Rats were randomly divided into five groups as follows: control (*n* = 7), formalin-injected (FOR; *n* = 7), YKS-treated + FOR (YKS; *n* = 9), fentanyl-treated + FOR (FEN; *n* = 9), and YKS + FEN + FOR (YKS + FEN; *n* = 9). The experimental protocol is shown in Table 2. Animals were housed individually in wire observation cages and habituated for 30 min. Ten minutes prior to the formalin test, animals were injected intraperitoneally with fentanyl or saline and returned to individual housing. Acute inflammatory pain was induced via an intraplantar injection of formalin (5%, 50 μL, Polysciences, Warrington, PA, US) into the right paw using a 30 G hypodermic needle. Rats in the control group were administered saline instead of formalin. Immediately after the injection, animals were returned to individual housing again, and the total time spent engaged in pain-related behavior was measured for the first 10 min (early phase), between 10 and 20 min (intermediate phase), and between 20 and 60 min (late phase) following the intraplantar injection of formalin or saline. Pain-related behavior was defined as paw shaking, licking, and lifting from the ground.



Groups are as follows: control, formalin-injected (FOR), YKS-treated + FOR (YKS), fentanyl-treated + FOR (FEN), and YKS + FEN + FOR (YKS + FEN). YKS was mixed with powdered rodent chow at a concentration of 3% and fed to YKS-treated rats for 7 days prior to the test. YKS, *Yokukansan*; i.p., intraperitoneal injection; s.c., subcutaneous injection.

#### 2.3.2. Immunofluorescent Staining

The appearance of pERK in the dorsal horn was investigated using immunofluorescent staining. Rats were randomly divided into the same five groups (*n* = 4 in each group) and administered the same drugs as the formalin test. Forty-five minutes after formalin injection, rats were intraperitoneally anesthetized with pentobarbital sodium (50 μg/kg; Somnopentyl, Kyoritsu Seiyaku, Tokyo, Japan) and intracardially perfused with phosphate-buffered saline at pH 7.4 until all the blood had been removed from the system. After perfusion with 4% paraformaldehyde in 0.1 M phosphate-buffered saline, fifth lumbar spinal cord (L5) samples were harvested. Tissue specimens were immersed in 20% sucrose solution for 48 h and subsequently embedded in optimum cutting temperature compound (Tissue-Tek OCT, Sakura Finetek, Torrance, CA, USA), frozen, and cut into 15 μm sections using a cryostat (CM3050S, Leica Biosystems, Nussloch, Germany). Sections were incubated overnight at 4 ◦C with rabbit anti-pERK antibody (1:500, #4370, Cell Signaling Technology, Danvers, MA, USA). Sections were then incubated for 2 h with fluorophore-tagged secondary antibody (donkey anti-rabbit Alexa Fluor 555, 1:1000, #A31572, Thermo Fisher Scientific, Waltham, MA, USA). Nuclei were counterstained with DAPI (4 ,6-diamidino-2-phenylindole, 1:1000, Thermo Fisher Scientific). Samples were imaged using a confocal laser scanning fluorescence microscope (FV1000D, Olympus, Tokyo, Japan), and cell co-localization of pERK and DAPI in the same area of laminae I–II were counted as pERK(+) cells by a

third person who was not engaged in the staining process. The mean value was calculated using five sequential sections from each rat.

#### *2.4. Cell Culture*

HEK293T cells were cultured in Dulbecco's modified Eagle's medium (high glucose) with L-glutamine and phenol red (FUJIFILM Wako Pure Chemical, Osaka, Japan) containing 10% fetal bovine serum (#G121-6, JR Scientific, Woodland, CA, USA), penicillin/streptomycin (FUJIFILM Wako Pure Chemical), and GlutaMax (Gibco, Massachusetts, NY, USA) at 37 ◦C in humidified air containing 5% CO2. Cells were transfected with human TRPA1 cDNA (a generous gift from Dr. Yasuo Mori, Kyoto University) and 0.01 μg DsRed-express 2 vector (Takara Bio, Shiga, Japan) using Lipofectamine 3000 (Invitrogen, Waltham, CA, USA). Cells were replaced on cover slips after a 3 h incubation period and used 24–36 h after transfection.

#### *2.5. Whole-Cell Patch-Clamp Recording*

Transfected cells were identified by the red fluorescence signal excited by an LED illuminator, X-Cite XYLIS (Excelitas, Waltham, MA, USA). The bath solution contained 140 mM NaCl, 5 mM KCl, 2 mM CaCl2, 2 mM MgCl2, 10 mM glucose, and 10 mM HEPES (pH adjusted to 7.4 with NaOH). Pipette solution contained 140 mM CsCl, 5 mM 1.2-bis(o-aminophenoxy)ethane –N,N,N ,N -tetraacetic acid (BAPTA), and 10 mM HEPES (pH adjusted to 7.4 with CsOH). Cells were treated with formalin (0.003%), FEN (10 μM), and YKS (1 mg/mL). YKS was decocted in standard bath solution at 60 ◦C for 15 min, and the supernatant after centrifugation (3000 rpm, 25 ◦C, 10 min) was used for the experiment. The TRPA1 currents were recorded in voltage-clamp mode using a Multiclamp 700B amplifier (Molecular Devices, California, USA), filtered at 1 kHz with a low-pass filter, and digitized with a Digidata 1550 B digitizer (Molecular Devices, San Jose, CA, USA). Data were acquired with pCLAMP 11 (Molecular Devices, California, USA). Pipette resistances were 3 ± 1 MΩ. The holding potential was −60 mV, and ramp pulses from −100 mV to +100 mV were applied for 300 ms every 5 s.

#### *2.6. Statistical Analysis*

Experimental data are presented as mean ±standard deviation. Statistical analyses were performed using one-way analysis of variance with Tukey's test or the Tukey–Kramer post hoc test for comparisons (SPSS 24, IBM Japan, Tokyo, Japan). The *p*-value < 0.05 were considered statistically significant.

#### **3. Results**

#### *3.1. Formalin Test*

The analgesic effects of fentanyl and YKS were examined using the formalin test. The dose of fentanyl (0.08 μg/kg) was determined according to a previous study using the same experimental system. It is reported that 0.04 μg/kg of fentanyl has no effect on formalin-induced pain and 0.16 μg/kg significantly inhibits it [38]. We then performed a preliminary confirmation test using the doses of 0.04, 0.08, and 0.16 μg/kg, and the dose (0.08 μg/kg) that provided a moderate non-significant analgesic effect for the first 20 min was used because we would not be able to evaluate the effect of the drug combination if the dose (≥0.16 μg/kg) that provides significant analgesic effect was administered. With the formalin test, the effects are generally evaluated in two phases: phase I (0–10 min) and phase II (10–60 min). In the present study, we divided the total evaluation time into three phases: the early phase (0–10 min), the intermediate phase (10–20 min), and the late phase (20–60 min), because the effect of a single intraperitoneal administration of fentanyl lasts for approximately 30 min and administration was performed 10 min before pain induction.

In the early and intermediate phases, the duration of pain-related behavior was significantly increased according to the formalin injection; however, the increase was significantly inhibited in the

YKS+FEN group (early phase, *p* < 0.01; intermediate phase, *p* < 0.05), but no significant effect was recognized in the YKS and FEN groups (Figure 2).

**Figure 2.** Duration of pain-related behavior with the formalin test. The combination of *Yokukansan* (YKS) and fentanyl (FEN) significantly inhibited pain-related behavior during the early phase (0–10 min after formalin injection) and the intermediate phase (10–20 min). The duration in the YKS group was significantly shorter compared with the FEN group during the late phase. FOR, formalin-injected group; YKS, YKS-treated+FOR group; FEN, fentanyl-treated+FOR group; YKS+FEN, YKS+FEN+FOR group. Mean <sup>±</sup> SD. ## *<sup>p</sup>* <sup>&</sup>lt; 0.01 (vs. control), \* *<sup>p</sup>* <sup>&</sup>lt; 0.05, \*\* *<sup>p</sup>* <sup>&</sup>lt; 0.01 (Tukey–Kramer test).

In the late phase, that is, after having lost the effects of fentanyl, the duration of pain-related behavior in the FEN group (1835.8 ± 415.4 s) was the longest and significantly increased compared with the YKS group (995.1 ± 220.3 s) (*p* < 0.01) (Figure 2).

#### *3.2. Immunofluorescent Staining of pERK(*+*) Cells*

The appearance of pERK in the late phase, 45 min after injection of formalin, was investigated to evaluate central sensitization. As a result, a similar tendency was observed with the formalin test in the late phase. Representative pictures are shown in Figure 3a. The number of pERK(+) cells in the FOR (5.96 ± 1.86 cells), FEN (6.83 ± 1.49 cells), and YKS+FEN (4.88 ± 1.42 cells) groups was significantly increased compared with the control group (0.83 ± 0.53 cells) (*p* < 0.01); however, the number of pERK(+) cells in the YKS group (3.19 ± 0.44 cells) was significantly lower (*p* < 0.05 vs. FOR, *p* < 0.01 vs. FEN) (Figure 3b).

**Figure 3.** Immunofluorescent staining of pERK(+) cells. (**a**) Appearance of pERK in the dorsal horn. The upper row includes pERK and nuclei, and the lower row includes only pERK. Red, pERK; blue, DAPI (nuclei). (**b**) The number of pERK(+) cells in the FOR, FEN, and YKS+FEN groups was significantly increased compared with the control group (*p* < 0.01); however, the number of pERK(+) cells in the YKS group was significantly lower (*p* < 0.05 vs. FOR, *p* < 0.01 vs. FEN). FOR, formalin-injected group; YKS, YKS-treated+FOR group; FEN, fentanyl-treated+FOR group; YKS+FEN, YKS+FEN+FOR group. Mean <sup>±</sup> SD. ## *<sup>p</sup>* <sup>&</sup>lt; 0.01 (vs. control), \* *<sup>p</sup>* <sup>&</sup>lt; 0.05, \*\* *<sup>p</sup>* <sup>&</sup>lt; 0.01 (Tukey's test).

#### *3.3. Whole-Cell Patch-Clamp Recording of TRPA1 Currents*

The TRPA1 channel is involved in formalin-induced pain sensation [39,40]. To investigate the pharmacological effects of fentanyl and YKS on TRPA1, we performed whole-cell patch-clamp recording in HEK293T cells expressing human TRPA1 and DsRed (Figure 4). TRPA1 was activated by 0.003% formalin, approximately the half maximal effective concentration according to a previous report [39]. First, we applied YKS and fentanyl, YKS, or fentanyl alone to check whether these medicines directly activated TRPA1. In these cells, exposure to YKS or fentanyl did not evoke a change in basal currents (Figure 4a,b) while the concomitant application of fentanyl and YKS slightly induced the current (Figure 4c). Subsequent addition of formalin activated TRPA1 and evoked a current with two phases, slow and rapid. However, the formalin-induced currents were observed in all experimental conditions. Thus, TRPA1 could be not inhibited by both fentanyl and YKS.

**Figure 4.** Pharmacological effects of formalin, fentanyl, and *Yokukansan* on TRPA1. Typical traces (left) and current–voltage relationships (right) of formalin-induced TRPA1 currents in HEK293T cells expressing human TRPA1. The concentrations of formalin (For), fentanyl (FEN), and *Yokukansan* (YKS) were 0.003%, 10 μM, and 1 mg/mL, respectively. YKS (**a**) or FEN (**b**), and YKS and FEN (**c**) were pretreated for 1 min before the concomitant administration of formalin. TRPA1 was not inhibited by both fentanyl and YKS. The holding potential was −60 mV, and ramp pulses (−100 to +100 mV, 300 ms) were applied every 5 s.

#### **4. Discussion**

Administration of fentanyl at a single high dose (e.g., ≥0.16 μg/kg; i.p. [38]) can inhibit pain induced by formalin; however, it may induce acute opioid tolerance and hyperalgesia [14–22]. We hypothesized that if the dose of fentanyl could be reduced, the risks may be mitigated. The present study examined the analgesic effect of a combination of fentanyl and YKS using a rat model of acute inflammatory pain. The effect of a single intraperitoneal administration of fentanyl lasted for approximately 20 min following formalin injection. In the early (0–10 min) and intermediate (1–20 min) phases, the combination of fentanyl and YKS significantly inhibited the time spent engaged in pain-related behavior, though the administration of fentanyl alone did not work effectively (Figure 2).

In the late phase (20–60 min); that is, after the effect of fentanyl had subsided, the duration of pain-related behavior in the YKS group was significantly decreased compared with the FEN group (*p* < 0.01). Moreover, that in the YKS group tended to be reduced compared with the FOR group (*p* = 0.061). However, when a strong analgesic effect is necessary, such as during the perioperative period, the use of opioid analgesics cannot be avoided. Although there was no significant difference (*p* = 0.057), that in the YKS+FEN group was shorter than the FEN group, and therefore combined use of YKS and fentanyl is thought to be clinically useful.

To elucidate its mechanism of action, we investigated the expression of pERK in the dorsal horn of the spinal cord. Moreover, we investigated the influence of YKS and fentanyl on TRPA1 in vitro. TRPA1, a calcium-permeable non-selective cation channel, is activated by various chemicals, including irritant exogenous ligands and endogenous ligands produced by inflammation [44]. TRPA1 expression in the nociceptive primary sensory nerve is related to the reception of noxious stimuli and signal transduction to the secondary sensory nerve, and activation of TRPA1 induces hyperalgesia [44]. Pain in phase I (0–10 min) of the formalin test is mediated by the activation of TRPA1, and is attenuated by TRPA1-selective antagonists [39,40]. Therefore, the influence of YKS and fentanyl on TRPA1 was investigated. According to the patch-clamp recording, both YKS and fentanyl had no antagonistic activity on TRPA1 (Figure 4). One report suggested that morphine activates TRPA1 [45], and influences of opioids on TRPA1 may be different depending on the type of opioid. In this study, we could not reveal the mechanism of analgesic efficacy of the combination of fentanyl and YKS in the early (0–10 min) and intermediate (10–20 min) phases. We previously reported that the administration of YKS increased the secretion of oxytocin in rats with acute psychological stress [46], and Gamal-Eltrabily et al. [47] reported the injection of oxytocin inhibited formalin-induced pain. Including this, further studies concerning the action mechanism are needed.

Mitogen-activated protein kinase (MAPK) pathways play an important role in nociceptive and neuropathic pain [42,48]. In phase II of the biphasic pain response caused by formalin, the ERK pathway was activated in the central nucleus, which may be involved in nociceptive plasticity [49]. U0126, a specific inhibitor of the ERK pathway, suppressed persistent pain induced by formalin [50]. Moreover, pERK(+) neurons in the spinal cord were increased following remifentanil infusion [51]. Thus, we explored the influence on the ERK pathway as the mechanism underlying the therapeutic effect of YKS. Our results show that YKS suppressed pERK expression, which may be related to the analgesic effect of YKS in the late phase. The anti-inflammatory action of saikosaponin A isolated from *Bupleuri* radix [52], the anti-inflammatory and anti-tumorigenic effects of total flavonoids from *Glycyrrhizae* radix [53], and the neuroprotective effects of liquiritigenin isolated from *Glycyrrhizae* radix [54] are exerted by inhibition of the ERK pathway. In the future, we will examine whether these components inhibit the ERK pathway in the present experimental system.

OIH, including RIH and FIH, is thought to be a complex physiological response involving glial cell activity [6,13], neuroinflammation [55], opioid receptor desensitization [12], and NMDA receptor activation [9–11,18]. YKS has an ameliorative effect on glutamate clearance in astrocytes and an antagonistic action at the NMDA receptor [35–37]. Additionally, we previously reported that administration of YKS attenuated the development of antinociceptivemorphine tolerance, and that suppression of glial cell activation in the spinal cord and mesencephalon may be one mechanism underlying this phenomenon [33,34]. These mechanisms are also thought to contribute to the preventative effect of YKS on the development of FIH. In this study, obvious FIH was not observed in the late phase, possibly because the dose of fentanyl was low; thus, further studies should be conducted using higher doses of fentanyl.

With respect to the analgesic effect of YKS, almost all clinical and basic studies investigated chronic pain [25–32], and no studies have investigated acute pain. The results from this study suggest that combination use with opioid analgesics might contribute to a reduction in opioid dose and prevention of paradoxical reactions following opioid use; however, YKS alone cannot be expected to provide a sufficient analgesic effect against acute inflammatory pain.

#### **5. Conclusions**

Fentanyl may induce acute opioid tolerance and postoperative hyperalgesia when administered at a single high dose. In this study, although fentanyl, the dose of which (0.08 μg/kg) cannot provide a significant analgesic effect, was used, combined use of YKS and fentanyl could significantly inhibit pain in the early and intermediate phases of the formalin test. Our findings suggest that YKS could reduce the use of fentanyl and the combined use considered clinically useful.

**Author Contributions:** Conceptualization and methodology, Y.T. and M.S.; investigation, Y.A., Y.T., M.K., H.I., N.A., Y.O., W.Y., T.O., M.T., and M.S.; resources, N.A. and Y.T.; data curation, Y.A., M.K., Y.T., and M.S.; formal analysis, Y.A., Y.T., and M.S.; writing—original draft preparation, Y.A.; writing—review and editing, Y.T. and M.S.; visualization, Y.A. and M.S.; supervision and project administration, M.S. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research received no external funding.

**Acknowledgments:** The authors wish to thank Tsumura & Co. (Tokyo, Japan) for generously providing *Yokukansan* (TJ-54) and Enago for English language review. The authors also thank Yasuo Mori (Kyoto University, Japan) for providing human TRPA1 cDNA.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## *Review* **PARP Inhibitors in Biliary Tract Cancer: A New Kid on the Block?**

**Angela Dalia Ricci 1,**†**, Alessandro Rizzo 1,\*,**†**, Chiara Bonucci 1, Nastassja Tober 1, Andrea Palloni 1, Veronica Mollica 1, Ilaria Maggio 1, Marzia Deserti 2, Simona Tavolari <sup>2</sup> and Giovanni Brandi <sup>1</sup>**


Received: 26 July 2020; Accepted: 29 August 2020; Published: 31 August 2020

**Abstract:** Poly adenosine diphosphate-ribose polymerase inhibitors (PARPi) represent an effective therapeutic strategy for cancer patients harboring germline and somatic aberrations in DNA damage repair (DDR) genes. *BRCA1*/*2* mutations occur at 1–7% across biliary tract cancers (BTCs), but a broader spectrum of DDR gene alterations is reported in 28.9–63.5% of newly diagnosed BTC patients. The open question is whether alterations in genes that are well established to have a role in DDR could be considered as emerging predictive biomarkers of response to platinum compounds and PARPi. Currently, data regarding PARPi in BTC patients harboring BRCA and DDR mutations are sparse and anecdotal; nevertheless, a variety of clinical trials are testing PARPi as monotherapy or in combination with other anticancer agents. In this review, we provide a comprehensive overview regarding the genetic landscape of DDR pathway deficiency, state of the art and future therapeutic implications of PARPi in BTC, looking at combination strategies with immune-checkpoint inhibitors and other anticancer agents in order to improve survival and quality of life in BTC patients.

**Keywords:** biliary tract cancer; cholangiocarcinoma; PARP; BRCA; olaparib; rucaparib; liver cancer

#### **1. Introduction**

Biliary tract cancers (BTCs) are a relatively rare group of malignancies arising from different anatomical locations of the biliary tree and including intrahepatic cholangiocarcinoma (iCCA), extrahepatic cholangiocarcinoma (eCCA), gallbladder cancer (GBC), and ampulla of Vater cancer (AVC) (Figure 1) [1,2].

BTC represents the second most frequent primary liver cancer after hepatocellular carcinoma (HCC), accounting for about 3% of all gastrointestinal tumors [3,4]. The incidence of BTC has increased in both western and eastern countries in the past two decades, concurrently with the rising incidence of iCCA, probably related to changes in tumor classification and better disease recognition [5]. Despite recent advances in the management of localized and metastatic disease, the prognosis of BTC patients remains dismal since the majority of cases are often diagnosed when unresectable or metastatic and the 5-year survival for patients with distant disease is about 5% [6]. To date, radical surgery is the only curative treatment option for BTC, but unfortunately, these malignancies are frequently asymptomatic in early stages, and approximately 40% of the patients considered resectable at the moment of diagnosis are found to be unresectable during exploratory laparotomy [7,8]. Systemic chemotherapy is the backbone

of palliative treatment for BTC patients, with the combination of cisplatin plus gemcitabine representing the current standard of care in the front-line setting, following the results of the ABC-02 trial [9]. Although this phase III trial showed a survival advantage for cisplatin–gemcitabine over gemcitabine monotherapy, nearly all BTC patients develop progressive disease during first-line treatment, with a median overall survival (OS) of less than a year [10]. Thus, improving outcomes in patients affected by advanced/metastatic BTC represents an urgent need.

**Figure 1.** Anatomical subvariants of biliary tract cancer. AVC: ampulla of Vater cancer; dCCA: distal cholangiocarcinoma; eCCA: extrahepatic cholangiocarcinoma; GBC: gallbladder cancer; iCCA: intrahepatic cholangiocarcinoma; pCCA: perihilar cholangiocarcinoma.

In recent years, an unprecedented amount of genomic studies has begun to unveil the complex molecular landscape of BTC, shedding new light on novel therapeutic opportunities of this poor-prognosis malignancy and opening the era of tailor-made oncology in BTC [11]. In fact, the emergence of novel therapies is modifying previous treatment algorithms for BTC—especially in iCCA, where targeting isocitrate dehydrogenase (IDH) mutations and fibroblast growth factor receptor (FGFR) fusions is entering in clinical practice [12]. Comprehensive sequencing studies of BTC showed that nearly 40% of patients harbor a potentially targetable genetic alteration, emphasizing the genomic complexity of the disease, with several reports that have been focused on cell-cycle dysregulation, DNA damage repair (DDR) pathway deficiency, and genomic instability [13].

*BRCA1*/*2* are the most well-studied DDR genes, and their prevalence fluctuates from 1% to 7% in patients affected by BTC [13], with *BRCA2* suggested to be more frequent in GBC [14]. Although these mutations generally correlate with poor response to standard treatments, previous reports about BTC suggested a role for platinum salts and poly (ADP-ribose) polymerase-inhibitors (PARPi) as successful therapeutic options in somatic and/or germline *BRCA* mutations (*BRCAm*) carriers [15]. Evidence from phase III clinical trials has led to PARPi approval in breast and ovarian cancers, and the use of PARPi is going to be extended also to prostate and pancreatic cancer [16–18]. In fact, from the first launch of the PARPi olaparib in 2014, recent years have seen the FDA approval of other PARPi, including niraparib, rucaparib, and talazoparib in distinct settings [16,19]. More specifically, niraparib can be actually used as maintenance therapy in recurrent platinum-sensitive epithelial ovarian cancer following the results of the PRIMA/ENGOT-OV26/GOG-3012 trial [16]. In this randomized phase III trial, median progression-free survival (PFS) was significantly longer in the niraparib arm compared to that in the placebo group (21.9 versus 10.4 months) in patients affected by advanced ovarian cancer experiencing response to platinum-based chemotherapy. Similarly, many other PARPi have also

entered clinical practice, as in the case of breast cancer where the OlympiAD and the EMBRACA trials have opened the doors of a new world, inaugurating the "PARPi Era" in HER2-negative *BRCAm* metastatic breast cancer [20,21]. According to OlympiAD—comparing olaparib monotherapy with single-agent chemotherapy of the physician's choice (capecitabine, eribulin, or vinorelbine)—olaparib treatment provided a significant benefit in terms of PFS, with risk of disease progression or death 42% lower with olaparib single-agent than with chemotherapy [20]. With a study design similar to OlympiAD, the randomized phase III EMBRACA trial compared talazoparib versus standard single-agent chemotherapy of the physician's choice (capecitabine, eribulin, gemcitabine, or vinorelbine) in advanced breast cancer patients with germline *BRCAm*, observing that talazoparib provided a statistically significant benefit in terms of PFS (8.6 versus 5.6 months; Hazard Ratio 0.54; 95% CI, 0.41–0.71, *p* < 0.001) [21]. Moreover, PARPi have shown an overall manageable safety profile, with hematological toxicity—mainly anemia—representing the most frequent adverse event [20,21]. In fact, incidence of grade 3–4 anemia has been reported to be around 19% in subjects receiving olaparib or rucaparib, 25% in niraparib, and 23% in patients treated with talazoparib [22] while neutropenia and thrombocytopenia ranges from 10% to 27%; thus, a strict monitoring on blood cell counts should be conducted in patients receiving these treatments [22].

As previously stated, previous experiences in ovarian and breast cancer have paved the way toward a number of trials testing PARPi in several tumors, with PARPi that are currently under active evaluation also in BRCA-mutated biliary malignancies [8–10]. However, a larger spectrum of genes that compromise DDR pathway has been reported to occur in up to 28.9% of patients with newly diagnosed BTC, and to date, the optimal therapeutic strategy in BTC tumors harboring Homologous Recombination Deficiency (HRD) alterations is yet to be defined [23].

In this review, we provide a comprehensive overview regarding the genetic landscape of DDR pathway deficiency, the emerging therapeutic role of PARPi in BTC, and current perspectives and possible future therapeutic implications of DDR alterations across BTC.

#### **2. HRD, the Role of PARP in DDR and Synthetic Lethality**

DNA damage and DNA repair, or lack thereof, have central importance in the induction of mutations. Additionally, since mutations drive the onset of nearly all malignancies, in physiological conditions, cells activate to defend themselves through a series of molecular pathways, the DDR, in order to handle genotoxic damage usually arising as single-strand breaks (SSBs) or double-strand breaks (DSBs) (Figure 2) [24].

**Figure 2.** Overview of DNA repair mechanisms. BER: base excision repair; HR: homologous recombination; MMEJ: microhomology mediated end-joining; MMR: mismatch mediated repair; NER: nucleotide excision repair; NHEJ: non-homologous end-joining.

Critical pathways able to fix DSBs are homologous recombination repair (HRR)—a form of DNA repair using homologous DNA sequences—microhomology mediated end-joining (MMEJ), and non-homologous end-joining (NHEJ), which conversely often leads to genetic material loss, thus resulting in genetic alterations [25,26]. Conversely, SSBs are mainly repaired by mechanisms such as base excision repair (BER), nucleotide excision repair (NER), or mismatch mediated repair (MMR) (Figure 2) [27,28]. Key elements in the DDR are the PARP enzymes, having an important role in SSBs repair and also taking part in HRR and NHEJ [29].

PARP (poly (ADP-ribose) polymerase) is a family of enzymes, including PARP1, PARP2, and PARP3 [30]. Interestingly, PARP1 is responsible for almost 80–90% of DDR activity, and in terms of structure, PARP1 presents a DNA binding domain at the N-terminus, with three zinc-finger-related domains able to recognize sites of damaged sequences [31]; moreover, PARP1 has a catalytic domain encompassing two subdomains: a helical domain and an ADP-ribosyltransferase catalytic transferring the ADP-ribose from NAD+ to protein residues, generating poly(ADP-ribose) chains (PAR) [32,33]. In fact, PARP1 and PARP2 are DNA damage sensors and signal transducers, able to synthesize branched PAR chains on target proteins through a process termed PARylation [34]. When PARP1 binds DNA, the catalytic function of PARP1 is activated following several allosteric modifications, leading to PARylation and recruitment of DNA repair effectors, including XRCC1 [35].

*BRCA1* and *BRCA2* are fundamental genes involved in HRR [36] and since they are critical in the process of DSBs repair, *BRCA1*/*2* germline mutations are associated with higher risk of carcinogenesis due to a mutational event on the other allele [37]. The same occurs when other genes essential for HRR are mutated, resulting in HRD [38–40].

PARPi are oral small-molecule inhibitors of PARP1, PARP2, and PARP3, whose action is based on synthetic lethality, a well-known concept proposed nearly a century ago [41,42]. As schematically represented in Figure 3, according to synthetic lethality the concurrent alteration of two different genes results in cell death while the alteration of a single gene does not. In the specific case of cancer treatment, with gene *A* representing a tumor suppressor gene or an oncogene, gene *B* could represent a candidate therapeutic target which may be used in order to target cells with *A* dysfunction.

**Figure 3.** Schematic figure representing synthetic lethality. As outlined, the simultaneous alteration of gene *A* and gene *B* results in cell death while the alteration of either gene does not. When the concept of synthetic lethality is applied to poly adenosine diphosphate-ribose polymerase inhibitors (PARPi) treatment, gene *B* represents a candidate therapeutic target used to target cells with gene *A* dysfunctions.

The inhibition of PARP causes the persistence of SSBs, resulting in DSBs [43,44]. More specifically, there are two main mechanisms of action of PARPi, both responsible for their antiblastic effect. First, PARPi inhibit catalytic activity of the enzyme by avoiding both PARylation of the repair site and autoPARylation [45]. The second and even more significant mechanism is represented by PARP trapping activity; in fact, PARPi trap PARP at its DNA binding site preventing repair processes, hesitating in cell death by mitotic catastrophe [46,47]. Moreover, the inhibition of this pathway can force cells to use alternative damage repair systems, namely non-homologous recombination processes [48,49], which are more error-prone and can result in large-scale genomic rearrangements, and finally, in apoptotic cell death [50].

### **3. DDR Deficiency and** *BRCAm* **in BTC**

The role of DDR alterations is still widely unknown in BTC and only few data about their clinical impact are currently available [51]. However, germline or somatic *BRCAm* are being increasingly reported due to the possibility to identify a distinct subgroup of carriers that may benefit from a personalized treatment strategy [52,53]. Curiously, *BRCAm* in BTC have been observed more frequently as somatic rather than as germline mutations [54].

The prevalence of DDR defects in BTC has been described in a range between 28.9% and 63.5%, and unfortunately, this range of frequencies depends on current lack of consensus regarding methods for testing and defining DDR alterations in BTC [54,55]. The recent evolution of sequencing technologies and the use of comprehensive gene sequencing panels has resulted in improved ability to detect variations in DDR genes, beyond *BRCA1*/*2* [56]. Nevertheless, two major limitations of these methods are represented by the unclear functional role of variants of unknown significance in DDR genes and the inability to identify epigenetic silencing of the same genes [57]. Moreover, the main open question is whether defects in genes that are well established to have a role in DDR could be considered as predictive biomarkers of response to platinum compounds and PARPi [58].

Another issue concerns how many germline and somatic pathogenic variants should be tested in order to identify "BRCAness" phenotypes [59]. A panel of 17 germline and somatic DDR gene alterations (*ATM, BAP1, BARD1, BLM, BRCA1, BRCA2, BRIP1, CHEK2, FAM175A, FANCA, FANCC, NBN, PALB2, RAD50, RAD51, RAD51C,* and *RTEL1*) in addition to *BRCAm* has been recently proposed in order to evaluate a correlation with genomic instability in patients affected by pancreatic ductal adenocarcinoma (PDAC), thereby excluding potential emerging DDR genes such as *ARID1A, ATR, ATRX, CHEK1, RAD51L1,* and *RAD51L3* [60]. Notably, mutations in *ARID1A* have been reported in up to 14% of cholangiocarcinomas (CCAs) [61], and interestingly, *ARID1A*—a chromatin remodeler of the *SWI*/*SNF* (Switch/Sucrose Non-Fermentable) family—probably contributes to recruiting and stabilizing the SWI/SNF complex at DSBs, thus regulating the DNA damage checkpoint [62,63]. Moreover, evidence from in vivo and in vitro studies suggested that *ARID1A* deficiency may sensitize cancer cells to PARPi [57]. Another gene involved in HR mechanisms is *BAP1*, a tumor suppressor gene and a deubiquitinase promoting DNA DSBs repair [64]. Yu and colleagues suggested that BAP1-deficient cells were sensitive to ionizing radiation and other agents that induce DNA DSBs [65], and additionally, *BAP1* mutant CCAs are likely to have poorer prognosis and a predisposition to bone metastasis development [66].

Patients with *BRCAm* are predisposed for BTC, as *BRCA1*/*2* alterations have been associated with early onset BTC [51–54]. More specifically, data from the early 2000s by the Breast Cancer Linkage Consortium (BCLC) suggested that *BRCA2-*carriers had higher relative risk (RR) of developing BTC than patients affected by infection with liver parasites, hepatitis C virus, and hepatitis B virus (RR 4.97, 95% confidence interval (CI) 1.50–16.52) [67].

Importantly, defective DNA repair enhances tumor heterogeneity and promotes tumor progression [68]. Hence, *BRCAm* generally correlate with poor response to standard treatments, although notable responses to platinum-based treatment or PARPi have been reported [69]. In 2017, Golan and colleagues published a retrospective analysis of 18 patients with confirmed *BRCAm* CCA [15]. Interestingly, the 44% of patients (8 of 18) had personal or family history of *BRCA*-associated malignancy (breast, ovarian, prostate, and pancreatic cancer) [15]. Overall, clinical germline testing for BTC risk is currently not recommended in clinical practice and more efforts are needed to better identify high-risk groups that might benefit from screening, further exploring, and eventually confirming the potential predictive and prognostic value of DDR gene alterations.

#### **4. PARPi in BTC**

Available data regarding PARPi in BTC patients harboring *BRCAm* and DDR mutations are sparse and anecdotal, with OS ranging from 11 to 65 months and sporadic cases of sustained response to PARPi, which have been reported [15,70–72]. As previously stated, although based on a small number of subjects, the multicenter retrospective study by Golan and colleagues suggested some clinical features of patients affected by BTC with germline and/or somatic *BRCAm* [15]. The study included 18 patients, 5 with germline *BRCA1*/*2m* and 13 with somatic mutations; interestingly, 13 patients were treated with platinum-based chemotherapy and 4 with PARPi. In terms of survival, BTC patients with stage I–stage II presented a median OS of 40.3 months (95% CI, 6.73–108.15) and of 25 months in stage III–stage IV BTC [15]. According to the results of this study, the presence of *BRCA1*/*2m* appeared to carry a more favorable prognosis since patients experienced a prolonged survival compared to historical data regarding BTC [15]. In a recent report by Chae et al., DDR gene mutations were observed in 55 out of 88 (63.5%) patients receiving first-line platinum-based chemotherapy for advanced BTC, with DDR gene mutations associated with longer OS (21.0 vs. 13.3 months, *p* = 0.009) and PFS (6.9 vs. 5.7 months, *p* = 0.013) after treatment with platinum salts [52]. This association between platinum sensitivity and DDR gene mutations has been widely described in other malignancies, including ovarian and breast cancer [73–75]. Platinum salts such as carboplatin and cisplatin exert their cytotoxic effects through distinct cellular mechanisms [76]; more specifically, after entrance into cells, platinum salts react with DNA generating monoadducts, inter- and intraDNA strand cross-links, and are able to cause SSBs and DSBs [77]. Consequently, DNA replication and transcription are blocked by this structural distortion, resulting in cell cycle arrest, cell apoptosis, and necrosis [78]. In physiological conditions, DNA lesions caused by platinum salts are properly repaired by DDR mechanisms; therefore, since platinum salts are DNA cross-linking agents, it is readily apparent that these compounds are more likely to be effective in *BRCAm* malignancies [79]. For example, higher rates of pathological complete response have been observed in *BRCAm*, triple negative breast cancer patients treated with neoadjuvant platinum salts compared to wild-type subjects [80]. Similarly, the randomized *TNT* trial highlighted a notable response rate and PFS benefit in metastatic *BRCAm* breast cancer patients receiving carboplatin compared to those receiving docetaxel [81]. This topic is particularly important if we look at BTC, where platinum-based chemotherapy represents the mainstay of palliative treatment following the results of the landmark ABC-02 trial and the more recent ABC-06 study [9,82,83].

To date, there is no evidence in literature regarding the efficacy of PARPi in BTC patients harboring DDR gene alterations, with the exception of a recent case report demonstrating a clinical benefit with olaparib monotherapy in a patient affected by gallbladder cancer with an Ataxia telangiectasia mutated (*ATM)*-inactivating mutation [84]. Following several trials assessing PARPi in breast cancer and ovarian cancer, recent studies have tested the role of PARPi in patients affected by HRD gastrointestinal malignancies, with the pivotal POLO trial, which has provided important data in this setting [71]. In fact, this randomized phase III trial has suggested a novel option for precision oncology in PDAC by evaluating the PARPi olaparib (300 mg twice daily) as maintenance in PDAC patients with *BRCAm* and whose disease had not progressed during first-line platinum-based chemotherapy [71]. Among the 154 enrolled patients, PFS was significantly longer in the olaparib maintenance arm compared to that in the placebo group, with 7.4 versus 3.8 months (Hazard Ratio 0.95; 95% CI 0.35–0.82, *p* = 0.004). Meanwhile, in analogy to previous reports in other solid malignancies, olaparib maintenance treatment has presented an acceptable and manageable safety profile, without a significant impact on quality of life [85]. More recently, a recent randomized phase II trial showed impressive response rates (75% and 64%, respectively) and survival in *BRCA1*/*2m* PDAC patients receiving platinum-base chemotherapy plus the PARPi veliparib or platinum-based chemotherapy alone as front-line treatment [86].

Considering the anatomical and histological analogies with PDAC, and in an attempt to translate this experience, multiple clinical trials are now evaluating the potential role of PARPi in metastatic BTC. We reviewed MEDLINE/PubMed and ClinicalTrial.gov for published or ongoing clinical trials evaluating the efficacy of PARPi in BTC until 20th July 2020. The medical subject heading terms used for

PubMed search were ((olaparib[Title]) OR (veliparib[Title]) OR (rucaparib[Title]) OR (niraparib[Title]) OR (talazoparib[Title]) OR (PARP[Title])) AND ((biliary[Title]) OR (cholangiocarcinoma[Title]) OR (gallbladder[Title])). The medical subject headings terms used for the search in ClinicalTrials.gov were ("Recruiting or not yet recruiting" as status), ("biliary tract cancer", "biliary tract neoplasm", "cholangiocarcinoma", "gallbladder cancer", "Ampulla cancer" as condition/disease) and ("PARP", "olaparib", "veliparib", "niraparib", "rucaparib", or "talazoparib" as other terms). Table 1 summarizes ongoing trials on PARPi in BTC registered on clinicaltrials.gov.

**Table 1.** Current ongoing trials involving PARP inhibitors in biliary tract cancer (BTC) registered on clinicaltrials.gov.


CCA: cholangiocarcinoma; DCR: disease control rate; DDR: DNA damage repair; 5-FU: F-fluorouracil; HRD: homologous recombination deficiency; IDH: isocitrate dehydrogenase; Nal-IRI: nanoliposomal irinotecan; ORR: overall response rate; **\*** somatic/germline mutation of *ARID1A, ATM, ATR, BACH1 [BRIP1], BAP1, BARD1, BLM, CHEK1, CHEK2,CDK2, CDK4, ERCC, FAM175A, FEN1, IDH1, IDH2, MRE11A, NBN [NBS1], PALB2, POLD1, PRKDC [DNA-PK], PTEN, RAD50, RAD51, RAD52, RAD54, RPA1, SLX4, WRN*, or *XRCC*; **\*\*** somatic/germline mutation of *ATM, ATR, CHEK2, BRCA 1*/*2, RAD51, BRIP1, PALB2, PTEN, FANC, NBN, EMSY, MRE11, ARID1A*; **\*\*\*** deleterious mutation of *BRCA1, BRCA2, PALB2, RAD51C, RAD51D, BARD1, BRIP1, FANCA, NBN, RAD51, or RAD51B*.

#### **5. Future Directions**

With the aim to provide novel effective combinations, several ongoing clinical trials are evaluating PARPi in combination with other agents, including cytotoxic chemotherapy, immune-checkpoint inhibitors (ICIs), and tyrosine kinase inhibitors (Table 1) [87].

Early preclinical reports have suggested that PARP1 is implicated in *STAT3* (Signal Transduced and Activator of Transcription 3) dephosphorylation, thus resulting in a reduced transcriptional activity of *STAT3* and lower PD-L1 expression [88]. Conversely, inhibiting PARP would clearly result in higher PD-L1 transcription in cancer cells and Programmed death-ligand 1(PD-L1) expression [88]. These preliminary findings have paved the way toward a number of studies assessing ICIs combined with PARPi in several malignancies since PARP inhibition has been suggested to increase tumor mutational burden, augmenting DNA damage processes and upregulating PD-L1 expression. The combination of PARPi with PD-1 inhibitors highlighted interesting response rates and a manageable safety profile in early reports evaluating this therapeutic strategy [89]. In a phase I trial assessing the PARPi pamiparib with the PD-1 inhibitor tislelizumab in 25 patients affected by advanced solid tumors, a response rate of 25% was observed, with two complete responses (4%) and eight partial responses (16%) [89]. Interestingly, this study included highly pretreated patients, with 14 out of 25 harboring a germline or somatic *BRCA1*/*2* mutation. More recently, the report by Spizzo and colleagues on 1292 tumor samples of BTC patients suggested a potential association between *BRCAm* and ICIs response, with tumor mismatch repair, microsatellite instability status, and PD-L1 overexpression associated with *BRCAm* [54].

Another interesting strategy is based on angiogenesis. In fact, hypoxia decelerates the downregulation of DNA repair processes, which in turn may result in genomic instability [90,91]. Therefore, the combination of PARPi and anti-angiogenic agents could enhance synthetic lethality, as witnessed in other solid malignancies such as ovarian cancer [92]. Unfortunately, acquired resistance to PARPi is a major issue in patients receiving these molecules, for which several potential mechanisms have been suggested, including the inactivation of the DNA repair proteins 53BP1 or REV7 [93,94]. Thus, novel drug combinations and treatment strategies able to overcome or at least delay the emergence of resistant clones are required [95]. PI3k/Akt, MAPK, and other mitogen signaling pathways have been related to reduction in *HR* repair, and consequently, have been associated with secondary resistance to PARPi [96]. As in the case of ICIs, preclinical and early clinical reports have suggested a possible synergistic activity provided by the combination of PI3k and MEK inhibitors plus PARPi [97,98], and further data are awaited.

Lastly, another strategy could be based on targeting IDH, a therapeutic option that is entering into clinical practice, with *IDH1* and *IDH2* mutations occurring in about 20% of iCCA patients [99,100]. Interestingly, *IDH1* action relies on the conversion of isocitrate to alfa-ketoglutarate; in case of *IDH* mutations, alfa-ketoglutarate is transformed by *IDH1* into 2-hydroxyglutarate (2-HG), which plays a role in tumor progression [101,102]. Since preclinical reports have detected alterations in the *HR* pathway and an increased PARPi sensitivity in *IDH1*-mutated malignancies, the strategy of combining PARPi with IDH-targeted treatments is under evaluation in the subgroup of BTC patients harboring IDH mutations (Table 1) [103,104].

#### **6. Conclusions**

Unfortunately, patients with advanced/metastatic BTC have a dismal prognosis and few therapeutic options, and therefore, there is an urgent need for novel treatment strategies in this setting. If PARPi have shown meaningful activity in several solid tumors, further efforts are needed to define the role of these novel agents in BTC. A key point would certainly be the identification of which patients are most likely to benefit from PARPi monotherapy or combinations. In fact, combination strategies of PARPi with ICIs and other anticancer treatments are being tested and the results of these investigations are awaited, with the hope to increase the number of medical options and to improve survival and quality of life in BTC patients.

**Author Contributions:** Conceptualization, A.D.R. and A.R.; methodology, A.D.R. and A.R.; writing—original draft preparation, A.D.R., A.R., C.B., and N.T.; writing—review and editing, A.D.R., A.R., C.B., and N.T.; visualization, A.P., V.M., I.M., M.D., S.T., and G.B.; supervision, S.T. and G.B.; project administration, A.D.R., A.R., and G.B. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research received no external funding.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**


© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## *Review* **Minimising Blood Stream Infection: Developing New Materials for Intravascular Catheters**

### **Charnete Casimero, Todd Ruddock, Catherine Hegarty, Robert Barber, Amy Devine and James Davis \***

School of Engineering, Ulster University, Jordanstown BT37 0QB, Northern Ireland, UK; casimero-c@ulster.ac.uk (C.C.); ruddock-t1@ulster.ac.uk (T.R.); hegarty-c19@ulster.ac.uk (C.H.); barber-r@ulster.ac.uk (R.B.); devine-a14@ulster.ac.uk (A.D.)

**\*** Correspondence: james.davis@ulster.ac.uk; Tel.: +44-(0)289-0366-407

Received: 30 July 2020; Accepted: 24 August 2020; Published: 26 August 2020

**Abstract:** Catheter related blood stream infection is an ever present hazard for those patients requiring venous access and particularly for those requiring long term medication. The implementation of more rigorous care bundles and greater adherence to aseptic techniques have yielded substantial reductions in infection rates but the latter is still far from acceptable and continues to place a heavy burden on patients and healthcare providers. While advances in engineering design and the arrival of functional materials hold considerable promise for the development of a new generation of catheters, many challenges remain. The aim of this review is to identify the issues that presently impact catheter performance and provide a critical evaluation of the design considerations that are emerging in the pursuit of these new catheter systems.

**Keywords:** intravascular catheter; CRBSI; biofilm; CVC; antimicrobial; antifouling

#### **1. Introduction**

Intravascular catheters are ubiquitous in contemporary care and it has been estimated that 30–80% of hospital patients will have a peripheral venous catheter (PVC) in place at some point during their stay [1–3]. While PVCs are by far the most commonplace, a wide range of catheter designs are employed to aid the delivery of life saving fluids and differ in terms of anticipated use. In contrast to the short-lived PVCs, central venous catheters (CVC) are designed for main vein access and mid to long-term (months/years) applications such as: the delivery of chemotherapy drugs, nutritional fluids and for haemodialysis. Used across every hospital unit as well as for outpatient management, CVCs are some of the most common indwelling medical devices of modern times with approximately 5 million catheters inserted annually in the US (cf. 330 million PVCs) [4,5]. While such devices provide ease of access to the vascular highways of the body, it must also be recognised that they can also serve as an entry point for life threatening infection [6]. It has been estimated that CVCs are some 64 times more likely to result in a catheter related blood stream infection (CRBSI) than other intravascular access devices [7].

Patients in intensive care units (ICUs) across the US are exposed to an estimated 15 million (catheter) days of CVC usage each year [8] from which up to 8% result in CRBSI [5,9]. CRBSI-induced sepsis accounts for 25% of annual haemodialysis patient mortality [10]. The US Centre for Disease Control and Prevention (CDC) estimates that a total of 250,000 blood stream infections are diagnosed each year [11] of which 80,000–120,000 are catheter-related [8,12]. While rates of attributable mortality vary from 12% to 25% for nosocomial CRBSIs [11,13,14], a meta-analysis concluded that an average of 3% of CRBSIs result in death [8]. Further complications can also arise such as intravascular thrombosis and endocarditis leading to myocardial infarction or stroke [15]. It is of little surprise that the consequences of CRBSI results in a considerable economic burden on both the healthcare provider

and patient. The attributable cost per infection in the US varies considerably based on infection type and healthcare factors, costs are reported to range from USD 3700 to USD 56,000 per patient, with an upper estimate of USD 2.3 billion spent on CRBSIs arising from CVC use annually [8,16].

The combined morbidity, mortality and economic burden posed by CRBSIs, is an ever present concern among clinicians and has prompted much effort in the search for more effective frontline procedures (i.e., aseptic technique education and prophylactic antimicrobial lock systems) [17–19], but there has also been radical rethinks of the design and the development of new catheter components, materials and smart sensing solutions. There has been substantial progress in these areas in recent years but, despite concerted efforts in education and the implementation of care bundles, bacterial contamination and the resulting CRBSI remains problematic. As such, there are considerable opportunities for recent advances in materials and sensing technologies to complement improved clinical practice in providing a more integrated solution. The provision of materials that minimise the adherence of bacteria to the intraluminal space, provide antimicrobial action, and sensors that can proactively monitor the condition of the line herald a new generation of smart catheter systems that aim to eliminate infection or provide early warning diagnostics. The aim of this review is to provide a critical evaluation of the design considerations that are emerging in the pursuit of these new catheter systems. While the focus here is on CVCs, it should be recognised that bacterial contamination is commonplace and much of the discussion here should be transferrable to a range of devices.

#### **2. Catheter Components**

There are a variety of CVC subsets available, the most common of which are: non-tunnelled, tunnelled, peripherally inserted central catheters (PICC) and totally implantable venous access ports (TIVAPs) [20]. The main components of a central venous catheter and the implanted venous access ports are highlighted in Figure 1A,B, respectively. In the case of CVCs, these are typically inserted through the jugular, subclavian or femoral veins to either the superior vena cava, right atrium or inferior vena cava of the heart [20,21].

**Figure 1.** (**A**) Tunnelled central venous catheter and associated components. (**B**) Totally implantable venous access port.

Selection of the catheter system is invariably based on patient-specific factors such as: purpose, anticipated lifetime and the frequency with which the catheter will need to be accessed. It can be anticipated that the longer the catheter is in place, the greater the risk of infection. It is of little surprise therefore that tunnelled CVCs and TIVAPs which are intended for long term use (typically months to years) dominate the CRBSI literature. While TIVAPs are primarily designed for periodic/infrequent applications (i.e., haemodialysis) [22], tunnelled CVCs are generally targeted at those interventions where regular administration of fluids, medication, parenteral nutrition or the aspiration of blood is required. As such, the frequency with which the needle free connector (NFC) is manipulated

can be particularly problematic with the majority of the infections arising as a consequence of its contamination [23,24].

A multitude of NFCs are available commercially but most share common design features relating to the access port. In almost all cases, access to the catheter is activated through the insertion of a male Luer connector (from a fluid giving set or syringe) which causes the deformation of a silicone septum and therein provides access to the catheter line [25–27]. Three of the more common approaches are highlighted in Figure 2. While they differ in terms of internal mechanism, most rely on a split septum design which, when disconnected, acts as a physical barrier to the entry of bacteria. Solid sealed silicone surfaces (BD Max Zero™) are also available and, while these potentially reduce the surface crevices through which bacteria can adhere, they still rely on the Luer activated displacement/compression of the silicone cap within the device to enable fluid flow to or from the line.

**Figure 2.** Needle free connectors based on (**A**) simple split septum, (**B**) mechanical spring compression and (**C**) blunt cannula. Computerised tomography scans of the internal components of an intensive care unit (ICU) Medical Clave™ connector before (**D**) and after connection to a giving set with a Luer connector (**E**).

A variety of engineering features have been implemented in recent years as a means of improving the performance of such devices in terms of haemocompatibility and in reducing the potential for CRBSI. The presence of blood within the fluid pathway of an NFC can result in haemolysis of the red blood cells which increases the risk of a fibrin clot leading to occlusion and ultimately prevents fluid transfer. Moreover, it provides a pool of nutrients that can promote the growth of bacteria [25]. Body movements (muscle flexing, respiration, coughing etc.) and clamping of the catheter can all induce changes in the mechanical and physiological pressure within the catheter that can serve to push blood along the line [28,29]. It has been estimated that even the smallest blood reflux (4–30 mL) can result in fibrin activation and occlude the inner pathways of the NFC [25]. Removing blood from the NFC is a critical concern and there has been an increasing shift from opaque NFC structures to more transparent

polymers that can enable visual inspection of the internal working of the hub. The movement of blood within the catheter and its propensity to travel (reflux) to the needle free hub upon the insertion and disconnection of the external Luer is however dependent on the design of the NFC hub [25,26].

It is of little surprise therefore that understanding the operation of these systems can be a major factor in minimising the risk of infection. Depending on the displacement of blood upon insertion/disconnection, hubs are generally classified as: negative, neutral, positive and anti-reflux. A summary of the various designs and their mechanism is provided in Table 1.


**Table 1.** Classification of needle free connector hubs in terms of fluid displacement.

It can be seen from Table 1 that in the case of negative and positive displacement systems, there is a recommended clamping procedure associated with the use of these systems to prevent the inadvertent reflux of blood. There is, however, an assumption that the clinical staff (or the patient in the case of known as home parenteral nutrition (HPN)) are aware of the mode through which a particular NFC operates. Hadaway (2011), in a survey of healthcare workers, found that of 554 responses, some 25% were unaware of whether the NFC used in the CVC line they were managing was positive, negative or neutral [30]. Moreover, 47% were unsure as to the correct approach to the flushing and clamping procedures associated with a particular NFC with the situation being compounded by the presence of multiple NFC brand variants in use within a given institution [27,30,31]. The early introduction of NFCs was characterised by an increase in CRBSI and a lack of appropriate training in device operation has often been cited as a contributing factor. This has been corroborated in instances where an institution has switched NFC brands and recorded an increase in CRBSI rates only to find the latter returned to previously lower levels when resuming use of their original NFC system [31]. While improvements in the physical design of NFCs will undoubtedly aid approaches to the prevention of CRBSI, the continuing prevalence of the latter however highlights that there is much still to be done.

#### **3. Pathogens, Colonisation, Biofilms and Infection**

In general, catheter related infection occurs mainly through two mechanisms—migration of adventitious skin pathogens (i.e., *S. aureus*) along the external surfaces of the polymer tubing through the cutaneous tract to the blood stream (extraluminal) and, as noted in the previous section, ingress via the needle free connector hub (intraluminal) [24]. In either case, upon contact with the blood, the microbes interact with fibrin to yield an adherent biofilm which promotes microbial colonisation and furthers the spread of the organisms. Extraluminal contamination has historically been more common in short term intravascular devices (peripheral venous/arterial catheters and non-cuffed/non-tunnelled CVCs) and typically arises through issues encountered during insertion/implantation [24].

Breaching the skin barrier to enable the insertion of a medical device inevitably provides an opportunity for skin flora or adventitious contaminants from the healthcare environment to gain access to the underlying tissues and from there sets a foundation for subsequent infection. Such issues are not restricted to intravascular access devices but have become increasingly common with

cardiovascular implantable electronic devices (CEID) such as pacemakers, cardioverter–defibrillators, cardiac resynchronisation devices etc. This is compounded by the increasing longevity of the patient cohorts where the number of surgical interventions (revisions, extractions and upgrades) have increased substantially year on year [32,33]. As the prime risk occurs at the time of insertion, it is of little surprise that repeated surgical replacement of a CVC (or the CEID) increases the risk of infection. The implementation of catheter care initiatives during the insertion of CVCs (such as the Keystone Central Line Bundle [34] and Epic3 guidelines [19]) have however led to significant improvements in outcomes and, in the US, has led to reductions in insertion related infections [23,25,35].

In contrast, contamination of the NFC and internal lumen tends to occur post-operatively as a consequence of failures in the aseptic manipulation of the connecting hub prior to the administration of fluids or aspiration of blood. Intraluminal colonisation tends to be predominant in those devices intended for longer term function such as cuffed Hickman and Broviac type catheters, cuffed haemodialysis CVCs, TIVAPs, and PICC systems [36]. It is of little surprise that the increased duration of placement and frequency with which such lines are accessed will also increase the risk of CRBSI where there will be more opportunities for the intraluminal migration of planktonic (free-swimming) bacteria arising from a contaminated hub to the bloodstream [7,24]. Examination of the microbial contamination of NFCs after periods of non-use found that colony forming units (CFU) varied from 15 to 1000 CFU which, if improperly disinfected, would be sufficient to induce colonisation of the catheter and result in bacteraemia [31,37,38]. Potential trouble spots in the use of the NFC, in terms of disinfection, relate to the point where the sterile Luer connector or syringe contacts the surfaces of the NFC—mainly the septum, side threads and side surfaces (between septum and NFC structure) [23]. The presence of grooves/gaps either in the core design between septum seal and housing (as highlighted in Figure 3 for the Clave™ NFC), or as a result of repeated use (i.e., abrasion or other physical damage) can all influence the ease with which disinfection can be achieved [26,27,30,39].

**Figure 3.** ICU Medical Clave™ needle free connector (**A**). Optical image of the septum (**B**) and cut through section (**C**) highlighting gaps in the structure.

While the majority of CRBSI are known to originate from issues in aseptic manipulation, two other sources of infection also need to be considered which are effectively independent of any attempt at external decontamination of the catheter or the NFC. Contamination of the fluid to be infused will effectively bypass any aseptic precautions employed during administration and offers microbes unimpeded access to the bloodstream [40]. Fortunately, such occurrences are exceedingly rare. Haematogenous seeding occurs when pathogens already present in the bloodstream as a consequence of local infection (i.e., pneumonia) encounter the foreign extraluminal surface and then subsequently colonise it (indicated in Figure 1A). Though it must also be recognised that a contaminated catheter can also serve as a potential seeding source for the contamination of other intravascular devices [41]. The pathogens most commonly cultured from infected CVCs are listed in Table 2. *Staphylococcus aureus* and coagulase-negative staphylococci (typically *S. epidermidis*) are the two pathogens most frequently isolated in CRBSI cases, with *S. aureus* responsible for between 10% and 25% of infections [42,43].


**Table 2.** Catheter related blood stream infection (CRBSI) associated pathogens [42].

The introduction of bacteria to the lumen of the catheter line will inevitably result in the formation of a biofilm (largely polysaccharide in nature) which serves as a foundation for the sustained growth of the microbes and ultimately as a latent infective source [44,45]. The film itself is an extracellular 3D network that protects the emerging communities through serving as a physical barrier against the body's intrinsic immune response (phagocytes) and limits the diffusion of antibiotics. These protective qualities can be particularly problematic when attempting to salvage a catheter (rather than its direct replacement) where the presence of any surviving bacteria within the biofilm can lead to a resumption of the infection [45]. As such, conventional antimicrobial therapies require concentrations some 100–1000 times greater than the normal minimum inhibitory concentrations (MIC) to be applied and the dosage maintained over longer durations in order to eradicate the biofilm [46]. A more worrisome issue is that in aiding bacterial reproduction, the biofilm can aid the alteration of bacterial gene expression, leading to mutations and modifications in the physiology of the pathogenic antigens, limiting immune and drug response and contributing to antimicrobial resistant species [47,48].

Extra and endoluminal colonisation rates are dependent on the interaction between physiological properties of the pathogen and the surface characteristics (i.e., hydrophobicity/hydrophilicity) of the catheter [49]. The initial adhesion of pathogens is improved by the initial formation of a conditioning film (comprised of platelets and plasma proteins such as albumin, fibrinogen, and fibronectin) that binds to the luminal surfaces [50]. Pathogens will more readily adhere to this film than to the bare catheter material itself [49] and, once attached, to the luminal surface, will rapidly establish a biofilm. It is of little surprise that there have been extensive efforts to modify the surface of the polymers used in the production of both the catheters and NFCs such that the initial deposition of the conditioning film is impeded.

#### **4. Current Practice**

#### *4.1. Disinfection*

Alcohol/antimicrobial wipes are widely employed as the primary anti-infective measure in the management of catheter lines and decontamination of NFCs. Alcohol wipes (typically 70% isopropyl alcohol (IPA)) are the most commonly applied measure and their biocide activity relies on their ability to dehydrate the bacterial cell—both during the application and as the alcohol evaporates [51,52]. While the use of the alcohol alone can be effective, its veracity can be greatly enhanced by the presence of an appropriate antimicrobial agent (chlorhexidine or povidone iodine) where the disinfection mixture exploits the immediacy of the alcohol and sustained action of the antimicrobial agent [23,30,53–56].

Alcohol disinfection is not however fool proof and will always be subject to human factors (time allocated to the procedure, friction applied etc.) and device designs (ability to penetrate the device crevices) and there remains a contentious debate as to whether complete removal of bacteria from NFCs is in fact possible [52,57–60]. Menyhay and Maki (2008) in an in vitro study of 30 NFCs contaminated with *E. faecalis* and then subsequently disinfected with 70% IPA, found that 67% of the NFCs continued to transmit microbial contaminants (440–25,000 CFU) [52]. The time allocated to disinfection appears to be a significant factor with studies by Kaler and Chin [52] finding that 15 and 60 s cleaning cycles with 70% IPA eliminated all organisms whereas studies by Smith et al. (2012), Simmons et al. (2011) and Rupp et al. (2012) highlighted that short to moderate cleansing (3–15 s), while decreasing bacterial load, were less effective at total decontamination [57–59]. There is nevertheless considerable variability and contrasting results, as befits the nature of human intervention involving "scrubbing the hub". The UK EPIC3 report on the evidence-based evaluations of an expert panel have recommended that NFCs be disinfected with 70% alcoholic chlorhexidine with the application of friction pre and post access [19].

#### *4.2. Education and Aseptic Techniques*

Training at the core of the management of CVCs—from implantation to the day to day care of the line but the reliance on human compliance and adherence to the main tenets of aseptic manipulation can be an inherently variable phenomenon [61–64]. While the introduction of care bundles has led to significant gains, it must also be noted that improved compliance rates are seldom universal. A recent study by Jeong et al. (2013) revealed compliance rates were only 37% after the intervention [63] and a recent meta-analysis by Ista et al. (2016) highlighted that total compliance is essentially unattainable [18].

While the significance of disinfection of the NFC prior to accessing the catheter has long been recognised, it is surprising that it remains a common point of failure [23,30,64]. Patients with long term CVCs undergoing total parenteral nutrition (TPN) are trained to administer their nutrition at home, or have it administered at home on their behalf, known as home parenteral nutrition (HPN). In a study conducted by Bond et al. (2018), 16% of HPN patients contracted at least one CRBSI, accounting for 0.31 CRBSI per 1000 catheter days [65]. Out of these 16%, the rate of CRBSI per 1000 catheter days was 0.27 when HPN was administered by a trained home care nurse, compared to 0.342 and 0.320 when self-administered or administered by a non-medical carer (such as a family member), respectively. It is noteworthy that although there are gains in having a trained caregiver—the benefits are only marginally better than having the patient manage the line. There is little doubt that more attention is required for training in aseptic access and maintenance [66].

#### *4.3. Catheter Locks*

A second line of defence in preventing bacterial ingress rests on the use of catheter locks. The latter is typically used where the catheter is not being used and the line is flushed with saline. This has the primary purpose of removing blood from the line such that occlusion and bacterial growth are minimised (discussed in Section 2) [26,27,30,67]. While the use of a saline flush is standard, other components can also be introduced and perform a variety of roles: anti-occlusion (heparin), antibiotic (vancomycin, gentamicin) and antimicrobial (citrate, ethanol and taurolidine) [68–70]. Vancomycin and gentamicin are generally reserved for therapeutic measures once a CRBSI has been diagnosed [22,46,71–73]. In contrast, heparin, citrate and, increasingly, taurolidine are used prophylactically [9,74–76]. There is no standard recommendation as to the use of catheter locks and their over-arching function is to maintain the integrity of the line. An extensive literature based on their application and investigations of their efficacy has emerged in recent years but a detailed discussion of their individual use is beyond the scope of the present study and the reader is directed to more comprehensive reviews [68–70].

#### *4.4. Barrier Caps*

In most cases, the silicone septum is the main physical barrier preventing the entry of microorganisms to the catheter lumen. Disinfection, while widely recognised as critical, still falls foul of issues relating to adherence and on the vagaries of the person performing the cleansing process [23,25,27]. Passive NFC caps have come more to the fore in recent years through providing a passive means of continuous disinfection whilst the catheter is not in use [77–84]. A number of different approaches have been taken in the design of the disinfectant barrier cap and their mode of operation is summarised in Figure 4. The simplest approach is the incorporation of a foam/gauze insert soaked with 70% IPA (i.e., 3M Curos™, SwabCap®, Site-Scrub®) or one that contains an antimicrobial (povidone iodine or alcoholic chlorhexidine) which is threaded onto the Luer connector when the line is not in use. As the foam pad contacts the silicone septum, the twisting of the cap provides a modicum of friction which can aid in the removal of the bacteria. Menyhay et al. (2008) have reported on a refinement of the basic design whereby a two-part system is employed with the antimicrobial contained within a capsule (Figure 4A) [52]. As the cap is threaded onto the Luer, the capsule is forced into contact with a spike at the top of the cap which pierces the capsule releasing the antimicrobial into the foam. The core advantage of this and similar systems are that their application is relatively independent of the NFC manufacturer and simply require a Luer connector. An alternative design has been proposed by Buchmann et al. (2009) whereby the cap encapsulates the entire terminal end of the NFC including the thread (Figure 4B) and, in contrast to the simple insert cap highlighted in Figure 4A, is intended to remain attached during flush procedures [85]. Mariyaselvam et al. (2015) reported that contamination of syringe tips is an often overlooked factor in the development of CRBSI and hence retention of the antimicrobial foam as a secondary septum could aid in the decontamination of the tips prior to entering the NFC [86].

In both designs, the barrier cap does not interact directly with the catheter lumen but aims to disinfect the silicone septum and surrounding area and are intended for applications where the frequency of access may be high. ClearGuard®, in contrast to the previous designs, relies upon a chlorhexidine impregnated rod that is inserted directly into the end of catheter lumen in place of the NFC connector (Figure 4C). The chlorhexidine diffuses into the intraluminal space and has been shown to be highly effective at reducing CRBSI risk within haemodialysis patients—a cohort that has hitherto been characterised with high rates of infection [82].

**Figure 4.** Needle free connector barrier caps based on simple foam insert (**A**), an encapsulating split septum (**B**) and the Clearguard® terminal cap (**C**).

#### **5. Catheter Designs and Antimicrobial Mechanisms**

The provision of training programmes to educate clinicians and patients in the maintenance of vascular access lines is a critical frontline response that can dramatically lower the risk of CRBSI but, as noted earlier in Section 4.2, achieving 100% compliance and prevention is unlikely. The provision of antibacterial caps is another advance which has demonstrated considerable gains but, as with conventional aseptic practice, there remains a human element in their successful application as well as several contentious cost issues that can be a barrier to their widespread adoption. The development of materials resistant to antimicrobial colonisation has long been regarded as a means through which to counter the potential lapses in practice and where elimination of the propensity to form a biofilm is

the principal goal. The majority of the research efforts targeted at this problem are generally directed towards materials that possess antimicrobial and/or antifouling properties and a summary of the various strategies are highlighted in Figure 5.

**Figure 5.** Material approaches to counter bacterial colonisation of central venous catheters.

Tunnelled CVCs, non-tunnelled CVCs and PICC-lines are typically made of polyurethane (PU) or silicone materials, with tunnelled CVC cuffs composed of polyethylene terephthalate (PET) [87]. The preference for PU stems from the versatility with which the physical and chemical properties of the material can be manipulated through judicious choice of the monomers [88–90]. In general, PU is prepared from the reaction of hydroxyl and isocyanate groups to yield the carbamate linkage (the urethane) and, with a large array of commercially available polyols and polyisocyanates bearing a range of chemical functionalities from which to choose [89,90], it is of little surprise that the properties of the polymer can be tuned to particular applications. The polymer properties can be critical in influencing the processability of the material and its resulting mechanical performance and biocompatibility [88,91–93]. Critically, the ability to alter the chemical functionality allows for modifications to the hydrophilicity/hydrophobicity of the surface and, through the presence of reactive chemical side chains, provides the ability to further tailor the catheter interface with antimicrobial/antifouling features [88,93–101].

#### *5.1. Biocide Release*

The antimicrobial systems employed on both the extra and endoluminal surfaces aim to kill the bacteria (or at the very least inhibit further growth) and a number of different mechanisms have been evaluated. Drug eluting materials in which the biocidal agent is released passively into the lumen represent the most common approach (as listed in Figure 5) and cover a wide variety of chemical species [95,102–110]. Incorporation within the catheter can be achieved through simple adsorption of the active agent but, more recent strategies have involved electrostatic interactions with surfactants and polyelectrolyte systems to yield more coherent and stable coatings. Thermally stable biocides such as silver ions/complexes can be melt processed along with the polymer used in the production of the catheter such that the antimicrobial is homogeneously distributed throughout the material. Alternatively, exposing the catheter surfaces to a suitable solvent can induce swelling of the polymer and impregnation/incorporation of the biocide at low temperature. Some of the commercial antimicrobial catheter systems and their characteristics are compared in Table 3.


**Table 3.** Commercial catheter incorporating antimicrobial/antifouling features.

The efficacy of employing antimicrobial catheters has been widely studied for most of the systems outlined in Table 3 or their equivalent, and there is a substantial body of the literature which has found marked improvements over the use of unmodified catheters. There have also been numerous reports that have found little benefit. It must be noted that there are a large number of factors involved in the maintenance of CVCs (as noted in earlier sections) which can make comparisons between the different systems and between coated/uncoated challenging. Nevertheless, a number of systematic reviews have conducted meta analyses of the available literature and there is substantive evidence that the implementation of catheters coated/impregnated with chlorhexidine/silver sulfadiazine and antibiotics (5-fluorouracil, vancomycin, benzalkonium chloride, teicoplanin, miconazole/rifampicin, minocycline, and minocycline/rifampicin) were associated with lower incidences of catheter colonization and had the greatest potential to reduce the incidence of CRBSIs per 1000 catheter days. In contrast, the efficacy of silver impregnated systems is much more contentious and, in many cases, fail to yield statistically significant results.

The Healthcare Infection Control Practices Advisory Committee (HICPAC) recommend the use of a CVC impregnated with chlorhexidine-silver sulfadiazine (CSS) or minocycline-rifampicin (MR) in patients with at least five consecutive days of catheterisation [111]. A recent systematic survey by Lai et al. (2016) found that the most effective system within this class of material was the rifampicin but it must be noted that such studies were of limited duration and their efficacy in CVCs destined for long term placement is questionable [112]. As such, most of the commercial systems highlighted in Table 3 recommend a maximum dwell time. A core issue is the limited repository of the drug within the polymer or coating such that release does not simply terminate after a given time period but results in sub lethal doses being administered which can facilitate the development of antimicrobial resistance.

Putting the issue of dwell time aside, the release of antibiotic moieties (i.e., rifampicin) have a long history but the increasing threat of bacterial resistance has driven considerable effort to examine alternative antimicrobial agents. Various antimicrobial peptides [100,108,113–115], guanidine derivatives (i.e., poly hexamethyl biguanide, polyarginines) [103,104], quaternary ammonium compounds [96,98], nitric oxide precursors [105,106,116,117], silver [118–120] and a host of other small molecules/metal ions or nanoparticles [107,110,118,121–124] with possible biocidal activity have all been investigated as potential modifiers for use in catheters and, while these invariably impact bacterial colonisation, they have yet to make the leap to commercial exploitation and/or substantive clinical trials.

#### *5.2. Contact Kill Systems*

Contact killing of bacteria, in contrast to passive elution, relies on the immobilisation of the antimicrobial at the catheter surface and, as such, sets out to present a lethal barrier to the microbes attempting to colonise the catheter surfaces [96,98,100,104,114,115,119,125,126]. Grafting through plasma processes, polymerisation of a biocide functionalised monomer, covalent linkage (i.e., click chemistry) onto side chains or the deposition of insoluble layers are common techniques through which the catheter interface can be functionalised. Quaternary ammonium compounds (QACs), guanidine derivatives, antimicrobial peptides (AMPs) and, more recently, graphene/graphene oxide [125] systems have all been evaluated as contact killing agents. In terms of QAC and AMPs, their cationic functionality and ability to disrupt the phospholipid bilayer are the main weapons through which they attack the integrity of microbial cell wall/membrane. The mechanisms through which graphene and its various analogues work are more contentious though there is evidence to suggest that the edge planes of graphene platelets directly exert a membrane disruption effect. The in situ generation of reactive oxygen species (ROS) through redox cycling of quinoid functionalities in graphene oxide has also been shown to be a potential cytotoxic pathway [125].

As the active agents are immobilised at the surface of the catheter, the biocidal activity is, at least in a model system, capable of being maintained indefinitely. Unfortunately, the need for direct contact between the agent and the bacteria can also be a significant limitation. The deposition of conditioning films or macromolecular debris (i.e., from dead bacteria or non-specific binding of proteins) along the lumen can negate the antimicrobial effects through preventing these killing interactions. Although the contact killing approach, like the drug elution systems, appears to offer only short-term activity, it should be noted that, unlike the latter, the underpinning mechanism has no impact on emerging antimicrobial resistance. Applied in isolation, the contact approach is clearly limited by fouling but, if the latter were removed, then it could be envisaged that long term effectiveness could be achieved.

#### *5.3. Surface Hydration*/*Hydrophilicity*

Prevention of fouling has been the second main route through which to avoid biofilm formation and minimise the risk of both CRBSI [97,99,114,127] and catheter related thrombotic complications [128–131]. The latter can be categorised into four types: mural thrombosis, ball-valve-thrombosis, intraluminal thrombotic occlusion and, the most common cause, pericatheter sheath [128–131]. In short, the presence of the fibrin sleeve is due to catheter insertion causing local venous injury, leading to the deposition of fibrin on the catheter surface and subsequent intraluminal growth of endothelial and smooth muscles within hours after CVC insertion [130]. This in turn could lead to blood flow reduction which further increases the risk of endoluminal cellular attachment and thus thrombus formation [129,130]. Further movement of the catheter within the vein causes endothelial erosions which prompt the formation of mural thrombosis within the catheter lumen [128,130]. On the other hand, a thrombus on the catheter tip could lead to ball-valve thrombosis where infusion can still occur, but fluid aspiration is impeded [128,130].

As indicated in Figure 5, a number of strategies aim to prevent fouling. Historically, the modulation of hydration and steric interactions were among the first approaches and typically exploit polyethylene glycol (PEG) derivatives tethered at the polymer–solution interface [99]. The rationale here is to control the hydrophilicity of the polymer interface to create a tightly bound water layer. This alters the thermodynamics of adhesion through making it both physically and energetically less favourable for the adsorption of proteins. While PEG derivatives have dominated the early literature, alternative systems incorporating zwitterionic moieties (i.e., polysulfobetaine) have been used in PICC lines and shown to reduce the adhesion of bacteria and the onset of thrombosis [132]. Instead of hydrogen bonding, zwitterionic based coatings use electrostatic interactions to create the hydration layer [131,133]. Roth et al. (2020) demonstrated the use of branched polyethyleneimine (PEI) modified polyurethane as a means of reducing the coefficient of friction and haemolysis ratio providing a material with considerable antithrombogenic properties [131]. The intrinsic inertness of silicone-based catheter

materials can be problematic, but plasma treatment can enable the introduction of more reactive surface functionalities onto which antifouling coatings can be anchored. This was adopted by Blanco et al. (2014) who demonstrated the use of a laccase/phenolic/sulfobetaine mixture to yield a zwitterionic film tethered to a plasma aminated silicone substrate [133]. The system, although initially targeting urinary catheters, demonstrated considerable antifouling capabilities which could however be translated to intravascular systems. The novelty of the biocatalytic film formation is clear but it could be argued that the complexity of the approach would be a detractor from more widespread adoption.

It is clear that the adaptation and incorporation of the antifouling film systems could have a significant impact on mortality as it has been estimated that some 20% to 40% catheters develop pericatheter thrombus or fibrin sheath [134]. The latter predisposes the patient to infection and increases the risk of thrombosis [135,136] and, if detachment occurs, the possibility of potentially fatal thromboembolism [137].

#### *5.4. Protein Layer Interactions*

The use of a protein coat to prevent the adhesion of other proteins can appear counter intuitive but such interactions are typified by the precoating of catheter surfaces with albumin (a relatively benign protein). This approach has been shown to markedly reduce the deposition of proteins that would otherwise adhere and contribute to biofilm formation [138,139]. The effectiveness of such an approach is however relatively short term as prolonged contact with blood eventually leads to the removal of the albumin. The use of heparin as a catheter lock is well established where it is employed to prevent thrombus and occlusion of the line [67–70,84]; catheter surfaces coated with the molecule have also exhibited marked resistance to non-specific protein fouling [92,119,140]. Several conflicting mechanisms for this action have been suggested (electrostatic repulsion, protein specific interactions, inhibition of bacterial adhesions etc.) but much remains to be done in order to elucidate whether they act in concert or if one predominates. The use of heparin coatings, as with its inclusion in lock solutions, can also give rise to concerns over sensitivity [141–143].

#### *5.5. Surface Energy*

The ideal solution would be to have the catheter composed of a material that minimises adhesion without the complexities of extensive surface modification. The adoption of materials possessing low surface energy has been proffered as one route through which to tackle biofilm formation and is typified by hydrophobic fluoropolymers (PTFE) and silicones (PDMS) [139,144–146]. Despite possessing very low surface energy (<25 mN/m), their effectiveness at preventing the non-specific adhesion of proteins is contentious with a number of investigations offering conflicting evidence. Where such polymers have found success, it has been suggested that passivation by albumin is the main factor in hindering cell attachment rather than the intrinsic low energy properties of the polymer substrate [139,144]. The exploitation of hydrophobicity in anti-adhesion contexts may however require a more nuanced application and there have been some notable advances through manipulating the surface topography [97,147]. Increasing surface roughness or the introduction of specific patterning inspired by biological materials (i.e., Sharklet), have shown to provide superhydrophobic coatings that are effective against *E. coli* and *S. aureus* [97]. The translation of the technology to catheter systems to prevent extra and intra luminal colonisation may be more challenging and, at present, such approaches remain speculative.

#### *5.6. Smart*/*Electronic Materials*

The majority of materials research aimed at combatting CRBSIs tend to focus on the modification of catheter surfaces and, while many of the strategies have shown to be effective in the short term, almost all succumb to fouling with increased dwell time and a loss of activity. Advances in electronics have seen some interest in the development of "smart" materials that can detect the presence of a

biofilm or, upon activation by an appropriate stimulus, release a biocidal agent into the lumen. Such research is still in its infancy but could herald a wholly new avenue for tackling CRBSI.

Li et al. (2014) were among the first to consider the introduction of sensors within the catheter line as a means of detecting the presence of bacteria [148]. Their approach relied upon the use of carbon fibre filaments (possessing the dimensional properties necessary for insertion with a typical catheter lumen) serving as electrodes which could measure changes in the pH of the intraluminal space as a consequence of bacterial growth. This was based on examining the change in the voltammetric response of uric acid (present within the blood) with changes in pH. An alternative approach by Davis et al. (2013) took the system further through the use of polymer modified electrodes in which pH dependent redox polymers based on plumbagin were used to indirectly measure pH and removed the dependence on the endogenous urate [149]. This was followed by Casimero et al. (2018) with a poly(flavin) system, again exploiting the redox transition of the immobilised flavin to gauge pH changes caused by bacterial growth [150]. These reagentless sensors, while capable of measuring pH within complex bacterial environments, also possessed the advantageous attribute of being capable of catalysing the electroreduction of oxygen resulting in the generation of biocidal reactive oxygen species (ROS) [149]. The core mechanism of the polymer modified electrodes relies upon the pH dependence of quinoid redox interconversions and, in this respect, they could be considered analogous to some of the more recent investigations employing graphene oxide where similar transitions are associated with ROS. The main difference here being that the amount of ROS generated via the poly(plumbagin) could be controlled through manipulation of the electrode potential. Such work was however purely proof of concept and there is no supplementary data on their effectiveness in reducing/preventing bacterial colonisation [149].

Paredes et al. (2014) proposed a novel technique for in-line diagnosis of bloodstream infections through impedimetric biosensing [151]. The system incorporates an interdigitated microelectrode biosensor (IDM), wireless electronics and antenna to detect infection and then trigger an external alarm. Colonisation and subsequent biofilm growth on the IDM alters the capacitance, which is compared to a threshold generated using laboratory-based impedance spectroscopy, providing a preventative warning at the earliest signs of colonisation. The use of a label-free IDM provides the system with a lifetime of over 11 months, requiring only a 50 mAh coin-cell battery. A critical issue here, however, is an assumption that material on the surface of the electrode is a biofilm, when it could be the adsorption of macromolecular components intrinsic to blood. There is little doubt that the introduction of diagnostic systems that can inform the clinician (or patients self-administering their line) of the presence of contaminated NFCs could aid in the management of the CVC and optimise the use of hygiene care bundles.

The smart approach, as indicated previously, is not necessarily limited to diagnostics but can also be harnessed to yield an "on demand" antimicrobial action. In addition to the generation of ROS, reactive nitrogen species (RNS) have also been considered and several reports have targeted the selective release of nitric oxide (NO) as the principal weapon through which to prevent bacterial colonisation [105,106,116,117]. Such work builds on the fact that NO is a chemical transmitter which has a multitude of biochemical roles but, in this context, its status as a key player in minimising platelet adhesion whilst also acting as a broad spectrum antimicrobial is of greatest value. Numerous studies have investigated the use of NO donor molecules (i.e., diazoniumdiolates and S-nitrosothiols) immobilised at catheter surfaces [105,106]. In such cases, the mode of action following the passive release mechanism (through trans-nitrosation of other molecules within the matrix and/or homolytic cleavage) is common to conventional antimicrobial elution systems. Goudie et al. (2019) recently demonstrated the covalent linkage of N-acetyl penicillamine onto silicone catheters in which the thiol could be further functionalised to serve as a NO donor [106]. The use of branch (and hyper branched) methacrylate linkers as indicated in Figure 6A enables a greater density of NO to be stored at the interface with the polymeric network found to retain its anti-fouling properties even after the NO biocides have been delivered.

Mihu et al. (2017) have sought to take a different approach through the thermal reaction between nitrite and glucose during a sol–gel process [152]. This results in the in situ generation of NO and its subsequent entrapment with the sol–gel nanoparticle matrix. In contrast to the previous investigations involving donor molecules, NO is released directly from the nanoparticles. This has been shown to afford activity against bacteria with MRSA cellular growth decreased by 40% following incubation with 2.5 mg/mL NO-np, and by 50% at concentrations greater than 5 mg/mL. The viability of staphylococcal biofilms also reduced by 51.8% under the latter conditions. Furthermore, MRSA growth was seen to be reduced by 50% after 8 h of normal exposure (no incubation) with 2.5 mg/mL NO-np, remaining at these reduced levels after 24 h. The latter results indicate a similar level of success to antibiotic locks, with comparable dwell times. Impregnation of nitric oxide-releasing nanoparticles could prove to be a potential alternative to prophylactic locking, removing reliance on antibiotics and providing a broader effective range; but further research into their in vivo effects are required.

The electrochemical generation of NO represents a new stage in the evolution of active catheters and various reports have examined its electrocatalytic generation from nitrite [116,117]. A variety of copper complexes have been investigated as potential redox cyclers (indicated in Figure 6B). Both the copper complex and nitrite ion must be supplied to the system in order to facilitate the electrogeneration of NO as neither will be present in the IV fluids being delivered. This stands in marked contrast to the plumbagin (or graphene oxide) system where the catalyst can be immobilised at the electrode and the principal feedstock is oxygen which is already endogenous to both the blood and IV fluids being administered. In order to counter this, the authors have suggested the use of a dual lumen CVC in which one lumen is dedicated to the NO generation with the other for the IV fluid transfer. This separates the potentially harmful catalyst and nitrite from direct contact with the vascular system and relies upon the diffusion of electrogenerated NO across the silicone membrane separating the two fluid lines as indicated in Figure 6B. This is a much more complex arrangement from both procedural and instrumentational perspectives. However, the availability of a large reservoir of the nitrite feedstock could counter the issues of depletion and fouling that is common to most of the conventional antimicrobial/anti-adhesion approaches and prove to be effective on long CVC dwell times.

**Figure 6.** (**A**) Hyperbranched methacrylate-penicillamine based nitric oxide (NO) donors. (**B**) Dual lumen electrocatalytic release of nitric oxide using copper complexes.

#### **6. Conclusions**

Catheter-related bloodstream infections have proven to be one of the most common nosocomial infections in the modern healthcare setting. For a condition that seems almost extraneous, they have a considerable impact on patient quality of life, up to the point of mortality, and pose a serious and undue economic burden on patients and health services. While there have been advances in each of the discussed areas, from improved aseptic techniques, new antibiotics and alternative lock therapies, these revolutions do not negate the core issues: aseptic techniques are only as reliable as the compliance of the user, systemic antibiotics are successful if you catch the symptoms early and the efficacy of antibiotic lock therapy is still up for debate.

Current practices for CRBSI diagnosis, prevention and management rely too much on therapeutic techniques, effectively waiting for patient quality of life to deteriorate before any action is taken. While there have been advances in catheter-sparing and prophylactic techniques, with the new guidelines promoting the use of impregnated catheters, and the improved variety, efficacy and range (outside of antibiotics) of antimicrobial locks looking promising, the impact on patient quality of life, cost and mortality is still too significant to declare these practices as viable solutions. There still remains a pressing need to develop a long-term solution for the proactive monitoring of catheters, a solution that will detect infection earlier or prevent it altogether.

**Author Contributions:** Conceptualization, C.C., T.R., J.D.; methodology, T.R.; formal analysis, R.B., A.D., C.H.; investigation, T.R.; writing—original draft preparation, T.R., C.C.; writing—review, editing and supervision, J.D. All authors have read and agreed to the published version of the manuscript.

**Funding:** This research was funded by the Department for the Economy Northern Ireland, Kimal PLC and Abbott Diabetes Care PLC., European Union's INTERREG VA Programme, managed by the Special EU Programmes Body (SEUPB).

**Conflicts of Interest:** Charnete Casimero is presently engaged on Cooperative Award in Science and Technology (CAST) PhD studentship cofounded by the Department for the Economy (DfE) Northern Ireland and Kimal PLC. Robert Barber is similarly engaged on Cooperative Award in Science and Technology (CAST) PhD studentship cofounded by the Department for the Economy (DfE) Northern Ireland and Abbott Diabetes Care PLC.

#### **References**


Bacterial Biofilm in Nosocomial Infections: An in Vitro and in Vivo Study. *ACS Infect. Dis.* **2019**, *5*, 1581–1589. [CrossRef] [PubMed]


© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

*Review*
