*2.7. Data Analysis*

Statistical tests were performed using R Studio (Version 1.0.153). A Shapiro–Wilk normality test was performed and none of the data were normally distributed. Arithmetic means were calculated for displaying results due to the high proportion of non-detect values in the dataset. Wilcoxon rank sum and Kruskal–Wallis rank sum tests with post-hoc Tukey tests were performed to determine statistical correlations. Wilcoxon tests were used for *Legionella* culture data (log transformed), pipe material, iron, and copper data, whereas the Kruskal–Wallis test was used to determine significance of chlorine data between SPPRs. Significance was set at a *p* value ≤ 0.05.

#### **3. Results and Discussion**

#### *3.1. Simulated Treatment and Distribution Reproduced Key Factors of Pre-, During-, and Post-Crisis Flint Water*

#### 3.1.1. Treated Source Waters Employed in this Experiment

To recreate water quality conditions in Flint, influent water conditions were simulated by treating raw Flint River water in the lab and collecting Lake Huron-sourced water from a well-flushed tap in Flint post crisis (Detroit tap water). The unaltered pH of treated Flint River water ranged between 7.84 and 8.57, while Detroit tap water ranged from 7.96 to 8.06, which recreated the stable pH observed when Flint was using Detroit water and the more variable pH when using Flint River water in 2014 [3,12].

The source water was added in 300-mL aliquots to six glass flasks (3 with Detroit tap water, 3 with treated Flint River water) with iron wire and mixed for 3 h to simulate six di fferent conditions in distribution systems (SDSs). Just prior to being added to the SDSs, the source waters (treated Flint River water and Detroit tap water) were chlorinated, achieving an initial disinfectant residual of 3.10 mg/<sup>L</sup> Cl2 (Table 1 section B). Additional chlorine was added to only the **DET-Enhanced** SDS condition to achieve a higher average initial residual of 3.80 ± 0.19 mg/<sup>L</sup> (Table 1 section B). The possible short-term role of cooler temperature during distribution while on Detroit tap water was tested in this work with the **DET-Cold** SDS condition, held at an average of 18.3 ± 1.4 ◦C compared to an average 21.8 ± 1.3 ◦C of the other five SDS conditions (**FR**, *FR-NoFe*, *FR-CC*, *DET*, **DET-Enhanced**) (Table 1 section B,C). This ~3 ◦C di fference served to recreate the reported average summer water temperature of 19.9 ± 2.24 ◦C (pre-crisis, Detroit) and 22.6 ± 2.14 ◦C (during crisis, Flint River) (Table 1 section A) [3].



\* Section (**A**): Representative chemical mean, ± standard deviation, (5–95 percentile range where available) for peak Legionnaires' Disease (LD) months of June–September for the indicated stage of the Flint Water Crisis. Representative distribution system temperature data are reported in Rhoads et al. (2017), chlorine data are from monitoring station 6, iron data are from citizen science sampling of flushed water from the same 150 homes in August 2015 (crisis) and August 2017 (post-crisis), and copper water crisis data are from a subset of first draw samples from homes that records indicate had at least partial copper service lines (n = 79). UNK = unknown. ˆ Section (**B**): Same parameters as in Section (A) were measured in the influent to the SDSs mean (5th–95th percentile) (n = 10 samples over a 10-month period) (Ambient laboratory set point reported for temperature data). Bold conditions SDSs were designed to simulate actual conditions found during the crisis. Conditions in italics were designed to simulate hypothetical scenarios. "DET" conditions received Detroit tap water influent. "FR" conditions received treated Flint River influent. + Section (**C**): Mean and standard deviation of temperature and chlorine and mean (5th–95th percentile) of iron effluent from SDSs (i.e., influent to the SPPRs) and mean (5th–95th percentile) of copper in the effluent of copper pipe SPPRs.

#### *Pathogens* **2020**, *9*, 730

## 3.1.2. SDSs Chlorine

The effluent water collected following the 3-h SDSs reaction time (Figure 2) successfully replicated known trends in chlorine residuals observed in the Flint water distribution system before, during, and after the water crisis. To assess inherent chlorine demand prior to the SDSs step, treated Flint River and Detroit tap waters were aliquoted to non-reactive glass containers without iron. The chlorine residual in treated Flint River water dropped from ~3 to ~1 mg/<sup>L</sup> in 180 min, presumably due to relatively high levels of organic matter (5.2 ± 0.03 mg/<sup>L</sup> TOC), whereas there was little to no decay occurred in the Detroit tap water (1.2 ± 0.03 mg/<sup>L</sup> TOC) over the same time period (Figure 3A). The addition of iron wire to simulate unlined iron pipe corrosion during distribution further reduced chlorine residuals in conditions with both treated Flint River water and Detroit tap water as influents (Figure 3B). However, while some residual was consistently detected in the Detroit tap water effluents after simulated distribution (DET, **DET-cold**, **DET-Enhanced**; 0.5–1 mg/<sup>L</sup> Cl2 after 180-min exposure), treated Flint River water conditions (**FR**, FR-CC, FR-no Fe) generally had no detectable residual (Figure 3B). Condition *FR-NoFe* is not shown in Figure 3B because no iron wire was added to the SDSs for that condition.

**Figure 3.** Representative Cl2 decay in source water and simulated distribution systems (SDSs). (**A**) Control experiment of chlorine decay of treated Flint River and Detroit tap water in non-reactive glass reactors without iron. (**B**) Representative results in different SDSs conditions: **FR**, treated Flint river water aged with iron wire; *FR-CC,* treated Flint River water with added corrosion control and aged with iron wire; *DET*, Detroit tap water aged with iron wire; **DET-Cold**, Detroit tap water incubated at cooler temperature with iron wire; **DET-Enhanced**, Detroit tap water with additional corrosion control and initial elevated chlorine levels.

While there was variability due to seasonal changes in the source water and variable iron wire corrosion rates throughout the experiment, the mean chlorine concentration (n = 43) after incubation in the SDSs exhibited a general trend of (lowest to highest): *FR-CC* ≈ **FR** < *FR-NoFe* ≈ *DET* ≈ **DET-Cold** < textbfDET-Enhanced (Table 1). Based on a Kruskal–Wallis rank sum test, the mean chlorine concentrations across the SDS conditions were significantly different (*p* value < 2 × <sup>10</sup>−16), while a pairwise post-hoc Tukey test further confirmed specific differences between conditions indicated by a "<" sign in the above trend analysis (all *p* values ≤ 0.009).

Overall, key expectations were also recreated with respect to known trends resulting from water chemistry and corresponding chlorine residual in SDSs effluent. Specifically, the SDSs successfully reproduced chlorine residuals comparable to those during the crisis of 0.28 ± 0.24 mg/<sup>L</sup> (at Flint city monitoring station 6) [30], compared to levels of 0.26 ± 0.23 mg/<sup>L</sup> in our treated Flint River water simulation (**FR** condition, Table 1 section C). SDS conditions also successfully simulated pre-crisis (**DET-Cold**) and post-crisis (**DET-Enhanced**) high chlorine, with actual values only 1 mg/<sup>L</sup> higher than measured during the pre- or post-crisis conditions (Table 1 section C). Both conditions with treated Flint River water and iron present (**FR** and *FR-CC*) occasionally had undetectable chlorine residuals under the conditions tested, whereas *FR-NoFe* and all conditions with Detroit tap water consistently had a measurable chlorine disinfectant residual following simulated distribution, as hypothesized (Figure 1). Iron has been shown to decay chlorine residual in typical drinking waters [31], but the chlorine decay observed in the SDS step was accelerated beyond what is typical due to the corrosivity of the treated Flint River water and lack of corrosion control.

#### 3.1.3. SDSs Iron and Corrosion Control

Known benefits of corrosion control (**FR** vs. *FR-CC*; **FR** vs. *DET*) in terms of hindered iron release and maintenance of higher chlorine residuals in the actual Flint distribution system (Table 1) were not achieved in these simplistic simulations. Based on a prior study [2], the addition of phosphate corrosion control to treated Flint River water reduced iron weight loss by 5.1 times compared to that observed in treated Flint River water without phosphate, while also reducing chlorine decay rates. Further, iron corrosion rates were 8.6 times lower in Detroit tap water with corrosion control versus treated Flint River water without corrosion control, a trend confirmed by our citizen science field sampling throughout Flint in August 2015 versus August 2017 (Figure 1) [3,5]. However, the corrosion control simulation applied to the SDSs in this study did not produce known significant differences in mean effluent iron (i.e., **FR**, *FR-CC*, and *DET*; Table 1 section C). The only condition with relatively low iron in this work was treated Flint River water without any iron present (*FR-NoFe*), in which mean iron was 15.4 ± 19.4 μg/<sup>L</sup> compared to the 60.5 ± 212 μg/<sup>L</sup> observed in August 2017 flushed water samples collected in Flint (Table 1).

We were aware that the simple approach applied here would not effectively replicate impacts of iron corrosion control, given that phosphate inhibition of iron corrosion and associated chlorine decay can sometimes require 6–12 months to produce expected benefits even under continuous-flow conditions in water mains, and even longer under more stagnant conditions [32,33]. In this seven-month simulation, the iron was only exposed to the water approximately 6 h each week, which translates into seven days total exposure of iron to the target water over the entire study. Thus, the analysis that follows considers that this particular aspect of the simulation is not representative of what occurred in the field.

#### *3.2. Simulated Premise Plumbing Reactors Reproduced Key Water Chemistry Trends of Pre-, During, and Post-Crisis Flint Water*

### 3.2.1. SPPRs Chlorine

After the effluents from the SDSs were transferred to the SPPRs, the 50% water change produced an immediate dilution of chlorine. Beyond dilution, there is an immediate chlorine demand from the combination of aged pipe material, pipe surface area, biofilm, and water within each reactor [24,26]. Notably, all SDS conditions, except **DET-Enhanced**, experienced an immediate chlorine demand within the first 10 min, which exceeded the 50% loss expected from dilution. **FR** and *FR-CC* never received any initial chlorine residual flowing into the SPPRs (Figure 4), whereas *FR-NoFe* retained a low, but detectable, chlorine residual (>0.1 mg/L) for a period of 60–120 min in the PEX SPPRs and 1–10 min in the copper SPPRs (data not shown). Chlorine was reduced in the Detroit tap water SPPRs to below 0.1 mg/<sup>L</sup> within 30–60 min in both PEX and copper SPPRs, while chlorine was maintained above 0.1 mg/<sup>L</sup> for up to 120 min in **DET-Cold with** PEX (Figure 4) versus just 10–30 min in the corresponding copper SPPRs. Chlorine residuals in the **DET-Enhanced** conditions after 120 min were 0.92 and 0.38 mg/<sup>L</sup> in the PEX and copper SPPRs, respectively (Figure 3). In some instances, chlorine was still detectable in **DET-Enhanced** SPPRs after 24 h. As a general rule, when detectable chlorine residual was present in the influent to the SPPRs, levels were higher in the system with PEX after 10 min than in the equivalent system with copper, consistent with the overall hypothesis of this work and our prior research [25] (Figure 1).

**Figure 4.** Chlorine residuals ( **A**) after 3 h contact time in the simulated distribution systems (SDSs) and (**B**) 120 min after the e ffluent from the SDSs were fed to the simulated premise plumbing reactors (SPPRs) (50% fresh SDSs water with 50% remaining SDSs following incubation in the SPPRs the previous cycle). Dashed lines indicate the calculated initial chlorine level added for each water or reactor type. Bars represent the maximum and minimum, the upper and lower bounds of the box are the first and third quartiles, and the median is indicated by the internal dash. The detection limit was 0.02 mg/L.

Overall, these results illustrate quick and drastic decay of the chlorine disinfectant residual in premise plumbing systems (Figure 4) that added to decay in the distribution systems (Figure 3). The U.S. Environmental Protection Agency (EPA) recommends that a free chlorine disinfectant residual be detectable (often, >0.1 or >0.2 mg/L) in 95% of distribution system samples [34], which has previously been acknowledged not to be adequate for the reduction of *Legionella* in large buildings, single-family homes, or small buildings [35]. The results from the Detroit tap water SPPRs (*DET*, **DET-Cold**, **DET-Enhanced**) demonstrate that the residual was detectable (>0.1 mg/<sup>L</sup> Cl2) after 120 min in the SPPRs only when the disinfectant residual entering much higher than 0.2 mg/<sup>L</sup> Cl2 (Figure 4).
