1. Prolegomena
Wastewater treatment and reuse is not new, and knowledge on this topic has evolved and advanced throughout human history. Reuse of untreated municipal wastewater has been practiced for many centuries with the objective of diverting human waste outside of urban settlements. Likewise, land application of domestic wastewater is an old and common practice, which has gone through different stages of development. This has led to better understanding of process and treatment technology and the eventual development of water quality standards [
1].
Domestic wastewater was used for irrigation by prehistoric civilizations (e.g., Mesopotamian, Indus valley, and Minoan) since the Bronze Age (
ca. 3200–1100 BC). Thereafter, wastewater was used for disposal, irrigation, and fertilization purposes by Hellenic civilizations and later by Romans in areas surrounding cities (e.g., Athens and Rome) [
2]. In more recent history, the “sewage farms” (
i.e., wastewater application to the land for disposal and agricultural use) were operated in Bunzlau (Silesia) in 1531 and in Edinburgh (Scotland) in 1650, where wastewater was used for beneficial crop production [
3]. In the following centuries in many rapidly growing cities of Europe and the United States, “sewage farms” were increasingly seen as a solution for the disposal of large volumes of the wastewater, some of which are still in operation today. Paris was a typical example with the first sewage farms established at Gennevilliers in 1872, eventually processing wastewater of the entire town. At the beginning of the last century, the sewage farms in France supplied with raw wastewater by the Colombes pumping station in Paris reached their maximum implementation, having been established in four different areas; in Gennevilliers (900 ha) and Achères (Achères plain, 1400 ha, Pierrelaye, 2010 ha and Triel, 950 ha) [
4]. A large “sewage farm” also was established in Melbourne, Australia in 1897 [
3,
4,
5]. The use of the land treatment systems continued into the twentieth century in central Europe, USA, and other locations all over the world, but not without causing serious public health concerns and negative environmental impacts. However, by the end of the first half of the current century, these systems were not easily accepted, due to drawbacks such as large area requirements, field operation problems, and the inability to achieve the higher hygiene criteria requirements required [
3].
Modern sewage systems were first built in the mid-nineteenth century as a reaction to the exacerbation of unsanitary conditions brought on by heavy industrialization and urbanization. Due to the contaminated water supply, cholera outbreaks occurred in 1832, 1849, and 1855 in London, killing tens of thousands of people. In addition, the Great Stink of 1858 occurred when the smell of untreated human waste in the River Thames became overpowering. This, combined with a report on sanitation reform by the Royal Commissioner Edwin Chadwick, led to the Metropolitan Commission of Sewers appointing Sir Joseph Bazalgette to construct a vast underground sewage system for the safe removal of waste [
6].
Today, planning of projects for the wastewater treatment and reuse of effluents is significantly increasing in several countries. The main (re)uses of treated wastewater are: irrigation (both agricultural and landscape), recharge of aquifers, seawater barriers, industrial applications, dual-distribution systems for toilet flushing, and other urban uses. International organizations, such as the World Bank, the Food and Agriculture Organization (FAO) of the United Nations, and the World Health Organization (WHO) estimate that the average annual increase in the reused volume of such water in the USA, China, Japan, Spain, Israel and Australia ranges from up to 25. For example, in California only 860 Mm
3/year of treated wastewater effluent (4300 Mm
3/year) was reused in 2010, whereas, over 80% (3440 Mm
3/year) of treated wastewater effluent was discharged to the ocean. In 2030, 2470 Mm
3/year is planned to be reused [
7]. In Spain more than 500 Mm
3/year of treated wastewater is currently reused and is expected to reach 1000 Mm
3/year [
8]. In Israel over 80% of treated wastewater effluent is reused, mainly for agricultural irrigation. In Singapore NEWater meets up to 30% of the nation’s current water needs [
9], which may increase to 55% by 2060 [
10]. Large-scale droughts in California and Texas in the USA have led to greater exploration and implementation of direct potable reuse. In California, the governor announced in 2013 that guidelines for potable reuse, including direct, needed to be established by 2016. Texas already has moved forward with direct potable reuse with full-scale projects in operation in Big Spring and Wichita Falls. In comparing indirect to direct potable reuse, assuming equivalent treatment trains, the only scientific difference is time. Environmental buffers such as ground water injection do little, if anything, to improve water quality. Conversely, injection of high-purity water into aquifers can cause leaching of metals, such as arsenic, and may mix with lesser quality waters during subsurface storage. However, environmental buffers theoretically provide response time depending on the length of time water is storied.
Direct potable reuse systems would have, by design, much shorter response times thus critical control points and on-line sensor systems are critical. While surrogates and indicators are gaining strength for rapid monitoring of treatment process efficacy, real-time systems with immediate feedback to operators and/or autonomous operation of the treatment system are needed. In another emerging trend, growing concerns surrounding the potential health impacts from chemical mixtures continues to plague water reuse stakeholders [
11]. The growing trend of rapid, high-throughput, toxicity screening for water using
in vitro bioassays is evident [
12,
13].
2. The Main Contribution of This Special Issue
The ten selected articles cover a wide spectrum of thematology. The ten papers include: (1) three papers focused on traditional technologies such as land application, and constructed wetlands; (2) five papers focused on biological treatment and disinfection of wastewater, including denitrification, bioreactors and bioelectrochemical systems; (3) a paper on land application-based management of olive mill wastewater; and (4) a paper addressing the impacts of biosolid and manure application on agricultural land. Brief summaries of these papers is provided below.
1. Natural treatment systems, especially land application, have a long history and are still used due to their simplicity, low cost, and significantly reduced cost and operation requirements. These systems are well-suited for small communities where operator expertise is limited. They require minimal management, energy, and are characterized as no-discharging systems [
5].
One paper describes the effect of plant species (
Eucalyptus camaldulensis vs.
Arundo donax) on carbon (C) turnover during wastewater application to land under experimental conditions [
14]. The findings suggest differences in soil microbial community composition and/or activity in the rhizosphere of plant species impact C cycling. The results reveal an important role of plant species on C cycling in terrestrial environments with potential implications in the sequestration of C and release of nutrients [
14].
Another study shows a Partial least squares (PLS) regression model to predict important land application management goals, including biomass production and nutrient recovery (
i.e., N and P). A land application system was evaluated in which four different plant species (
Acacia cyanophylla, E. camaldulensis, Populus nigra, and A. donax) were irrigated with pre-treated effluent [
15]. After a PLS regression analysis, the primary influencing parameters were: effluent loading rate, soil salinity, electrical conductivity (EC), available phosphorus (Olsen-P), Na
+, Ca
2+, Mg
2+, K
2+, sodium adsorption ratio (SAR), and NO
3−N [
15]. The model was validated by the data from a previous year, strengthening the model’s potential for the prediction of response variables. This study is expected to help in the development of appropriate methodology for the design and monitoring across various agronomical practices, such as land application. The model will further aid in identification of appropriate management strategies with respect to vegetation and field practices [
15].
In another paper, three different two-stage hybrid ecological wastewater treatment systems (HEWTS) were considered. In these systems combinations of horizontal and flow (HF and VF) of constructed wetlands (CWs) and stabilization ponds (SP) were evaluated for the removal of Organic-N, NH
4+N, NO
3−N, total N, total P, total Coliforms (TC) and
Escherichia Coli, BOD, COD, and TSS [
16]. The most effective systems were those systems containing a VF component, either HF-VF or VF-HF. In these systems, NH
4+N was reduced by 85.5% and 85.0% respectively, while NO
3− was increased to 91.4 ± 17.6 mg/L and to 82.5 ± 17.2 mg/L, respectively. At the same time,
E. coli was reduced by 99.93% and 99.99%, respectively [
16]. While the goal of most wastewater treatment systems is focused on reducing nutrients, the results here demonstrate that two-stage HEWTSs systems, including VF component can be produced an effluent with high concentration of inorganic nutrients, thus suitable for agricultural irrigation.
2. In the second area one paper described a vertical membrane bioreactor (VMBR) composed of anoxic and oxic zones. The study objectives were to meet water quality regulations, to reduce the volume of produced sludge, and to produce water for beneficial water reuse in South Korea [
17]. Excellent removal efficiencies of BOD, COD, TSS, and
E. coli were demonstrated in a full-scale system. Moreover, average removal efficiencies for total N and total P were found to be 79% and 90%, respectively. Average specific energy consumption of the full-scale system was found to be 0.94 kWh/m
3 [
17].
Another paper in this special edition addresses nitrifying and denitrifying biofilters. In wastewater treatment plants (WWTPs) tertiary denitrification of secondary effluent is necessary to control the eutrophication of receiving water bodies by attenuation of nutrients. In biofilter systems, the size of the filter media affects the system performance with smaller diameter media achieving better the nutrient removal efficiency [
18]. Two denitrifying biofilters (DNBF), one packed with quart sand with sizes of 2–4 mm (DNBF
S) and the other of 4–6 mm (DNBF
L), were studied for tertiary denitrification under empty bed retention times (EBRTs) of 30 min, 15 min, and 7.5 min [
19]. Using these times of operation, the percentages of NO
3−-N removal were 93%, 82% and 83% in DNBF
S biofilter, and 92%, 68% and 36% in DNBF
L one, respectively. The N removal loading rates increased with decreasing EBRTs. The half-order denitrification coefficient of DNBF
S increased from 0.42 (mg/L)
1/2/min at the EBRT of 30 min to 0.70 (mg/L)
1/2/min at the EBRT of 7.5 min, while DNBF
L values ranged from 0.22 to 0.25 (mg/L)
1/2/min. The performance of both DNBFs was stable within each backwashing cycle, with the NO
3−-N removal percentage variation within 5%. Better denitrification was achieved in DNBF
S but with a decreased flow rate during the operation [
19].
Conventional wastewater treatment processes often rely on chemical disinfection to comply with the stringent microbiological safety required for water reuse. One study in this Special Issue showed that well designed and operated membrane bioreactors (MBRs) can consistently achieve efficient removals of TSS and pathogens. Hai
et al. [
20] provide an in-depth overview of the mechanisms and influencing factors of pathogen removal by MBRs and highlight practical issues, such as reduced chemical disinfectant dosing requirements and associated economic and environmental benefits. Additional operational aspects, such as membrane cleaning, membrane imperfections/breach and microbial regrowth in the distribution system are also discussed.
Bioelectrochemical systems (BES) and forward osmosis (FO) are emerging technologies with great potential for energy-efficient wastewater treatment. BES takes advantage of microbial interactions with a solid electron acceptor/donor to achieve bioenergy recovery from organic compounds, while FO can produce high-quality water driven by natural osmotic pressure. The strong synergy between these two technologies can collaboratively address water-energy nexus issues. FO can assist BES with achieving water recovery (for reuse), provide electricity generation, and supplying energy for accomplishing the cathode reactions. In addition, BES may help FO with degrading organic contaminants, providing sustainable draw solute, and stabilizing water flux. Lu
et al. [
21] has reviewed the recent developments that demonstrate the synergy between BES and FO, while analyzing the advantages of each combination and providing perspectives for future research. The findings encourage further investigation and development for efficient coordination between BES and FO for an integrated system suitable for wastewater treatment and reuse [
21].
Another paper in this Special Issue shows the application of an Analytical Hierarchy Process (AHP) by integrating a Delphi process for selecting suitable disinfection techniques for wastewater reuse projects [
22]. The proposed methodology provides a useful tool to evaluate different disinfection techniques with multiple criteria and alternatives, with expert opinions playing a major role in the selection of the most appropriate technique. Five different disinfection processes suitable for wastewater reuse have been evaluated for each of the nine criteria weighted according to the opinions of consulted experts. This methodology is shown to be appropriate in realistic scenarios, where multiple criteria are considered in the selection of a particular disinfection technique [
22].
3. The paper of this area shows that a land treatment system for olive mill wastewater. In this system, the treatment potential of natural soil with
E. camaldulensis is compared to physicochemical properties and contribution of plant species in terms of N and P recovery and biomass production [
23]. Land application of olive mill wastewater resulted in significant reduction of inorganic and organic constituents of the applied wastewater. At 15 cm of soil profile, the average removal of COD, TKN, NH
4+-N, TP, In-P, and total phenols approached 93%, 85%, 66%, 86%, 82%, and 85%, respectively. Effluent application increased soil organic matter (SOM), total kjeldahl nitrogen (TKN), and available P in the soil, particularly in the upper soil layer [
23]. In this land treatment system, increasing soil depth (30 and 60 cm) did not further improve treatment efficiency [
24]. EC and SAR was increased by olive mill wastewater addition, but at levels which do not pose severe risks for soil texture. Enhancement of soil fertility resulting from olive mill wastewater sustained eucalyptus tree growth with high biomass yield.
4. The final paper of the Special Issue is a review focused on waterborne outbreaks and sources of microbial pollution in rural areas in the USA from land application of biosolids and manure [
25]. Most of the waterborne disease outbreaks observed in North America and elsewhere are associated with rural water supply systems [
26]. The majority of the reported waterborne outbreaks are related to microbial agents (parasites, bacteria, and viruses). Rural areas are often characterized by high livestock density and lack of advanced treatment systems for animal and human waste. Animal waste from livestock production facilities is often applied to land without any prior treatment. Biosolids (treated municipal wastewater sludge) from large wastewater facilities in urban areas are often transported and also applied to land in rural areas. This situation introduces a potential for risk of human exposure to waterborne contaminants such as human and zoonotic pathogens originating from manure, biosolids, and leaking septic systems. In this study, gaps in knowledge are identified and recommendations to improve water quality in rural areas are discussed [
25].
3. Challenges on Wastewater Treatment and Reuse
As human population continues to grow and urbanize, the challenges for securing water resources and disposing of watewater will become increasingly more difficult. Today, wastewater is usually transported through collection sewers to a centralized WWTP at the lowest elevation of the collection system near to the point of disposal site to the environment. Because centralized WWTPs are generally arranged to route wastewater to these remote locations for treatment, water reuse in urban areas is often inhibited by the lack of dual distribution systems [
27]. The infrastructure costs for storing and transporting reclaimed water to the points of use are often prohibitive, which is making reuse less economically viable. Thus, decentralized wastewater management systems should be more seriously considered in the future to treat wastewater at or near the points of waste generation. Also an alternative to the conventional approach of transporting reclaimed water from a central WWTP, the concept of decentralized (satellite) treatment at upstream locations with localized reuse and/or the recovery of wastewater solids is becoming more appreciated [
27].
Water reuse offers tremendous potential in augmenting already strained water resource portfolios, yet biosolids utilization/disposal remains challenging particularly for dense urban settings. In both water reuse and biosolids applications to land, the primary challenge remains public perception. While advanced technologies can help to lower energy footprint and to increase reliability, the obstacle of perception can be far more daunting. Emerging contaminants such as pharmaceuticals and antibiotic resistant bacteria are particularly difficult to explain to the public. Both historical and more recent examples of disease spread by water (such as cholera and cryptosporidiosis, respectively) weigh heavily on public concerns over the safety of water reuse. Advanced technologies such as on-line sensors, membranes, and advanced oxidation can help ease perception; however, a better understanding of how engineered reused water compares to existing source waters can be quite persuasive.
The challenge of emerging chemical constituents has become exacerbated by concerns of mixture toxicity. Exposure to chemicals does not happen discretely, but rather, chemicals exist as complex mixtures of widely variable composition. Animal testing alone cannot reasonable address the fundamental question of “is it safe?” This is particularly true for mixtures since a seemingly infinite number of computations exist. Therefore, rapid biological screening assays, primarily in vitro, are gaining attention as a means to quickly and comprehensively evaluate the complex mixtures of chemicals in water. High-throughput bioassays can be quite successfully used for the qualitative and quantitative identification of chemicals present in a wide array of biological endpoints relevant to public health. Since new chemicals are constantly introduced to the market and considering the innumerable amount of potential transformation products, bioassays make good sense in paving a path forward that will better help the public and regulators move forward with water reuse projects.
As cities continue to grow and water resources continue to become more challenged, only water reuse, desalination, and transportation outside can provide additional resources than those provided by natural deposition. Water reuse in particular has numerous advantages, yet real challenges in terms of public acceptance. Scientists have an opportunity to help move the field forward through development of more effective communication of complex data and by making sure that reused water quality is compared to that of existing urban water resources.