Next Article in Journal
Profile Analysis of Handwashing Behavior Among a Sample of College Students in the Multi-Theory Model Framework
Previous Article in Journal
Artificial-Intelligence-Based Smart Toothbrushes for Oral Health and Patient Education: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

History and Development of Water Treatment for Human Consumption

by
Philippe Hartemann
1,* and
Antoine Montiel
2
1
Department of Public Health, School of Medicine, Campus Brabois, 54500 Vandoeuvre-Nancy, France
2
Past Quality Director, Water Paris, 75013 Paris, France
*
Author to whom correspondence should be addressed.
Submission received: 2 April 2024 / Revised: 19 April 2024 / Accepted: 1 June 2024 / Published: 4 February 2025
(This article belongs to the Section Environmental Health)

Abstract

:
Throughout history, humans have sought to drink water that is good for their health, according to the knowledge of the time. Hippocrates’ definition of water quality, “good water should be clear, light, aerated, without any perceptible odor or taste, warm in winter and cold in summer”, remained virtually unchanged until 1887, when it was added that water should dissolve soap and foam well, be clear and colorless, have a pleasant taste, leave no large deposits after boiling, and cook vegetables and wash clothes well. This definition guided all treatments to remove the substances responsible for cloudiness, odor and discoloration, as well as the choice of resources: clear water and water with low mineral content. The discoveries by Pasteur and Koch led to the addition of microbiological criteria, like the absence of pathogens, and the definition of microbiological indicators. Throughout the 20th century, advances in scientific knowledge in microbiology, chemistry and toxicology led to major progress in treatment methods. These filtration and disinfection treatments are described here according to their historical implementation. Due to progress in numerous areas, e.g., both chemical and microbiological analytical detection limits, speed of information flow and origins of certain diseases that are discovered to be waterborne, the consumer is now exposed to anxiety-provoking news (microplastics, eternal pollutants (cf. per- and polyfluoroalkyl substances (PFASs)), drugs, pesticides residues, etc.). Thus, the consumer tends to lose confidence in tap or bottled water and turn to buying home purifiers. Drinking water treatment will continue to evolve with more sophisticated processes, as analytical progress enables us to expect further developments.

1. Introduction

Water, essential to life, can also be the cause of disease and death. It has always been a prerequisite for the existence of populations. As a result, all the great ancient civilizations developed thanks to the mastery of water, in the basins of the great rivers: the Nile, the Indus, the Tigris, the Euphrates, the Mekong, etc. Egypt offered, in 3000 BC, the first example of global, centralized water management with the existence of a Water Office in charge of hydraulic works and water allocation [1].
Throughout history, humans have sought to drink water that is good for their health, according to the knowledge of the time. As early as 500 BC, Hippocrates wrote the following in his Treatise on Air, Water and Places: “Good water should be clear, light, aerated, without any perceptible odor or taste, warm in winter and cold in summer” [2]. This definition of water quality remained virtually unchanged until 1887, when it was complemented with the following: good water should be clear and fresh, dissolve soap and foam well, be clear and colorless, have a pleasant taste, leave no large deposits after boiling, cook vegetables well and wash clothes well. This definition guided all treatments to remove the substances responsible for cloudiness, odors and discoloration, as well as the choice of resources: clear water and water with low mineral content. Everything changed with the discovery of microorganisms. Pasteur’s famous phrase, “we drink 90% of our illnesses”, ushered in a new era in the approach to drinking water supply [3]. Advances in bacteriology represented a key element in the definition of drinking water. From 1887 onwards, following the work by Koch [4], water resources were selected according to the presence or absence of bacteria indicative of fecal contamination. By the end of the 19th century, since fresh, clear, tasteless and odorless water was not necessarily synonymous with drinking water, filtration and disinfection treatments were introduced. These had already been used empirically since antiquity but were now guaranteed by analytical results.
Throughout the 20th century, advances in scientific knowledge in microbiology, chemistry and toxicology led to major progress in treatment methods.
In microbiology, the time-consuming and often negative search for pathogenic germs was replaced by a search for fecal contamination microorganisms, a list then completed by indicators of disinfection efficiency, good filtration and deterioration of water quality in the distribution network [5]. In the 1950s, viruses were also considered for the prevention of poliomyelitis transmission [6]; then, in the 1970s and 1980s, it became clear that microorganisms responsible for diseases caused by water inhalation (Legionella pneumophila) or highly resistant microorganisms (Cryptosporidium spp., for example) also had to be looked for [7].
In chemistry, continuous analytical progress since the 1960s has enabled the measurement of trace mineral elements, including their valence (UV, visible, atomic absorption, arc emission, argon plasma emission, X-ray fluorescence spectrometry, etc.), as well as the measurement of ultra-trace organic compounds in water (gas chromatography and high-pressure liquid chromatography, coupled either with specific detectors or mass spectrometry) [8]. This has made it possible to measure pesticides, hydrocarbons, solvents, drug residues, etc. These methods have also made it possible to identify secondary reactions between certain oxidation or disinfection treatments and by-products present in water [9].
In toxicology, given the abundance of mineral or organic substances found in low concentrations in “clear” water, it was necessary to organize reasoning on long-term toxicology (whole life: 70 years, 2 L of water per day) to determine maximum admissible values for each element or molecule. The values thus set per liter of water consumed introduce a large safety margin based on a risk of pathology of the order of 10−6 per substance, due to the lack of knowledge on possible “cocktail effects” [10].

2. History of Water Treatment

2.1. Using Water That Requires No Special Treatment

For a very long time, each family fetched the water it needed, selecting the best apparent quality and, if necessary, transporting it in containers. The supply of water to cities, requiring large volumes, dates back to several centuries BC, mostly by transferring water from places where it met the standards of the time.
Initially, water was collected from a nearby spring or river, but the need for an abundant, high-quality resource meant that towns had to be built near a river or lake. Then, wells were dug to pump groundwater. The first wells date back to around 6000 BC. For larger towns, water was transported via simple canals, sand or rock dikes. Later, people began to use other kinds of pipes, such as palm branches in Egypt and bamboo in China and Japan. Later, clay, wood and then metal began to be used.
The Romans were the greatest architects and builders of water distribution networks, constructing dams to form lakes whose water was aerated before distribution. Mountain water was the most popular, thanks to its quality, and was transported by aqueduct. Water carrying was widespread within the city, providing a livelihood for many people who had to be trusted.

2.2. History of Boiling Water

Prehistoric man had already observed that boiling water prevented illnesses and attributed this to a battle between good and evil spirits, the latter being driven away by boiling [11]. As the water after boiling had an unpleasant taste, it was customary to infuse it with herbs: tea is the best example. This led to the belief that boiling a liquid (e.g., fruit juice) produced a beverage that was “good for you”. This explains why fermented beverages such as wine, mead, beer, rice juice or fermented palm juice can be found in every civilization. In Europe in the Middle Ages, all previous work on water and its treatment has been completely forgotten, thus many waters were dangerous for health. Because of their empirically demonstrated antimicrobial activities, wine and beer were the main beverages drunk during this period and olive oil or vinegar were added to food [12].
Even today, boiling is still recommended when the microbiological quality of the water is doubtful, or bad, or when traveling in areas with uncontrolled water. At 100 °C, one log of bacteria or other microorganisms is killed per minute of contact. The WHO advice is to maintain a temperature of 100 °C at least during one minute for a good result according to the initial water contamination [13]. The modern electric kettles do not insure a sufficient boiling time in case of heavily contaminated water. Many countries have “boiling advice” regulations, which are passed on to the whole population in the event of an incident in the water distribution system; some are extending the period at 100 °C [14].

2.3. Treatments to Reduce Water Cloudiness

2.3.1. Home Treatments

Natural decantation is depicted in Egyptian paintings dating from 1500 to 1300 BC, which show devices for water sedimentation. As knowledge evolved, water decantation was improved by the addition of “reagents” of mineral origin (such as alum and clays) or vegetable origin (powders of various seeds, fruits, ashes, etc.), treatments still used in some countries without a drinking water supply. It is clear such “empirical treatments” are only able to eliminate by decantation the biggest particles with attached microrganisms and a part of the coloration. But this is not negligeable because some mineral and organics pollutants as most of the pathogenic microrganisms (eliminated in feces) are linked to particles. In 1989, the GTZ (a German cooperation organization) published a book on “The use of natural coagulants for the purification of drinking water in rural areas”, which perfectly codified the methodology to be employed [15].
Decanting is often followed by charcoal or sand filtration. This method is still used in some countries, for example by those who sell water from a wineskin, adding pieces of charcoal to eliminate the taste produced by the wineskin.

2.3.2. Collective Treatment for Urban Water Supply

Slow Filtration

The presence of colloid-related turbidity in raw water requires cities to implement similar clarification and filtration steps. The colloids to be eliminated are negatively charged, and this property was initially used to adsorb them onto biofilms (bacterial exo-polysaccharides) deposited on filter media in slow filters after maturation for a few weeks to a few months, depending on water temperature. The speed of this slow filtration should be around 5 m3/m2/day. This slow sand filtration removes bacteria, viruses and parasites if the raw water turbidity is less than 20 NFUs at this speed with an order of elimination of 5 logs [16].
The first slow filter used was the Peacock filter installed by Gibb in Paisley, Scotland, in 1804. It was a horizontal slow filter preceded by a settling basin and a gravel-filled roughing filter [17]. Other cities were then equipped, such as London in 1830, Poughkeepsie in the USA and Paris in France in 1898, without the addition of any reagent. As some turbid waters contained suspended solids in addition to colloids, pre-treatment stages were added (roughing by rapid filtration on gravel and rapid pre-filtration on sand of around 1 mm in size). The biological role of slow filtration was only scientifically demonstrated in the second half of the 20th century. This concerns the biological oxidation of certain mineral micropollutants, either to eliminate them by retention or to leave them after transformation in the treated water, such as iron and manganese, ammonium ions and sulfide ions, which give nitrates and sulfates respectively. However, empirically speaking, the first attempts at slow biological filtration were made in the latter years of the twentieth century. In Germany, for example, the definition of water quality in the 1850s included the same terms as those already mentioned, but with the addition: “The water must come out of the ground”, thus favoring groundwater. As quantities were insufficient, it was proposed to draw water from the alluvial aquifer by forcing the river water to filter through the banks, a system that protected Altona while a cholera epidemic raged in Hamburg, which drew its water directly from the Elbe. This led to Koch’s description of this fundamental step for public health [18]. In Germany, the term “bank filtration” was used, while in France it was preferred to use groundwater drawn from the alluvial aquifer.

Rapid Filtration

With advances in analysis, the degradation of surface water and the ever-increasing volumes required to meet the growing needs of cities, it became necessary to use large areas of slow filtration. This was incompatible with the available land, and so, from the early 1950s, slow filtration plants were gradually replaced by fast filtration plants (filtration speed at least 24 times higher) with the need to add coagulation reagents: aluminum salts (like the Egyptians!), iron salts, then cationic polymers requiring subsequent flocculation with the addition of agents of plant origin (alginates) or synthetic agents. These negative colloids in the water can also be neutralized by adding chemical reagents. This technique is called coagulation; once neutralized, the positive colloids can regroup and form flocs: this is flocculation. These flocs are then removed or separated from the water, either by decantation or flotation followed by rapid filtration, or by direct filtration just after the addition of the coagulation reagents, in which case we speak of coagulation on a filter. The latter technique is only suitable for low-turbidity water: turbidity must be less than 10 NFU (Nephelometric Unit) and in no case, even for short periods, exceed 20 NFU. Rapid filtration uses filters with speeds ranging from 3 to 10 m3/m2/h [19].
As organic micropollutants (pesticides, PolyAromatic Hydrocarbons (PAHs), drug residues, etc.) were quickly identified, it was proposed to add powdered activated carbon at doses of 1 to 20 ppm at the coagulation stage. At the same time, following evidence of secondary reactions of chlorine on humic acids measured by TOC (Total Organic Carbon), pre-chlorination was banned. Advanced coagulation was proposed, at pH between 5 and 5.5 with ferric salts, followed by pre-ozonation of the water as part of rapid filtration. This type of treatment is also effective in eliminating parasites [20].

Membrane Filtration

More recently, membrane filtration has been developed. As organic colloids are made up of macromolecules and colloidal clays of micelles ranging in size from 0.5 to 1 micrometer, they can be eliminated by microfiltration (cut-off point 0.5 µm) or ultrafiltration (cut-off point 100 to 200,000 Dalton). Porous membranes are then used on water influenced by surface water or for surface water, replacing the sand filtration stage [21]. The knowledge acquired since the 1970s has shown that this stage cannot be installed after a powdered activated carbon (PAC) reactor, as the reagents added for the proper separation and recycling of the PAC are polymers that irreparably clog the membranes. A sand filtration stage is essential upstream. Raw water turbidities are low < 50 NFU for ultra filtration, which is why these stages are often implemented after the chemical clarification stage.
Even more recently, dense membrane filtration treatments have appeared: nano filtration, low-pressure reverse osmosis, reverse osmosis, which also act on dissolved elements. The first two are traditionally used for water softening, and reverse osmosis for demineralization. But more and more plants are using them, which means that treated water has to be remineralized to neutralize its corrosive power before distribution.
As the presence of certain mineral pollutants in water has come to light, selective adsorption techniques have been proposed to retain them, such as activated alumina for fluorides, manganese dioxide for manganese, arsenic, selenium, antimony, vanadium, uranium and radium. These treatments are mainly used on spring water intended for packaging [22].

2.4. Disinfection Treatments

Very early on, various more or less effective products were used to eliminate disease-causing “miasmas”. We can now guarantee the usefulness of the following biocidal agents.

2.4.1. Chlorine

Chlorine in its pure form was discovered in 1774 by C. W. Scheele, but the name “chlorine” was not given to this substance until 1810 by H. Davy. It was Bertholet who discovered that this element could be used to bleach linen, and used “chlorine water” in 1785 in the village of Javel (now incorporated into Paris), where washerwomen gathered to wash their clothes in the Seine river. The oxidizing properties of chlorine, stabilized in a potash solution, enabled laundry to be bleached (liqueur de Javel) [23]. Soda subsequently replaced potash in the composition of bleach.
The deodorizing properties of chlorine were well known. It had been observed that water with a “septic” odor was a source of disease, and the odor (miasma) made people sick. The addition of chlorine removed the odor and the water no longer made people sick… thus proving that odor was responsible for disease! Before the discovery of bacteria, it seems that the first to suggest disinfecting water with chlorine were L.B. Guyton de Morveau in France and W. Cumberland Cruikshank in England, both around 1800 [24]. J. Snow was the first to successfully use chlorine to disinfect the water in Soho, London, to stop the cholera epidemic of 1854. But it wasn’t until 1890 that chlorine was shown to be an effective disinfection tool capable of reducing the amount of disease transmitted in water [25].
The disinfectant chlorination of distributed water has developed on a large scale in the 20th century. A technique for purifying drinking water using chlorine gas, calcium hypochlorite tablets or a sodium hypochlorite solution was developed as early as 1910 by an army major and used in France, particularly in Verdun during the First World War, to combat waterborne diseases in the trenches, hence the term “Verdunisation” of drinking water for replacing poor quality wine (“picrate”) [26]. Chlorine was first used in Paris in 1911, on the advice of Pasteur’s disciple Dr Roux, but consumers reacted to the taste of the water. After the 1914–1918 war, chlorination became widespread around 1920, with the addition of dechlorination at reservoir level.
In the 1970s, any surface water treatment chain that didn’t start with a pre-chlorination treatment at the “Break-point” was a bad treatment chain. In 1974, Rook in Holland showed that chlorine could react with natural organic matter in the water: humic acids measured by TOC. The molecules formed are trihalomethanes: CHX3, including chloroform CHCl3 and chlorobrominated compounds: CHCl2Br and CHCLBR2 and brominated compounds: CHBr3 [27]. As New Orleans water was found to contain high levels of chloroform, some linked it to the high incidence of cancer in the city. This triggered a major controversy in the USA, which led to the adoption of the Clean Water Act after extensive analytical research and epidemiological studies [28]. Over 100 by-products were identified, some of them carcinogenic, and by 2000 all surface water treatment processes starting with pre-chlorination were obsolete.
Break-point” pre-chlorination was then replaced either by the addition of chlorine dioxide, which does not produce a secondary reaction like chlorine, but chlorites, for which a limit has been set (<0.2 mg/L European Directive 1998), or by a dose of chlorine that limits the formation of monochloramines to the iodoform stage.
The bactericidal action of chlorine is mainly due to the HOCl form, which is disinfectant at a water pH below 8.1. To guarantee good disinfection, a CT of 15 is required (0.5 mg/L of free chlorine after 30 min of real contact), and residual chlorine in the distribution network provides a good bacteriostatic effect at a dose of 0.3 to 0.5 mg/L [29].

2.4.2. Ozone

This gas was described in 1840 by C.F. Schönbein. Then W. von Siemens designed an ozone generator and began studying its application to water, experiments which were continued for water disinfection by Froelich and Ohlmueller in 1891. The first Rhine water treatment plant was installed at Oudshoorn (now incorporated to Alphen on Rhin) in the Netherlands in 1893, followed by another at Lille in France in 1897. In 1896, N. Tesla had patented an ozone generator, and it was in Nice that the first drinking water treatment plant was inaugurated in 1906 with the creation in 1907 by M.P. Otto of the Compagnie Générale de l’Ozone. From then on, ozone became the disinfectant of choice for water purification, before chlorine regained the upper hand after the First World War, due to its much lower cost and the need to convert combat gas production plants [30].
It was Dr. Coin’s work in Paris in the 1950s, which demonstrated the virucidal effect of ozone at a dose of 0.4 mg/L for 4 min real contact time on the poliomyelitis virus [6], that led to the development of ozone in most major treatment plants in France, then in Europe and worldwide, with the 2 world leaders (Veolia and Suez) [31].

2.4.3. Chlorine Dioxide

Initially discovered by H. Davy in 1814 through the reaction of sulfuric acid with potassium chlorate, then with hypochlorous acid, this gas is explosive and difficult to transport, making it necessary to produce it on site. It was used in potabilization plants from the 1940s onwards for its biocidal activity, which is comparable to that of chlorine but has no effect on ammonium ions, especially in the 1980s because it does not react with organic matter in raw water to form trihalomethanes [32]. On the other hand, the presence of chlorites is a disadvantage, as is its corrosiveness to polyethylene pipes.

2.4.4. Other Chemical Disinfectants

Chloramines

Chloramines are not used for treatment, as their oxidizing power is much weaker than that of chlorine. On the other hand, their persistence in the network is much greater [33]. Some countries (e.g., USA) therefore use monochloramine in distribution networks, but this is not currently authorized in many European countries, in particular because of the risk to dialysis.

Hydrogen Peroxide

It is not used for drinking water disinfection, but for disinfecting storage tanks after their annual cleaning. Like monochloramine, it can also be used in hot water distribution networks (anti-legionella process).

Metals: Copper, Silver

As far back as antiquity, it was recommended to preserve water by inserting copper or silver coins. Copper was used when pre-chlorination was abolished to prevent algal blooms, or even directly in the raw water supply. This technique is no longer recommended, either in the resource or in the plant especially in presence of cyanobacteria.
Silver at a dose of 70 µg/L is still used to maintain water reserves at home for 2 to 3 years in the event of water shortages (survival reserves, e.g., Switzerland). Some English-speaking countries use a copper-silver treatment in hot water pipes to limit Legionella proliferation [34]. (WHO limit of silver in drinking water is 10 µg/L).

2.4.5. Ultraviolet Radiation

Solar Disinfection (UV-A)

Since ancient times, it has been recommended to clarify water by decanting or filtering and leaving it in the sun before drinking. Research into the bactericidal effect of light began in the 19th century. In 1877, Down and Blunt demonstrated that light inhibited bacterial growth in the laboratory, and that the inactivation of a given quantity of microorganisms depended on exposure intensity, duration and wavelength. Then, in 1885, Duclaux, another pupil of Pasteur, demonstrated the different sensitivities of different bacteria to the same dose of daylight [35]. Today, disinfection by solar radiation (UV-A) is still used in tropical or equatorial zones where there is no water supply for human consumption. Known as SODIS (Solar Disinfection), the method of disinfecting water using only solar radiation and polyethylene terephthalate (PET) bottles is recognized by the WHO. UV-A rays destroy the cell structure of bacteria and react with oxygen dissolved in water, producing a highly reactive form of oxygen - the free oxygen radical - and hydrogen peroxides that destroy pathogenic germs. Infrared radiation also heats the water, in synergy with UV radiation. Above 50 °C, the disinfection process is three times faster than at 20 °C. At 30 °C, exposure to the sun for at least 6 h is required [36].

UV C Disinfection

UV technology was discovered in 1845 and used at 254 nm from the late 1920s to disinfect surfaces and water (Cernovedeau 1910). In 1960, the effect of UV on DNA structure at base pair level was demonstrated. The breakage and recombination of adjacent thymin bonds renders the DNA code unreadable, making reproduction impossible. Disinfection using UV C radiation at 254 nm is currently possible using either low-pressure or medium-pressure mercury vapor lamps [37].
A photon energy of 400 J/m2 provides good water disinfection, particularly for parasites (Giardia and/or Cryptosporidium) with water turbidity of <0.5 NTU. Some viruses require much higher photon energies (1500 J/m2 for noroviruses). UV disinfection must therefore be complemented by final chlorination [38].
Some chemical molecules are photosensitive and are transformed by photolysis. Ozone/UV-C couplings are therefore used to produce radical reactions, and for water disinfection it is essential that the quartz in the lamps is treated to cut off all wavelengths below 230 nm, and that the lamps have a tube cleaning system to prevent the deposition of oxidized metals [39].

3. Conclusions

This historical survey clearly shows that when it comes to water intended for human consumption, we’re at the limits of our knowledge, and what’s true today may not be true tomorrow. Chlorine is an obvious example; “In 1970, any surface water treatment line that didn’t start with pre-chlorination was a bad treatment line. In 2000, any treatment line that begins with pre-chlorination is banned”.
This is due to progress in numerous areas: analytical detection systems, speed of information flow, predictive toxicology, origins of certain diseases that are discovered to be waterborne (e.g., Helicobacter pylori and stomach ulcers), progresses in epidemiology.
At present, analytical progress in both chemistry and microbiology (e.g., DNA sequencing) is such that everything can be found in everything. This information which, without being put into the context of toxicology, panics the consumer, who tends to lose confidence in tap water and turn to buying home purifiers or bottled water. New chapters of potential drinking water contamination are open every year. As an example, we may cite the presence of plastic residues in water due to resource pollution. The actual data are including microplastics. With the future analytical progress enabling us to measure nanoplastics and molecules, we can expect further developments, including for bottled water. After regulations facing to pesticides, drug residues, it is now necessary to face with the presence of degradation products of pesticides in raw water which toxicity is quite unknown. In the same time the analytical progresses allowed the isolation of per-and polyfluoroalkyl substances in many natural resources (PFAS) which are very numerous, very difficult to be degraded in the nature (“eternal pollutants”) and now present in the blood of everybody. Many small cities have now to face with the need to include a more elaborated water treatment (active charcoal for example) and new costs. Even some sources of bottled water are contaminated. It is important to remember the contribution of drinking water is very limited in comparison with other sources like food, air, etc.. For pesticides the contribution is estimated around 10% maxi and for PFAS around 1%.
But, even the contribution of drinking water is low, we have to do all what is possible for reducing it. Thus, the future of drinking water treatment will include more steps and more analytical controls with more and more parameters. In the same time it will be necessary to think about the prevention or precaution principles application for setting the admissible values of all these new parameters, taking into account the eventual “cocktail effects” between different molecules and their potential endocrine disrupting mechanisms of action. It is of course necessary to avoid a more important chemical pollution of the resource because the actually present residues are sufficient for creating a lot of treatment problems. It is also possible we will have to think to abandon some resources because too polluted for distributing, even after treatment, a “clean” water without residues.
The climate change and increase of water temperature will also lead to the proliferation of some “new” microrganisms which will be to take into account but the actual disinfection treatments are still efficient, even against the antibiotic multiresistant strains.

Author Contributions

Conceptualisation, writing-original draft preparation, P.H.; historical data curation, writing-review and editing, A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ruf, T. L’histoire de la Maîtrise de L’eau en Egypte. In Aménagements Hydro-Agricoles et Systèmes de Production; ORSTOM, Ed.; CIRAD.DSA: Montpellier, France, 1987; pp. 361–374. [Google Scholar]
  2. Hippocrate. Traité des Airs, des Eaux et des Lieux; Translated by Charpentier, D.V. 1844. Available online: https://remacle.org (accessed on 19 April 2024).
  3. Pasteur, L. De l’extension de la théorie des germes à l’étiologie de quelques maladies communes. Rev. Méd. Vét. 1880, 57, 642–654. [Google Scholar] [CrossRef]
  4. King, L.S. Dr. Koch’s postulates. J. Hist. Med. Allied Sci. 1954, 7, 350–361. [Google Scholar] [CrossRef] [PubMed]
  5. Devane, M.L.; Moriarty, E.; Weaver, L.; Cookson, A.; Gilpin, B. Fecal indicator bacteria from environmental sources; strategies for identification to improve water quality monitoring. Water Res. 2020, 185, 116204. [Google Scholar] [CrossRef] [PubMed]
  6. Coin, L.; Hannoun, C.; Gomella, C. Inactivation par l’ozone du virus de la poliomyélite. Presse Med. 1964, 22, 2153. [Google Scholar]
  7. Stout, J.E.; Yu, V.L.; Best, M.L. Ecology of Legionella pneumophila within water distribution systems. Appl. Environ. Microbiol. 1985, 491, 221–228. [Google Scholar] [CrossRef]
  8. Brezonik, P.L.; Arnold, W.A. Water chemistry: Fifty years of change and progress. Environ. Sci. Technol. 2012, 46, 5650–5657. [Google Scholar] [CrossRef]
  9. Yang, M.; Zhang, X. Current trends in the analysis and identification of emerging disinfection byproducts. Trends Environ. Anal. Chem. 2016, 10, 24–34. [Google Scholar] [CrossRef]
  10. Cotruvo, J.A. Drinking water standards and risk assessment. Regul. Toxicol. Pharmacol. 1988, 8, 288–299. [Google Scholar] [CrossRef]
  11. Nichter, M. Drink boiled water: A cultural analysis of a health education message. Soc. Sci. Med. 1985, 21, 667–669. [Google Scholar] [CrossRef]
  12. Medina, E.; Romero, C.; Brenes, M.; De Castro, A. Antimicrobial activity of olive oil, vinegar, and various beverage against foodborne pathogens. J. Food Protect. 2007, 70, 1194–1199. [Google Scholar] [CrossRef]
  13. WHO. WHO Guidelines for Drinking Water Quality; WHO: Geneva, Switzerland, 2001; Available online: https://www.who.int/publications/i/item/WHO-FWC-WSH-15.02 (accessed on 25 August 2024).
  14. Sobsey, M. Managing Water in the Home: Accelerated Health Gains from Improved Water Supply; WHO: Geneva, Switzerland, 2002; 69p. [Google Scholar]
  15. Al Azharia-Jahn, S. Emploi de Coagulants Naturels Pour la Purification de L’eau Potable en Milieu Rural; Deutsche Gesellschaft für Technische Zusammenarbeit, GTZ Gmbh: Eschborn, Germany, 1989; 570p. [Google Scholar]
  16. Obe, M.B.P.; Abouzaid, H.; Sundaresan, B.B. Slow Sand Filtration. A Low Cost Treatment for Water Supplies in Developing Countries; WHO Publication, Water Research Center: Swindon, UK, 1986. [Google Scholar]
  17. Huisman, L.; Wood, W.E. Slow Sand Filtration; WHO: Geneva, Switzerland, 1974; 123p. [Google Scholar]
  18. Koch, R. Wasserfiltration und Cholera. Z. Hyg. Infekt. 1893, 14, 183–205. [Google Scholar] [CrossRef]
  19. Ives, K.J. Rapid filtration. Water Res. 1970, 4, 201–223. [Google Scholar] [CrossRef]
  20. Hidayah, E.N.; Chou, Y.C.; Zeh, H.H. Characterization and removal of natural organic matter from slow sand filter effluent followed by alum coagulation. Appl. Water Sci. 2018, 8, 3. [Google Scholar] [CrossRef]
  21. Remigy, J.-C.; Desclaux, S. Filtration membranaire (OI, NF, UF)—Présentation des membranes et modules. In Memento Technique de L’eau; Degremont: Rueil-Malmaison, France, 2007; 1718p. [Google Scholar] [CrossRef]
  22. Popoff, G.; Montiel, A. Eligible treatments applied to natural mineral waters: Advantages and disadvantages. Eur. J. Water Qual. 2006, 11, 3–12. [Google Scholar]
  23. Dakin, H.D. The antiseptic action of hypochlorites: The ancient history of the “new antiseptic”. Br. Med. J. 1915, 3, 809–810. [Google Scholar] [CrossRef] [PubMed]
  24. Lewcock, A.; Scoot-Kerr, F.; Mathieson, E. Chlorine disinfection and theories of diseases. In “An element of controversy: The life of chlorine in science”. Elem. Controv. 2007, 5, 179. [Google Scholar]
  25. Snow, J. Cholera and the water supply in the South Districts of London in 1854. J. Publ. Health Sanit. Rev. 1856, 2, 239–257. [Google Scholar]
  26. Sarrazin, J. Medical officers and Battle of Verdun. Hist. Sci. Med. 2002, 36, 133–145. [Google Scholar]
  27. Rook, J.J. Chlorination reactions of fulvic acid in natural waters. Environ. Sci. Technol. 1977, 11, 478–482. [Google Scholar] [CrossRef]
  28. DeMarini, D.M. A review of the 40th anniversary of the first regulation of drinking water disinfection by-products. Environ. Mol. Mutagen. 2020, 61, 588–601. [Google Scholar] [CrossRef] [PubMed]
  29. Tanner, B.D.; Gerba, C.P. The application of the Ct Concept for determining disinfection of microorganisms in water. J. Swim. Pool Spa Ind. 2004, 5, 8–14. [Google Scholar]
  30. Langlais, B.; Reckhow, D.A.; Brink, D.R. Ozone in Water Treatment: Application and Engineering; Taylor & Francis eBook: Abingdon, UK, 2019; 552p. [Google Scholar] [CrossRef]
  31. Hartemann, P.; Block, J.-C.; Joret, J.-C.; Foliguet, J.-M.; Richard, Y. Virological study of drinking and wastewater disinfection by ozonation. Water Sci. Technol. 1983, 15, 145–154. [Google Scholar] [CrossRef]
  32. Gordon, R.; Rosenblatt, A.A. Chlorine dioxide: The current state of the art. Ozone Sci. Eng. 2005, 27, 203–207. [Google Scholar] [CrossRef]
  33. Vikesland, P.J.; Ozekin, K.; Valentine, R.L. Monochloramine decay in model and distribution system waters. Water Res. 2001, 35, 1766–1776. [Google Scholar] [CrossRef]
  34. Sicairos-Ruelas, E.E.; Gerba, C.P.; Bright, K.R. Efficacy of copper and silver as residual disinfectants in drinking water. J. Environ. Sci. Health Part A 2019, 54, 146–155. [Google Scholar] [CrossRef]
  35. Ray, C.; Jain, R. Drinking water treatment technology-comparative analysis. In Drinking Water Treatment Strategies for Sustainability; Ray, C., Jain, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 9–36. [Google Scholar]
  36. McGuigan, K.G.; Conroy, R.M.; Mosler, H.J.; du Preez, M.; Ubombe-Jaswa, E.; Fernandez-Ibanez, P. Solar water disinfection (SODIS): A review from bench-top to roof-top. J. Hazard. Mater. 2012, 235–236, 29–46. [Google Scholar] [CrossRef]
  37. Exner, M.; Kramer, M.; Lajoie, L.; Gebel, J.; Engelhart, S.; Hartemann, P. Prevention and control of healthcare-associated infections in health care facilities. Am. J. Inf. Control 2005, 33, S36–S40. [Google Scholar] [CrossRef]
  38. Zyara, A.M.; Torvinen, E.; Veijalainen, A.-M.; Heinonen-Tanski, H. The effect of UV and combined chlorine/UV treatment on coliphages in drinking water disinfection. Water 2016, 8, 130. [Google Scholar] [CrossRef]
  39. Gray, N.F. Ultraviolet disinfection. In Microbiology of Waterborne Diseases: Microbiological Aspects and Risks, 2nd ed.; Academic Press: Cambridge, MA, USA, 2014; Chapter 4; pp. 617–630. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hartemann, P.; Montiel, A. History and Development of Water Treatment for Human Consumption. Hygiene 2025, 5, 6. https://doi.org/10.3390/hygiene5010006

AMA Style

Hartemann P, Montiel A. History and Development of Water Treatment for Human Consumption. Hygiene. 2025; 5(1):6. https://doi.org/10.3390/hygiene5010006

Chicago/Turabian Style

Hartemann, Philippe, and Antoine Montiel. 2025. "History and Development of Water Treatment for Human Consumption" Hygiene 5, no. 1: 6. https://doi.org/10.3390/hygiene5010006

APA Style

Hartemann, P., & Montiel, A. (2025). History and Development of Water Treatment for Human Consumption. Hygiene, 5(1), 6. https://doi.org/10.3390/hygiene5010006

Article Metrics

Back to TopTop