Open access peer-reviewed chapter

Pathogen Management in Surface Waters: Practical Considerations for Reducing Public Health Risk

By Matthew R. Hipsey and Justin D. Brookes

Submitted: June 1st 2012Reviewed: December 3rd 2012Published: May 15th 2013

DOI: 10.5772/55367

Downloaded: 1446

1. Introduction

Pathogen contamination of water systems is a major public health challenge in both developing and developed countries across the globe [1-9]. The pathogens of concern to human health vary between aquatic systems depending on the nature of the pathogen source and the intended use of the water. Due to their persistence in the environment and resistance to conventional treatment technologies, the (oo)cysts of the protozoan organisms Cryptosporidium spp. and Giardia spp. are a typical concern in water bodies used for drinking water [10-11]. In poorly treated drinking water storages and recreational waters (both fresh and marine), other problem organisms include bacteria such as Salmonella spp., Shigella spp., Vibrio spp. Clostridium spp. and Staphylococcus aureus, and numerous human enteric viruses such as those from the genera Enterovirus, Hepatovirus, Rotavirus and Norovirus [12-13]. Accordingly, the nature of disease caused by these organisms is also widely variable (Table 1).

Most concern is given to allochthonous enteric microorganisms, those that enter surface waters via external loading. However, depending on the environmental context, some autochthonous pathogens, those that develop internally, may also be important (e.g. Vibrio cholerae). The allochthonous sources typically occur when heavy rains wash infected material from surrounding agricultural and/or urban catchments into the runoff waters that ultimately supply the waterbody, or when effluent is discharged directly into watercourses (Figure 1). Inputs to lakes and rivers from recreational users can also lead to a significant increase in pathogen concentrations [14]. These major pathogen sources present a risk to humans through three main routes of exposure: direct consumption of microorganisms within drinking water, recreational contact, and consumption of microorganisms that have bio-accumulated within the tissues of consumable shellfish (Figure 2).

Figure 1.

Schematic overview of allochthonous microbial sources and receiving aquatic environments from the catchment to the ocean.

DiseaseAgentSymptoms
AmebiasisProtozoan (Entamoeba histolytic)Abdominal pain, fatigue, weight loss, diarrhoea
CampylobacteriosisBacterium (Campylobacter jejuni)Fever, Abdominal pain, fatigue, diarrhoea
CholeraBacterium (Vibrio cholerae)Fever, Abdominal pain, vomiting, diarrhoea
CryptosporidiosisProtozoan (Cryptosporidium parvum)Abdominal pain, vomiting, diarrhoea
Diarrhoeagenic Escherichia coliBacterium (Escherichia coli O157:H7)Acute bloody diarrhoea, abdominal cramps
GiardiasisProtozoan (Giardia lamblia)Abdominal pain, vomiting, diarrhoea
HepatitisVirus (Hepatitis A)Fever, chills, abdominal pain, jaundice
SalmonellosisBacterium (Salmonella sp.)Fever, abdominal cramps, bloody diarrhoea
ShigellosisBacterium (Shigella sp.Fever, diarrhoea, bloody stools
Viral GastroenteritisVirus (rotavirus etc.)Vomiting, diarrhoea, headache, fever

Table 1.

Summary of common water-borne diseases.

Pathogen distribution and transport in surface waters is a function of the pathogen load in the source water (e.g. agricultural runoff or direct wastewater discharge), the settling or entrainment characteristics of the particles that they may attach to, and resuspension from sediment-associated organisms by turbulence at the benthic boundary layer. The distribution of organisms will also be impacted by predation [15] and degradation due to sunlight exposure, or mortality due to undesirable physico-chemical conditions [16]. For some organisms, in situ growth may also need to be considered.

Contamination of rivers, lakes and reservoirs that are primarily used for drinking water is a particular challenge for environmental scientists and water managers charged with supplying water that minimises the risk of infection to downstream consumers. The nature of the contamination and the mechanisms that control contaminant dynamics may vary considerably due to either site-specific or contaminant-specific properties and hence there is a need to better understand how to assess and mitigate the risks.

As indicated above, microbial contaminants such as the (oo)cysts of the pathogens Crytposporidium or Giardia, are mainly derived from allochthonous (external) sources, whereas others may be generated autochthonously (internally) within the water storage. It is therefore not surprising that a fundamental principle of drinking water supply is to use high quality, protected source waters as a means of reducing the potential load of drinking water contaminants and thus reducing treatment costs and subsequent health risks to consumers. In reference [17], it was reported that the mean concentration of Cryptosporidium oocysts in protected reservoirs (0.52/100L) and pristine lakes (0.3 – 9.3/100L) was considerably lower compared to polluted rivers (43 – 60/100L) and polluted lakes (58/100L), which demonstrates the merit of this strategy. However, with increasing pressures on catchments, aquatic systems are not always sufficiently protected and pathogen risks must therefore be appropriately managed. In developing countries this is further confounded since both drinking and recreational waters may be subject to substantial direct and unregulated effluent discharges that are difficult to control at the source.

Figure 2.

Conceptual breakdown of routes of exposure of microbial pollutants.

When planning management measures or policies it is important to consider that the presence or absence of pathogens within the aquatic environment does not always translate directly to a high risk to human health. For example, Cryptosporidium and Giardia (oo)cysts have been identified at hazardous levels in Lake Kinneret, which has historically supplied around half of Israel’s water, however no major outbreaks were reported in Israel during an equivalent period [5]. Conversely, outbreaks of cryptosporidiosis have been documented where the water met guidelines based on standard bacterial indicator concentrations [3-4]. It must therefore be recognized that contaminant data and risk assessment procedures need to consider the wider management framework that encompasses the entire system, from source to exposure. To obtain a more realistic assessment of the overall contaminant risk it is necessary to understand the critical variables controlling contaminant fate and distribution once they enter the aquatic environment.

Allen et al. [18] went further, and described pathogen monitoring alone as being of little value and highlighted several cases where monitoring had misled regulatory authorities as to the actual risk, including both false positives and false negatives. They highlight several technical and administrative barriers why this is the case, and instead suggest that the human and financial resources would be better invested in enhancing treatment processes and gaining a better understanding of the system (such as the major pathogen sources and sinks, and inactivation processes). Within a drinking water system this is fairly clear since there are many points at which the quality of the source water can be controlled before the public is ultimately exposed, but for other environmental waters there is only limited ability for intervention. However, aquatic systems have an inherent natural assimilative capacity and, there are a number of beneficial water quality changes that can occur when water is stored in reservoirs, which may ultimately attenuate pathogen concentrations [16,19]. Reduced water movement increases the rate of sedimentation of particulate material. This reduces turbidity and may also result in the sequestering of the microbes associated with the particles. Many of the pathogens of concern are attenuated by environmental conditions with mortality linked to temperature, grazing by protozoa, and incident ultra violet radiation being the most critical factors. Of particular importance is having a good knowledge of the hydrodynamic processes that control water transport in the aquatic environment as this will ultimately determine the length of time the water is retained and the environmental conditions that it will be subjected to. In reservoirs, issues such as the thermal stratification and the short-circuiting of inflows have been identified as being significantly important [20].

The question then becomes whether or not it is possible to optimise the performance of aquatic systems as barriers to pathogen transmission by manipulating river or reservoir conditions. The climatic and hydrological conditions that lead to the development of a specific contamination threat are highly diverse and it is necessary to have a clear understanding of the origin and dynamics of the potential contaminants in relation to environmental conditions in order to understand the most appropriate control methodologies to implement of the range that are available. Further, it is often the case where multiple contaminants occur within a single storage or system, and the optimum solution for minimising risk is a compromise between minimising exposure to individual contaminants and may involve implementation of several management options. As a result it is necessary to manage the associated risks through development of a suitable risk management framework. To address the complexities and variability inherent in pathogen transmission requires a detailed quantitative understanding of contaminant fate and distribution through surface water systems, yet this is rarely achieved in practice and there is a need for improved assessment and mitigation of pathogen threats.

The aim of this chapter is to describe how to best gather specific information to support structured risk assessment programs for dealing with pathogen distribution in surface water systems. With this knowledge we summarise several control measures and discuss how they may be implemented within a structured framework to minimise the risk of contamination to humans. It is important to understand the key dynamics of contaminants, as described next, to provide the necessary context for the risk management framework and control measures.

2. Controls on fate and transport of microbial contaminants

Hydrodynamic controls on pathogen distribution: Hydrodynamics are a key driver in shaping distribution of pathogens in aquatic systems [21] and determine the horizontal transport, rates of dispersion and dilution, and their vertical distribution. In lakes and reservoirs horizontal transport is predominantly driven by basin-scale circulation patterns including wind-driven currents, inflows and basin-scale internal waves [22]. Although wind-driven currents only influence the surface layer, inflows can occur at any depth in a stratified water column [23], and internal waves can generate significant internal currents that may act in different directions at different depths. In stratified lakes and reservoirs, internal waves have been shown to be responsible for the vertical advection of pathogens past offtake structures resulting in periodic variations in water quality [24].

Dispersion describes both the turbulent dispersion (for example in the surface mixed layer) and shear dispersion due to the presence of a horizontal or vertical velocity shear, (e.g. rivers or in tidally forced systems). In river-floodplain systems sharp velocity gradients between the floodplain and main river cause substantial horizontal dispersion and mixing.

Since the source of most pathogens to reservoirs is via catchment inflows or engineered outfalls, the behaviour of inflowing water as they enter the water column is of particular importance. Inflow dynamics are controlled by their density and momentum relative to that of the ambient water. For example, warm inflows will flow over the surface as a buoyant surface flow, and cold dense inflows will sink beneath the ambient water where they will flow along the bottom towards the deepest point. In either case, as it propagates the gravity current will entrain ambient water, increasing its volume, changing its density and diluting the concentration of pathogens and other properties. A further complication is introduced where the density difference is derived from particulate matter (turbidity current), in which case the settling of these particles will influence the density and propagation of the inflow [25]. The speed at which the inflowing water travels, its entrainment of ambient water and resulting dilution of its properties, and its insertion depth are all of critical importance in determining the hydrodynamic distribution of pathogens. Prediction therefore requires a detailed numerical solution, often in three dimensions, which can resolve processes controlling momentum, mixing, and thermodynamics.

Kinetics: As particles are advected and mixed throughout a waterbody they are also subject to ‘non-conservative’ behaviour, i.e. growth or decay. Organisms become inactivated as they are exposed to the range of biotic and abiotic pressures that face them within the aquatic environment. In particular, organisms are known to be sensitive to temperature, salinity, pH, oxygen, turbidity, sunlight, and they are also subject to predation by larger autochthonous microorganisms [15]. Some bacteria may also be able to support growth through assimilation of nutrients from the water [26-30].

Sedimentation & association with particles: The settling rate of free-floating organisms is relatively small [31]. Association with inorganic and organic aquatic particles however can considerably increase the losses due to sedimentation, with particle settling being affected by their size and density according to Stoke’s law [32-33]. Pathogens may be associated with particles via adsorption at the surface, or they may be physically enmeshed within the organic matrix of faecal material. Differences in dynamic aggregation rates between different organism classes (i.e. protozoan, bacterial, viral) are thought to be an important determinant when deciding the applicability of surrogates [21].

Resuspension: Since pathogens may remain viable for significant periods in aquatic sediments [34-36], the resuspension and subsequent re-distribution of pathogens and indicator organisms can potentially be an important process. Sediment resuspension occurs when the shear stress due to currents and turbulent velocity fluctuations reaches a critical level. In rivers and estuaries, large currents are capable of generating significant critical bed-shear that exceeds the critical level regularly. In lakes and stratified environments such high velocities are reached less frequently, but can be caused by large underflow events and by basin-scale internal waves motions, for example after a period of significant wind forcing. Turbulent motions within the benthic boundary layer driven by currents and internal wave breaking [37] result in the resuspension of particulate material [38]. In environments with an active surface wave field such as the coastal ocean and estuarine environments, then oscillatory currents due to wind-wave action will also be important in periodically redistributing pathogen-laden sediments near coastal cities [39-40].

3. Hazard analysis and risk assessment framework

In drinking supply reservoirs the major contaminant risks not only originate from the introduction of pathogens, but also from the release of dissolved iron and manganese from sediment, growth of cyanobacteria and subsequent release of associated toxins, and from natural organic matter (NOM) reaction with the chlorine used for treatment. Risk management of pathogens therefore cannot be considered in isolation. The timescale over which these hazards generate varies from days to weeks, or even months. Problems associated with iron, manganese and cyanobacteria typically develop with the evolution of thermal stratification. It may take a number of days or weeks for cyanobacteria to grow to concentrations that may cause a taste and odour or toxin problem. Problematic concentrations of iron and manganese develop under anoxic conditions from long periods of stratification typically ranging from weeks to months. On the other hand, the greatest risk of transport of pathogens from catchment to the treatment plant is after significant inflow events, which may present a water quality issue within the timescale of hours to days depending upon the reservoir size and flow magnitude. Similarly the highest loading of NOM and turbidity occurs during inflow events.

In addition to the variable time scales that must be considered when deciding control measures and developing a risk management framework, the potential for large spatial heterogeneity must also be considered. In long, drowned-valley reservoirs, the concentration of a contaminant introduced from upstream may pass through a long reach of standing water and be progressively attenuated prior to reaching the offtake location(s), depending on the hydrodynamics and stratification at the time of the event. If the shape of the supply is more circular then circulation patterns may be more complex and considerable variability and patchiness may develop and influence the observed trends. For cyanobacteria and metal contaminants, the dynamics are strongly governed by vertical gradients in temperature and other physico-chemical properties that will vary based on meteorological conditions.

In many cases there maybe coinciding water quality issues that pose different risks and require the implementation of control measures that minimize exposure to one threat but which may actually increase exposure to another. For example, in a single reservoir, cyanobacterial toxins may exist in the surface, pathogens within an inflow intrusion at mid-depth, and anoxia and soluble metals in the deepest regions. In such instances care must be taken to ensure the management strategies that are implemented are well founded and the individual risks are well quantified. In these more complicated situations cost-benefit analyses may help guide the most appropriate action and help assess the relative benefits for public health, also set within context of reducing treatment costs at the downstream treatment facility.

Informed management not only requires knowledge on the spatial distribution, but also the temporal variability of threats. As the threat changes, so may the most suitable control measure, and it is therefore logical that any management framework that is implemented is adaptable and able to respond to changes in risk. To be effective in this regard, monitoring must also be flexible and reflect the nature of the risk at any given time [41].

Catchment-reservoir systems are complex and it is important to note that given the high variability in these systems one can never reduce all risk(s) to zero, however by strategic identification of potential problems, one can mitigate risk with knowledge of some basic hydrodynamics, the evolution of hazards, and careful monitoring. In drinking water supplies that are deemed critical and face multiple pressures, more sophisticated options such as real-time management systems are emerging to assist with the assessment of risks and the ongoing monitoring of the implemented control measures.

Successful risk management of water supply reservoirs and recreational waters requires a systematic approach to contaminant monitoring and prediction in order to reduce the risk of exposure to the public. This is best achieved through a quantitative understanding of the critical processes involved in contaminant distribution and transport, such as dilution rates and time scales for inactivation or contaminant decay. Such information would enable water managers to quantify the risk to water quality associated with a contaminant threat in source water, revise monitoring protocols to detect the organisms or chemicals of concern, and to manage water treatment or recreational closures proactively based upon detected or, ideally, modelled (anticipated) risk. Due to their flexibility, such ‘adaptive’ management strategies are more effective than their rigid counterparts [42], yet they are rarely adopted due to numerous uncertainties in our understanding and due to poor numerical model predictions.

The critical fist step in developing an adaptive management strategy for water supply is identifying the hazards of concern, articulating the hydrodynamic and biogeochemical conditions that lead to the development of the hazard and understand how these are influenced by the prevailing meteorology and hydrology (Figure 3). Within a drinking water supply reservoir the important processes can be summarised as loads entering the reservoir; transport, growth or attenuation with the reservoir and the distribution of the hazard relative to the offtake (Figure 3). Each of the parameters contributing to risk can be monitored and the processes measured and modelled.

4. Management framework

Routine monitoring and targeted measurement of the major processes systematically builds up a bank of knowledge to support reservoir management and detect and mitigate risks in a timely fashion. The aim is to measure the processes and hazards identified in Figure 3 which contributes to knowledge of the system (Figure 4).

System knowledge can be used to inform on catchment hydrology and contaminant loads, reservoir behaviour and can help focus the monitoring effort on the key variables. Monitoring for those variables will vary depending upon the timescales for which different hazards present, the cost associated with monitoring and the ease at which the parameter can be monitored. Typically these fall with three classes; routine monitoring for analytes that are sampled and then measured within a laboratory, online monitoring where sensors are deployed in the reservoir and log key parameters at relatively short time intervals (typically 10 min) and ad hoc monitoring in response to an event, such as a rain event inflow.

This monitoring serves two purposes; it allows for detection of the hazard so the appropriate risk assessment can be undertaken, and it also allows the managers to draw correlations between when the hazard is present and the reservoir and catchment conditions at the time. Upon detection of the hazard the situation needs to be assessed with respect to what risk it poses to water quality. This stage of the management framework can draw upon expert knowledge, specific process understanding derived from online monitoring and simulation of the event with hydrodynamic models. Armed with this knowledge and a prediction about the severity of the event, decisions on the operational response can be made. Points of control include attenuation of inflows in engineered wetlands and biofilters, alteration of the inflow dynamics, in reservoir treatment or manipulation of the offtake depth to select the best quality water to send to the treatment plant.

Figure 3.

Conceptual model of the major hazards challenging drinking water supplies and their links to meteorology, hydrodynamics and biogeochemistry.

Figure 4.

Management framework for assessing and responding to health threats to drinking water sources (modified from [20]).

5. Gathering information for risk assessment

The above risk assessment framework is built around suitable streams of information that are used to inform decision making and guide mitigative measures. There are numerous types of information that contribute to the understanding of an aquatic system, ranging from static information that characterises the domain, to routinely collected water quality samples and real-time sensors, through to strategically collected data related to a particular threat. In addition to general environmental or direct microbial data measurements, microbial surrogates (or bio-indicators) also have an important place in risk assessments. The application of hydrodynamic and water quality models have also significantly increased in recent years to supplement direct measurements. The hydrodynamic models are now well developed and tested and there are now numerous published models of pathogen dynamics in aquatic systems. These various information sources are discussed in this section.

5.1. System characterisation and baseline monitoring

Catchment: In most cases, the quality of water and public health risks that may exist within a lake, river or reservoir used for drinking water will reflect land-use practices and their distribution within the surrounding catchment. Any risk determination or control measure assessment must therefore give careful consideration to the nature of the activities within the catchment and identify key sources of contaminants or contaminant precursors and whether they are point-source or diffuse in nature. This analysis need not be complex and may simply involve plotting topography, land-use type, vegetation and soil type within a Geographical Information System (GIS), and assessing them within the context of the stream and drainage network and proximity to the water supply. The distribution of land-use activities that provide large quantities of contaminants will vary widely depending on the site-specific catchment properties and it is therefore not possible to generalise risk profiles with certain land-use activities. Using the case of Cryptosporidium within areas used for dairying as an example, animal husbandry practices vary from farm to farm, and how these activities are managed relative to the stream will largely determine the downstream pathogen numbers. Nonetheless it is useful information to have to help identify potential contaminant ‘hot-spots’. Areas of urbanisation are generally major sources of diffuse inputs and easily identifiable.

Other attributes that should not be overlooked included catchment vegetation distribution, since this influences contaminant attenuation, runoff quality and concentrations of natural organic matter and suspended material. More specific indicators such as riparian integrity can also be considered as it is well documented that this correlates with turbidity [43] and pathogen attenuation [44-45]. When considering catchment properties that may contribute to downstream contaminant loading, it is important to not only look for potential sources of contaminants, but also to search for opportunities to implement management measures. Preventing contaminant loading by reducing the source may in fact be the most cost-effective and sustainable solution rather than purely relying on engineering interventions or avoidance procedures in downstream locations [42].

Rivers & Reservoirs: For most situations it is critical to understand the nature of inflows entering a river reach, lake or reservoir. They are of direct relevance from a health point of view as they are the mechanism for seeding downstream water bodies with pathogens washed from the catchment. The relationship between flow and pathogens is well established and different phases of the hydrograph may be identified from a risk perspective [46]. For example, the first-flush concept highlights that risk is concentrated around times of large rainfall events and is therefore linked with the stochastic nature of climatic drivers for the location of interest. As pathogen concentrations vary with flow stage, so to do factors such as suspended particles, dissolved organic carbon and other variables such as predatory microorganisms. Highly turbid water is effective at attaching bacteria and viruses [33], but may be ineffective at removing protozoan (oo)cysts. Dissolved organic carbon is also critical in attenuating UV light, which is an important mechanism for inducing mortality in a range of microbial organisms.

In addition to the river flow rate measurement (or stage-height should a suitable height-flow rating relationship exist), it is essential to regularly measure the temperature and salinity (or electrical conductivity) of the inflow for characterisation of its density for later comparison with the stratification profile present at downstream stations. Depending on the nature of the site, the characteristic concentration of suspended particles in the inflow water may also be insightful since highly turbid waters not only influence sedimentation, but also can impact the density and ultimate fate of the inflow water [25]. This may be easily measured using an optical turbidity probe, or ideally an in situ particle analysis instrument (e.g., [47]). Developing a relationship between the particle size distribution and the optical turbidity signal is also able to provide further information about the dynamics of suspended sediment [25].

In stratified water bodies such as estuaries and deep lakes or reservoirs, the fate of inflowing water is largely determined by the stratification profile and the domain morphology. The vertical structure of density, and how it varies seasonally, will ultimately determine whether the inflow water will flow along the surface, along the bottom thalweg of the site, or at some level in between (Figure 5). It will also control the travel time, and level of entrainment that the inflow water experiences within the ambient water profile. The environmental conditions the pathogens experience will also vary accordingly and this may impact on the ability of the site to attenuate pathogens successfully or otherwise [23]. To best understand the vertical structure it is essential to have vertically resolved information of temperature (and salinity where relevant), and ideally a thermistor chain with a surface meteorological station to understand the dynamics of wind mixing and surface heat fluxes.

The information detailed above permits the development a clear picture of how the system is interfacing with the surrounding catchment and the key controls on transport processes. This can be improved by collection of water quality samples that provide specific information on microbiological concentrations in the water. They should generally be collected upstream and at various locations along the waterbody. Given that they are more resource intensive (both cost and time) than typical environmental measurements, it is essential they are targeted to specific points of concern and the monitoring program should be designed within the context of the transport dynamics.

Figure 5.

Schematic of inflow scenarios that may be observed entering a lake or reservoir, illustrating the surface overflow, interflow and underflow.

5.2. Real-time sensors

Often ad hoc monitoring, such as that described above, will miss ‘events’ that occur at a frequency below the monitoring interval unless they are specifically programmed to coincide with the event. Generally, the timescales of horizontal transport in rivers and lakes during flood events are significantly less than the routinely implemented monitoring frequencies. Remotely deployable sensors are therefore attractive to supplement other sources of information. Logistical challenges may also make deployable sensors better value as they save travel and manpower expenses, although maintenance and regular calibration is essential for these to be reliable and useful.

Instrumentation used to collect in situ data are available from numerous sources. Real-time sensors for meteorology, temperature, conductivity, turbidity and chlorophyll-a are commercially available from numerous vendors to provide high temporal resolution data, which in many cases, may be transmitted automatically to managers via telemetry. Such instrumentation is already being used widely across the globe to support decision-making activities. Advances in sensor technology continue and sensors are now available for examining in situ concentrations for nutrients, metals and other contaminants although they are yet to be widely adopted. As these become more widely available, practitioners will be able to take advantage of high-resolution time-series of many physical and chemical constituents. Even without the implementation of more advanced sensors, real-time information of core environmental properties (temperature, salinity, oxygen and turbidity) can provide considerable insight into the dynamics that will ultimately control pathogen fate and transport.

5.3. Strategic data collection

Strategic sampling may be undertaken to improve process understanding about the dynamics of the system. For example, routine sampling may not be able to provide sufficient information on hydrodynamic or biogeochemical processes at a scale required to truly assess the risk of a threat developing. Commonly, site-specific experimental data is collected to supplement other information on items such as the particle size distribution of suspended sediment, fine to medium scale horizontal and vertical mixing processes, and important ecological processes.

It is often the case that routine monitoring programs do not accurately portray the actual risk and strategic data can help to build knowledge about a system and improve the management practices. As an example, the reservoir shown in Figure 5 would typically be monitored by the managing water utility by collecting microbial samples at the inflow and at the surface near the centre of the basin. Although noticeable inflow concentrations would be recorded, surface grab samples from the main body of lake would be near the minimum level of detection, and it would therefore be assumed the reservoir had sufficiently attenuated the inflowing load. Experimentation of pathogen transport by [21] has showed that in fact in the case the reservoir may have attenuated little of the incoming load, and that significant concentrations existed below the surface layer [48]. Furthermore, most microbial monitoring programs are based on regular, often weekly, sampling. Although this is a good idea, and is acceptable for the case where there is a constant pollution source, for lakes and reservoirs fed by rivers, the highest risk is from large runoff events. It is therefore also recommended that event based monitoring be implemented. Even for large reservoirs that have considerable residence times, it can take just a few days for contaminated inflow water to reach the extraction point (e.g., [20,23,49]).

Often models (discussed below) can be used to assist in the design of an effective and targeted monitoring programme that reflects the dynamics of the system and can more accurately portray the risk. Sophisticated monitoring programmes also include capacity to logically adapt the monitoring regime as a particular threat is observed to develop. This approach acknowledges that we are more interested in the spatial and temporal variability in the contaminant of concern as the probability of exposure increases, and accordingly the monitoring effort intensifies to ensure the actual risk is being portrayed accurately.

5.4. Surrogate measurements for indicating threats to health

Often it is not possible or practical to directly detect the presence of a certain contaminant within a water body and so surrogates or other indirect indicators of the dynamics may be defined. In particular, the complexity, cost and time constraints associated with the direct enumeration of pathogens to identify their distribution in large water bodies, frequently limits the ability of water managers to detect the intrusion of poorer quality water inflows into the aquatic environment. As a result, there is considerable discussion in regulatory organizations and water utilities as to the value of using surrogates, such as microbial indicator organisms or even physical properties such as turbidity, as a way of detecting the presence of microbial contamination and hence the threat of actual pathogenic organisms. Microbial indicators have the advantage of being low-risk, present in high concentrations relative to other organisms of concern and simple and cheap to enumerate.

The most widely used indicator organisms of microbial pollution are the enteric coliform bacteria, which are gram-negative bacilli that belong to the family Enterobacteriaceae (e.g. Klebsiella spp., Enterobacter spp. Citrobacter spp., Escherichia coli). Specific coliform measurements include total coliforms, faecal coliforms, and in particular the specific organism E. coli [50]. The latter two are the most common since they are abundant in the faeces of humans and other warm blooded animals, and are hence thought to be a reliable indicator of faecal pollution. Total coliforms are used less frequently since they include organisms from soil and cold-blooded animals. Except for certain strains of pathogenic E. coli (e.g. O157), coliform bacteria are not a threat to human health, but their high abundance means that they are easy to detect, thereby alerting regulatory authorities to pollution events that may contain other organisms of concern. Other routinely used indicator bacteria include the gram-positive cocci, including Enterococci and faecal streptococci. However, it is now apparent that these bacterial indicators are not suitable for assessing the risk posed by protozoan pathogens and some enteric viruses [21,51-52].

Various bacteriophages are used as index organisms for enteric viruses [53-56]. The single-stranded F-specific RNA (F+ RNA) bacteriophages (e.g. strains MS-2, F2 and Q beta) and the double-stranded somatic coliphages (e.g. strains T2, T7 and ϕX174) are routinely measured in fresh and coastal waters. However, faecal bacteriophages are not always suitable index organisms since they are present in a range of animal as well as human faeces, whereas human enteric viruses only originate from human faeces. There have also been reports of human enteric viruses being detected in waters in the absence of bacteriophages [57]. The rationale for their use as model organisms is based on their similar size and morphology, along with the low cost, ease and speed of detection compared to human enteric virus assays. The ideal host bacteria would be of human faecal origin only, consistently present in sewage in sufficient numbers for detection, and only lysed by phages that do not replicate in another host or the environment. While bacteriophages to Bacteroides fragilis strain HSB40 appear to be human specific and do not replicate in the environment [58], their phage numbers are too low for general use. Due to their high abundance, studies have focused on the coliphage systems, including the double and single-stranded DNA and RNA-containing phages listed above, and for a range of bacterial hosts. The F+ RNA coliphages attach to the sides of the bacterial pili that only occur on exponentially growing specific (F+) strains of E. coli or an engineered Salmonella typhimurium (strain WG49), and are therefore the current models of choice [53,57].

The use of spores of the gram-positive bacilli Clostridium perfringens has been suggested as a good indicator of human faecal contamination and may correlate with human parasitic protozoa and enteric viruses [21,58-59]. However, two confounding factors must be considered; first, C. perfringens spores are very persistent [35] and second, they may be excreted by various animals [60]. Hence, they may show little relationship with parasitic protozoa in animal-impacted raw waters, and could be misleading about the likely presence of infective human viruses.

Particle counting and turbidity levels have also been identified as potential surrogates of microbial pollution and weak epidemiological evidence exists that suggests waterborne illness from drinking water may be associated with the raw water turbidity [61]. The use of turbidity alone to predict pathogen presence is difficult because turbidity is dependent on a range of processes that are independent of pathogen presence. For example, it is well established that many young calves are infected with Cryptosporidium [62], however, calving is timed to coincide with the period when feed is abundant and cows are on a rising plane of nutrition. Consequently, calving and high oocyst numbers occur when catchments are well vegetated, yet this is typically when turbidity is low. Additionally, surrogates such as turbidity are influenced by catchment specific factors such as soil-type distribution and non-grazing land-use such as horticulture that do not correlate with pathogen input. In shallow systems, turbidity may be caused through resuspension of sediment during high wind events or strong currents, and therefore may exist unrelated to any catchment or wastewater discharges. Nonetheless, turbidity is a readily measurable parameter that warrants investigation as a potential early warning mechanism of increased risk.

While no single water quality indicator can reliably assess the bacterial, protozoan and viral contamination of aquatic environments in all circumstances, it is feasible that a suite of surrogates may be identified that will estimate levels of microbial contamination within defined circumstances, such as within a storage reservoir with well characterized inputs [21]. To understand how they relate to each other, it is necessary to develop a process-based understanding of surrogate organisms in order to develop a model of their behaviour and assess their dynamics relative to their pathogenic counterparts [16].

5.5. Role of numerical models

Although the processes influencing enteric organism fate and distribution are fairly well established, much uncertainty remains as to the relative importance of each process on a system-wide scale, and the spatial and temporal variability that is present. Furthermore, a detailed understanding of how pathogen dynamics vary between systems, which may differ in their loading, salinity, temperature and trophic status, remains elusive. As a result, ad hoc monitoring routines are often employed that rarely give an indication of the true risk. Numerical models are attractive since they offer to integrate the myriad of interacting and non-linear processes and place them within a system-wide context.

The use of numerical models to augment existing monitoring and risk-management activities is becoming increasingly widespread since they are able to highlight dominant processes controlling organism dynamics, and can be used to fill knowledge gaps and test catchment management scenarios or examine engineering interventions. There have been several models used to simulate different components of microbial pollution reported in the literature that range in sophistication and that are relevant for different surface water environments, including freshwater lakes and reservoirs [16,63-64], streams and rivers [65], and estuaries and coastal lagoons [66-68].

Models are used by a range of organizations for a variety of applications:

  • as a scientific tool to explore the dominant processes within a given system – for managers interested in understanding the spatial and temporal variability in the dynamics that control enteric organism behaviour, and conducting pathogen budgets and exploring sensitivities;

  • to guide the design targeted monitoring programs – the model can be run to provide information about expected transport and kinetic controls to ensure that the sampling locations and frequency is focused on the areas that present the largest risk;

  • to quantify differences between species – the model can be used to ‘correct’ the observed microbial indicator organism data so that the true risk by actual pathogenic organisms can be quantified;

  • to quantify the impact of proposed management scenarios – scenarios such as catchment remediation, climate change and engineering interventions can be compared to the base case system as part of a cost-benefit analysis prior to any remedial action;

  • to support real-time decision-making – the model can be used to provide now- and fore-casts of conditions within an aquatic system to enable managers to alter pumping regimes or issue recreational closures.

Models for assisting with the understanding of contaminant dynamics within a system range from simple web-based tools to full three-dimensional (3D) hydrodynamic-water quality models. To demonstrate their ability to describe pathogen transport dynamics, here we focus on a case-study of a medium sized Australian reservoir, Myponga Reservoir in South Australia, where pathogen transport dynamics have been measured and modelled using various approaches. The model developed to simulate pathogen kinetics is shown in Figure 6.

INFLOW: This is a simple web-based tool written in JavaScript (http://www.cwr.uwa.edu.au/) developed by [23], that estimates the entrainment experienced by a riverine intrusion as it enters and progresses through a standing body of water (e.g. lake or reservoir) and the final insertion depth and inflow thickness. The model simply requires the inflow parameters including temperature, salinity and channel parameters (e.g. bed slope, roughness), and reservoir information such as the vertical temperature and salinity profiles. This information is used to estimate whether the inflow will act as an overflow, interflow or underflow (Figure 5), the approximate entrainment rate (and hence the dilution), and the timescale for transport and insertion. In addition, since the model is able to estimate the time-scale of transport, a simplified loss-model (simplified version of Figure 6) is included that estimates the loss due to settling and inactivation. Therefore, with relatively simple input information, the user is able to assess reservoir risk reduction of pathogen concentrations by dilution, settling and inactivation. This tool is only applicable where the lake is weakly forced at the surface, and doesn’t tell you what happens to the contaminated inflow water once it has inserted. However, the simplicity of knowing if the water will travel at the surface, mid-regions or at depth and an approximate dilution factor is a surprisingly powerful management tool.

Figure 6.

Schematic indicating organism dynamics showing the major pools and kinetic processes that occur in response to sunlight (UV-B, UV-A and VIS), temperature (T), salinity (S), dissolved oxygen (DO), pH and available carbon (DOC) (modified from [16]).

CAEDYM: The Computational Aquatic Ecosystem Dynamics Model (CAEDYM) is a comprehensive water quality model available from the Centre for Water Research at The University of Western Australia [69], able to couple with a 1D lake stratification model (DYRESM) and a 3D hydrodynamic model (ELCOM). ELCOM is specifically designed for modelling circulation patterns in stratified lakes and reservoirs and can be easily run on a desktop computer. The pathogen sub-model includes inactivation due to different solar radiation bandwidths (including UV), natural mortality, sedimentation and aggregation onto suspended particles, and resuspension as described in [16]). ELCOM-CAEDYM is well suited to applications where topographic effects may be important or where lake circulation is a dominant process over inflow forcing. For example, if the source is a recreational area or a lakeside canal estate, then three-dimensional effects may be important in determining the mixing of the contaminated water around the lake. Similarly, if the source is a discrete effluent discharge, then the prevailing hydrodynamics will need to be well understood in order to accurately characterize the risk of the contaminated water reaching an off-take or recreational area. As an example, the application to Myponga Reservoir (Figure 7) describes a contaminated inflow entering from a side tributary before it begins to interact with a large peninsula and migrate into the main basin. Output from the model highlights the strong vertical gradients that result. Hipsey et al [16] use this platform to further investigate the controls on E. coli dynamics over a range of time-scales for different sized reservoirs and showed that a different combination of transport and kinetic processes could ultimately shape the response of the microbial concentrations observed at the offtake.

Figure 7.

ELCOM-CAEDYM Cryptosporidium concentrations (oocysts/10L) presented as a slice through Myponga Reservoir (bottom-right, colour scale reflects oocyst concentration), South Australia (see inset), following a large runoff event, and highlighting Cryptosporidium oocyst concentrations as a function of time for three depths near the offtake (left).

DYRESM, is a 1D hydrodynamic model that has been shown to accurately capture the temperature and salinity dynamics of large and small lakes and reservoirs [70]. It accommodates horizontal motions caused by inflows and outflows, in addition to a Lagrangian vertical mixing model and a surface thermodynamics module. CAEDYM couples with DYRESM and includes the same detailed microbial sub-model as outlined in Figure 6. The horizontal averaging used in DYRESM-CAEDYM significantly improves computational efficiency and means that seasonal or even multi-decadal simulations can be performed with reasonable run-times. This model is therefore well suited to applications that look at the long-term impact of different watershed management or climate change scenarios for example, and its ease of use makes it particularly useful to reservoir and lake managers.

GLM-FABM: The General Lake Model (GLM) is coupled with the open-source ecological model FABM as described in [71]. The model GLM is similar to DYRESM but based on [72], and a simplified version of the pathogen kinetic model [16] is implemented within FABM. Output from the model as applied to Myponga Reservoir for the year of 2003 (Figure 8) demonstrates how the incoming oocyst load manifests in the water column concentrations and highlights the temporal and vertical variability that may be expected. The figure indicates the variability seen in the off-take Cryptosporidium concentrations at three different depths, and demonstrates the potential benefits of selective withdrawal and adaptive reservoir management to minimize the potential risks.

Figure 8.

Time-series of A) viable inflow oocyst load as estimated from data; B) simulated concentrations of viable oocysts at the dam wall for three different depths, and C) the viable oocyst concentrations (oocysts/10L) throughout the water column as simulated by GLM-FABM.

6. Control measures and considerations to minimise exposure

There are a range of control measures that can be implemented to mitigate the risk of contaminants passing through to delivery systems. Here we provide a broad overview of the different approaches and their relevance to different contamination issues. The different control methods can be characterised based on the nature of the intervention:

  • reducing the delivery of contamination to the exposure point (water storage);

  • improved attenuation of the contaminant within the storage;

  • and optimal extraction of water to reduce exposure.

The methods around these are discussed next, including a discussion on operational monitoring and the potential for real-time management of water bodies for risk minimization.

6.1. Catchment management

Most contamination within drinking water storages is linked to the surrounding catchment land-use. The contamination can either originate from the catchment and enter the tributaries to the storage, or a precursor of the contamination may originate within the catchment that is later transformed to the end product that constitutes a health risk. There is therefore much scope to reduce contamination within drinking waters through strategic assessment and management of catchment land-use and the condition of tributary streams.

For Cryptosporidium and other pathogens the biggest threat is when heavy rains wash viable cells from agricultural catchments within the surrounding river basin into the floodwaters that feed the reservoirs [73-74], or in some cases when effluent from wastewater treatment plants is discharged directly into upstream watercourses (75-76). For many reservoirs, prevention of the contaminant source from entering the hydrological network is the ultimate method for reducing downstream health risks. In particular this involves improving agricultural practices and the best example is related to improvement of the methods used in animal husbandry. Farm-scale alleviation techniques such as improved drainage and runoff recharge may prevent or delay contaminated runoff from entering the stream network. At the sub-catchment scale, policies for suitable riparian management are also recommended since it has been shown that suitable riparian buffers can act to filter contaminated farm water from entering streams [46].

In urban environments, principles of Water Sensitive Urban Design (WSUD) are increasingly being adopted with numerous innovations in the design of stormwater management features including biofilters and constructed wetlands at critical catchment points [77].

6.2. Inflow manipulation

A significant critical control point for managing contaminant transfer into the main reservoir body is through strategic engineering based interventions where catchment tributaries meet a standing water body. Several flow management options exist here that can reduce the delivery of contaminants, including:

  • flow diversions – as seen in pathogen concentrations over the course of an inflow event, higher values occur at the leading edge of the peak in the hydrograph, and so there is potential for ‘first-flush’ flows to be diverted away from the water body under consideration. After the initial peak in cell numbers are reduced, then the flow diversion can be removed. The disadvantage of this approach is the potential loss of valuable water, however it could be considered as an environmental flow.

  • sedimentation basins, or ‘pre-reservoirs’ –can be used to slow incoming water down and encourage sedimentation of particulates. The water that overflows the basin is usually of a higher quality and passes into the main water body with a lower concentration. There are potential complications with this approach, particularly considering that oocysts have a long life in the sediment and a very high may mobilise previously sedimented organisms in a single event.

  • constructed wetlands – like pre-reservoirs are useful at slowing the inflowing water and enhancing attenuation (e.g., [78]), also mentioned above.

To assess the efficiency of any of these controls and the loading reduction, it is a simple case of measuring water at the outlet relative to the influent water.

6.3. In-lake controls

Aside from manipulating the inflow concentrations (above) or offtake strategy (below), there is some potential to manipulate concentrations by using in-lake interventions. However, there are no standard methods for controlling the concentration of pathogens within standing waters and these are more commonly applied for management of nuisance cyanobacteria, for example, but also offer potential to support management of high pathogen loads. The most common method for managing stratified waters is destratification, generally achieved through the introduction of compressed air on the lake bottom. The air-water mixture with lowered specific weight causes a rising water curtain, destroying stratification. Similarly, impellers have also be suggested as a method to disperse contaminated water that is concentrated within highly important areas to regions where they are less of a concern. They operate by enhancing the water exchange rate (potentially the littoral water exchange rate, depending upon the flushing technique and the location of the devices), which, if important enough, can prevent the formation of high concentration patches. Some flushing devices also increase also vertical mixing and this may impact on incident light received by organisms.

Where hypolimnetic contaminants become unmanageable, for example following a large, dense flood intrusion, then historically there has been examples of hypolimnetic release or purging to remove these from the system. It can be done in natural lakes by pipes called Olszewski-tubes [79], and by selecting an appropriate height of the discharge on the dam for reservoirs. However, this potentially involves losing valuable water resources and may also lead to enhanced downstream contamination.

Morillo et al [80] numerically demonstrated the use of a suspended geo-textile curtain to manipulate the flow path of contaminated inflow water, thereby increasing time taken for natural attenuation processes to occur. These curtains change the internal flow-paths by compartmentalisation, and either slow down or redirect highly contaminated inflow pulses. They may also contribute to address other contamination concerns, since they can have an impact on the nutrient distribution and the residence time of the lake [80-81]. In addition to controlling the path of inflowing water, they constitute a barrier to internal waves, and thus can prevent enhanced mixing due to seiching and subsequent sediment resuspension.

However, such measures are logistically difficult and may incur considerable cost, and for pathogens in particular, management of the offtake location usually remains a more practical and effective means to reduce risk.

6.4. Adapting the location of drinking-water offtakes and bathing sites

The vertical variability in reservoir water quality can be exploited to select an offtake depth with the lowest contaminant concentrations. Selective withdrawal is a widely used method for controlling transmission of contaminants downstream of surface storages. This is mostly applicable for reservoirs with sufficient depth and the capability in the dam offtake structure for multiple offtake depths.

The simple INFLOW model described above enables prediction of the depth at which the riverine inflow will occur and the anticipated dilution as it travels through the reservoir. Consideration should be given to the time of transport. Appropriate monitoring is necessary to ensure stratification and intrusion type is known to the operator. The speed at which an inflow travels through a lake, the degree of entrainment of ambient lake water, the dilution, and the insertion depth are all important in determining the distribution of pathogens in lakes and reservoirs. Consequently it is important to know the depth of the riverine intrusion so water can be harvested outside of this intrusion.

Similarly, where bathing sites are situated around the system of interest, the best solution is temporary closure upon detection of high concentrations with the monitoring data, or from the model predictions.

6.5. Real-time management systems

Given the complexity of managing environmental waters it is not surprising that there has been a considerable proliferation of decision support systems (DSS) in the water resources sector over the past several decades. In general DSSs integrate databases, models, and data visualization tools through a user-friendly interface. The complexity of the models and the databases ranges greatly depending on the intended use. A distinction exists between ‘real-time’ DSSs that provide advanced warning of deleterious impacts and ‘non-real-time’ DSSs that serve as planning and operational development tools. Real-time flood prediction DSSs have emerged as commonplace technology, which provide advance warning to save life and property. However, the development of real-time systems for water quality concerns has been less common, but is increasing as data acquisition systems and the associated cyber-infrastructure associated with such developments improves.

The purpose of the DSS is to accurately predict the fate and transport of floods and contaminant dynamics. Because these types of incidents have high spatial and temporal variability in inland standing waters, complex three-dimensional simulations are often also implemented. In this case the DSS performs all of the tasks to maintain simulations of the current conditions in an automated manner so that when incidents occur, the system provides useful predictions to aid in mitigation measures (Figure 9).

7. Conclusions

It is apparent that contaminant risk to drinking water supply needs to consider a broad management framework that encompasses the entire system, from source to exposure. Reducing the corporate and human health risk associated with these contaminants requires a realistic assessment of the overall contaminant risk which can only be achieved with an understanding of the critical variables controlling contaminant fate and distribution once they enter the aquatic environment. Whilst it is near impossible to reduce the pathogen risk to zero, it is possible to manage the load of pathogens entering the water body, understanding and predicting where they go within a drinking water supply reservoir and managing the withdrawal of water to ensure the best quality of water is treated and distributed for potable supply. Sanitation and water treatment have dramatically reduced the burden of water borne disease on the human population. However, the water industry cannot afford to be complacent or not implement risk management strategies for contaminants in drinking water supply catchments and reservoirs. Failure to do so has significant cost, can lead to outbreaks and at worst cost human lives [82]. Tragically water borne disease still takes an enormous toll in developing countries but with implementation of some of the simple technologies and approaches presented here, an integrated risk management framework, and in combination with treatment and disinfection, this toll can be reduced.

Figure 9.

Framework for a Decision Support System (DSS) that has inputs from a range of monitoring programs which inform and input to hydrodynamic and management models to visually represent predictions of risk, and report exceedance of contaminant thresholds and send warnings to managers.

© 2013 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Matthew R. Hipsey and Justin D. Brookes (May 15th 2013). Pathogen Management in Surface Waters: Practical Considerations for Reducing Public Health Risk, Current Topics in Public Health, Alfonso J. Rodriguez-Morales, IntechOpen, DOI: 10.5772/55367. Available from:

chapter statistics

1446total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Biological Responses of in vivo Studies to Contaminants: A Contribution to Improve Public Health Knowledge

By Maria de Lourdes Pereira, Irvathur Krishnananda Pai and Fernando Garcia e Costa

Related Book

First chapter

Leprosy: The Ancient and Stubborn Disease

By Prasetyadi Mawardi

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us