Chlorine vs. Chloramine: A Tale Of Two Chemistries

Source: Swan Analytical USA

In drinking water treatment’s ongoing battle between disinfection and disinfection byproducts (DBPs), most water utility customers are oblivious to the process. One thing they do notice, however, is when their water smells or tastes bad. Here are some insights that can help water treatment plant (WTP) operators deal with their internal concerns about DBPs and residual chlorine or ammonia levels, as well as their external concerns about customer perceptions of water quality.

Choosing Between Two Approaches

Maintaining a delicate balance among multiple concerns—fluctuating source water contaminants, water purity, DBPs, and residual levels of free chlorine or monochloramine—is what keeps WTP operators awake at night.

On one hand, chlorination has historically dominated in water treatment, providing greater upfront purification, although chlorine levels can dissipate relatively quickly within the water distribution system. On the other, chlorination comes with a risk of generating undesirable DBPs if natural organic matter (NOM) and organism levels in the source water are high.

According to Randy Turner, technical director at Swan Analytical USA, Inc., “As a result of the U.S. EPA’s Stage 2 regulations, water utilities need to survey their systems at multiple points. If any one point is above the regulatory limit of 0.08 mg/L for total trihalomethanes (TTHM) or 0.06 mg/L for haloacetic acids (HAA), the utilities need to take action to reduce them. As a result, approximately one-quarter of U.S. municipal water plants currently chloraminate to reduce disinfection byproducts.”

Chloramination is only about one-quarter as effective as chlorine in purifying organic contaminants upfront, but it avoids DBP issues and maintains good residual levels of monochloramine for continued purification throughout the distribution system.

Assessing Residual Chlorine

The Centers for Disease Control and Prevention (CDC) cites the importance of chlorine residual in drinking water as an indication that:

  • “a sufficient amount of chlorine was added initially to the water to inactivate the bacteria and some viruses that cause diarrheal disease;” and,
  • “the water is protected from recontamination during storage.”

To ensure those levels of protection, EPA Method 334.0 (Determination Of Residual Chlorine In Drinking Water Using An On-line Chlorine Analyzer) targets measurement of residual chlorine—free chlorine or total chlorine—in drinking water distribution systems, designed for daily monitoring when chlorine residuals range from 0.2 mg/L to 4.0 mg/L. According to that document, “Amperometric titration or N,N Diethyl-p-phenylenediamine (DPD) colorimetric methods are the most commonly used approved grab sample methods.”

Better Chloramination Choices Start With Better Measurement

Choosing the best chloramination strategy starts with having a good sense of water conditions throughout the treatment process. There are two main measurement approaches:

Amperometric method: Measures monochloramine and total ammonia, subtracting monochloramine from total ammonia to calculate free ammonia present, typically minimized to prevent nitrification risks.

DPD-type analyzers: Provide sequential measurements for chlorine species, enabling accurate calculation of monochloramine, combined chlorine, and dichloramine.

DPD analyzer outputs

Figure 1. Sequential outputs from a DPD analyzer provide specific chlorine/chloramine component values. (Source: Swan Analytical USA, Inc.)

These discrete values allow WTP operators to tailor chloramination control strategies effectively, minimizing free ammonia in the distribution system and preventing unwanted nitrification or elevated dichloramines that cause taste and odor issues.

Automated monitoring

Figure 2. Automated monitoring helps detect and correct upset events quickly. (Source: Swan Analytical USA, Inc.)