October 22, 2018
Surrogate Calibration In Environmental Air Monitoring? No Thanks
By Michael D. Shaw
What—you might ask—is surrogate calibration? Let’s start with some basics. Most scholarly works on calibration first invoke the notion of a comparison. Thus, as materials testing company Instron puts it: Calibration is the process of comparing an unknown value to a known value. To calibrate a device is to compare a characteristic of that device with the characteristic of a similar device called a “standard.”
The Automation, Systems, and Instrumentation Dictionary defines calibration as “a test during which known values of measurand are applied to the transducer and corresponding output readings are recorded under specified conditions.”
In the environmental field, there is great interest in measuring the concentration of air pollutants. Standards are set by regulatory agencies specifying the maximum allowable levels of many hazardous compounds. In most cases, these concentration measurements are performed with instruments. This type of analysis is referred to as a “relative method,” since the measurement of the analyte must be compared to measurements of additional samples that are prepared with the use of analyte standards.
In contrast, an “absolute method” would give the analyte concentration from a direct measurement of the sample. No additional measurements are required (other than sample mass or volume). Absolute methods are not common in air pollution studies.
As such, these air monitoring instruments—being relative methods—must be calibrated with a known standard. Fortunately, many calibration gas mixtures are available from commercial sources, in compressed gas cylinders. Typically, such calibration gases are specified as consisting of a particular concentration of a “component” (the analyte of interest) and the “balance gas” (most often air or nitrogen).
Other analytes require the use of so-called “permeation devices.” Permeation devices are small, inert capsules containing a pure chemical compound in a two-phase equilibrium between its gas phase and its liquid or solid phase. At a constant temperature, the device emits the compound through its permeable portion at a constant rate. Permeation devices are typically inserted into a carrier flow to generate test atmospheres for calibrating gas analyzer systems.
Between gas blends in cylinders and permeation devices, standards can be generated for most pollutants. However, there are some analytes that do not lend themselves to either calibration modality, and other approaches must be used. At Interscan, our instruments for hydrogen bromide, hydrogen peroxide, and peracetic acid detection are calibrated in-house as follows:
1. A suitable aqueous solution of the analyte is prepared, and using a gas bubbler apparatus, a vapor stream of the compound is created. (The analyte “generator.”)
2. The stream is then subject to wet chemical analysis, thus standardizing the stream to a known concentration value. This must be done each time a stream is set up.
3. This same stream can then be used to calibrate instruments—for the next few hours.
Admittedly, this is more difficult and time-consuming than using cylinder gas or permeation devices, but it is necessary for the analytes mentioned—among others. Some people, though, don’t think it’s necessary. They believe that they have discovered an alternative solution, namely surrogate calibration.
The term “surrogate calibration” refers to a practice in instrument calibration whereby a standard different from the entity to be measured is utilized. For example, it might be discovered that sulfur dioxide produces a certain response on a hydrogen peroxide sensor. Since commercial standards for sulfur dioxide are readily available, and this “certain response” is known, it is posited that sulfur dioxide could be used as a surrogate to calibrate a hydrogen peroxide sensor.
However, there are serious problems with this strategy:
- As a first principle, there must have been a means to calibrate the hydrogen peroxide sensor originally, that did not involve a surrogate. So, what was this technique, and why not just keep using it?
- The “certain response” of a surrogate should ideally be characterized over time, measuring range, and from sensor to sensor. Except, this hardly ever happens. Instead, a generalized number, which could be off by more than 20%, is inevitably employed.
- Ironically, surrogate methods are often applied in measurement scenarios involving very low concentrations of hazardous chemicals. These are the very situations in which precise measurements are essential. If the regulatory limit is 1 part-per-million, you really need to be able to distinguish between 0.9 and 1.1!
Expediency has its place, but not as a substitute for real analytics. After all, the idea here is to protect public health, right?