“Minimum Detectability” is one of those instrumentation terms that is used frequently, but is seldom defined. Indeed, even though you will encounter this term on many data sheets, its definition does not appear in any of the usual learned references, including Process Instrumentation Terminology, ANSI/ISA—51.1—1979(R1993) and Standard Terminology Relating to Sampling and Analysis of Atmospheres, ASTM D 1356 – 05.
However, the ASTM standard does provide us with…
Method Detection Limit: The minimum concentration of an analyte that can be reported with a 99% confidence that the value is above zero, based on a standard deviation of greater than seven replicate measurements of the analyte in the matrix of concern at a concentration near the low standard.
Simplifying this, we can say that “Minimum Detectability” is the lowest concentration of analyte that can be unambiguously discriminated from noise. [Some agencies set a standard that minimum detectability must be at least 2-2.5 times the noise level.] Fair enough, but how can this be utilized in occupational health or process gas detection?
First of all, it is important to note that any data garnered at the level of minimum detectability will not be accurate. For example, in a typical case, the minimum detectability of a particular instrument is given as 1% of full scale, and accuracy is ± 2% of full scale. Thus, for a 0-100 ppm scale, the minimum detectable reading of 1 ppm would actually be 1 ppm ± 2 ppm—hardly a useful measurement.
Similarly, on a digital unit, the minimum detectability of a particular instrument is often given as the least significant digit. On a commonly used 3½ digit meter, for a range of 0-199.9 ppm, this would be 0.1 ppm. In this case, accuracy is specified at ± 2% of reading ± 1 least significant digit. Here, the minimum detectable reading of 0.1 ppm would actually be 0.1 ppm ± 0.002 ppm ± 0.1 ppm. Technically better than the analog example, but still of little value.
Even so, knowing the minimum detectability of an instrument can be helpful in situations when “go/no-go” readings are of interest. Given a properly calibrated instrument, the smallest observable response would be—by definition—the minimum detectable level, and would indicate at least the presence of the analyte in question (any interferences notwithstanding).
Of course, such practices should only be done when instruments with more appropriate sensitivity are not available.