Skip to Main Content
Added to cart.

Understanding the Basics of Uncertainty in Measurement and Calibration

Understanding the Basics of Uncertainty in Measurement and Calibration

Estimating measurement uncertainty is one of the most challenging tasks scientists and calibration technicians must handle. 

When an instrument is calibrated, the measurements should be traceable to a common standard. All calibration labs worldwide use the ISO Guide to the Expression of Uncertainty in Measurement (GUM) method to estimate measurement uncertainty. First, let’s examine the basics of Uncertainty in Measurement and Calibration.

What is the Uncertainty of Measurement? 

Uncertainty is the range of possible values within which the actual value of the measurement lies. It is the “doubt” of measurement. It tells us how good the measurement is. Every measurement has some “doubt,” we should know how much this “doubt” is to decide if the measurement is good enough for usage.

Many things affect the result of a measurement, the tools used, the method or process used, and the way the person did the job.

Error is not the same as uncertainty. For example, in calibration, when we compare our device to be calibrated against the reference standard, the error is the difference between these two readings. It is critical to be able to distinguish between uncertainty and error.

Standard Deviation of the Measurement

The standard deviation is a statistic that measures the dispersion of a dataset relative to its mean. It is then calculated as the square root of the variance. 

The goal is to get the typical deviation of that whole measurement process and use the knowledge as an uncertainty component related to that measurement.

Being aware of the standard deviation of your calibration process is a critical part of the total uncertainty.

The Reference Standard (Calibrator) and its Traceability

One of the most significant sources of uncertainty comes from the reference standard or calibrator you use in your measurements/ calibrations. First, you should select a suitable reference standard for each measurement. 

It is also essential to note that it is not enough to use the manufacturer’s accuracy specification for the reference standard and keep using the uncertainty of the reference standards long term. You must calibrate your reference standards regularly in a calibration laboratory with sufficient capabilities to calibrate the norm and make it traceable. 

It is essential to realize the total uncertainty of the calibration that the laboratory documents for your reference standard. Therefore, follow the stability of your reference standards between its regular calibrations. After some time, you will learn the fundamental uncertainties of your reference standard, and you can use that information in your calibrations.

Compliance Statement: Pass or Fail

When an instrument is calibrated, it has a pre-defined tolerance limit that it has to meet. Tolerance levels are the maximum levels indicating how much the result can differ from the actual value. If the errors of the calibration result are within the tolerance limits, it is a passed calibration. If some result errors exceed tolerance limits, it is a failed calibration.

The GUM 8-step Process for Calculating Uncertainty

  1. Describe the measured value in terms of your measurement process.
  2. List the input quantities.
  3. Determine the uncertainty for each input quantity.
  4. Evaluate any covariances/correlations in input quantities.
  5. Calculate the measured value to report.
  6. Correctly combine the uncertainty components.
  7. Multiply the combined uncertainty by a coverage factor
  8. Report the result in the proper format.
Background image