MINIMIZING ERROR IN CALIBRATING SPECTROMETERS – Part 1. Accuracy vs Precision: Misconceptions of error that influences decisions

The terms “accuracy” and “precision” are often confusing. Accuracy refers to how close a result is to a known value, while precision indicates the consistency of repeated results. Ideal measurements are both accurate and precise. However, it’s possible to have high precision with poor accuracy, or low precision that still averages out to an accurate result over many trials.

Application to Octane Measurement:
In octane rating analysis, precision is a challenge, especially with the reference octane engine, which is not always precise despite being used to define accuracy. Essentially, the octane engine represents the “Not precise, maybe accurate” portion of picture. Repeated measurements of the same gasoline sample show a normal distribution of values with a standard deviation of ~0.25 octane units, meaning 95% of results fall within ±0.5 units.

The calibration process, then, is charged with mapping a very precise spectrum to a far less precise reference value and calibration must account for this. One can run multiple engine tests on the same sample and use the average value to reduce error—error decreases with the square root of the number of runs. In dealing with a system where the reference is imprecise, there is more that we can do.

Here is the link to the full presentation on Minimizing Error in Calibrating Spectrometers from ATC 2025 Conference. Also available in Youtube video from 00:00 to 22:00.