Choosing the best method will optimize how future spectra will be processed. The calibration procedure for spectroscopic models begins with selecting appropriate preprocessing methods. Different analyzer technologies (e.g., NIR vs. Raman), spectrometer manufacturers, and chemometric algorithms can influence the optimal choices. To evaluate preprocessing methods, Root Mean Squared Error of Cross Validation (RMSECV) is commonly used, offering a reliable measure of combined bias and precision error.
In one example, various combinations of preprocessing techniques (e.g., derivatives, scatter correction, normalization) were tested. The results showed that some combinations significantly outperformed others, with the best approaches selected based on minimizing RMSECV with the fewest model factors.
Partial Least Squares (PLS) regression is the standard algorithm due to its widespread integration and effectiveness in handling errors from both spectroscopy and lab sources. However, PLS assumes linearity, which doesn’t always apply. In such cases, alternative methods like Locally Weighted PLS (LWR-PLS) can provide better performance by addressing non-linearity through localized modeling.
Finally, method development is a one-time effort. Once a spectral data processing method is chosen, it generally remains fixed throughout the instrument’s use.
Here is the link to the full presentation on Minimizing Error in Calibrating Spectrometers from ATC 2025 Conference. Also available in Youtube video from 00:00 to 22:00.