Gulf Coast Conference 2021 – Managing Calibrations for Optical Spectroscopy

GCC 2021 event

GCC 2021

Join Brian Rohrback on 10/12/2021 at the Orchid Room, 9:40 AM – 10:10 AM.

Abstract:
A multi-industry consortium got together 8 years ago to rethink how calibrations need to be performed for spectroscopy instruments and analyzers. The priority was on solutions that are non-disruptive, fully utilize legacy systems, and lessen the workload rather than layer on additional requirements. The result is a foolproof, simple approach that can be used with any type of brand of spectrometer, any type or brand of chemometrics, and ensures robust and reliable calibrations. For more information, email info@infometrix.com.

ISA Virtual Conference: Rethinking Calibration for Process Spectrometers

ISA 2021 Virtual ConferenceJoin Brian Rohrback at the 2021 ISA Analysis Division Virtual Conference

March 23, 2021 at 12:00 ET

Register and be ready to take part in these in-depth discussions at www.isa.org/ad

 

Rethinking Calibration for Process Spectrometers

The talk focuses on a generic, machine-learning approach that addressed the primary bottlenecks of mustering data, automating analyzer calibration, and tracking data and model performance over time. The gain in efficiency has been considerable, and the fact that the approach does not disturb any of the legacy (i.e., no changes or alterations to any analyzer or software in place) made deployment simple. The result is a standardized procedure for doing calibrations that adheres to best practices, archives all data and models with easy access in mind, and delivers models in any format.

ISA Webinar – Practical AI: In Search of Dynamic, Autonomous Process Analytics

ISA 2021 webinar postingJoin Brian Rohrback, President of Infometrix on Feb. 25th at 1:00pm ET.

Free Webinar: Process Control & Instrumentation Series

 

 

Practical AI: In Search of Dynamic, Autonomous Process Analytics

The application of the concepts behind artificial intelligence and machine learning mandates a systematic approach to extracting information from multiple, byte-dense data sources. Effective extraction of this information leads to improvements in decision making at all levels of the chemical, petrochemical, and petroleum industries. To accomplish anything in the AI space, we need to combine traditional approaches in statistics, database organization, pattern recognition, and chemometrics with some newer concepts tied to better understanding of data mining, neurocomputing, and machine learning. This is an introduction to a practical approach to deploying AI and how a multi-company, multi-industry, hydrocarbon processing consortium, established eight years ago to re-evaluate how the calibration process for sensors and analyzers could be managed more efficiently. The focus spans optical spectrometers, chromatographs, and process sensors, independently and in combination, with a shift from current practices to approaches that take advantage of the computational power at our fingertips.

Dr. Rohrback’s expertise covers the integration of multivariate data processing for process analyzers and laboratory instruments catering to routine quality analysis. Prior to his current position, he worked for Cities Services Oil Company, now Occidental Petroleum, with industry positions including research scientist managing the chromatography group, an exploration geologist, and manager of planning/budget for EAME. He holds a B.S. in chemistry, a Ph.D. in organic geochemistry, and an MBA. His 50-year span of published works include topics in petroleum exploration, chemical plant optimization, clinical and pharmaceutical diagnostics, informatics, pattern recognition and multivariate analysis.

Efficient Calibration Process and Big Data

View latest talks on Big Data and Calibration Process Efficiency.

 

 

 

 

Harnessing Big Data – AiChE 2020

Big Data implies a systematic approach to extracting information from multiple, byte-dense data sources. Effective extraction of this information leads to improvements in decision making at all levels of industry. Here, we combine traditional approaches in statistics, database organization, pattern recognition, and chemometrics with some newer concepts tied to data mining, neurocomputing, and machine learning. The cost is low and the benefits are high.

The Multivariate Process Paradigm – SciX 2020

This is a summary of a chemical processing consortium, established eight years ago to re-evaluate how the calibration process for sensors and analyzers could be managed more efficiently. The focus is on optical spectrometers to enable a shift from current practices to approaches that take advantage of the computational power at our fingertips. It was critical to prioritize solutions that are non-disruptive, utilize legacy systems, and lessen the workload rather than layer on additional requirements. The result is a choice of tools available to consume the data and generate actionable, process-specific information.

ISA 2020 – Rethinking Calibration for Process Spectrometers II

The Long Beach Convention Center
Long Beach, CA
1:30pm, April 27th

 

Brian Rohrback
Infometrix, Inc.
Will Warkentin
Chevron Richmond Refinery

 

KEYWORDS
Best Practices, Calibration, Cloud Computing, Database, Gasoline Blending, Optical Spectroscopy, PLS, Process Control

ABSTRACT
Optical spectroscopy is a favored technology to measure chemistry and is ubiquitous in the hydrocarbon processing industry. In a previous paper, we focused on a generic, machine-learning approach that addressed the primary bottlenecks of mustering data, automating analyzer calibration, and tracking data and model performance over time. The gain in efficiency has been considerable, and the fact that the approach does not disturb any of the legacy (i.e., no changes or alterations to any analyzer or software in place) made deployment simple.

We also standardized a procedure for doing calibrations that, adheres to best practices, archives all data and models, provides ease of access, and delivers the models in any format. What remains is to assess the speed of processing and the quality of the models. To that end a series of calibration experts were tasked with model optimization, restricting the work to selecting the proper samples to include in the computation and setting the number of factors in PLS.  The amount of time and the quality of the models were then compared.  The automated system performed the work in minutes rather than hours and the quality of the predictions at least matched the best experts and performed significantly better than the average expert.  The conclusion is that there is a large amount of recoverable giveaway that can be avoided through automation of this process and the consistency it brings to the PLS model construction.

INTRODUCTION
There is a lot of mundane work tied to the assembly of spectra and laboratory reference values to enable quality calibration work.  There is also insufficient guidance when it comes to the model construction task.  How much time should be spent on this task?  How to best assess whether a spectrum-reference pair is an outlier or not? How many cycles of regression-sample elimination make sense? Where do we switch over from improving the model by adding PLS factors to overfitting and incorporating destabilizing noise?

For more information or the full paper, contact us.