Upcoming Events 2022

We miss in person relationships and live experiences that are not available with virtual and hybrid meetings. Virtual platforms have their issues which makes us desire more for in person gatherings. We’re still not back to normal but we are easing our way back. Hope you feel the same and will join us at these upcoming events to share and engage in a valuable discussion.

Upcoming Events in 2022

CPAC, Seattle, WA, May 2-3

Routine Quality Assessment – Similarities and uniqueness of machine learning and chemometrics and how they combine to form robust solutions.

PEFTEC, Rotterdam, June 8-9

Streamlining the Use of Chemometrics – Faster response, improved flow of information and a significant process understanding nearly cost-free.

IFPAC, Bethesda, MD, June 12-15

Agile Process Analytics – Combining technical tools to augment or replace tasks that consume brainpower for timely response and greater profits with future goals of optimization with automated spectroscopy calibration.

SciX, Kentucky, Oct 2-7

Optimizing Spectroscopy Performance – Lessening the workload with automation of models and maintaining them quickly and easily for robust, reliable, and timely calibrations.

IFPAC 2022 – Agile Process Analytics

Join Brian Rohrback of Infometrix, Inc. for his talk on Agile Process Analytics, June 14, 1:05pm EST.

Agile Process Analytics

Application knowledge and chemometrics play a vital role in the processing of all types of multivariate data into application-specific information and has been doing so for at least 50 years.  There has been a not-so-subtle shift in thinking as we integrate basic concepts and the occasional hallucination in the data mining, artificial intelligence, machine learning worlds.  The target is to identify combinations of our technical tools to augment or replace tasks that consume brainpower where timely response is valued, and profits are at risk.  The biggest focus of chemometrics has been in the calibration of optical spectrometers.  It is worth considering the subtasks:

  1. Optimizing the instrument settings for a given application;
  2. Optimizing the method parameters – preprocessing, transformations, wavelength ranges;
  3. Handling of calibration transfer; and
  4. Optimizing models for inliers and rank in pursuit of routine processing and adjusting to changes in ingredients and unit operation.

The first two tasks are a set-once method development and the third may be generic across all applications. This paper tackles subtask 4 with a project that combined traditional approaches in statistics, database organization, pattern recognition, and chemometrics with some newer concepts tied to better understanding of data mining, neurocomputing, and machine learning.  The future goal is to automate spectroscopy calibrations such that it is possible to have instrument systems tune themselves.

EAS 2021 – ChemMLometrics. Perform. Repeat. , Brian Rohrback

EAS 2021 ChemMLometrics. Perform.Repeat. Join Brian Rohrback for his talk on ChemMLometrics. Perform. Repeat., November 16th at 10:15AM.

ChemMLometrics. Perform. Repeat.

Doing a one-off research project can be very satisfying, but in industry the money is made when the results of such a project can be placed into routine quality assessment.  There is much discussion today revolving around machine learning and some of this discussion has invaded the space that has been occupied by chemometrics over the last few decades. There is a large overlap as these techniques are applied in instruments aimed at process quality control.  To a large extent, any differentiation will simply be a question of the jargon chosen. An on-line Stanford University class on machine learning covers Principal Components Analysis, Principal Components Regression, Partial Least Squares, K-Nearest Neighbor, and other mainstays of the chemometrics toolbox.  However, there are techniques that are new in the machine learning realm that can be employed for targeted tasks, improving the traditional chemometrics regime of supervised learning.  The key is to narrow the focus for each component in a system and to understand the extent to which you can control the inputs.  This talk highlights examples in optical spectroscopy calibration particularly for the identification of outliers, and in chromatography for both signal processing and the management of chromatographic libraries. The emphasis is to highlight both the similarities and the uniqueness of machine learning and chemometrics and show how they combine to form robust solutions for industry.

EAS 2021 – Automating Calibrations for Optical Spectroscopy

EAS 2021 Automating Calibrations for Optical Spectroscopy Join Brian Rohrback for his talk on Automating Calibrations for Optical Spectroscopy, November 16th at 1:30PM.

Automating Calibrations for Optical Spectroscopy

This talk represents a summary of a multi-industry consortium established eight years ago to re-evaluate how the calibration process in optical spectroscopy could be managed more efficiently.  The idea is to enable a shift from current practices to approaches that take better advantage of the computational power and some newer concepts supplied by research into machine learning algorithms.  The result is a solution that is not disruptive of any legacy instruments or software already in place and lessens the workload rather than laying on additional requirements. The approach uses all readily available components and can be assembled easily for any specified purpose.  The use of commercial components reduces the cost of deployment and assembling pieces in a plug-and-play manner minimizes the impact of any previous selections of hardware and software.

IFPAC 2020 – Autonomous Calibration and Optimizing Chromatographic Interpretation

IFPAC 2020 cardIFPAC 2020
Feb 23-26, 2020
Bethesda, MD

See abstracts below for papers being presented at the IFPAC 2020 conference. Join us or contact us for more information.

 

 

Autonomous Calibration
Brian Rohrback – Infometrix
Randy Pell – Infometrix
Scott Ramos – Infometrix

The use of chemometrics in processing spectroscopic data is far from new; the processing of NIR data in petroleum refineries dates to the early 1980s and in the food industry well before that. Although the computers have improved in performance leading to speed ups in the calibration process, the procedures being followed have not changed significantly since the 1980s. Intriguingly, we have made decisions on the corporate level that work against each other. We are installing more spectrometers and at the same time we are reducing staffing for spectrometer calibration and maintenance. A change in approach is mandated. In the spirit of automation, there are tools from both the chemometrics and the general statistics realms that can be applied to simplify the work involved in optimizing a calibration. Robust statistical techniques require some set-up of parameters, but once established for an application, they are often useable in every other instance of that application. The result is a one-pass means of selecting optimal samples for a calibration problem and, in turn, simplifies the assignment of model rank. This approach solves two problems:

 

Optimizing Chromatographic Interpretation
Brian Rohrback – Infometrix, Inc.

The heartbeat of the process environment is in the data we collect, but we are not always efficient in translating our data streams into actionable information. The richest source of process information comes from spectrometers and chromatographs and, for many applications, these prove to be the cheapest, most adaptable, and most reliable technologies available. In chromatography, there is a rich history and the chemometrics role is well defined but rarely placed into routine practice. This paper will provide a retrospective of routine processing solutions that have solved problems in pharmaceutical, clinical, food, environmental, chemical, and petroleum applications. It also discusses how to use tech borrowed from other fields to provide more consistent and objective GC results, automate translation of the raw traces into real-time information streams, and create databases that can be used across plant sites or even across industries.