Analyzer Technology Conference, Booth #511

Join Infometrix at ATC 2026 Conference in booth #511 for presentation on Ai-Metrix and the automation of chemometric calibrations.

April 13-17, 2026

Galveston Island Convention Center

Meet with industry leaders for discussion on new and innovative analyzer techniques, developments, and applications for process and laboratory measurements as well as the fundamentals of quality control employing optical spectroscopy.

For additional information on the Analyzer Technology Conference, brochure and event overview in pdf are available for viewing with links below. You can also reach out to info@infometrix.com as well for any questions. We look forward to seeing you.

ATC Brochure

ATC Event Overview

CPACT Webinar on The Intersection of Machine Learning, Chemometrics, and Spectroscopy

CPACT Webinar on The Intersection of Machine Learning, Chemometrics, and Spectroscopy

Presented by Brian Rohrback of Infometrix, Inc.

April 23, 2026 (7:00PM UK Time).

See abstract below. Visit CPACT Webinars or contact info@infometrix.com for details

AI and machine learning have stormed into our scientific and marketing lexicons.  As we discuss the integration into analytical chemistry applications, we face the invariable need to merge with the field of chemometrics.  We know chemometrics as an area of study that has generated a set of tools for practitioners to use in extracting the information content from sets of analytical data.  Machine learning is the extension of this idea, just without human intervention.  As we employ the tools provided by chemometrics to autonomously automate a process, where the computer is making decisions based on the input data, the chemometrics becomes a cog in the machine learning world.  One area ripe for this combination is optical spectroscopy, particularly IR, NIR, and Raman.

Let’s do a thought experiment.  What if we decided we wanted to fully automate the use of optical spectroscopy for a quality control application?  What would be required to take any spectroscopy instrument, put it into a lab or process stream, have it learn the application, build an optimized model, deploy the model for QC, and maintain the calibration for the life of the instrument. Can this be done without human interaction?

To standardize the control of spectroscopy assessments, there are four primary software-related areas to tackle, two of which the user may only need to do once.

  1. At the start, a method needs to be set that optimizes how future spectra will be manipulated and involves algorithm selection, choice of preprocessing, and potentially trimming the wavelength range.
  2. The other early process is to understand the precision of the laboratory methods, how they impact calibration models, and how this information needs to be factored into understanding system performance.
  3. On a continuous basis, the process chemistry can change dictating a maintenance effort to determine the optimum number of factors and identify outliers that negatively impact model performance.
  4. A system has been outlined by ASTM to automatically flag when the model performance has degraded.

Infometrix has spent the last decade and a half commercializing a system designed to fully automate and efficiently optimize all aspects of the above calibration. Components of the thought experiment are in place and a discussion of the approach (plus solutions to encounters with implementation quicksand) shows how to blend chemometrics into machine learning for the benefit of industry.

IFPAC 2026 Presentations

IFPAC 2026SAVE THE DATE FOR IFPAC-2026! Network and share your knowledge on advancements in manufacturing science. Join Brian Rohrback for the following presentations.

Chemometrics in Chromato-Context (ID# 83)

Chromatography is one of the most useful technologies to employ for routine chemical assessment in industry. In many cases, it is the cheapest and most adaptable technology available to fully document the composition of our samples. Chemometrics has been used to interpret chromatographic traces, although the implementation has been far less than seen in spectroscopic applications.  It gives us the chance to review where chemometrics has been utilized in the chromatographic sciences and where the advantages lie.   Starting with the chromatography basics, this presentation builds up the world of chemometrics step-by-step to show where the technology has been used and can contribute in the form of driving much more reliable results from the data we collect.

Fully Integrated Data Analysis (ID# 84)

We employ many sources of analytical information to perform quality control on the processes we manage.  In many cases, we are not utilizing the information content from the data we currently collect.  In most quality control situations, results from different sources will need to be merged into a single release metric. This can be done hierarchically, where information will need to be factored in order of priority or response time.  Another option is to process simultaneous data in a blended, data fusion model.    Care must be taken to ensure that the complexity of fusing several sources of data does not involve so much complexity that the system is unwieldy or simply cannot be used. Here we will discuss current techniques and show how the value of the information stream can be improved by more timely integrated data analysis.  An example from the pharmaceutical classification of botanicals shows the power of this approach.

Chemometrics – COPA (Chemometrics for Online Process Analysis)

Chairs: Brian Rohrback, Infometrix, Antonio Benedetti, Polymodelshub, and Hossein Hamedi, Arrantabio

Chemometrics is central to all calibration work in spectroscopy and has influence in most of the instrumentation tied to product quality control. We are investigating the challenges and the successes tied to the implementation of chemometric technology as it relates to the process industry, whether for pharmaceuticals, for consumer products, for food, or for chemicals. We seek to optimize quality control.

Globalized Spectroscopy (ID#261)

Implementing spectroscopy applications is often a complex management process, even if the deployment is restricted to a single spectrometer.  When a company wants to roll out spectroscopy in multiple locations additional potential problems arise however, managed properly, the benefits are significant.  IR, NIR, and Raman are the most common optical systems employed to measure chemistry in a quality control application, but they require a calibration to convert spectral signatures to the properties of interest.  Unless an objective mechanism for performing calibration is available across the sites, product quality results will vary.  It is possible to package “best practices” into a system that forces consistency and optimal outcome.  By removing the subjective nature of manual calibrations, the quality of the quality control can be assessed and maintained at a high level.

View the program preview here.

Contact info@infometrix.com for questions or for more information on presentations and event details.

Objective Tracking of Calibration Model Quality

In the petroleum industry, this approach is applied at scale. Facilities often monitor dozens or even hundreds of predictive models simultaneously, such as summer, winter, and all-season fuel grades. In this example, 27 models (a 9×3 grid) are tracked, though some refineries monitor more than 300. The system works with any optical instrument and with any chemometrics assessment software.

ASTM D6122 provides sample-specific guidelines for evaluating these models. Rather than relying on simple fixed limits, it defines dynamic, sample-specific thresholds. When samples fall outside these limits, Ai-Metrix can kick in to supply an updated model in minutes.

Different visual indicators convey different issues:

  • Yellow triangles represent samples that are statistically in control but unusual for the model. These are often good candidates for inclusion to improve model robustness.
  • Red squares indicate that model diagnostics are acceptable, but the predicted value does not match laboratory results—typically signaling a laboratory error.
  • X markers show both diagnostic failures and unusual samples, indicating a true system failure that requires intervention.

Although the ASTM calculations are complex, they are well-suited for automated computation. Once implemented, users can quickly drill into individual samples to examine diagnostics, model predictions, and laboratory values. This allows identification of discrepancies where the system is stable, but results are out of specification, often revealing process or lab issues rather than model faults.

By compressing large volumes of historical data into actionable metrics and applying these models in real time, organizations can distinguish false positives, detect procedural problems, and better understand the sources of disagreement between manufacturing and laboratory measurements.

ASTM’s work is notable because it formally codifies how to evaluate model performance—something that had not been standardized before. While adoption has been strongest in refining, these methods are largely unknown in pharmaceuticals, chemicals, and food manufacturing.

With real-time feedback and rapid model updates, these systems enable smarter, more adaptive manufacturing. This is where machine learning and AI naturally fit: not as replacements, but as practical overlays that enhance existing workflows and produce outputs that can support regulatory discussions.

Learn more about Ai-Metrix automation. Contact us at info@infometrix.com for a demo.

Quality Assurance with Ai-Metrix Automated Model Validation

Ai-Metrix Automated Model ValidationAi-Metrix®now offers a fully automated model validation framework, combining real-time tracking with powerful diagnostics based on ASTM D6122 and comprehensive Nelson Rule monitoring. Designed to support high-stakes environments like gasoline blending, this feature ensures your predictive models remain accurate, reliable, and audit-ready as new data flows in.

🎯Smarter Monitoring Starts Here
As your team uploads fresh data to the Ai-Metrix server, our system continuously evaluates model performance using robust statistical tools. A streamlined dashboard gives you instant visibility:

✅ Visual Control Charts – Instantly identify anomalies with trend plots showing ±1, 2, and 3 standard deviations.
✅ Advanced Rule Integration – Choose from Nelson Rules or ASTM D6122 checks to detect early signs of model drift or calibration issues.
✅ Dynamic Model Grid – See all active models at a glance, organized by product grade and property. Click to dive deeper into sample counts and validation metrics.
✅ Flexible Metric Selection – Monitor predicted values, residuals, F-ratios, or Mahalanobis distance to match your validation strategy.
✅ Rule Violations Trigger – Violations can trigger an email to a distribution list for action.

🎯Why It Matters
→ Regulatory Alignment – ASTM D6122 compliance, built-in.
→ Hands-Free Oversight – Continuous, automated validation as data is collected.
→ Proactive Alerts – Catch issues before they affect process quality.
→ Complete Transparency – From calibration sample size to rule violations, everything is traceable and actionable.
→ Real time updates – With the Ai-Metrix calibration power, model updates are available in real time to move the system back into compliance.

🎯Confidence in your models isn’t optional—it’s critical.
Ai-Metrix delivers a smarter, automated approach to model validation that helps your team maintain compliance, ensure process integrity, and make better decisions, faster. Explore the new validation dashboard today and take control of your model quality.

Book a quick demo and see how Ai-Metrix can elevate your operations. info@infometrix.com

“Quality is never an accident. It is always the result of intelligent effort.”
– John Ruskin, English writer (1819-1900)