Online Statistical Techniques in Batch Operations

by | Jun 6, 2008 | Services, Consulting & Training, Technologies

Jim Cahill

Jim Cahill

Chief Blogger, Social Marketing Leader

I was reading the great Control magazine article, Data Analytics in Batch Operations, by Lubrizol’s Robert Wojewodka and Emerson’s Terry Blevins. It describes the challenges in applying on-line analytics in batch production for early fault detection and prediction of end of batch quality parameters. If done right, this can help avoid out of spec batches, waste and rework and the opportunity cost of the lost batch. This can be critically important in specialty chemical and pharmaceutical manufacturing, which in the words of the authors, “depend heavily on batch processing to produce low-volume, high-value product.”

The reward is great if you can make these on-line analytics work but the challenges are great. One of the biggest is the time differences between batches. Operators and events can halts and restarts to process. These may be for manual additive additions, waiting on common equipment to become available or abnormal conditions that may develop.

Another challenge is that online measurement of quality parameters “may not be technically feasible or economically justified.” These lab samples must be available to online analytics toolset to perform these analytics.

Other challenges they noted included variations in feedstocks, varying operating conditions, concurrent batches and the assembly and organization of the data.

The concept of “Golden Batch” has been around a long time, which is the concept of comparing batches in progress, or just completed with ideal ones from the past. The authors point out two big weaknesses with this approach. The first is conditions indicated by each measurement may affect the product quality in different ways. Secondly, this is a univariate approach to a multivariate problem–no knowledge is gained of the relationships of process variations.

In a prior post, I’ve discussed some of the analytic tools, PCA and PLS. By applying these online, “…changes can be made in the batch to correct for detected faults or deviations in the predicted value of key quality parameters.”

I mentioned to Terry that I gave this article a close read and was working on a blog post. He told me the really innovative thing they were able to do was to apply dynamic time warping (DTW). The article describes its use:

…allows such [batch time length] variations to be addressed by synchronizing batch data automatically using key characteristics of a reference trajectory.

This normalization process is covered in detail in the Greg McMillan and Mike Boudreau’s book, New Directions in Bioprocess Modeling and Control and discussed in the article, PAT Tools for Accelerated Development and Improvement.

It’s hard to give the article justice in a short blog post, so if you have 18 minutes and 30 seconds to spare, watch to the interview Walt Boyes did with Bob, Terry and Philippe Moro at last year’s Emerson Exchange.

Popular Posts

Comments

Follow Us

We invite you to follow us on Facebook, LinkedIn, Twitter and YouTube to stay up to date on the latest news, events and innovations that will help you face and solve your toughest challenges.

Do you want to reuse or translate content?

Just post a link to the entry and send us a quick note so we can share your work. Thank you very much.

Our Global Community

Emerson Exchange 365

The opinions expressed here are the personal opinions of the authors. Content published here is not read or approved by Emerson before it is posted and does not necessarily represent the views and opinions of Emerson.

PHP Code Snippets Powered By : XYZScripts.com