Implementing Quality by Design

Jonathan Lustri Life Sciences Industry Consultant

Jonathan Lustri
Life Science Industry Consultant

Many manufacturers in the Life Sciences industry are challenged to respond to a new paradigm for process development and manufacturing. The FDA’s cGMP for the 21st Century initiative is driving the industry to change its development and manufacturing to be based on process understanding and a risk based approach. This new paradigm uses terminology such as Quality by Design (QbD), continued process verification (CPV), and process analytical technology.

Source: Implementing Quality by Design, Helen Winkle, Director FDA, FDA/PDA Joint Regulatory Conference 2007

Source: Implementing Quality by Design, Helen Winkle, Director FDA, FDA/PDA Joint Regulatory Conference 2007

A QbD strategy has a significant impact on the methods and tools for process development, process validation, and process control. Process understanding is the fundamental building block that underlies quality by design. This begins with defining the desired patient performance characteristics and understanding how product Critical Quality Attributes (CQAs) impact patient performance characteristics.

Once research has determined the CQA specifications and the acceptable variability of the CQAs, process development develops the understanding of the relationship between the CQAs and the Critical Process Parameters (CPPs). The state of the art capability is to develop a multivariate model that describes how the CPPs relate to the CQAs, including the acceptable variability of the CPPs, where product remains within the acceptable variability of the CQAs. This is known as the process design space.

A model that defines the design space allows for more flexibility in operating the process. Traditional pharmaceutical manufacturing is focused on repeating a fixed recipe and controlling the recipe parameters to fixed setpoints. The QbD strategy changes this to operate the process within the defined design space. This results in more robust manufacturing producing higher quality. The traditional approach does not allow the flexibility to control the process to respond to unavoidable variability.

2 CPP - 2 CQA Design Space Map

2 CPP – 2 CQA Design Space Map

For example, it may be best to operate a fluid bed dryer at a different airflow rate or temperature if the ambient humidity or temperature is not within its normal range. Using the concept of multivariate models and design space, the product critical quality attributes can be predicted from critical process parameters making it possible to implement process control strategies that provide real time control of product quality. The figure (left) provides an illustration of a model of the design space that defines how two different critical process parameters impact two different critical quality attributes and provides information on how these parameters may be controlled to stay within the design space.

Process validation is also impacted by the new paradigm. The FDA has issued a new validation guidance document that encourages Continued Process Verification. Traditional validation methods relied on the principle of a manufacturer being able to prove that they could repeatably produce quality product by following a fixed procedure within specified equipment. If this was demonstrated, the process was considered validated. Under the new guidance, validation is based on the principle that the manufacturer can establish scientific evidence that the process is capable of consistently producing product quality. One aspect of this is the use of continuous process verification to provide on-going assurance during routine production that the process is in a state of control and producing quality product.

Process Analytical Technology (PAT) is an important tool for Quality by Design and Continued Process Verification. PAT instruments are used to measure critical quality attributes in real time.

Near-infrared (NIR), Fourier transform infrared (FT-IR), Raman spectroscopy, and Focused Beam Reflectance Measurement (FBRM) are a few of the popular PAT instruments that enable real time measurement of critical quality attributes. The use of these instruments drastically impact product quality and compliance. There are some challenges however, to applying PAT instruments.

The outputs of many of these are multivariate data sets such as a light spectrum. Spectral data is not subject to simple calibration methods normally found in univariate instruments such as a pressure sensor. Instead, a chemometric model must be applied to the spectral data that is then used to predict the critical quality attribute. A variety of process conditions must be included during the development of the chemometric model to assure that the model accurately predicts the critical quality attribute.

Example Data Output from NIR Spectrometer

Example Data Output from NIR Spectrometer

PAT instruments are ideal tools for moving to real time quality control. However, using these instruments require new skills and new software environments. The Chemometric models are developed with modeling tools such as Camo Unscramber, Matlab Solo Predictor, and Umetrics Simca P. These are typically the tools of the analytical chemist, not the control engineer.

Also, there is the challenge of how to interface to the PAT instrument, retrieve the spectral data, apply the chemometric models to the spectral data to predict the critical quality attribute, and finally close the loop with the control system to control the process within the design space. To make matters more challenging, all this has to be done in a 21 CFR Part 11 compliant environment so the integrity of the data is preserved to prove product quality.

The PAT Integration Challenge

The PAT Integration Challenge

This is not a small integration challenge. It can get even more challenging if the multivariate model requires data from other sources such as the lab!

In order to make this challenge more manageable, Emerson has partnered with Optimal Industrial Automation. Optimal enables quality by design by providing a software system that supports the integration of PAT instruments, multivariate models, and process control systems. synTQ is a PAT software platform that enables the implementation of PAT and is intended for use in the lab, in process development, and in manufacturing. synTQ stands for synchronized total quality and provides pre-built adapters to interface to most PAT instruments, Multivariate modeling packages, process control systems and other data sources.

synTQ - Integrating Data for PAT Applications

synTQ – Integrating Data for PAT Applications

Using a tool like synTQ enables the development of automated PAT methods in R&D and for deployment into manufacturing. Data used to develop the PAT methods as well as the run time data are stored within the synTQ environment. The typical data storage software solutions that are found in the industry (OSI PI or Aspen IP/21) are not fit for purpose when it comes to data generated by PAT instruments. Spectral data is a large data set and requires special methods to organize, store, and retrieve this data for use in process control and for compliance.

Clearly, there is a lot new to learn for the process control engineer! In order to implement quality by design, it will take collaboration between the chemists, process scientists and the process control engineer. Now is the time for the motivated control engineers to develop the knowledge and skills to implement control strategies based on advanced analytical instruments and multivariate models as the life sciences industry speeds along into the 21st century of pharmaceutical manufacturing.

We’ll be exploring the importance of batch analytics in a future post and you can connect and interact with other Life Science professionals in the Life Sciences Industry track of the Emerson Exchange 365 community.

2 comments

Leave a Reply