Tuesday, July 7, 2009

Automated In-Line Dilution - Introduction

This blog will introduce the basic terms, engineering concepts, and quality concerns for automated in-line dilution equipment. An article with this content was also published in the Journal of GXP Compliance, an Institue of Validation Technology publication.

I welcome your comments as I hope to make this dicussion interactive with those of you familiar to this equipment or those of you who want to learn more.

Automated in-line dilution is an increasingly popular technology in the biopharmaceutical industry . In-line dilution is a process that can help solve some of the capacity, financial, and quality concerns that biopharmaceutical manufacturing plants may be facing with regards to process solution makeup and delivery. The information conveys a general knowledge about the technology, how it works, and what the quality impacts are.

It’s not uncommon for there to be some hesitation or uncertainty about implementing new technologies. After all, there is a comfort level with the current, traditional processes in which there are known failures and a general knowledge on how to address them. Implementing new manufacturing methods requires a familiarization period especially when people or organizations do not have a great understanding of what risks may be present and how to mitigate them. In addition to addressing new process methods, the quality & manufacturing science organizations have to consider the FDA guidance issued in 2004, PHARMACEUTICAL CGMPs FOR THE 21stCENTURY — A RISK-BASED APPROACH and ICH’s Q10 Among other things, these guidances urge companies to use process analytical technologies (PAT) to accomplish Quality by Design (QbD). The thought of incorporating PAT adds another level of complexity to adopting new process methodologies. To further complicate the issue, some people may already have negative feelings about PAT based on past experiences.

One of the many concerns in biopharmaceutical facilities is capacity. Biopharmaceutical companies are now obtaining much higher fermentation and cell culture yields than several years ago and have outgrown the capacity of downstream process equipment and production facilities. Biopharmaceutical companies are also challenged with manufacturing at large scales. Manufacturing 10,000L batches of process solutions in large tanks is inherently difficult. Making a 1 L solution in a lab can be done very precisely using analytical instruments that are calibrated to the milligram. A 1 L solution in the lab is also very easy to mix and does not take a long time to mix thoroughly. On the other hand, making a 10,000 L solution requires technology like load cells or level probes which have accuracies at least 10 times worse than the lab scales. If you add a bag a salt to a 10,000 L vessel, the mixing is anything but uniform and efficient; it requires significant time to achieve a uniform solution. Mixing is so difficult at large scale that mixing studies and validation are required to ensure the process is reliable and repeatable.

The scale up issue here is accuracy and ergonomics. If your development process was done using lab prepared solutions, you have to be prepared to deal with new variability once you scale up using traditional solution makeup techniques.

So where does in-line dilution fall in here? In-line dilution has a huge advantage compared to the large scale traditional processes because the mixing and preparation is actually being done at a small scale (think of the holdup volume of the skid vs. the 10,000L buffer prep tank). In addition, in-line dilution processes can incorporate feedback control and feedback control with mixing to achieve accuracies equal to or better than in the lab. As mentioned above, a process that was originally designed for a 10,000 L batch of process solution may now require twice as much solution due to increased yields. The manufacturing process now needs to make two 10,000 L batches in the preparation vessel and transfer each batch to a 20,000L storage vessel. Is there a 20,000 L vessel available? Is there room to install a tank this size?

Below is an illustration of this situation and how in-line dilution can solve this large scale problem.




What is automated in-line dilution?
In-line dilution is a process in which two streams are brought together in a controlled fashion to meet an overall target concentration. A dilution ratio of up to 10:1 or more can often be achieved by current equipment designs. This typically allows for product solution of up to a 10X concentration to be utilized. The maximum dilution ratio is limited by both equipment constraints and properties of the concentrated solution. Much larger dilutions can be obtained by placing multiple in-line dilution processes in series. The equipment utilized for an in-line dilution process is compact and usually portable. The equipment is typically only capable of performing one step at a time, meaning that only two inlet streams can be combined to make an intermediate or final product. If a second process step is necessary, such as addition of a third solution or adjustment of another parameter (i.e. pH), then a second module can be added to the equipment. Multiple skids can also be placed in series to accomplish this task. An intermediate is the output of any one in-line dilution module or skid. The intermediate is then directed to the inlet of the subsequent module or skid to perform the next processing step. Below is an illustration of a process in which multiple dilutions or processing steps can be performed.

Automation of the process allows for the final product solution to be manufactured “just in time” and the small portable equipment is capable of delivering the final product at the “point of use”. Below is a brief explanation of each of these terms:

Just in time – the product solution is manufactured on demand as required by the process in real time. The product does not require preparation prior to beginning the process and therefore eliminates the need for large storage vessels. This term originates from Six Sigma and Supply Chain Management as a method of reducing storage of intermediate products.

Point of use – the in-line dilution equipment can be placed at or near the point of use in order to eliminate intermediate storage tanks.

Portable – In-line dilution equipment can be designed as a “skid” system in which all of the components are fitted to a frame mounted on wheels. A portable skid makes transporting much easier and allows for the in-line dilution to be performed at the point of use, which could be in several areas of a facility.

Skid – A skid is a portable or semi-portable system that performs one or more unit operations and only requires simple utility hookups to operate. As an analogy, skids are to appliances as manufacturing plants are to homes.

Buffer – (Biopharmaceutical term) A process solution that is typically aqueous and contains one or more salts; the solution’s ability to maintain a stable pH value is implied but not necessarily valid.

Follow this blog for future discusions on:
  • Engineering Principles & Designs

  • Equipment Components

  • Process Materials

  • Process Solutions

  • Operation

  • Maintenance

  • Quality Concerns

Data Monitoring - Introduction

Welcome to the Kymanox Engineering blog. My name is Brandon Patterson and I work for Kymanox (ki'-mah-noks'), a growing technical project management and engineering company in the biotechnology, pharmaceutical, and medical device industries. I am currently a Sr. Process Engineer for the Midwest office. In my 8 years of experience in the pharmaceutical industry I have experience in process engineering, equipment engineering, maintenance, and validation.

I have had the opportunity to work on projects that involved the use of data monitoring / data logging instruments to validate environmental chambers, sterilizers, washers, and facilities. This post and future posts will introduce the basic terms, engineering concepts, and quality concerns for data monitoring equipment. The terms data monitoring and data logging will be used interchangeably in this blog. An article with this content was also published in the Journal of GXP Compliance, an Institue of Validation Technology publication.

I welcome your comments as I hope to make this dicussion interactive with those of you familiar to this equipment or those of you who want to learn more.

Data logging is the process of collecting information over a certain period of time at predetermined time intervals. The data is generally collected sequentially and at rates faster than what is possible by human observation. Data logging instruments allow for highly accurate measurements to be obtained. One of the most common types of data logging is temperature data recording. This practice is common in the pharmaceutical, biotech, and medical device industries and will therefore be the primary focus and example in this blog. However, with appropriate instruments, many other forms of data can be collected. Examples include:
  • Relative humidity
  • Pressure
  • Electrical currents

Many of the concepts discussed in this blog can also be applied to other forms of data logging. Applications and importance of data logging. Data logging is used in engineering and validation activities to gather sequential readings that can be analyzed to gain a greater understanding of the operation of a system. Systems that may utilize data logging can include any area that has a requirement to maintain a given temperature specification such as:
  • Warehouses
  • Production rooms
  • Refrigerators
  • Freezers
  • Sterilizers

Data is usually collected simultaneously at multiple locations within the area being tested. Locations are carefully chosen so that the entire area is “mapped”. The data is then analyzed to determine the worst-case locations within the area or system (i.e., the hottest or coldest locations). The data can then be used to perform additional calculations including:
  • Minimums
  • Maximums
  • Averages
  • Standard Deviations
  • Lethality of a sterilization cycle (F-sub-zero)

Since a complete data logging event contains hundreds to thousands of data points, these calculations make it possible to draw conclusions about the system from the data collected. Data logging is critical to obtaining an accurate, detailed representation of how a system operates. Some systems change temperature very fast, such as a sterilizer, while other systems, such as a refrigerator, change temperature slowly. These systems may also have long cycle times that make it impractical for manual data recording. Data logging also enables a significant number of measurements to be collected over short or long periods of time. The large number of measurements can easily surpass the quantity of points required to be statistically significant.

Follow this blog for future discussions on:
  • Types of data loggers, including stand-alone and computer-based instruments
  • Temperature elements used in data loggers, including thermocouples, thermistors, and resistance temperature detectors (RTD)
  • The configuration of data loggers, including wiring, multiple channels, and self-contained instruments
  • Calibration of data loggers
  • Use of data loggers in Validation (Qualification), including environmental chambers, steam sterilizers, warehouses, and production rooms
  • Collecting data, including probe placement, sampling intervals, and the duration of monitoring protocols
  • Electronic data, including electronic files, printing directly from software, exporting to Excel, and key aspects of 21 CFR Part 11 compliance