Guest Column | October 30, 2023

Understanding The Importance Of Real-time Monitoring For Protein Aggregation

A conversation with Danny Chou, founder and CEO of Compassion BioSolution

Security Cameras

Manufacturing proteins for biopharmaceutical applications is fraught with challenges linked to maintaining and demonstrating their stability. One of the biggest issues related to protein instability is aggregation, which, depending on its nature or extent, can compromise a therapeutic’s safety or efficacy, in turn threatening patient health and triggering potentially disastrous recalls of an otherwise promising drug.

Danny Chou began his career as a clinical pharmacist before returning to academia to earn his Ph.D. in pharmaceutical biotechnology at the University of Colorado – Denver, where he became engrossed with protein stability, particularly around monitoring for protein aggregation. Since then, he has dedicated himself to the analytics that support better monitoring for protein-based therapeutics. After stints at major biopharmas such as Amgen, Sanofi, and Gilead Sciences, Chou launched Compassion BioSolution, where he has spent nearly a decade helping companies foment a more advanced analytical paradigm, built on real-time, sensitive monitoring techniques capable of providing faster, better insight into a molecule’s propensity to aggregate.

While most biopharmas continue to rely on batch monitoring to gain insight into protein aggregation, these techniques are often insufficient to give a clear and complete picture of what is occurring in a process. Regulators and industry leaders alike have begun to recognize the value of deeper monitoring, and, for Chou, who has long promoted the use of more advanced process analytical technologies (PATs) to gain a better understanding of aggregation, the more widespread adoption of these technologies for the biopharmaceutical industry is only a matter of time. We asked Chou to explore his expertise in these approaches, trends in the industry, and how real-time monitoring can impact variables like training operators and managing data.

Can you give us the abridged definition of real-time protein aggregate monitoring and explain how it's different from other techniques?

Danny
Danny Chou, Ph.D., Pharm.D., founder and CEO, Compassion BioSolution
The best analogy I could use is that batch monitoring for protein aggregation is like snapping pictures of an event. Obviously, when you’re snapping photos, you’re trying to snap a picture based on what you think you want to see when it’s happening. Batch-based methods, most of the time, are not actually in action at the moment something occurs, such as the conversion of a monomeric protein. When proteins begin to collide and form an aggregate with a neighboring molecule – batch methods are never designed for, and are not capable of, capturing that. As such, a batch-based method is analogous to snapping photos and then retrospectively reviewing the pictures to try to put the whole scenario together.

In contrast, real-time monitoring is akin to having a CCTV camera monitoring the site where aggregation may occur, whether it be in-vial or in-process during manufacturing. You can see it from the moment it occurs and have fluid, uninterrupted information on that event as it occurs and continues. That’s the paradigm difference – one can see that having a video as it’s happening in real time is far more powerful than the snapshot approach.

What's the current uptake or interest in real-time monitoring?

It depends on what level you’re talking about. I would say that, overall, this is still in its infancy for companies that are very interested in reducing the cost of goods sold when they are trying to make commercial products and want to transition to a continuous process. They may evaluate different forms of PAT. Currently, there’s very little talk or awareness of looking at aggregation, and specifically, of applying PAT to monitor aggregation. Most of the conversations are around monitoring other attributes, such as concentration, osmolality, and other aspects of the molecule or in-process material. For aggregation monitoring in real time, we’re still very early in its evolution. I’m biased because I’ve been interested in this for a long time, and it’s why I spend so much time sharing my background, because from the time I entered the field I’ve felt that analytical ability is the bottleneck.

I have a longstanding appreciation for real-time monitoring because my Ph.D. was focused on the mechanistic aspects of these types of phenomenon. That means you really had to see it right from the moment it happens. I use the analogy of being at the scene of the crime, so there's no doubt about what's causing it. Going back to the adoption rate, I think we are very early. In fact, recently, the United States Pharmacopeia formed an expert panel focused on discussing this topic and on trying to come up with a list of technologies that might be amenable for this application, with the goal of writing a new USP chapter. I happen to be on that expert panel.

In terms of implementation, I have a relatively higher level of experience because I have been utilizing real-time monitoring in the development of drug product formulation for several years now. I even used it to compare different molecule candidates to look at their relative propensity for aggregation. This is an area where opportunity is not captured, because the ability to select a molecule based on its natural inherent tendency to aggregate early on allows you to select a molecule that's basically easier to develop and to manufacture.

At the same time, if you apply real-time techniques to monitor aggregation, that means you can get away from the traditional way of formulation development, where companies that use batch-based monitoring will wait for months to get the results. If you compare different formulations in real time, you can see which formulations are forming aggregates sooner, and that allows you to draw a conclusion as to which one has a faster onset and greater rate of aggregation. I have only seen maybe one, or at the most, two publications involving a company assessing the utility of real-time monitoring and how they could use it as a stop control system when they see a trend of increasing aggregates during, say, protein purification.

Downstream purification is arguably the most expensive part of bioproduction. When you’re trying to remove impurities, that includes impurities from the host cell, process impurities, and, importantly, aggregates. So, being able to see effective [impurity] removal and make sure you maintain within certain specifications allows you to evaluate whether the process can be continued as it is or if it needs to be interrupted. I think the motivation behind this USP effort is to gather experts together, have conversations about this, and provide those insights to the industry.

What are the greatest barriers to that more robust adoption, and have you seen anyone successfully overcome them?

The biggest barrier is, ironically, the fundamental understanding of the protein aggregation phenomenon. I think despite the progress that's been made in the last two decades, there are still many practitioners in industry who do not fully appreciate the dynamic nature of protein aggregation. Aggregates are not uniform. They do not necessarily follow any particular trend from molecule to molecule. Because there's dynamism in terms of size, morphology, and mechanism, it's still very confusing to the industry and even academics. Unfortunately, there's still quite a high percentage of people who have not studied the fundamentals and have not bothered to read the latest research or the mechanistic research that has been done.

The industry also tends to take a cookie cutter approach to different problems. When I was in graduate school, I was lucky that I was working on a protein system that was aggregating in a slow, controlled fashion, and that mainly formed smaller soluble aggregates that I could easily capture using HPLC and which did not dissociate. But what if I have an antibody that happens to aggregate really quickly and large aggregates are excluded from, say, the HPLC column? The most ubiquitous way to monitor aggregation is with HPLC using a principle called size exclusion. With it, any aggregate above a certain size, and it's usually 200 nanometers or so, will be filtered out from the column completely and not be seen, detected, or quantified. When you have that situation, you're basically not even seeing it. The data, the chromatogram, can tell you that you have 99% monomer when in fact a big part of that actual overall population has been filtered out from the column.

I have successfully applied emerging techniques to solve real problems involving the formulation development and other needs of my clients. But I've been pretty active on the speaking circuit talking about this, and I think the uptake is maybe just beginning, based on my understanding of the vendors of such technologies and on the part of major biopharma companies, where they're beginning to be interested in testing out these new technologies, but there are still more questions than answers.

Can you talk about data management? Presumably, a team that jumps from discrete sampling to continuous monitoring is going to produce reams of additional data.

Just like when you want to film a real-time video, you're going to need a lot of memory to store that data. I think that is really easy to overcome because now you have cloud-based approaches; you have various data storage approaches that you can use to store massive amounts of data and pull them down from any location you like. The challenge is getting people to become used to seeing aggregation in a different light. Currently, with the batch method, people are looking at the loss of say, monomeric protein from the HPLC, and that's usually what people submit to the agency as part of the stability data. And there's a certain arbitrary criterion for acceptance – usually it's 95% or greater monomer on the HPLC. We want to begin to say, we're using a method in which we are monitoring the average molecular weight of the population of antibodies in the process, so that during downstream processing you'll be looking at the increase or decrease in the overall population of average size.

When you're using these sensitive techniques, the best ones are based on first principles. So, I usually use techniques like light scattering, which is very powerful because it's most sensitive to the barest presence of the smallest aggregates. And if you know how to use all the capabilities of light scattering, you can understand whether those aggregates are composed of a few large particles or many smaller particles. I can see a day when I answer some of these mechanistic questions, and the operator can simply be sitting there seeing if things are shifting, and then pushing the alarm if things are going out of spec, stopping the train and performing modifications to remediate the problem.

From a training perspective, what are the biggest differences between real-time monitoring and batch monitoring?

From a training perspective for batch monitoring, an analyst can learn an instrument at a time by following the protocol and then developing a method that's suitable for each case. But in real-time monitoring, particularly as it applies to manufacturing monitoring, it's a much more global, high-level strategic decision and adaptation. I think there will be more lengthy training, including from the fundamentals, as I mentioned, all the way to the ins and outs of proper implementation and how to use the data. You'll need to set up acceptance criteria and then have data to back up how you set up those specifications. It would be a more continuous endeavor.

Can you describe some scenarios where real-time monitoring would be most useful and some scenarios where batch monitoring may make more sense?

Real-time monitoring is far more powerful than batch monitoring. So, it will be most useful in cases where you need to solve a problem that conventional approaches cannot solve. If you have a molecule that happens to have very peculiar aggregation behavior kinetics, for example, then you're going to need to have a real-time system to capture that level of definition. Whereas if you are looking at routine stability monitoring where you need to report to the regulatory agency and set a pass or fail at the time of manufacturing release, then batch monitoring would make more sense. Part of that is because the regulators and industry are still used to seeing that type of data to determine whether it fits certain expectations. Getting the regulators to read real-time data and correlate that to a certain quality requirement is still going to take some time.

That’s why continuous research by industry and scientists in academia is important to better understand this. I see it as a tremendous positive trend that USP is involved, and that FDA regulators have always encouraged this approach, because both FDA and ICH guidance call for the application of PAT in what they call the quality-by-design paradigm for drug development and manufacturing. PAT is a key component of that, and real-time monitoring technology is the most powerful form of PAT you can have, especially in-line or online. There's no question that the FDA would love to see companies shifting to a real-time paradigm, but it takes time.

About the Author

Danny K. Chou, Ph.D., Pharm.D., is a biopharmaceutical industry veteran with expertise in biologic drug substance and drug product manufacturing process development with a focus on biopharmaceutical characterization, downstream purification, formulation development, and emerging technologies for protein aggregate/subvisible particle analysis. He is the founder and president of Compassion BioSolution, a biopharmaceutical consultancy. Previously, Dr. Chou was a senior scientist and group leader at Gilead Sciences and has also worked at Genzyme and Amgen. He received his Ph.D. from the University of Colorado Center for Pharmaceutical Biotechnology under an NIH Fellowship and his Pharm.D. from the University of Florida.