Guest Column | September 22, 2021

9 Pitfalls To Avoid In Data Integrity in Pharmaceutical and Device Development & Operations

By Peter H. Calcott, Ph.D., president and CEO, Calcott Consulting LLC

big data personal information safety technology-iStock-929002810

In the first article of this two-part series, I shared five common misconceptions in data integrity (DI). In this article, I will illustrate with examples areas where I have seen significant DI implementation problems in companies I have worked with. This should not be viewed as an exhaustive listing but rather those I have found that illustrate the diversity of issues.

1. Part 11 Compliance

Part 11 compliance has been around for about 20 years,1 but still today I see confusion in the industry. While most elements have been implemented successfully, in part attributable to the array of great software available in the marketplace, there are still areas where people stumble. I will illustrate two areas.

First, while almost all my clients purchase software in the marketplace, the systems fall into two types for the purpose of this point. The systems that reside in the cloud, where you access the application via the internet, usually are very robust. However, I have seen clients disable certain features that can have impact (e.g., audit trails). On e-systems that reside on a PC at the site of use, I have witnessed several problematic incidents. These all focus on the integrity of the clock used to date-stamp data. In these examples, the software has used the Microsoft (computer) clock to date-stamp the data rather than the software clock. In both cases, the Microsoft clock was not protected, and I could change the date and time with a click of the mouse. This rendered the date stamp worthless. The simple fix was for the administrator to lock the computer setting so that analysts could not change it. Of course, the system administrator can still do that.

Second, assignment of appropriate access based on job function is critical for meeting Part 11 compliance. At larger companies, there are at least three levels of access, easily recognizable. I will use setting up access levels for a High Pressure Liquid Chromatography (HPLC) as an example. The most restrictive is for staff who simply review data, either rejecting or approving the results. This is common for supervisors or manager-level staff. The next level is assigned to analysts, where they need to be able to set up runs, review data, and make adjustments to, for instance, integration parameters. And finally, there is the system administrator, who has complete access to the inner working of the system. In some smaller companies, these assignments are often blurred, with “super users” having administrator rights although they actually run samples and process data. This means that these analysts can actually access the file systems and make major adjustments. This leads to situations where DI can be questioned. It is particularly important to assign the access based on job function and separate the administrative functions from analyst roles. This tends to be a problem particularly in smaller companies.

2. Integration Of HPLC Chromatograms

Ideally, software should be set up to run automatically and integrate correctly every time. However, in many analyses, integration is not perfect, requiring post-analysis adjustments. If this is the case, it is paramount to incorporate into the method SOP a procedure to follow, with appropriate documentation, so the reintegration is performed in a reproducible compliant and documented manner. Without these controls and checks, it leaves the company open to DI questions.

3. Out Of Specification (OOS) Investigations

Even after the “Barr Case” of 1993,2 companies run into problems with how they conduct OOS investigations. There needs to be a robust SOP detailing how you proceed when a suspected OOS is encountered. Initially, before any investigation into product quality, there needs to be an assessment as to whether the method was run correctly in the laboratory. If an error can be demonstrated, the whole result or even the whole run can be nullified and the run or sample repeated. Once the run is shown valid, or at least cannot be nullified, then a product investigation can be considered. A detailed investigation plan, including repeat testing or retesting, must be drawn up and executed. Failure to follow a structured plan can create doubt about your DI status and conclusions.

4. Environmental Monitoring (EM) Data

Particularly in sterile or aseptic processing operations, many EM data are generated and analyzed. Obviously, the correlation between numbers of colonies on plates and the results on test forms must be perfect. So, when I routinely audit operations, I often review these forms. Even well-run operations will pick up counts on plates quite normally. If I see page upon page of zero colony forming units, my suspicions are triggered. If it looks too good to be true, it usually is.

5. Reports That De-emphasize Data That Compromises The Study – Validation And Investigations

As I indicated in part 1 of this article series, any discrepancy or deviation or anomalous data that is generated in an investigation or validation must be considered in the context of the end result of the report. Too many times, I have found results that might cast doubt on a conclusion are ignored and not discussed. Not all “failing” results or deviations will necessarily nullify the conclusion. In many cases, other tests can be performed to address the anomaly. At the end of the day, you want a report that can be read by others (including an inspector) that is correct and convincing.

6. Cherry-Picking Data

In many MHRA and FDA presentations and guidances,3-7 they have described cherry-picking of data. That is the tendency to keep testing until you get the result you “want” – usually a passing result. It often manifests in using unofficial databases to house data, running trial samples, using test samples to “calibrate” systems, and the list goes on. In the GMP world of validated methods, you get a chance to run a sample once according to the SOP. Only if you can prove there was a lab error can you justify nullifying the test and repeating it. While FDA warning letters are rife with incidents, I have found in auditing it often happens in smaller companies that are transitioning from being solely a research company to a development company moving into clinical trial manufacturing. Often, the senior staff is research trained and not familiar with the GMP requirements. It is a hard transition to make in a career. I know because I made that transition a long time ago.

7. Making Data And Sampling Ports Accessible

Although I have never seen an example of this in my years of auditing, I am including it because the MHRA used this example in its 2018 guidance 6. By this, they mean that a sampler might be tempted to sample not from the correct port in, for instance, a WFI loop, but rather another from a more accessible one, if the former is difficult to get to. So, we must be vigilant to assure we do not install obstacles in the way of our staff getting their jobs done correctly.

8. QA-Issued Forms

Any blank form used in your operations (on the shop floor or QC labs) must be a controlled form. That is, it must be QA issued (appropriately reviewed and approved) and be issued with a unique identifier. QA needs to keep an inventory of those issued, when, and to whom. If the forms are available online and can be printed off by an operator, then the control is lost. A form can be filled in or destroyed with no record. I have found this is a difficult principle for some smaller companies to grasp. If you do not track issuance of your forms, you are open to questions about the integrity of your documentation.

9. Transitioning From Research To Regulated Environment

I have found that the transition from a completely research organization to a development and commercial organization can be problematic in this area. Researchers are used to experiments not working, resulting in their repeating the process until it works. It is part of the research method. So, when those staffers transition over to a more regulated operation, for instance during clinical manufacturing, that transition can be very traumatic. Many actions acceptable in research are just not compliant to the GMP regulations. This is particularly true in DI, illustrated in some of the areas above.

DI is a hot topic with regulators at the present time. There are good guidances out there to help and educate you.3-9 What is most important is to recognize that DI issues can affect any person or organization. I would encourage all to consider that it might actually be happening in your organization as you read this article. Do not wait for the regulators to discover it in your operation. Rather, be proactive and seek out assurances that it is not occurring in your operations. With the intelligent use of ICH Q9 – Quality Risk Management10 techniques, you can systematically assess your processes, identify weaknesses, and remedy them before you have a data integrity crisis on your hands.

References

  1. 21 CFR Part 11 Electronic Records; Electronic Signatures — Scope and Application
  2. United States v. Barr Laboratories, Inc., 812 F. Supp. 458 (D.N.J. 1993) https://law.justia.com/cases/federal/district-courts/FSupp/812/458/1762275/
  3. MHRA GMP Data Integrity Definitions and Guidance to Industry March 2015
  4. FDA Data Integrity and Compliance with GMP April 2016
  5. EMA Data Integrity August 2016
  6. MHRA “GX” Data Integrity Guidance and Definitions March 2018
  7. FDA Data Integrity and Compliance with Drug CGMP December 2018
  8. PIC/S Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments August 2016
  9. WHO Technical Report Series Number 996 2016
  10. ICH Q9 Quality Risk Management
  11. PIC/S Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments July 2021

About The Author:

Peter H. Calcott, D.Phil., is president and CEO of Calcott Consulting LLC, which delivers solutions to pharmaceutical and biotechnology companies in the areas of corporate strategy, supply chain, quality, clinical development, regulatory affairs, corporate compliance, and enterprise e-solutions. He has also served as an expert witness. He also teaches at the University of California, Berkeley in the biotechnology and pharmaceutics postgraduate programs. Previously, he was executive VP at PDL BioPharma, chief quality officer at Chiron and Immunex Corporations, and director of quality assurance for SmithKline Beecham and for Bayer. He has also held positions in R&D, regulatory affairs, process development, and manufacturing at other major pharmaceutical companies. He has successfully licensed products in the biologics, drugs, and device sectors on all six continents. Calcott holds a doctorate in microbial physiology and biochemistry from the University of Sussex in England. He has been a consultant for more than 20 years to government, industry, and academia.