Life Sciences Home
About the Authors

Life Sciences Blog

Importance of Change Management - case study

(Change Management, IT QMS) Permanent link

by Paul Labas

 

Apparently, it is not following proper change management practices that sinks ships


The US Navy has recently been forced to restrict the operations of its three newest fast-attack nuclear submarines – including one that was commissioned less than a week before – over "unauthorized and undocumented weld repairs having been performed."  At $2.6 billion price tag for each submarine, one must consider the impact that implementation of uncontrolled changes will have on the Navy’s finances, operational readiness, reputation, and the performance of individuals involved.  Someone, at some point, had made a decision that the required repairs are minor enough to bypass the mandated change management procedures, or that, perhaps, they didn’t have the time to deal with the required paperwork.  That erroneous decision had apparently resulted in great costs to everyone involved. 

 

We see the same situations in the world of the regulated computerized systems, where lack of proper change management procedures, or individuals choosing not to follow such procedures, force the systems as a whole to be declared non-compliant and unreliable, with all the expected consequences of such decisions. 

 

I recently led a team tasked with assessing the state of compliance of a large product Quality and Compliance Management system used to support QA areas such as CAPA, non-conformances, product change control, audit management, etc.  The team had discovered a number of serious issues, but the one that stood out was the fact that the system was updated 29 times in the previous 12 months, without a single system change following the applicable change control procedures.  There were probably more changes performed that could no longer be tracked. 

 

Although an in-depth investigation included reviews of relevant emails and meeting notes, as well as interviews with various personnel involved in the changes, the necessary information about the performed changes, such as what was done, when, why and by whom, could never be recovered.  We’ve interviewed the individuals responsible for implementing the changes, and the primary reasons given for bypassing the required process were “we didn’t have the time for documentation”, “it was just a minor change”, “it was just a collection of bug fixes”, and, perhaps my ‘favorite’, “we couldn’t have the system users agree on the change, so we decided to implement it in production to see if they like it”.  Although the process was in place, the individuals involved felt they were placed under the circumstances that forced them not to follow it.

 

The system stakeholders acknowledged that the changes were implemented without due risk and impact assessments, and were not formally authorized, adequately tested or approved for implementation.  Technical analysis revealed that the system documentation such as User Requirements and Design Specifications may only be 60-70% accurate, and the system no longer behaved as describe in the system test scripts.  When the executive management were informed, they had to decide whether the system must be taken off line, thus forcing the business to revert to costly paper-based processes and erasing millions of dollars spent on the initial system implementation and validation, or the organization would chose operating a non-compliant system “at risk”, while rushing to fully revalidate the system.    

 

Although the above case is extreme, we must never forget that inadequately controlled system changes erode at the validated and compliant state of the system, one change at a time.  Regardless of how much funds and attention is invested in the initial system purchase, implementation and validation, it is the uncontrolled system changes that may ultimately obliterate all of these efforts, rendering the system non-compliant. 

 

It is important to remember that although major system releases normally receive the attention and funds required to follow the applicable procedures, it is ‘minor’ system changes that are often made by smaller teams, if not individual players, who may not follow the required procedures, be that through a conscious decision or through the lack of training and understanding of the required processes. 

 

Organizations must proactively establish, communicate and enforce appropriate change management processes and procedures, as well as stay mindful of high-pressure environments that may force individual players to circumvent the due process for the sake of saving time and efforts.  Although the individual ‘minor’ changes may seem harmful, their cumulative effect may eventually be devastating.


    


 

For RCM offerings relevant to the above article, please visit:

 

IT Part 11 and Other Regulatory Compliance

IT Quality Management Systems

 

Data Integrity Case Study

(Data Integrity, 21 CFR Part 11, FDA Inspections) Permanent link

by Paul Labas


In my previous blog post I listed six FDA Warning Letters issued thus far in 2015 that specifically cite data integrity violations discovered by the inspectors.  It is interesting to note that the findings in all six Warning Letters concentrate on the data that was generated within GMP QC Laboratories, using instruments such as HPLC, UPLC, FTIR, IR, IMS, etc. 

 

RCM Life Sciences was recently contacted by the leading manufacturer of topical pharmaceutical products in the U.S. (the Client), who felt the need for an in-depth assessment of the compliant state of its QC Lab instruments and computerized systems.  After thorough analysis, RCM was retained to perform a periodic review with elements of a gap assessment for 20 QC Lab instrument computerized systems, under the defined Project/Solution model. 

 

RCM engaged a Project Leader and two SME Specialists under my oversight, providing services under a Project/Solution model. 

 

The RCM team:


  1. Assisted the client with defining the project scope, approach, costs, resourcing and timelines based on the recent relevant RCM experience. 


  2. Improved the Client’s existing Periodic Review procedural framework and progressed it through reviews, approvals and implementation.


  3. Performed and documented an assessment of the compliant state of 20 QC Lab instrument computerized systems in scope, including HPLC, GC, FTIR, uPLC, UV Spec, Raman IR, Near IR, IMS, TOC Analyzers, etc.  The assessment covered data integrity, Part 11 compliance, validation/qualification, change controls, operations, security and user access, data backup and recovery, instrument calibration, etc.


  4. Documented detailed itemized remediation activities required to address the identified findings.


  5. Assisted the client with identifying the “next step” options and selecting the most effective way to proceed towards the systems remediation.
       

  6. Provided the Client personnel with on-going guidance and coaching relevant to the computerized systems compliance and data integrity, including analysis of FDA warning letters and latest industry trends.


Following the assessment completion, RCM personnel was further retained by The Client to assist with the remediation projects.  


Please review the full case study for further details. 

  





For RCM offerings relevant to the above article, please visit:


IT Part 11 and Other Regulatory Compliance

IT Quality Management Systems

First half of 2015: FDA Warning Letters on Data Integrity

(Data Integrity, 21 CFR Part 11, FDA Inspections) Permanent link

by Paul Labas


“Data integrity problems break trust. In the time between inspections, we trust you to do the right thing. And if we finish an inspection and that trust has been broken, then we need to go through a few exercises to build that trust [again] before we can go forward and trust you until the next inspection.”

 

Karen Takahashi

Senior Policy Advisor

Office of Compliance

FDA Center for Drug Evaluation and Research (CDER)

     



In 2013, FDA cited data integrity problems in only seven out of 26 warning letters it issued for GMP violations.

 

In 2014, FDA called out data integrity in 13 out of 18 GMP warning letters. 

 

During the first six months of 2015, data integrity issues were specifically documented in six warning letters published on the FDA website. 

 

The trend described in my previous blog post obviously continues and FDA upholds its promise to focus on the data integrity enforcement.

 

Here are just few selected quotations from the latest warning letters (follow the provided links to further explore all data integrity violations described in the letters):

 

Apotex, Inc.

  1. “Your firm failed to ensure that laboratory records included complete data derived from all tests”

  2. “data […] only included the most favorable result obtained from multiple test results without any justification.”

  3. Conclusion: “The foregoing examples are of serious CGMP violations demonstrating that your quality system does not adequately ensure the accuracy and integrity of the data generated at your facility to ensure the safety, effectiveness, and quality of the drug products you manufacture.”

 

Cadila Pharmaceuticals Limited


  1. “Your firm did not have proper controls in place to prevent the unauthorized manipulation of your electronic raw data.”

  2. Conclusion: “The items listed above, as well as other deficiencies our investigator found, lead us to question the effectiveness of your current quality system to achieve overall compliance with CGMP at your facility. It is apparent that you have not implemented a robust quality system at your firm.”

 

Novacyl


  1. “Failure to prevent unauthorized access or changes to data and to provide adequate controls to prevent omission of data.”

  2. Conclusion: “The inadequate controls over access to your data raise questions about the authenticity and reliability of your data and the quality of the APIs you produce.” 

 

VUAB Pharma


  1. “Failure to prevent unauthorized access or changes to data and to provide adequate controls preventing data omissions.”

  2. “You also failed to have proper controls in place to prevent unauthorized manipulation of your laboratory’s raw electronic data”

  3. Conclusion: “This lack of control over the integrity of your data raises questions about your analytical data’s authenticity and reliability, and about the quality of your APIs.”

 

Hospira


  1. “Your firm failed to exercise appropriate controls over computer or related systems to assure that only authorized personnel institute changes in master production and control records, or other records”

  2. “your […] data acquisition software, TotalChrom®, did not have sufficient controls to prevent the deletion or alteration of raw data files.”

  3. “because no audit trail function was enabled […], your firm was unable to verify what types of injections were made, who made them, or the date or time of deletion.”

  4. “you are unable to assure that all raw data generated is included and evaluated when you review analytical test results to determine whether your products conform with their established specifications and standards.”

  5. Conclusion: “These are serious CGMP violations that demonstrate that your quality system does not adequately ensure the accuracy and integrity of the data you generate to support the safety, effectiveness, and quality of the drug products you manufacture.” 

 

Yunnan Hande Bio-Tech. Co. Ltd.


  1. “Failure to prevent unauthorized access or changes to data and to provide adequate controls to prevent omission of data.”

  2. “the computer software for this equipment lacked active audit trail functions to record changes to data”

  3. “Laboratory control records must include accurate and truthful documentation of all raw data”

  4. Conclusion: “The above examples are serious CGMP deviations demonstrating that your quality system does not adequately ensure the accuracy and integrity of the data generated at your facility to support the safety, effectiveness, and quality of the drug products you manufacture.”

   

The conclusions mirror the statements from FDA’s Deputy Director Richard L. Friedman.

 

Every one of these letters is closed with a recommended remediation action similar to this one:


“In response to this letter, provide specific details about the comprehensive controls in place to ensure the integrity of electronic raw data generated by all computerized systems during the manufacture and testing of your drugs.” 

 

While FDA clearly expects “comprehensive controls” that ensure the data integrity, FDA has not yet published any specific definitions, expectations or guidance on this topic.  In the next blog post or a white paper due shortly, we’ll analyze the FDA’s and MEA’s input, content of 483, warning letters, and consent decrees, opinions of the top industry experts, and our organizational experience to answer two questions:



  1. WHAT are the agencies’ expectations when it comes to data integrity; and


  2. HOW can the companies meet these expectations in a cost-effective manner, especially when regulated data is collected, modified or maintained by computerized systems.

 





For RCM offerings relevant to the above article, please visit:


IT Part 11 and Other Regulatory Compliance

IT Quality Management Systems

Data Integrity and FDA Warning Letter

(Data Integrity, FDA Inspections) Permanent link

by Paul Labas


In my previous blog post I have described an FDA Inspector’s opinion and experience with data integrity issues that FDA encounters during the industry inspections, as well as the FDA’s uncompromising stance on such issues.  Although the FDA’s opinion is important, nothing speaks louder to the industry than the agencies’ enforcement actions. 

 

FDA uses a number of means to enforce compliance with the laws under its jurisdiction, including Warning Letters.  According to FDA, “Warning Letters are issued only for violations of regulatory significance.  Significant violations are those violations that may lead to enforcement action if not promptly and adequately corrected.”  Warning Letters are public records that are carefully monitored by the industry experts to understand the current FDA enforcement direction and trends.  

 

On April 6 of this year, FDA issued a Warning Letter to a company that specializes in R&D of biotechnology, citing “significant deviations from current good manufacturing practice (CGMP) regulations for the manufacture of active pharmaceutical ingredients (APIs).” 

 

There are three specific deviations cited in the Letter, and all three have to do with the data integrity issues:

 

  1. “Failure to prevent unauthorized access or changes to data and to provide adequate controls to prevent omission of data. 


    You lacked controls to prevent the unauthorized manipulation of your laboratory's electronic raw data.  Specifically, your infrared (IR) spectrometer did not have access controls to prevent deletion or alteration of raw data. 


    Furthermore, the computer software for this equipment lacked active audit trail functions to record changes to data, including information on original results, the identity of the person making the change, and the date of the change.”



  2. “Failure of your quality unit to ensure that materials are appropriately tested and the results are reported.  

    …the analyst at your firm altered the file name in the spectrophotometer containing the sample identification information for (b)(4) API lot # (b)(4), tested on April 2, 2014, to support the release of two previously manufactured lots, # (b)(4) and (b)(4). 


  3. "Failure of your quality unit to exercise its responsibility to ensure the APIs manufactured at your facility are in compliance with CGMP, and meet established specifications for quality and purity.

    For example, your quality unit failed to detect that your laboratory altered IR raw data and misrepresented the results for approval and release of (b)(4), API lots# (b)(4) and (b)(4).”  

 

The importance of data integrity assurance cannot be overestimated, and the above highlights the rightfully rigid and forceful position that the agencies will maintain on the topic.  We must remember the goal of life sciences quality as minimizing the impact to “the big three” - patient safety, product quality and data integrity. 


 



 

For RCM offerings relevant to the above article, please visit:

 

IT Part 11 and Other Regulatory Compliance

IT Quality Management Systems

 



Data Integrity - Meeting FDA Expectations

(Data Integrity, 21 CFR Part 11, FDA Inspections) Permanent link

by Paul Labas


Recently, RCM sponsored an ISPE event where an FDA Investigator of Pharmaceutical Inspectorate gave a presentation titled “Risky Business? Quality Systems and Drug Lifecycle Risk Management.” The presentation was adapted from another presentation provided by Richard L. Friedman, M.S., Deputy Director, Science and Regulatory Policy Office of manufacturing & Product Quality, FDA / CDER / OC. The event attracted over 250 industry participants.

Although the presentation was designed to focus on the manufacturing environment and processes, The Inspector made it a point to specifically emphasize Data Integrity as one of the core FDA requirements.


The Inspector didn’t try to soften the message, saying that “If there is no data integrity assurance, the data cannot be given to FDA and must be discarded.” He then displayed the relevant PowerPoint slide with the following on it:


FDA:


Data integrity breaches:


  1. Undermine assurance of pharmaceutical quality

  2. Break down basic trust with regulator and public

  3. Are a fundamental failure of the Quality System

Yes, the FDA presentation had the last point in bold – one of very few in the 55-slide presentation.


Speaking with potential clients, I often encounter a philosophy that quality and compliance of computerized systems is disconnected from, and is a distant second to, the quality and compliance of the actual pharmaceutical, biotech or med device products that are at the core of the company’s business. What such philosophy fails to recognize is that the computerized systems are being relied on to capture, secure and store the product data that must be maintained under Health Authority laws and regulations. The computerized systems are the very means to ensuring the expected data integrity and allowing the company to provide the evidence of the product quality to the regulatory agencies, thus defending its quality assurance practices.


Although the use of the term “data integrity” by the agencies is on a fast rise, the principles behind the term are not new - FDA documented the requirements for electronic data integrity by issuing 21 CFR Part 11 in 1997. Part 11 outlines the criteria under which FDA considers the electronic data to be trustworthy, reliable, and acceptable.


As another FDA slide listed examples of the data integrity breaches that The Inspector personally discovered and noted in FDA-483s and Warning Letters, it was clear that every one of them can be matched to the Part 11 provisions that, if met, would have eliminated the findings!


  1. “The system permitted unauthorized data changes”

    Part 11.10(d) and (g) require the system access and ability to modify the data to be limited to authorized users

  2. “The older system records could not be found”

    Part 11.10(b) and (c) require the data storage, protection and ready retrieval throughout the data retention period.

  3. “Data deleted or altered, with no audit trail”

    Part 11.10(e) requires the use of automated audit trails that capture the information relevant to data creation, modification and deletion

  4. “Password sharing”

    Part 11.10(d) and (g) require the system access to be limited to authorized users; while Part 11.300 outlines the controls required for user credentials and passwords

It is imperative to recognize the importance of computerized systems quality and compliance to the company’s ability to provide the regulatory agencies with the complete and accurate evidence of the actual product quality assurance practices. The compliance with 21 CFR Part 11 is not only mandatory under US laws, but also ensures the expected data integrity, thus avoiding what FDA calls “a fundamental failure of the Quality System”.


After all, as The Inspector reminded the audience, “if it’s not documented, it never happened”.




For RCM offerings relevant to the above article, please visit:


IT Part 11 and Other Regulatory Compliance

IT Quality Management Systems




Why Validate?

(CSV, 21 CFR Part 11) Permanent link

- by Pat Lennon

Why DO we validate systems? Of course there is the simple, nearly automatic answer, “Because the regulatory agencies require it”, but is that a good enough reason? We can fall back on the FDA’s explanation for what validation means, to make sure our system performs the functions expected of it in a reliable and acceptable manner, but that is very close to a circular argument, i.e., we validate because it is a good idea, if not common business sense.

I propose that we validate our systems because it is the most efficient way to make sure that the output of those systems is such that we can use it for our business purposes. In other words, validation provides economic benefits. How can I say such a thing? As validation is declared to be a costly money pit, what “economic benefits” can it possibly render?

The validation process provides us many things beyond evidence that the SYSTEM works as expected. It also:

  • Allows us to make sure our business processes work as expected.
  • Provides us a basis to make quality control risk decisions, i.e.:
    • How much can we depend on our data? How accurate is it?
    • What level of verification do we need to add on the generation of each report?
    • How much quality testing do we have to do before we can safely release our product for use with good faith that the product will do more good than harm?
  • Reduces the manpower required to ensure the safety and efficacy of our product, regardless of whether that product is data, pharmaceuticals, medical devices or submission reports.
  • Identifies the risks that our system represents, which determines the amount and extent of external controls we need to implement against that system as well as the manpower required to implement those controls.


Validation costs are much like investing in automation of a production process. It replaces long term, repetitive costs with up-front investments. Like any other capital investment, it looks expensive if you do not consider the alternative, but when compared to the costs of no investment, the initial costs always show a significant savings over the duration of the activities.

We need to stop thinking about validation as an unnecessary, regulatory-agency-imposed, burdensome cost and see it as it is, the process systems’ version of automation. When we begin to see validation as capital investment and not lost money, then we will really begin to understand why we validate.