Understanding FDA’s CSA Guidance in the Context of Current Regulations and GAMP®

The opinions in this article are those of the authors and do not reflect the opinions of the company they work for.

Computer Software Assurance (CSA) is a Case for Quality initiative that arose when computer systems validation was identified as a barrier to adopting technology for modernization and innovation. The FDA recognizes the value of using advanced technologies to enable the industry to make medicines of more reliable quality.1

Removal of CSV barriers is critical in the realization of Pharma 4.0 and the FDA’s own commitments to advanced manufacturing2 (e.g., Center for Drug Evaluation and Research’s (CDER) Emerging Technology Program, the Center for Devices and Radiological Health’s (CDRH) Case for Quality, and the Center for Biologics Evaluation and Research’s (CBER) Advanced Technologies Program).

Subscribe to our e-Newsletters
Stay up to date with the latest news, articles, and events. Plus, get special offers
from American Pharmaceutical Review – all delivered right to your inbox! Sign up now!

CSA is a paradigm shift from:3

Table 1.

CSA is the application of critical thinking to validation that adds risk- based documentation to risk-based testing while taking a lifecycle approach, to “take credit” for activities, and reduce the validation effort. It is a “least burdensome approach” with the effort focused on risk to patient safety, product quality, and data integrity. That said, CSA does not reduce testing; CSA advocates more testing with less documentation; it is more important to remove defects than to collect documentation for inspection purposes. CSA ensures the efforts taken are value-added and the documentation available is sufficient to confirm the patient safety, product quality and data integrity requirements are upheld. CSA also champions the acceptability of modern testing techniques such as exploratory testing and automation in validation as part of the least burdensome approach.*

Dispelling Fallacies About CSA

Fallacy #1: CSA cannot be implemented until the guidance comes out.

It is important to note the FDA’s upcoming guidance on CSA is “Nothing outside of the scope of current FDA regulations. Matter of clarifying misperceptions.” Cisco Vicenty, FDA, Case for Quality, Program Manager. Consider the 1997 preamble to 21 CFR Part 11 final rule stated, “The agency believes that, within part 11, firms have the flexibility they need to adjust the extent and stringency of controls based on any factors they choose, including the economic value of the transaction”.6 CSA falls within the aforementioned flexibility, and regulated companies should follow the guidance within 21 CFR Part 11 Scope and Application and incorporate CSA as part of its “...justified and documented risk assessment and a determination of potential of the system to affect product quality and safety, and record integrity”.7

For further guidance on validation, Scope and Application points to FDA’s guidance for industry and FDA staff General Principles of Software Validation (GPSV) and GAMP®. We will focus on GAMP® as CDRH, the authors of GPSV, are the creators of CSA. It is no surprise that GAMP® is referenced. Pharmaceutical Inspection Co-operation Scheme (PIC/S), with participation from over 50 regulatory authorities globally, states “The GAMP® Guide (and appendices) has evolved largely to define best practices in specifying, designing, building, testing, qualifying and documenting these systems to a rigorous validation management scheme, largely for the controlling system”.8 That is, GAMP represents “c” or current Computerized Systems Validation practices.

In late 2019, ISPE GAMP® publicly announced their support for the CDRH Case for Quality Program and specifically called out CSA for its significance and alignment with GAMP®’s risk-based approach to computerized systems.9 By October 2020, ISPE GAMP® incorporated CSA concepts into GAMP® with the publication of Good Practice Guide on Data Integrity by Design.10 The power of critical thinking is emphasized in the guide when CSA is applied two different ways on the same computerized system. In fact, the upcoming CSA guidance document provides only one way of implementing CSA and should not be interpreted as the only way of doing it; interpreting the guidance as the only way would be contrary to the CSA mission of critical thinking.

Fallacy #2: CSA is limited to medical devices because it is a CDRH concept

Actually, CSA excludes the medical devices (e.g., pacemakers, diagnostic equipment) and applies to the non-product computerized systems (e.g., MES, ERP, QMS, EDMS) used to support operations. This means CSA is more applicable to the biologics and pharmaceutical companies, as they do not manufacture medical devices in contrast to medical device companies that have both devices and non-product computerized systems.

CDRH has always been the source of guidance on computerized systems validation. The historic reliance on CDRH for further guidance on validation is clearly noted in Figure 1.11 The size of the circle reflects the number of times a document is referenced, and 21 CFR Part 11 is the most referenced and largest circle. General Principles of Software Validation 2002 is the second largest as it is repeatedly referenced for further guidance on validation by branches of the FDA.

Table 1.

Unfortunately, General Principles of Software Validation (GPSV) is primarily focused on the medical device, and industry ended up misapplying the recommended rigor for medical devices on systems such as ERP and MES. Treating these systems as medical devices is the opposite of the least burdensome approach advocated by agency.† CSA is the “missing bubble” in the upper right-hand corner, a guidance document for all the other computerized systems.

Fallacy #3: Screenshots are required as test evidence.

CSA joins GAMP in the pushback on the over reliance on screenshots as test evidence. GAMP5 states it clearly, “Supporting documentation such as printouts, screenshots, notes, pictures, etc., may be helpful to support test results depending on the nature of the test, and the GxP impact, complexity, and novelty of the area tested. Unnecessary supporting documentation that does not add value to the normal test results should be avoided”.13

Screenshots are little more than affidavits when results can be manipulated in an undetectable fashion using browsers native functionality before taking a screenshot or after the fact using a multitude of freely available editing tools.‡ Given the ease of manipulation, screenshots are now equivalent to attestations of pass/ fail. The belief that screenshots provide additional assurances is one of the primary reasons why validation is inefficient. Traditional validation testing is measured by the absence of errors typically achieved through repetitive dry runs prior to execution, while capturing screenshots for both passes and fails, whereas modern testing is measured by the number and severity of issues detected using unscripted and scripted techniques, while only capturing screenshots for failures. The efficiencies gained can be repurposed to increase patient safety, product quality and data integrity through automation, increased testing, etc.

Fallacy #4: Automation and ancillary tools must be qualified.

CSA promotes the use of automation and ancillary tools (e.g., ALM, database compare utilities). CSA is reiterating what is already stated in final 21 CFR Part 11 Scope and Application which clarifies these tools do not trigger separate validation efforts unless there is a predicate rule other than Part 11. Overly broad interpretations have led to the “...unnecessary controls and costs and could discourage innovation and technological advances without providing added benefit to the public health” that Scope and Application was seeking to avoid. For example, we have seen acceptance sampling used instead of a tool that could do 100% verification; the decision was made in order to avoid the perceived and unnecessary documentation burden to use the tool. Consider Application LifeCycle Management tools, ALM used at the regulated company typically does not need to be validated based on the absence of predicate rules, but the regulated company will need to validate it if they choose to use for predicated design control activities. Note, the absence of validation requirements does not obviate the need to test as the regulated company must still ensure it meets their business needs for its own purposes versus inspection purposes. The documentation of such testing is at the discretion of the regulated company.

Fallacy #5: Internal development testing cannot be leveraged in validation efforts.

One of the most critical aspects of CSA is the leveraging of the work that is already completed. Whether that was testing done at the vendor or done through internally SDLC, e.g., internal test cycles. Regulators view substantiates this view: a) Vendor testing can be leveraged for functional verification;14 and b) the sponsor may rely on qualification documentation provided by the vendor, if the qualification activities performed by the vendor have been assessed as adequate.15 Annex 11 3.1 stipulates IT be treated analogously to a supplier.16

Some regulated companies struggle with this concept and attempt to stipulate QA oversight that transforms technical issues into needless compliance issues, e.g., QA sign-offs and GXP procedures; essentially constraining modern testing approaches to inefficient validation cycles based on software engineering practices from the 1990’s. One notable example is the unacceptability of regression testing based on technical impact assessment during prevalidation testing, i.e., the system must be retested 100% each time a change is made for it to be leveraged in validation. This is notable because it is contrary to how even validation occurs. We do not retest 100% of the system each time a change is introduced; we retest based on technical impact assessment for defects, releases, changes, etc.

While the Appendix S2 on CSA in GAMP GPG Data Integrity by Design discusses the concept in depth, the following quote from the FDA’s Vicenty, captures it all:

“If you’ve already done the work, use the work that is done (SQA+ or high risk requirements) … (Are you not doing it because the) right work hasn’t been done vs. work not done compliantly? Just put it in your SOP, you will be leveraging SQA and now it will be a compliant process”

Conclusion

An airplane pilot does not fly more carefully if she is carrying passengers versus cargo. By focusing on her safety, she protects her life by flying carefully and both passengers and cargo will arrive safely. Validation should be the same. The regulated company should focus on ensuring the system meets its needs with respect to functionality, patient safety, product quality, and data integrity. The screenshots and perceived regulatory burden are red herrings that distract us from the primary goals. The purpose of validation is to protect the public health. In fact, when regulators want to confirm a system is working properly, they will go to the system in production versus the validation environment or a stack of screenshots in a validation package. When regulators want to assess the validation program, they will review from the perspective of do you do what you say you do, and the second question of is it enough. To answer the first question, CSA should be incorporated into the regulated company’s QMS or validation plans. To answer the second question, you can explain it is enough using the risk-based documentation, audit trail from the test cycle, metrics, direct demonstrations of the system in production, etc.

Focusing on the quality of the computer software will lead to a quality culture where compliance is a by-product of the culture. Focusing on the documentation for inspection leads to “...unnecessary controls and costs and could discourage innovation and technological innovation without providing added benefit to the public health”.7

In January 2021, the FDA-Industry CSA team have asked more than 570 industry peers, practitioners and subject matter experts as part of the live webinar session – “Panel Discussion with FDA and Industry Team”.17 The key insights are:

When will you start implementing CSA

  • 46% immediately or this year
  • 47% unsure
  • 7% had no plans

If unsure or no plans (54%), what is preventing you from adopting CSA?

  • 40% need help implementing
  • 33% waiting for other
  • 27% were facing internal resistance.

We are at a tipping point and can boldly speculate, based on the second question, 46% will grow to 68% with help implementing, and that 68% will grows to over 85% once people stop waiting. We hope this article provides the guidance and clarifications necessary to start your journey towards the adoption of CSA. We hope we have swayed some of you needing help and waiting for others to explore implementing now.

Having worked with the FDA-Industry CSA team on CSA and with ISPE GAMP® on the GPG on Data Integrity by Design, we can say, CSA is already here and within the scope of current regulations.

References

  1. Safety, Efficacy, and Quality Remain Top Priorities as We Continue Our Work to Expand Access to Cost-Saving Generic Drugs for the American Public, FDA Available at: https:// www.fda.gov/news-events/fda-voices/safety-efficacy-and-quality-remain-top- priorities-we-continue-our-work-expand-access-cost-saving Accessed: February 28, 2021
  2. Accelerating the Adoption of Advanced Manufacturing Technologies to Strengthen Our Public Health Infrastructure, FDA Available at: https://www.fda.gov/news-events/fda- voices/accelerating-adoption-advanced-manufacturing-technologies-strengthen-our- public-health Accessed: February 28, 2021
  3. Vicenty F, Murray J, McPhillips D, Bill F, Spiegler J. A Collaborative FDA and Industry Perspective: Automation + Non-Product CSV. Paper presented at: Collaborative Industry Event: FDA Siemens PLM Medtronic; May 15, 2018; Minneapolis, MN
  4. FDA, Guidance for Industry - The Least Burdensome Provisions: Concept and Principles, February 5, 2019
  5. MDIC- Medical Device Innovation Consortium, Panel Discussion FDA and Industry Collaboration on Computer Software Assurance (CSA), 20th Annual Computer and IT Systems Validation, April 23, 2019
  6. Department of Health and Human Services, Food and Drug Administration. 21 CFR Part 11 Electronic Records; Electronic Signatures; Final Rule Electronic Submissions; Establishment of. Public Docket; Notice. Docket No. 92N-0251. Published March 20, 1997
  7. FDA, Guidance for Industry, Part 11, Electronic Records; Electronic Signatures — Scope and Application, August 2003
  8. PIC/S, Good Practices for Computerized Systems in Regulated “GxP” Environments, September 25, 2007.
  9. Why ISPE GAMP ® Supports the FDA CDRH: Case for Quality Program, ISPE Pharmaceutical Engineering by Sion W, et.al, November/December 2019, https://ispe.org/pharmaceutical- engineering/why-ispe-gamp-supports-fda-cdrh-case-quality-program Accessed: February 28, 2021.
  10. ISPE, GAMP RDI Good Practice Guide: Data Integrity by Design, October, 2020.
  11. Shitamoto K, Gurumoorthi S. Computer Systems Assurance: Understanding FDA’s CSA Guidance in the context of Current Regulations and GAMP. Presented at: Eminence Business Media (EBM) and Compliance Groups, Computer Software Assurance (CSA) by FDA-Industry Team, GAMP GPG Data Integrity by design co-authors, February 18-19, 2021.
  12. Information is Beautiful. Millions of Lines of Code. https://www.informationisbeautiful. net/visualizations/million-lines-of-code. Accessed: February 27, 2021
  13. ISPE, GAMP 5 Guide: Compliant GxP Computerized Systems, February 2008.
  14. Medicines & Healthcare Products Regulatory Agency (MHRA) - GxP Data Integrity Guidance and Definitions, March 2018
  15. European Medicines Agency (EMA)- Notice to sponsors on Validation and Qualification of Computerized systems used in Clinical Trials, April 7, 2020.
  16. European Commission Health and Consumers Directorate-General, EudraLex The Rules Governing Medicinal Products in the European Union Volume 4 Good Manufacturing Practice Medicinal Products for Human and Veterinary Use, Annex 11: Computerized Systems, June 30, 2011.
  17. Computer Software Assurance – CSA Revolution Series, FDA-Industry CSA Team, “FDA Discusses Data Integrity Challenges & CSA Application”, Season 2, Episode 2, Live Webinar Series, January 26, 2021.

Author Biographies

Table 1.

Ken Shitamoto is an avid fundraiser who spends his time focused on his family and giving back. He champions diversity and equality on behalf of his 16-year-old daughter.

Ken is the head of IT Quality Engineering at Gilead, a member of FDA-Industry CSA Team, and a contributing author of the GAMP GPG Data Integrity by Design. He holds a BA in Molecular Cell Biology from UC Berkeley, and a MS in Computer Science from San Jose State University.

Table 1.

Senthil Gurumoorthi has over 17 years of diverse experience in biopharmaceutical business technologies with leadership expertise in Technology delivery, Risk, Inspection, Audit and Quality Management. Senthil leads the IT Quality function at Gilead Sciences, is a member of the FDA-Industry CSA Team and is a contributing author of the GAMP GPG Data Integrity by Design. He holds B.E in Electronics & Communication from PSG College of Technology, MS in Electrical Engineering from New Jersey Institute of Technology, and Masters in Business Administration (MBA) from Imperial College London.


*FDA defines “least burdensome’’ as the minimum amount of information necessary to adequately address a relevant regulatory question or issue most efficiently at the right time.4 Moreover, FDA has further clarified the use of least burdensome in the context of computer software assurance - “Record needs to be of value to the regulated company, not the investigator or auditor”.5

To make matters worse, the average age of the software engineering text in the reference section of GPSV is 1993. Imagine a mechanic attempting to work on a Tesla with practices and tools from 1993 when the average modern high-end car is estimated to have 100 million lines of code.12 Would we accept a doctor who only used practices and therapies from 1993? We would not accept a standard of care based on 1993 for any part of our lives whether doctors, automotive, or home.

Even videos can be edited with “deep fake” face swaps by children using applications such as Reface on their iPhone.

+SQA is the development/pre-validation test cycle used at Gilead Sciences, which was the context of the conversation.


  • <<
  • >>

Join the Discussion