Data Integrity and Laboratory Analytical Instruments

When we talk about pharmaceutical and biotech drugs an immediate image of patients, brands and labels comes to mind; but it is less common, at least for those who do not work in the industry, how complex it is and how every day there is a battalion of professionals protecting the integrity and safety of those products. The heart of any pharmaceutical resides in its Quality Control. I know, I know… as an engineer I may differ as well, but what is truth is truth and Quality Control are present from the moment that a drug is born to the last minute before it reaches out to our patients, to ourselves. Saying that, Quality Control does not have an easy task to fulfill and I’m here to talk about a piece that guarantees that each and every one of the drugs developed, that go into clinical studies and then to commercial distribution must have and that’s a process called Data Integrity.

Subscribe to our e-Newsletters
Stay up to date with the latest news, articles, and events. Plus, get special offers
from American Pharmaceutical Review – all delivered right to your inbox! Sign up now!

Data integrity starts with your quality culture within the organization. There are common elements that are embedded into the business culture and leadership and those get translated into the company’s behavior. When I speak of data integrity, I always start with this note as we can excel in our quality control, technical and validation approaches, but if the values and business culture don’t support awareness on the topic, transparency, right and prompt communication, and mechanisms to report issues; data integrity events are going to be observed on a frequent basis. As ISPE GAMP Guide: “Records and Data Integrity” mentions; “People, Process and Technology are interconnected and is the responsibility of the Data Governance and Senior Leadership to design an effective framework within the business”. But, what does that mean and how do we apply this concept in our labs?

One of the best processes that can support a reliable data integrity program is your computer system validation approach. But, how do we get there? Simple. The idea concentrates in how the ALCOA+ concept (Attributable, Legible, Contemporaneous, Original and Accurate; plus, Complete, Consistent, Enduring and Available) is integrated into the Computer System Validation Lifecycle and what elements are critical for its successful implementation. Below are four areas that I consider essential to start with:

  1. Senior Leadership admits and supports the Quality Culture and Data Integrity Governance.
  2. Controls are established and “Lived” by the business. Controls need to be defined for Behavioral, Procedural and Technical spaces.
  3. Supporting processes are created. These can be unlimited and will be dependent on the complexity of the business, processes and products. Supporting processes examples include:
    1. GMP Audit Processes (including System Vendor Audits and Quality Technical Agreements)
    2. Validation Process
    3. Quality Metrics Reviews
    4. Process Engineering / Lean Quality Controls
  4. Business processes are created to define the groups and mechanisms to follow that will provide a compliant maintenance state (i.e. QC Business Analysts, IT Infrastructure Group).

So now that we have an overview of the main blocks and foundation, let’s focus on analytical instruments and how if they are properly installed, qualified and validated they will ensure a compliant and Data Integrity reliable environment and how the Computer System Validation Process can support a reliable Data Integrity Program.

Currently, and following the agile speed of technology, most of the analytical instruments shall follow a computerized system lifecycle, including computer system validation. The computer system validation program must follow a lifecycle approach in where the data integrity elements are introduced to each of the phases. For this, a risk management methodology is highly recommended. The risk management approach will be the process in where risks related to the analytical instrument computerized system or integrated to a computerized system are identified and controls implemented that will reduce potential risks to acceptable levels. Three main concepts are attributable to a successful risk management approach, and these are:

        1. Vendor Risk Assessment The vendor risk assessment will be performed during the selection of a potential analytical instrument vendor and its software developer. This applies as well for systems, applications or SaaS vendors and will include an evaluation of the supplier’s ability to support that specific software or service against the code of regulations, industry guidelines and internal policies and procedures.
        2. System Risk Assessment The system risk assessment will be performed during the planning phase and will determine the overall risk of the system, against its intended use, applicable regulations and internal procedures. Note: is recommended that the (User Requirements Specification document is defined).
        3. Functional Risk Assessment The functional risk assessment will be performed after the User Requirements are developed. This risk assessment will determine risks related to individual functions and process steps. The idea is that the controls identified will be tested during the validation testing. Note: Various, if not, a major portion of analytical instruments are COTS (Commercial of-the-Shelf ) and do not necessarily require a functional risk assessment. Please note that this is where the company’s responsibility steps in and as “users” we must ask the vendor to submit risk assessments, system configuration, testing, support and maintenance documentation. Vendor may provide the qualification/ validation package as well. These packages are not to be taken for granted and internal qualification per your company’s policies and procedures is highly recommended.

As ISPE “GAMP – A Risk Based Approach to GxP Compliant Laboratory Computerized Systems” describes, “the number and level of detail of the specifications will vary depending upon the type of system and its intended use”. As an example, they mention that the “software design specifications are not expected from the regulated organization for non- configured products.”

Going back to the three main concepts, we can say that they are vital as they embrace the latest regulatory approach on process knowledge and risk identification, but also because they establish a correlation between the definition of requirements, validation process and data lifecycle. Which takes us to a more exciting topic and happens to be the “core” of any validation or qualification testing: The User Requirements.

The User Requirements must reflect how the analytical instrument in question is required to function from a procedural and technical aspect; including but not limited to the business processes, potential methods or assays and data workflows. They should be developed for all laboratory analytical instruments and it is important to keep in mind that these areas should be traceable back to corresponding regulations and risk assessments.

There are critical areas that shall be included in an URS, but within the scope of this article the following five areas are directly related to data integrity and must be included within the URS content to satisfy adherence of data integrity elements within the validation space. Below areas and requirements examples:

      1. Electronic Records
        • System records retention; records are retrievable and reproducible
        • System provides the ability to prevent records from deletion
        • System generates accurate and complete copies of electronic records in human readable and electronic form
        • System have electronic record time stamps with accurate system date/time
      2. Electronic Signatures
        • Link between electronic and handwritten signatures
        • Unique user IDs and passwords attributable to the electronic signature
        • Meaning, date and time of the electronic signature recorded
      3. Access and Security
        • System shall require and authenticate a unique User ID and password combination for each individual user account to control access
        • System shall restore password in non-human readable form
        • System controls shall provide the administrator the ability to change user group authorization
        • System shall have password expiration controls
      4. Audit Trail
        • System shall automatically generate an audit trail for user actions that includes: creation, modification, deletion of electronic records and electronic signatures
        • System shall capture user ID, date and time at minimum
        • System shall prevent previous data recorded from being obscured
        • Audit trail information shall be accessible for review
        • Audit trail shall not be modified, deleted or disabled by users
      5. General Data Integrity
        • Original data shall be available for access, review or transcription; transcribed data if any, shall be traceable to its source
        • All data shall be backed up, and backed up data shall be restorable on the operational system
        • When system data is archived, include requirements addressing the following:
        • Ability to track changes to archived data
        • Record retention period
        • Data restore services (third party)

Once the user requirements are properly identified, the computer system validation lifecycle will drive the proper sequence of validation testing per associated risk level; ensuring that the correct testing is performed against the data integrity elements previously defined. A V-Model validation approach and USP-1058 are highly recommended to confirm accurate requirements and testing relationship. After this, is up to the data integrity program to define a data lifecycle process in where data flows within a system can be mapped. This will support a reliable decision making when changes are proposed because it will identify data ownership, potential risks and controls within the creation, processing, review, reporting and use, retention/archival and retrieval, and destruction of the data (data lifecycle phases).

But here is where it gets interesting. How do we make sure that our analytical instruments are maintained correctly and in accordance with data integrity expectations? How do we make sure that hybrid processes (manual and electronic) are risk assessed properly so controls can protect records and raw data generated? There is no simple answer. One recommendation is to map out the process overview of relationships between the laboratory computerized system, applicable regulations, potential instrumentation attached to the system, and records and raw data meaning. This last step will take you to identify the metadata and make sure that raw data is attributable to a specific meaning. The process can be started with an understanding of the predicate rule regulations and identifying a single approach that can satisfy all applicable regulations. I will say that most of the data integrity problems and current citations are because of the lack of evaluation on the current lab software or computerized system state and associated processes against “new expectations” on regulations and industry guidance. Evaluation may result in acknowledgement of “lack of technical controls” and that might be… Ok; as to reach a data integrity compliant environment, procedural and technical controls must be in place. As an example, CFR (Code of Federal Regulation) requires policies and procedural controls that will drive the use of electronic records, because software may contain technical controls to support regulations, but that doesn’t mean that they comply with regulations. That’s the reason why is our responsibility, as the owners, to qualify, validate, audit, assess and monitor.

Procedural controls will support QC labs instrument software and their compliant state in terms of data integrity, but they need to satisfy the following requirements:

      • People/staff training on that SOP
      • Assurance that the SOP is followed
      • Adherence to the SOP is confirmed by Quality Systems oversight and Internal Audit process

In summary, key concepts to a compliant analytical instrument implementation are bucketed in this five categories:

      • Procedures
      • Training of personnel
      • Software development activities (software technical controls per URS)
      • Testing activities (Validation)
      • Quality management systems
      • Infrastructure

Wrapping up this topic I need to cover one more area that is taking high visibility in the biotech industry: Bioanalytical Methods Validation. As the new guidance document took place in May 2018, we must note that data integrity elements were embedded into this guidance. Four areas in where data integrity resonates are: bioanalytical methods development, validation, reference standards and documentation.

On the Bioanalytical Method Development topic, we must note that now it specifies that the sponsor or owner shall record the changes to procedures as well as any issues and their resolution during development and provide documented rationale. The method in development must ensure that the method is suitable for validation and once in validation environment; is an expectation that the validation of the method demonstrates its suitability to the analysis for that specific study samples.

On reference standards and documentation, data integrity topics are noted as well. For reference standards, the sponsor/owner shall use authentic analytical reference standards with known identities and purities to prepare solutions of known concentrations; following by that the reference standard should be identical to the analyte with exceptions like the use of an established chemical form of known purity (i.e. free base, free acid, salt). Also, is expected that the sponsor/owner needs to provide CoA with all the traceability critical information (source, lot number, expiration date). As for documentation, the revised guideline has a whole section for its requirements. Some of the top requirements in relationship with data integrity speak of 1) summary of assays used and cross-references, 2) summary table with all relevant method validations and codes, 3) reference of analytical site, instruments, validation report per item (i.e. blank matrix, SOP, repeat analysis) and 4) per parameter, validation requirements shall include elements and acceptance criteria as well as “In Study” analysis requirements.

As a result, application of these concepts will drive the use of appropriate risk management methodologies, analytical instruments qualification and its computerized system validation, and technical and procedural controls that will support an adequate analytical instrument use and data integrity model within the Quality Control organization; directing your “QC Labs” to a more mature data integrity state.

  • <<
  • >>

Join the Discussion