GAMP Good Practice Guide for Validation of Laboratory Computerized Systems, Part 1

Article

LCGC Europe

LCGC EuropeLCGC Europe-05-01-2006
Volume 19
Issue 5
Pages: 274–282

The best parts of the GAMP laboratory system GPG are the life cycle models for both development and implementation of computerized laboratory systems.

Over the past years I have not spoken in any great detail about guidance documents on computer validation for chromatographic systems and chromatography data systems (CDS) but concentrated on a specific topic from the regulations themselves. This is because most guidance has concentrated largely on computerized manufacturing and corporate systems, rather than laboratory systems.

This has changed now with the publication of the Good Automated Manufacturing Practice (GAMP) Forum's Good Practice Guide (GPG) on Validation of Laboratory Computerized Systems.1 However, this publication needs to be compared and contrasted with the AAPS publication on Qualification of Analytical Instruments (AIQ).2 Both publications have been written by a combination of representatives from the pharmaceutical industry, regulators, equipment vendors and consultants.

This will be a two-part discussion of the guide and where we should go to cover adequately both equipment qualification and validation of chromatography-based laboratory systems.

Overview of the Guide

Published in 2005, the stated aim of the GPG is to develop a rational approach for computerized system validation in the laboratory and provide guidance for strategic and tactical issues in the area. Section 5 of the GPG also notes that "...the focus should be on the risk to data integrity and the risk to business continuity. The Guide assumes that these two factors are of equal importance."1

The GPG notes that companies need to establish their own policies and procedures based on their own risk management approaches. Of interest, the inside page of the GPG states that if companies manage their laboratory systems with the principles in the guide there is no guarantee that they will pass an inspection — therefore caveat emptor!

The guide consists of a number of chapters and appendices as shown in Table 1. As you can see, the order of some of the chapters is a little strange. For example, why is the validation plan written so late in a life cycle or why is the chapter on training of personnel positioned after the validation report has been written? However, at least the main computer validation subjects are covered in the whole life cycle including system retirement. The GPG also cross references the main GAMP version 4 publication for a number of topic areas for further information where appropriate.3

Table 1: Contents of the GAMP GPG on validation of laboratory systems.

One major criticism is that the nine references cited in Appendix 5 are very selective and, therefore, the GPG ignores some key publications in this area such as:

  • Furman et al. on the debate of holistic (or system) versus modular validation or qualification of computerized equipment.4 This paper was written by three FDA personnel about the validation of computerized chromatographic equipment; ignoring it is not an option as it provides a scientific rationale for this two-level approach.

  • PDA Technical Report 18 on validation of computer-related systems that contains a more specific computer validation definition than the FDA process validation definition quoted in Section 3.1 of the GPG.5,6

  • AAPS Analytical Instrument Qualification white paper published in 2004, which was the outcome of a joint FDA-AAPS conference from 2003.2

Ignoring these papers biases the approach that this guide has taken and is a fatal flaw as we shall discuss later in this article.

Overall, the problem with this GPG is that you have to cherry pick the good bits from the bad. As with any performance appraisal system, let's start with the good news first and work our way downhill afterwards.

The Good News

The best parts of the GAMP laboratory system GPG are the life cycle models for both development and implementation of computerized laboratory systems.1 The writers of the guide are to be congratulated for producing life cycle models for development and implementation that reflect computerized systems rather than manufacturing and process equipment presented in the original GAMP guide.3 The latter V model life cycle is totally inappropriate for computerized and laboratory systems as it bears little comparison with reality. The problem with the GAMP V model was that immediately after programming the system it undergoes installation qualification (IQ). Unit, module and integration or system testing is conveniently forgotten, ignored totally or implied rather than explicitly stated. The two models illustrated in the laboratory GPG are shown in Figure 1. The left-hand side shows the system development life cycle (SDLC) that is intended for more complex systems and the right-hand side which shows the system implementation life cycle (SILC) for simpler systems.

Figure 1: System development and system implementation life cycles (adapted from the GAMP laboratory GPG).

This reflects the fact that we can purchase a system, install it and then operate it as shown on the right-hand of Figure 1. The vast majority of equipment and systems in our laboratories are similar to this, but consider the question: has the SILC oversimplified the implementation process for all spectroscopy and other laboratory computer systems?

The argument for the SILC: For most computerized chromatographs and CDS in a post-Part 11 world, you will need to add user types and users to the system that will need to be documented for regulatory reasons, for example, authorized users and access levels required by both predicate rules and 21 CFR 11. However, after doing this for simpler systems, such as an integrator, we can go into the performance qualification stage; this is not mentioned specifically in the SILC.

The argument against the SILC: The software used in some laboratory computerized systems may need to be configured — this is a term for either selecting an option in the software to alter its function within limits set by the vendor. This can be as simple as selecting which function will be used from two or three options or, in more complex systems such as a laboratory information management system (LIMS), using a language to set up a laboratory procedure or method. This factor is not accommodated specifically in either of the life cycle models. However, regardless of the approach taken, the system configuration must be documented partly for your business to allow reconfiguration of the system in case of disaster but also for performing the validation.

In my view, the concept of the SILC is good but the scope and extent of it can be taken further than the Laboratory GPG suggests. Furthermore, it can also be aligned with the existing software categories contained in Appendix M4 of the current GAMP Guide.3 The rationale is that many laboratory systems are configured rather than customized; therefore we need more flexibility than the simple SILC presented in the Laboratory GPG. This is where the GAMP software categories, outlined in Version 4 Appendix M4, come in and where the Laboratory GPG makes computer validation more complex than it needs to be.

Figure 2: Modified system implementation life cycle options and macro development.

Figure 2 is my attempt to take the SILC principles further than the GPG and align them with the existing GAMP software categories. A user requirements' specification is the entry point for all variations illustrated in Figure 2, the exit from this figure is the route to the qualification and configuration of the system. The definitions of the different types of GAMP software are

  • GAMP 3 software: This is commercial off-the-shelf software (COTS). The SILC for this type of software is shown on the right-hand side of Figure 2 In essence, this is a modification of the GPG implementation cycle where the documentation of security, access control and any other small software configurations for run time operation are substituted for the design qualification.

  • GAMP 4 software: This is configurable commercial off-the-shelf software (configurable COTS). Once the software functions have been understood, an application configuration specification can be written that will state what functions in the software will be used, turned on, turned off or modified. After the software has been installed and undergone the IQ and operational qualification (OQ) has been performed, then the software can be configured according to the configuration specification documents.

  • GAMP 5 software: Although this is usually a unique and custom application. However, in the context of CDS software, this is typically a custom calculation, a macro or custom program that is written to perform a specific function. Therefore both GAMP 4 and GAMP 5 software can exist in the same system and the GAMP 5 aspects are typically an addition to the normal functionality of the software rather than a substitute.

Therefore, the life cycles for a CDS in this category will be the GAMP 4 software (central flow in Figure 2) plus the additional steps for GAMP 5 macros or calculations (pictured on the left-hand side of the same diagram). Here, there needs to be a specification for the macro (name plus version number), the calculation or the programming or recording of the macro. Against this will be formal testing to ensure that the functionality works as specified. Once this has been performed, then the macro is installed with the application and it is tested under the performance qualification (PQ) phase of validation as an integral part of the overall system.

The Bad News

Here, in my view, is where the GPG creates problems rather than solves them. My rationale is that computer validation is considered difficult by some people — therefore conceptual simplicity is a key issue for communication and understanding to ensure that we do not do more than is necessary, dependent on the risk posed by the data generated by a system. In the words of Albert Einstein: keep it as simple as possible — but no simpler. Don't look for simplicity in certain sections of this guide as it's not there.

Do We Validate or Qualify?

There is always a debate in the laboratory between qualification or validation of laboratory equipment and computerized systems. Section 3.1 of the GPG discusses the qualify equipment or validate computer systems debate; it proposes to simplify the approach by classifying all equipment and systems under the single topic of "validation". However, this goes against how the rest of the organization works; it is important to emphasize that laboratories are not unique islands inside an organization — rather they are an integral component of it.

The inclusion of ANY item of laboratory equipment with a computer chip from a pH meter (GAMP Category 2 software upwards as a "computerized laboratory system" is wrong, in my view, as it will create much confusion. Especially as it goes against the advice of the GAMP Guide in Appendix M4, which states that the validation approaches for Category 2 systems consist of qualification steps or activities. Therefore, we now have conflicting guidance from the same organization on the same subject — you can't make this stuff up!

The GPG is doubly wrong following the publication of the draft general chapter <1058> titled Analytical Equipment Qualification for the US Pharmacopeia as it kills the rationale for including everything under the term validation dead.7 I will discuss this publication under the AAPS AIQ in part two of this column.

Therefore, let us get the terminology right. We qualify

  • instruments

  • equipment

We validate

  • systems

  • processes

  • methods

We also calibrate

  • instruments

  • equipment.

These simple principles are easy to grasp and allow any laboratory full flexibility to be made of the risk-based approaches to regulatory compliance. You do not usually need to do as much work to qualify an instrument for an intended purpose as you would validate a computerized system. In overview, the reason is that typically you'll need to qualify the instrument, as well as validate the software, which implies more work because it's usually a more complex system.

Is this separation of "qualify equipment" and "validate systems" too simplistic? Yes for two reasons:

1. Do we have clear and agreed definitions of "laboratory equipment" and "laboratory system"? No.

2. Have we forgotten that all CDS have both the instrument (equipment) and system components (computer and training elements)? You can't operate the equipment without the system and vice-versa. Therefore, we need an integrated approach to these two issues which will be discussed in part two of this column.

The debate is also clouded by the lack of suitable definition of "qualification". It is a difficult word to define as it is used in a variety of ways such as in design, installation, operational and performance qualification. A definition for qualification is defined in ICH Q7A GMP for active pharmaceutical ingredients as

Action of proving and documenting that equipment or ancillary systems are properly installed, work correctly and actually lead to the expected results. Qualification is part of validation, but the individual qualification steps alone do not constitute process validation.8

The first part of the definition is fine for equipment but the qualifying (sorry!) sentence means that here qualification is inextricably linked to validation. So we have a problem. However, a PIC/S guidance document with the snappy title of Validation Master Plan, Installation and Operational Qualification, Non-Sterile Process Validation, Cleaning Validation has some thoughts on the qualification versus validation debate:9

2.5.2: The concept of equipment qualification is not a new one. Many suppliers have always performed equipment checks to confirm functionality of their equipment to defined specifications, both prior to and after installation.

So for the purposes of our discussion we can start to tease out what a qualification process actually is:

  • Equipment is specified by the laboratory

  • Installation is properly undertaken

  • Equipment works correctly.

Of course, all stages are associated with appropriate documentation.

We must also consider the requirements of the GMP regulations under §211.160(b) for scientific soundness:10

Laboratory controls shall include the establishment of scientifically sound and appropriate specifications, standards, sampling plans and test procedures designed to assure that components, drug product containers, closures, in-process materials, labelling and drug products conform to appropriate standards of identity, strength, quality and purity.

Therefore, the specifications for equipment and/or computerized systems and the tests used to qualify or validate them should be grounded in good science and include where necessary the use of traceable reference standards.

Therefore, don't forget the impact of calibration: either on a formal basis against traceable standards (typically after a chromatograph has been serviced) as well as on a regular basis before a system is being used to make a measurement, for example, system suitability test. This all adds up to scientific-based control of the system, the chromatograph and potentially also a method.

Categorization of Laboratory Systems

Section 2 of the Lab GPG notes:1

In GAMP 4, systems are viewed as a combination of individually categorized software and hardware elements. The proposed approach in this Guide is that Laboratory Computerized Systems can be assigned a single classification based upon the technical complexity of the system as a whole and risk to data integrity.

In Appendix M4 of the GAMP Guide is a classification of software into five categories from operating systems (Category 1) to custom or bespoke software (Category 5). This is shown in Figure 3 on the left-hand side. Note, as we have discussed earlier, that more than one class of software can exist in a system; for example, GAMP Categories 1 and 3 for a basic CDS integrator commercial off-the-shelf package running on a PC plus Category 2 firmware within the chromatograph.

Figure 3: Classification of laboratory systems by GAMP main guide and the laboratory GPG.

In an attempt to be all-encompassing for laboratory systems, the GPG has included ALL instruments, equipment or system with software of any description. Instead of five categories of software, we now have seven (Categories A to G). The categories that have been devised for the Laboratory GPG are based on four principles:

1. Configuration: The software used in the system varies from firmware that cannot be modified, to parameterization of firmware operating functions, proprietary configurable elements up to bespoke software (these are encompassed in GAMP version 4 software categories 2–5).

2. Interfaces: From stand-alone instruments to a single interface to another system and through to multiple interfaces to the system.

3. Data processing: From conversion of analogue to digital signals to post-acquisition processing.

4. Results and data storage: From no data generated to methods, electronic records and post-acquisition processing results.

However, the approach outlined in the GPG is wrong again as it separates and isolates the laboratory from the rest of the organization when in reality it is an integral part of any regulated operation from R&D to manufacturing. We cannot have an interpreter at the door of the laboratory who interprets the GAMP categories used in the rest of an organization to Lablish (Laboratory computerized system validation English). There needs to be a single, unified approach to computerized system validation throughout an organization at a high level that acknowledges that there will be differences in approach as one gets closer to the individual quality systems, for example, GMP, GLP etc., and the individual computer systems. To do otherwise is sheer stupidity.

Do You Really Want to Validate a Dishwasher?

Some of the typical systems classified by the GPG are shown in Figure 3 on the right-hand side. In contrast, the left-hand side and centre columns show how systems from the traditional GAMP software categories map to the new GPG categories. You'll also note that a system can be classified in more than one GPG class depending on the software functions. In devising this classification system, the GPG proposes to include balances, pH meters, centrifuges and glass washers as "laboratory computerized systems". Strictly speaking this is correct — the equipment mentioned above all have firmware or ROM chips that allow the system to function.

According to the main GAMP Guide all these items of equipment would be classified as Category 2 and "qualified" as fit for intended use. Under the Laboratory GPG, they are split into two classes (A and B) and are "validated" as fit for purpose. The comparison of the GAMP Guide and the Laboratory GPG software classifications are shown in Figure 3 on the right-hand side of the diagram and the arrows in the middle indicate how the two classification systems are mapped and are compared with each other.

The horror that some of you may be having now around the suggestion to validate a balance, pH meter or centrifuge is more about terminology used rather than the work that you would do. Moreover, as we get to more complex laboratory systems such as LIMS the Laboratory GPG suggests that GAMP 4 categories may be more suitable! It really depends on the functions that the equipment or system does and how critical it is.

Table 2 shows the comparison between the GAMP guide and the GPG for classifying typical laboratory systems; in the latter instance, HPLC with or without a data system in two or three categories. Looking at most of today's CDS it is difficult to imagine that some can fit into Lab Category C and D (equivalent to GAMP Category 3) as shown in Table 2. The main CDS systems used with a regulated laboratory must be configured to work correctly. This coupled with their use either to release or develop product means are high profile systems in any inspection. If there is any doubt look at the Able Laboratories inspection of May 2005 to see how a company collapsed after an FDA inspection of its CDS.11 In addition, regulated CDS systems can have custom calculations input — these are unique to an individual laboratory or organization and must be validated as such.

Table 2: Comparison of system classification by GAMP and the laboratory GPG.

Summary

In today's risk- based environment, computer validation and equipment should be getting easier, quicker and simpler. Although the GAMP GPG for laboratory computerized systems was published in 2005, it reads as if it were published under the older and more stringent regulatory approach that existed before 2002 and FDA's Pharmaceutical Quality Initiative in 2004.

The great concept is the system implementation life cycle and a realistic (at last!) system development life cycle as the good points from this document. In the next column, I'll look at the risk assessment methodology outlined in the guide, work by the AAPS and the USP on equipment qualification and a proposed way forward.

Robert McDowall is principal at McDowall Consulting, Bromley, Kent, UK. He is also a member of the Editorial Advisory Board for LCGC Europe.

References

1. GAMP Forum Good Practice Guide — Laboratory Systems; International Society for Pharmaceutical Engineering: Tampa, Florida, USA (2005).

2. S.K. Bansal et al., Qualification of Analytical Instruments for Use in the Pharmaceutical Industry: A Scientific Approach (American Association of Pharmaceutical Scientists, USA (2004).

3. Good Automated Manufacturing Practice (GAMP) guidelines version 4, International Society for Pharmaceutical Engineering: Tampa, Florida, USA (2001).

4. W. Furman, R. Tetzlaff and T. Layloff, JOAC International 77, 1314–1317 (1994).

5. Validation of Computer-Related Systems, Parenteral Drug Association Technical Report 18, 1995 (Journal of the PDA , 49 S1-S17).

6. FDA Guidance on Process Validation Guidance (1987)

7. Pharmacopoeal Forum, <1058> Analytical Equipment Qualification, January 2005.

8. ICH Q7A Good Manufacturing Practice for Active Pharmaceutical Ingredients (2000).

9. Validation Master Plan, Installation and Operational Qualification, Non-Sterile Process Validation, Cleaning Validation PIC/S Guide PI 006-1 PIC/S Geneva (2001).

10. FDA Current Good Manufacturing Practice for Finished Pharmaceutical Products (21 CFR 211) (1978).

11. R.D. McDowall, Quality Assurance Journal, 11(1), (2006).