GAMP Good Practice Guide for Validation of Laboratory Computerized Systems, Part 1

Guidance to help validation of computerized systems used within regulated laboratories is always welcome but is it always helpful? In the first part, I present an overview of the Guide, different approach to life cycle validation and system classification.
May 01, 2006

Over the past years I have not spoken in any great detail about guidance documents on computer validation for chromatographic systems and chromatography data systems (CDS) but concentrated on a specific topic from the regulations themselves. This is because most guidance has concentrated largely on computerized manufacturing and corporate systems, rather than laboratory systems.

This has changed now with the publication of the Good Automated Manufacturing Practice (GAMP) Forum's Good Practice Guide (GPG) on Validation of Laboratory Computerized Systems.1 However, this publication needs to be compared and contrasted with the AAPS publication on Qualification of Analytical Instruments (AIQ).2 Both publications have been written by a combination of representatives from the pharmaceutical industry, regulators, equipment vendors and consultants.

This will be a two-part discussion of the guide and where we should go to cover adequately both equipment qualification and validation of chromatography-based laboratory systems.

Overview of the Guide

Published in 2005, the stated aim of the GPG is to develop a rational approach for computerized system validation in the laboratory and provide guidance for strategic and tactical issues in the area. Section 5 of the GPG also notes that "...the focus should be on the risk to data integrity and the risk to business continuity. The Guide assumes that these two factors are of equal importance."1

The GPG notes that companies need to establish their own policies and procedures based on their own risk management approaches. Of interest, the inside page of the GPG states that if companies manage their laboratory systems with the principles in the guide there is no guarantee that they will pass an inspection — therefore caveat emptor!


Table 1: Contents of the GAMP GPG on validation of laboratory systems.
The guide consists of a number of chapters and appendices as shown in Table 1. As you can see, the order of some of the chapters is a little strange. For example, why is the validation plan written so late in a life cycle or why is the chapter on training of personnel positioned after the validation report has been written? However, at least the main computer validation subjects are covered in the whole life cycle including system retirement. The GPG also cross references the main GAMP version 4 publication for a number of topic areas for further information where appropriate.3

One major criticism is that the nine references cited in Appendix 5 are very selective and, therefore, the GPG ignores some key publications in this area such as:

  • Furman et al. on the debate of holistic (or system) versus modular validation or qualification of computerized equipment.4 This paper was written by three FDA personnel about the validation of computerized chromatographic equipment; ignoring it is not an option as it provides a scientific rationale for this two-level approach.
  • PDA Technical Report 18 on validation of computer-related systems that contains a more specific computer validation definition than the FDA process validation definition quoted in Section 3.1 of the GPG.5,6
  • AAPS Analytical Instrument Qualification white paper published in 2004, which was the outcome of a joint FDA-AAPS conference from 2003.2

Ignoring these papers biases the approach that this guide has taken and is a fatal flaw as we shall discuss later in this article.

Overall, the problem with this GPG is that you have to cherry pick the good bits from the bad. As with any performance appraisal system, let's start with the good news first and work our way downhill afterwards.