OR WAIT 15 SECS
Information from guidance documents used in clinical investigations can be very useful to help meet FDA regulations regarding computer security in analytical laboratories.
R.D. McDowall, McDowall Consulting, Bromley, Kent, UK.
In May 2007 the FDA released the final version of a guidance document that you may be unaware of: Computerised Systems Used in Clinical Investigations.1 So why is this relevant to the computerized systems used in chromatography laboratories you may ask? The answer is simple: in the absence of a rerelease of 21 CFR 11 this document provides the current formal thinking of the FDA on security and access control applied to computerised systems. This has become more important since the Able Laboratories fraud2,3 and will have an impact on the FDA's new approach to Pre Approval Inspections4 which aims to determine data integrity and fraud. This places security and access control as prime areas that must be controlled to prevent fraud and to establish and maintain data integrity.
This guidance document has a long history and was first released as a draft in April 1997 and finalized in April 1999 as the Good Clinical Practice (GCP) guidelines. These guidelines are very vague when it comes to discussing computerized systems. In the light of the reassessment of Part 11 by the FDA in 2003, the clinical guidance document was reissued as a second edition in September 2004 and in May 2007 the final version was released under a slightly modified title that extends the scope of the document, that need not concern us in this discussion.
The listing of a minimum list of standard operating procedures (SOPs) required for computerized systems described in Appendix A of the document is also of interest. This is shown in Table 1 in the left hand column and you will need to interpret this list in terms of what is relevant to an analytical laboratory. My interpretation of this list is shown in the right hand column. This is my attempt to demonstrate that this list can be useful in a chromatography laboratory as well as in a clinical environment. This list of minimum SOPs is a common sense approach and is consistent with current good practice in a non-regulated environment (e.g., ISO 17799 2005)5 and should not be thought of as onerous requirements of a regulatory agency. Each item we discuss can be a single SOP or the topics can be distributed amongst several procedures — the choice is yours.
Table 1: List of SOPs and interpretation for an analytical laboratory.
However, the main aim of this column is to look in more depth at security and access controls because these will be under more regulatory scrutiny in the future. The parts of the FDA clinical guidance that I will discuss are
I'll present the points made by the FDA in my sequence (Table 1) rather than the sequence presented in the guidance as, in my view, my way is more logical. This is because the FDA writers have distributed facets of the same item between either internal or external security safeguards, I just prefer to take a different perspective of access control and logical security which is a combination of these two. I'll make reference in the following text back to either Section D or E of the original guidance document so that you can trace my thought process to the source document.
It is also important to realise that this is a guidance document that contains in bold type at the top of each page the words "Contains Non-binding Recommendations" and alternative approaches to those outlined are acceptable.
The first part of system security is to limit access to any of your systems only to those people who need to use them. Section D1 of the guidance states Access must be limited to authorized individuals (21 CFR 11.10(d)). This is a basic requirement for both the FDA and EU GMP and GLP regulations, 21 CFR 11 but also of ISO 17799. This latter standard provides much practical advice of how to implement many of the security requirements for our computerized systems and comply with the regulations.
Section E also notes; Staff should be kept thoroughly aware of system security measures and the importance of limiting access to authorized personnel. This highlights the importance of initial and on-going training of users.
The guidance simply states: We recommend that each user of the system have an individual account (D1). This is simply common sense as it is the only way that allows any computer system to uniquely identify an individual and also their actions. It is the heart of the requirements for data integrity and authenticity. In instances where user identities have been shared this means that the system cannot differentiate between users and data integrity is severely compromised. This was the situation with the Concord Laboratories warning letter issued by the FDA in July 2006,6 here the laboratory managers logged on and set up chromatography systems that the staff used for analysis. The problem was that user accounts were shared and staff used the managers' accounts and therefore the individual responsible for any changes could not be identified — hence the unrequited love letter from the FDA.
Individual user accounts are also consistent requirements outlined in ISO 17799, the standard also goes further and recommends that before a user is allowed access to use a system, there is a formal authorization process to use a system initiated by the head of function and that access privileges granted are appropriate to the user's actual need. This last stage may have restricted access while a user trains to use the system, with increased privileges coming with successfully acquiring the skills to use the system correctly.
Here the FDA requires a means, either inside the system or external to it, to maintain a cumulative record that indicates, for any point in time, the names of authorized personnel, their titles, and a description of their access privileges (E). So why would the FDA request this? Again, it comes down to the principles of data integrity. If an entry in the audit trail uses the user ID to link actions, then there may not be sufficient detail to indentify an individual, therefore, the cumulative list is recommended. However, it must be considered when an individual was first granted access to the system and when their use was terminated? This information will aid the integrity of data enormously.
Ideally, the chromatography data system should be able to provide this information, however, most do not because users do not request this function. So to meet this recommendation, most system administrators will be left with a paper alternative that compiles either an Excel or Word list that is regularly updated. The information for each user should be:
In addition to having unique user accounts, passwords must not be shared. As the FDA note: Individuals should work only under their own password or other access key and not share these with others (I). This is also a requirement of the majority of organisation's corporate computing or security policies so this is not merely a regulatory requirement. In some organisations, staff can be dismissed if they are caught sharing passwords.
Passwords should be changed as follows: We also recommend that passwords or other access keys be changed at established intervals commensurate with a documented risk assessment (I). This is good practice, however for many organisations do we really need a documented risk assessment as our corporate security standards usually define password length, structure (e.g., number of characters and composition) and expiry time? Therefore my suggestion would be to follow corporate standards as these should have sufficient requirements to meet the intent of this guidance.
The user should log into that account at the beginning of a data entry session, input information (including changes) on the e-record, and log out at the completion of data entry session. The system should not allow an individual to log onto the system to provide another person access to the system (I).
The recommendation here is to have a technical control to prevent sharing accounts the main way this would be met would be to have the user input their personal password at key stages of the work flow. If this seems a little prescriptive you need to remember that the guidance is aimed at a clinical environment where many of the clinical investigators may not have basic compliance training.
However, here's a good one for translation to the laboratory: When someone leaves a workstation, the person should log off the system. Alternatively, an automatic log off may be appropriate for long idle periods. For short periods of inactivity, .. an automatic screen saver can prevent data entry until a password is entered (I).
Now the password protected screen saver is fine in principle but may be poor in practice unless you have a situation where most of the systems are networked. If a screen saver comes on it has the potential for preventing many stand alone systems from working. An automatic log off may stop your application form working! Secure but you don't get much work done.
In contrast, many networked CDS systems can be used like this as the instrument can be set up to run from a data or instrument server and the user's session can be stopped or terminated without impacting the analysis. Users of networked CDS systems should also be trained to lock their screens if they will be away from their terminals for any length of time.
A recommendation of the guidance is that: The system should be designed to limit the number of log-in attempts and to record unauthorized access log-in attempts (I).
This sounds fine in practice but what is an unauthorised access? In the context of a chromatography laboratory, there is usually a physical security to restrict unauthorised access to the site and sometimes to the laboratory itself. The unauthorised access will be from either untrained users of the software within the laboratory or people external to it. Therefore a system needs to log or audit trail all access attempts, both successful and unsuccessful, within a single session a maximum number of unsuccessful attempts before an account locks is normally three or four. On a regular basis, a system administrator should check to see the unsuccessful attempts to determine if any action needs to be taken.
Rather than access a system via the application, a more surreptitious way of accessing data is to use the file manager system of the operating system or a web browser. Therefore the guidance has the following to say: Procedures and controls should be put in place to prevent the altering, browsing, querying, or reporting of data via external software applications that do not enter through the protective system software (E).
This has already been quoted in a 483 observation to the Gaines Chemical company when an inspector reviewed a chromatography data system that used operating system files rather than a database to manage the data files generated.7 The observation noted that there was no validation data to demonstrate that an authorised user of the corporate WAN did not have access to analytical data on the laboratory's LAN.
Therefore some controls are required to prevent a user's unauthorized users from accessing the data. These can include having write-protected drives so that modified files can only be saved under a different name or that laboratory network drives are hidden from the general user community. Training is also a major element here that users must not access applications via the operating system.
Finally, there is also a requirement for anti-virus software: Controls be implemented to prevent, detect and mitigate effects of computer viruses, worms, or other potentially harmful software code on study data and software (E).
These are basic controls that are essential for today's computing environment – unless you have a stand alone system in the laboratory that is not connected to a network and movement of files to and from it are checked for any malicious software. However, it is important that when software of this nature is used it is adequately documented. Also, the antivirus definition updates should be excluded from the scope of your change control SOP as this is the normal mode of operation of this software.
This column is intended as a summary of the security and access control requirements of a new FDA guidance on computerised systems in clinical investigations that has been interpreted for the analytical laboratory.
Bob McDowall, PhD, is principal at McDowall Consulting, Bromley, Kent, UK. He is also a member of the Editorial Advisory Board for LCGC Europe.
1. FDA Guidance for Industry, Computerised Systems Used in Clinical Investigations, May 2007.
2. Able Laboratories 483 Observation, July 2005
3. R.D. McDowall, LCGC Europe, 20(3), 144–149.
4. Gold Sheet August 2007.
5. ISO 17799-2005, Information Security Management, ISO Geneva.
6. Concord Laboratories, FDA Warning Letter, July 2006.
7. Gaines Chemical Company, 483 Observations, December 1999.