A Whole Lot of Data Going On Book Review: Data Integrity and Data Governance by R.D. McDowall

Article

The Column

ColumnThe Column-07-09-2019
Volume 15
Issue 7
Pages: 22–25

R.D. McDowall has written an excellent book on data integrity and data governance. Those who need to understand what this is should read the book and follow his advice. He has included both what the regulations and regulators say and what we need to do to be in compliance. This is a scalable approach with enhancements for larger items and shortcuts for smaller items, with numerous examples throughout the book.

 

motizova/stock.adobe.com

R.D. McDowall has written an excellent book on data integrity and data governance. Those who need to understand what this is should read the book and follow his advice. He has included both what the regulations and regulators say and what we need to do to be in compliance. This is a scalable approach with enhancements for larger items and shortcuts for smaller items, with numerous examples throughout the book.

Twenty years ago the big buzzword in the pharmaceutical industry was 21 CFR Part 11, which was the US Food and Drug Administration’s (FDA) new standard for electronic records and electronic signatures. Discussions raged about how to interpret the standard and how strict it should be. 21 CFR Part 11 actually only had 2.5 pages of requirements, and that was probably one of the most expensive 2.5 pages written, at least for the pharma industry. Maybe until now.

The newer buzzwords have been “data integrity” and “data governance”, which have been floating around for a while. Every agency (FDA, WHO, MHRA, PIC/S, GAMP, and several others) worth its salt has published some requirements of their own about the subject. There are also the good manufacturing practice (GMP), good laboratory practice (GLP), and good clinical practice (GCP) in their many variations around the world that have to be followed for production and analytical laboratory work, nonclinical testing, and clinical testing. Most of them include something about data integrity, maybe not always using that terminology, but they have text that can be interpreted as meaning “data integrity”. So how on earth do we navigate through this enormous mass of requirements?

The answer lies in R.D. McDowall’s newest book Data Integrity and Data Governance-Practical Implementation in Regulated Laboratories (Royal Society of Chemistry 2019). This is a massive tome of 600 pages, and may seem like a tough one to digest with content that may not be all that interesting. Still, the management of regulated industry needs to read all 600 pages in the book in order to set the standard for how every single person in their organization needs to work-themselves included.

The book is not just another boring text book. It has a lot of fun comments scattered inside the text. One example is “Human inventiveness knows no bounds when it comes to data falsification”, and another “the poor reader must hack through a jungle of words to figure out what is needed-this being the analytical equivalent of an Indiana Jones archaeological expedition”, which is the explanation to the data integrity guidance documents from the authorities! McDowall has done the excavation of this giant mass of paper for us.

 

The book explains in detail all the aspects of the topics of data integrity, starting each chapter with the regulatory basis for each of them. McDowall has made it a lot easier for everyone by publishing this book. The amount of investigations that he has carried out in order to establish a system in the overwhelming craziness of regulatory documents is astonishing. Every manager and every other person that has to make sense out of all this should be thankful that he has cleared an understandable path through this jungle of words and requirements.

While 21 CFR Part 11 was a job to be done (making sure that all the IT systems were OK), data integrity is different. It is a mindset and has to be worked on continuously by everyone in the organization without exception. Or, as the author writes, it is a programme, not a job to be done.

From my own experience, if management is not involved, things will not be done. If management does not require work to be done in a specific way, there will always be people who could not care less about following procedures, especially if it is easier or quicker not to do so. All the things that need to be in place for good data integrity will never be in place unless management itself takes an active part and interest in the work and sets requirements for everyone in the organization to follow. This is actually no different from any quality work, but the data integrity work is even more ubiquitous in the organization.

Poor management; they have to deal with a lot. Quality assurance and quality control require management involvement, reviews, and assessments. Throughput of the work in production or the laboratory requires management involvement, risk analyses, assessments, and maybe changes to the processes. Then there are all the business aspects. Is there no end to what management needs to be involved in? Maybe there is, but data integrity is definitely not one of them.

Before the book even starts, there is a comprehensive glossary covering words, abbreviations, and data integrity terms. Chapter 1 is about how to use this book-useful reading before diving into the rest of the book.

Every chapter starts with an overview and includes citations of relevant parts of the various regulations. It is followed by a small summary from the author, before the detailed explanations about the topic and how to comply. At the end of the chapter there is a comprehensive reference list, and throughout the text there are references to other chapters, other texts, and other books. If you want to read more, it is easy with these references.

The first few chapters start off with the background, and include citations from several warning letters from the FDA to organizations that did not have their data integrity in place. The warning letters from the FDA should be read because these show the current views that the FDA hold. Data integrity problems vary from a basic lack of data integrity and documentation to fraud. Several of these warning letters form the background for the regulatory agencies to respond with more detailed requirements.

 

McDowall has created a very useful data integrity model in chapter 5 that puts the various parts of the topic in their right place. The foundation requires the right culture and ethos for data integrity. The foundation needs to have management leadership, policies, procedures, and staff training. Hence the management involvement. For process development and production, the levels above include the right equipment for the right job (including validation and qualification); the right manufacturing process for the right job (including validation and on-going control); and batch manufacturing for the right product (with data supporting the right quality of the product and records meeting the marketing authorization). For analytical development and quality control, the corresponding levels are right instrument and systems for the right job (including qualification and validation); the right analytical procedure for the right job (including validated and verified under the right conditions); and the right analysis for the right reportable result (complete, accurate, and consistent). If the foundation is not in place, the levels above will not work. And if level 2 is not in place, level 3 will be compromised. McDowall has produced a system out of the various requirements with his model.

Other chapters go into detail with each subtopic. Topics include roles and responsibilities, policies, procedures, and training, and how to keep an open culture for data integrity. An open culture means that it should be possible to be a whistleblower or admit to errors without fear of retaliation. In some cultures this is a very different mindset and may be almost impossible to follow. For others it may be difficult if people’s performance is measured by throughput and correctly doing their work. Within a good “quality culture”, finding errors is positive because that means improvement can be made so that the same error is not made again.

McDowall then moves on to the analytical data life cycle, assessment, and remediation of laboratory processes and systems. There is a lot of information about paper records, electronic records, and hybrid systems, which include both paper and electronic records. He strongly advocates getting rid of paper altogether, which is not news to anyone who has worked with 21 CFR Part 11 over the past 20 years. He also covers data integrity of analytical instruments and their qualification as well as computerized system validation. How to validate analytical procedures is covered in detail, and this validation needs to be in place before anyone starts performing analytical procedures. Some regulations call for second person review, and when and how to do this is also discussed. A difficult aspect in any quality work is the quality metrics, and the same goes for metrics for data integrity. He explains how to do this, and also how to raise data integrity concerns. Data integrity audits are quite equivalent to any other quality audit, but include different checkpoints. How to conduct a data integrity investigation is also covered as well as corrective actions and preventive actions (CAPA). Outsourcing is common in the pharmaceutical industry, and checking that the company does the work includes, of course, a check on their data integrity. There is also a list of items to include in the contract between the company and the work the outsourced company does. The final chapter is a data integrity treatise and lists all the items that need to be checked during an audit. It is not a checklist, and the author advises auditors not to use a one during an audit because that may limit them in their investigation.

Each of the chapters have many examples to show the practical implications of the requirements. Known analytical methods are discussed in detail. One example is observation methods (looking at the sample to assess colour, looks, and smell): should this be documented on paper or electronically? Second person review of the method includes checking that what has been documented on paper has been entered correctly into the laboratory information management system (LIMS). Or should there be some other verification of the result? Another is chromatography methods: what are the raw data? The chromatography data system (CDS) as a stand-alone system needs to be handled and controlled one way, while if it is on the network and also connected to a LIMS it should be handled differently. What about the simple process of weighing a sample? How do we document this in order to have data integrity?

 

The audit trail is important in any computerized system because this shows when who has done what, and also why anything has been changed. The automatic audit trail includes both first entries and changes. Requirements are that the audit trail is connected to the data and that this is possible to review. Most of the instrument systems as well as LIMS have had this functionality built in for decades, but we need to prove that it cannot be turned off and has not been turned off at any time to prevent fraud.

Computerized systems also need to have named user accounts and no “multiple-user accounts” like an analyst, and each user should have access to what he or she needs and no more. What do you do if you have a user who is not only a user, but also an application manager with many privileges? How can changing time on the computer, turning off the audit trail, or deleting records be prevented? We need to prove that nobody has access to the back door of the system to change times, audit trails, or other data. The book discusses this in detail.

The many examples in the book are familiar events that happen in the laboratory, like familiar analytical methods and instruments, and help the reader to understand his points. The analyst or manager will then have pegs to hang the sometimes vague requirements on. The requirements on data integrity now are more detailed than the first requirements for computer validation in the late 1980s, “One must make sure that the system is validated”, without any explanation on how to get there. Or the old EU GCP, “Computer systems must be validated and error free”. Oh, well...

The author emphasizes that all his lists and control points do not mean that data integrity is anything like “one size fits all”. For every organization a risk-based approach should be created, and the key to this is to first understand your own processes and go from there. The lists give a good overview after reading the text, but the reader is encouraged to use their own head and make a correct assessment based on their organization, their processes, and the tools they have available. The lists then come in handy because it is possible to pick and choose from them to do a good (or perfect?) job in your own organization. Personally I think that the lists get me thinking in the right direction, and points can be added or deleted once I understand what is needed for my organization. In this respect the content of the whole book is scalable, but you first need to understand what data integrity really is all about. That is no different from all “quality thinking”.

 

When one has read the whole book (or at least most of it), a chapter can easily be re-read and sense made of whatever specific theme the chapter covers, as it contains all the regulatory background, the explanations, and the how-to. Such free-standing chapters mean that there are repetitions in the text, but it also means that each chapter can be read by itself. My opinion is that this is a good approach, even if that results in more pages than strictly necessary.

What I really like about the book are the figures, flowcharts, and tables, which summarize the text very well. But the publisher has made it hard for the readers, as some of the figures are set with fonts that are difficult to read or incredibly small. Some figures have uneven fonts, which results in divided words, like in 2.2 where it says, “No s eparation of sys tems” and “res ponsible”. They have used half a page for figure 16.1 where the capital letters are 1 mm, instead of a full page that would be far more readable. Some of the tables are very long and could benefit from a little space or a horizontal line between the various parts. Column one has a key word and column two has several points belonging to that key word. If you read column two, you frequently continue on a new key word as there is no space there to show that this is the end.

A few sentences could definitely have been made shorter by adding a comma or dividing the sentence. The problem with the English language is that many nouns and verbs are the same, and you need to understand from the context if it is a verb or a noun. When sentences are up to five lines long without anything to break them up, they may need to be read a couple of times before you understand them. There are a few examples of this, but it is not a general problem.

The book is mainly focused upon the laboratory and the pharmaceutical industry. Every industry and any type of work (for example, production, testing laboratories, healthcare, forensics) would definitely benefit from reading this book if they need data integrity. And data integrity is something everyone should worry about. It only takes a little translation to change the word “lab” to whatever your work is about. The details in the text can also easily be translated to your line of work to get you thinking in the right direction. The pharmaceutical standards cited are in principal not very different from other quality standards. It would definitely help to see what the pharmaceutical standards say, even if you work with International Organization for Standardization (ISO), ASTM, healthcare standards, or any other standards. Data integrity is everything. If you don’t have data integrity you really cannot trust your data.

Siri Segalstad has worked with LIMS and all aspects of IT systems validation for over 30 years, first as an employee with a pharmaceutical company, and then with her own consulting company since 1995. She took part in a EU Leonardo da Vinci project to create a complete master’s degree in IT validation. As a result of this, she wrote the book International IT Regulations and Compliance, which she used when teaching at the Hogeschool Gent in Belgium. In addition, she has written a large number of publications and given classes and presentations from Taiwan in the east to California in the west. She is based in Norway.

E-mail:siri@segalstad.comWebsite:www.limsconsultant.com

Related Videos
Robert Kennedy
John McLean | Image Credit: © Aaron Acevedo
Related Content