The Current Landscape of Data Integrity

Published on: 

LCGC North America

LCGC North America, LCGC North America-01-01-2020, Volume 38, Issue 1
Pages: 58

A look at how data integrity requirements and the focus of data integrity have changed over the past five years.

Ensuring data integrity is currently a significant concern in pharmaceutical laboratories, as regulators have increased their focus on this aspect of compliance. To get a sense of the current landscape, we recently spoke to Monica Cahilly, the president and founder of Green Mountain Quality Assurance, in a podcast interview. Cahilly has more than 25 years of consulting experience, with specialized interest and enthusiasm for data integrity assurance and data governance. She has worked with U.S. and international health authorities and with companies regulated by good manufacturing practice (GMP), good laboratory practice (GLP), and quality system regulations, as well as in university and research settings. An excerpt of our interview appears here. In this segment, we asked how data integrity requirements and the focus of data integrity have changed over the past five years.

The data integrity requirements themselves have actually not changed. They’ve always been the same, as long as the regulations that govern medical devices, finished pharmaceuticals, biopharmaceuticals, and all other commodities regulated by the health authorities such as the U.S. FDA have been in place, because fundamentally in the regulations, data needs to be able to be fully reconstructed and fully evaluated to provide objective evidence of the conduct of scientific work. So the requirements themselves haven’t changed. What has changed, in my experience, is our understanding of how to apply those requirements to the current choices that we’re making it.

A common example is this is that many individuals understood that original records, when the records were paper, needed to be reviewed, and that if someone made a cross out on a paper, the reviewer would also need to look at the cross out, to evaluate if that had an impact on the overall decision making with regard to that data set that was on the piece of paper. What was not well understood is that when computers were implemented in the digital experience platform (DXP) regulated industry, as far back as the 1970s and 1980s, those computerized systems, even the less sophisticated ones, generate electronic data sets that are, in fact, the original records.

Equally not well understood is that those original records that are electronic need the same requirements that have historically applied to paper, which is that they need to be retained in a manner that is complete. They need to be subject to review by a second person, because that second person is looking at what the person who generated the data did in the electronic system. Then the reviewer also needs to be able to look at the crossed-out text in the electronic data sets, which many computer system developers refer to as computer generated audit trails, that help reconstruct who did what, when, and why.


I think the majority of the focus currently on data integrity has to do what I refer to as “blind spots,” where we made a choice, but our control strategy was outdated and didn’t really help us govern the choice that we were making. So, we’re trying to catch up and undo these blind spots, so that we can gain some of the efficiency and effectiveness with regard to the choices that we’re making, such as the use of computerized systems, but also so that, at the end of the day, the most important thing is, obviously, our patients on the receiving end of what we do. We want to be able to make better and better decisions, because those affect our patients.

One of the most curious questions that I get with regard to data integrity is “Can you give me a checklist? Just give me a checklist, and if I fill it out, I will know everything I need to know about data integrity.” While that question, I believe, is coming from a good place, it reflects a lack of understanding about the nature of data integrity and data integrity requirements, because to understand data integrity, I think you have think in a holistic way. You have to think using quality risk management principles, you have to think using critical thinking skills, and you really have to say, “I need to take my existing quality systems and my processes and control strategy, and I have to update and modernize them to make these requirements integral to everything that I do that is currently governed by my existing quality system.”

So asking to be given a checklist for data integrity is equivalent to saying, “just give me a checklist for GMP, and if I fill it all out at the end of that process, I’ll be GMP compliant.” Anyone who’s an experienced professional in any of the “Good Practice” (GxP) regulated industries knows that you can’t just get a checklist and at the end of that process understand what to do. Data integrity is a similar concept because it has many moving parts, and it requires a holistic quality risk management approach.

This interview has been lightly edited for style and space.