Laboratory Accreditation Is Not a Cloak of Infallibility

Laboratory Accreditation Is Not a Cloak of Infallibility

,
LCGC North America, April 2022, Volume 40, Issue 4
Pages: 190

Those working in a technical field are always trying to innovate and increase operational efficiency to produce the most value out of any given project. This requires making data-driven decisions to identify patterns and to forecast the potential implications of specific strategic moves. As with anything, these decisions are only as good as the information that they are predicated upon, and the data are only as good as the people and processes that generate them.

If your decision making relies on analytical chemistry, then you want to be confident that the measurements are an accurate representation of the matrix that is being analyzed, and that they are of “publication” quality. This is a major feather in one’s cap, especially when litigation is involved, because the state and federal court systems regard peer-reviewed data as the gold standard.

But how can you know for sure if the analytical laboratory that you’ve selected is producing reliable data? Often, we are quick to assume that analytical data is sound and accurate based on an accreditation or certification held by the laboratory in question. These sorts of credentials are meant to indicate reliable data production, because a series of compliance training, proficiency testing, and validations is generally required to achieve such qualifications. Even so, they are not the broad-scale cloak of infallibility that many might think they are. In other words, just because a given laboratory carries a certain accreditation does not mean that the data that they generate can be trusted, without performing a deeper dive into their quality assurance and quality control protocols (QA/QC).

From our experience in the energy, environmental, agricultural, and forensic sectors, one should not take accreditations at face value. A bounty of questions should be asked about control and surrogate retention measurements, calibrations, validations, and instrumental upkeep. Fortunately, the majority of this information is made available in a good laboratory’s QA/QC reports.

Below are some suggestions regarding what to look for in QA/QC reports, so that you can secure confidence in the laboratory’s work and the data that it produces.

Control and Surrogate Measurements

The use of surrogate, “spiked,” or control measurements are a standard practice whereby laboratory operators use samples of known concentration to assess the accuracy and precision of the response of their instruments.

Calibrations

Accurate instrumental calibration may single-handedly be the most important aspect of reliable data generation in analytical chemistry. Proper calibration allows the operator of a given instrument to interpolate the concentration of a particular constituent based on the instrumental response to samples of known concentrations.

Validations

Laboratories performing quantitative and qualitative analyses need to follow certain protocols for method validation. These are generally industry-specific but are similar in the types of the measures they prescribe to demonstrate that a method is fit for use and can be expected to provide reliable results.

Instrumental Upkeep

Analytical instruments need routine maintenance and often require major repairs after a significant amount of use. Labora- tories should keep regular maintenance logs to document instrumental upkeep. Additionally, major alterations, such as the installation of new columns, require re-valida- tion of the method according to guidelines. There should be documented re-validation of the instrument performance following major maintenance efforts.

Conclusion

Collectively, reviewing these QA/QC-related items can reveal a considerable amount about the veracity of the data being generated by a given laboratory. The technical fidelity of a laboratory’s QA/QC report is far more insightful than a simple recitation of standardized accreditations and certifications that may be held, which as we’ve mentioned, should be taken with a grain of salt. All of this feeds back into the simple truth that the only thing worse than no data is bad data. Given our evolving world where correlations, projections, and ultimately, data-driven decision making are the standard, it is paramount that the information that we are reliant upon is generated from a trustworthy source that exercises proper QA/QC procedures.

Kevin A. Schug is a Full Professor and the Shimadzu Distinguished Professor of Analytical Chemistry in the Department of Chemistry & Biochemistry at The University of Texas at Arlington. Zacariah L. Hildenbrand is a Research Professor at the University of Texas at El Paso. Direct correspondence to: kschug@uta.edu