The LCGC Blog: Forensics, Lawyers, and Method Validation—Surprising Knowledge Gaps

June 8, 2015
Kevin A. Schug

Kevin A. Schug is a Full Professor and Shimadzu Distinguished Professor of Analytical Chemistry in the Department of Chemistry & Biochemistry at The University of Texas (UT) at Arlington. He joined the faculty at UT Arlington in 2005 after completing a Ph.D. in Chemistry at Virginia Tech under the direction of Prof. Harold M. McNair and a post-doctoral fellowship at the University of Vienna under Prof. Wolfgang Lindner. Research in the Schug group spans fundamental and applied areas of separation science and mass spectrometry. Schug was named the LCGC Emerging Leader in Chromatography in 2009, and most recently has been named the 2012 American Chemical Society Division of Analytical Chemistry Young Investigator in Separation Science awardee.

For the analytical community, method validation in some form or another is a natural extension of best practice in the analytical laboratory. However, the notion of method validation, and many aspects of detailed forensics analysis, are not well understood by most lawyers and judges.

 

Recently I served as an expert witness in a case involving the detection of a cocaine metabolite, benzoylecgonine, in a defendant’s urine using gas chromatography–mass spectrometry (GC–MS). This test was performed at a forensics laboratory following a reported positive test using a preceding immunoassay screen for drug metabolites. I will not relay any more details than this, because the problem in question dealt with an apparent lack of GC–MS method validation. For the analytical community, method validation in some form or another is a natural extension of best practice in the analytical laboratory. However, the notion of method validation, and many aspects of detailed forensics analysis, are not well understood by most lawyers and judges. I suppose that this might not be surprising to most, but it does present a serious knowledge gap that must be bridged in cases involving substance or alcohol abuse, if the associated case is to be properly litigated. In this particular case, I was called to testify on the basics of GC–MS, its complexities in analytical method development, and the necessity of method validation and verification. As part of my testimony, I was asked to write a very basic account on the importance of method validation. Below is the bulk of the text that I submitted for this purpose. I thought it might be interesting to relay in the LCGC Blog forum to raise awareness for others that such a knowledge gap does widely exist, and that it is vitally important for analytical scientists to be able to convey such concepts to the public community in fairly simple terms. At the end, I give a bit more about the problem associated with the case in question, which itself is fairly surprising.

Method validation, the comprehensive performance and documentation of measurements to verify a method is reliable and fit for purpose, is an essential component of any analytical chemical measurement. The failure to appropriately validate and document a method makes it is impossible to prove the validity of the scientific test performed by that method. Such a result would be scientifically unacceptable.

There exist many degrees of rigor for method validation that depend on the value and use of an analytical measurement. On the one hand, simple “quick and dirty” feasibility measurements at the outset of a research project may only require simple system suitability requirements are met, and that a result can be reproduced. These may only be recorded in one’s lab notebook and alone would not be appropriate for broad dissemination. On the other hand, unequivocal verification of drug purity for a marketed pharmaceutical, determination of the presence or absence of substances in a biological sample for the purpose of legal action, or dissemination of a new method in peer-reviewed literature require the highest degree of rigor in method validation to ensure the validity of the result. There are many aspects to the validation process, and there is a great deal of guidance available to ensure that proper steps are taken throughout the experimental design and performance of work (1–3).

The components of method validation depend on the desired output of the method. At the outset, a method validation plan, which sets the parameters to be measured and the desired performance metrics, should be devised and documented. The parameters may vary generally depending on whether the method is meant to be quantitative (to determine the amount of a chemical compound in a sample) or qualitative (to identify or verify the presence of specific chemical compounds in a sample). There are metrics associated with each manipulation of the sample from acquisition through preparation and chemical analysis.

When a new instrument is installed in the laboratory, it must first meet manufacturer specifications for proper operation. It must be validated to provide the appropriate information for its intended purpose. The proper operation of the instrumentation must also be verified on the day it will be used to measure real samples (for example, casework samples), as well as when regular maintenance is performed. All of these activities should be carefully documented and verified correct by a laboratory manager. For some chemical analyses, the sample must first be treated in some way to remove interferences (for example, solid-phase extraction or liquid–liquid extraction are common sample preparation procedures) or to better prepare the analytes for chemical analysis (for example, derivatization of analytes to improve chromatographic separation or MS detection). Each step in a chemical analysis adds variance, or error, to the method. The sum of variance for each step in a method equals the total variance of the method, and some steps are more prone to error than others. Therefore, each step must be carefully validated to understand and document its contribution to total error of the method, so that the overall reliability of the end result can be judged.

Quantitative analysis requires the documented validation of multiple figures of merit (1,2). Required parameters include establishing the accuracy, precision, limit of detection, limit of quantification, and specificity of the method. A reliable calibration model (a correlation between analyte concentration and instrument response) must be established. The potential for interferences (from the sample matrix as well as from carryover) must be assessed. Throughout the analytical method, instances of sample loss must be evaluated. In some cases, where deemed necessary, stability of samples and robustness of the method should also be determined. The definition and precise procedure associated with each of the above parameters varies depending on the laboratory setting, the types of samples measured, the techniques used, and the intended use of the measured results. When a method is first established on a new instrument, all of these parameters should be performed and documented with the utmost rigor. As the method is used, the performance must be periodically verified (on a regular basis, in conjunction with instrumentation maintenance, and when real samples are analyzed) using appropriate subsets of the full validation procedures. All documentation should be curated, maintained, and made available when it is necessary to verify that these procedures have been properly implemented and that reliable performance has been proven.

In some cases, only qualitative verification of the presence of a substance in a chemical sample is desired (for example, by mass spectrometry). Such an analysis may be triggered if a decision point from a previous measurement (for example, by immunoassay) was positive and needs to be verified. Several of the validation parameters mentioned above should still be determined and documented to ensure a reliable qualitative analysis (2). This is most important when an analyte in a highly complex biological sample is to be scrutinized. Biological matrices contain many small and large molecules with variable chemical character that can interfere with different measurements. Thus, specificity must be ensured by assessing potential interferences from exogenous constituents that may be in the sample and that may respond similarly to the analyte of interest. Additionally, multiple samples of the biological fluid of interest from different sources should be measured to ensure that endogenous components, which might vary from sample to sample, do not interfere with the intended measurement.

 

 

Many of today’s instruments are exceedingly sensitive. It is important to assess the limit of detection of an instrument for a given analyte, even as part of a qualitative analysis, as well as to determine carryover. Many instruments can measure into the parts-per-billion concentration range. If any residue of the analyte remains in the instrument or in a sample introduction device, then a false positive reading could result. It is important to verify that the level of the measured analyte is consistent with the decision point that prompted the qualitative analysis in question. Ideally, the qualitative confirmation by the complementary technique would also involve a quantitative analysis. This is often not mandated, but it would improve the reliability of the determination. That said, to reiterate, at a minimum, the qualitative analysis requires validation of specificity (interference and carryover determination) and the limit of detection of the method. Some guidance documents also suggest determination of precision in these cases (3).

Many of the specifics associated with the recommended actions for validation can be significantly detailed in terms of procedures and expected performance levels when a particular analytical method is in question. It is important to remember that all steps of sampling handling and analyte determination require validation to ensure reproducible results. Similarly, validations and verifications are best carried out by the analyst who will also handle the real samples, and the documents generated should be approved by a laboratory manager. Overall, clear documentation is critical for all aspects of validation, verification, maintenance, methodological changes, and other factors that might influence a final reported result. Such information provides the confidence necessary to make decisions and draw conclusions about situations that are important for safety and well-being.

For the case in question, the qualitative determination of benzoylecgonine in urine, the forensic laboratory had never performed a method validation. In 2007 when new GC–MS instruments were installed, they had simply made one injection of a benzoylecgonine standard to visually confirm an appropriate electron ionization spectrum was recorded. There were no attempts made to assess specificity in urine, limit of detection, carryover, or precision-throughout an approximately 7–8 year period. While my task was to explain why this was a problem to the prosecutors and judge, and to bridge the knowledge gap, I was most surprised that the forensic laboratory had not performed due diligence. Guidance on such method validations in this case was to come from ISO 17025, which clearly conveys recommended procedures for validation in this and other determinations. According to the information I was given, not only did the laboratory fail to follow these recommendations, the associated laboratory accrediting body charged to verify compliance and good laboratory practice failed to notice this inconsistency. I fear that this type of situation might be more widespread than we know. I do believe that many forensics laboratories operate at the highest standard of performance and validation. Yet, there are many forensics laboratories and the compliance and competence surely vary among them. If you couple the knowledge gap in communicating science to some lawyers and judges with a potential lack of quality lab results in some cases, it is a little scary to think of the number of decisions that may have been incorrectly rendered. Perhaps this is another justification that all citizens should be scientifically literate-an extra set of eyes to locate these potential inconsistencies would seem to be a good idea, especially if someone’s money and livelihood are at stake. I think most people would agree that the system needs to be meticulously authenticated so that those who are guilty are found guilty, and those that are not, are not.

References

(1) U.S. Department of Health and Human Services, Food and Drug Administration, Center for Drug Evaluation Research, Guidance for industry: Bioanalytical Method Validation, Fed. Reg.64, 1–20 (2001).

(2) Scientific Working Group for Forensic Toxicology (SWGTOX) Standard Practices for Method Validation in Forensic Toxicology, SWGTOX Doc 003 (Rev. 1), 2013, pp. 1–52.

(3) United Nations Office on Drugs and Crime, Guidance for the Validation of Analytical Methodology and Calibration of Equipment Used for Testing of Illicit Drugs in Seized Materials and Biological Specimens, New York, 2009, pp. 1–67.

 

Previous blog entries from Kevin Schug:

The LCGC Blog: A Disconnect Between Science Research and Science Education Research

The LCGC Blog: A New Consortium of Researchers for Environmental Analysis and Remediation

The LCGC Blog: How to Get the Most Out of Your First Conference Experience

The LCGC Blog: Five Steps in the Evolution of an Instrumental Analysis Course for Enhanced Student Preparation

The LCGC Blog: Insights on Increased Efficiency for Superficially Porous Particles Among Other Things

The LCGC Blog: Responsible Unconventional Oil and Gas Exploration in Colombia

The LCGC Blog: Intact Protein Separations: Some Education is Missing

The LCGC Blog: Evaluating the Impact of Unconventional Oil and Gas Extraction on Groundwater

The LCGC Blog: My New Obsession: Gas Chromatography with Vacuum Ultraviolet Absorption

The LCGC Blog: From Reversed Phase to HILIC and Back Again: Recent Evolutions in HPLC and UHPLC Stationary Phases

The LCGC Blog: Unanticipated Benefits of Keyword Searching the Scientific Literature

The LCGC Blog: A Report from Riva del Garda: The Current State of the Art of Gas Chromatography

The LCGC Blog: Basics, Applications, and Innovations in Solid-Phase Extraction

The LCGC Blog: My Own March Madness

The LCGC Blog: A View of Separation Science Research at a Czech Conference

The LCGC Blog: What is the Optimal Training to Provide Students Interested in a Career in Industry?

The LCGC Blog: Flow Injection Analysis Can Be Used to Create Temporal Compositional Analyte Gradients for Mass Spectrometry-Based Quantitative Analysis

The LCGC Blog: A Closer Look at Temperature Programming in Gas Chromatography

The LCGC Blog: Back to Basics: The Role of Thermodynamics in Chromatographic Separations

The LCGC Blog: Five Steps in the Evolution of an Instrumental Analysis Course for Enhanced Student Preparation

The LCGC Blog: The Dimensionality of Separations: Mass Spectrometry Is Separation Science

The LCGC Blog: What Can Analytical Chemists Do for Chemical Oceanographers, and Vice Versa? 

The LCGC Blog: Do Not Forget to Assess Potential Matrix Effects in Your LC-ESI-MS Trace Quantitative Analysis Method from Biological Fluids 

The LCGC Blog: Derivatization 

The LCGC Blog: Restricted-Access Media for Biomonitoring Applications: A Solution That Deserves More Attention