OR WAIT null SECS
Q. You recently published a paper detailing a three-step workflow for a retention time indices system for use in the two-dimensional gas chromatography (GC×GC) analysis of ignitable liquid residues (ILR) for arson investigations (1). What is novel about this approach and what benefits does it offer over other methods?
Nadin Boegelsack: Alignment is a key step, and a common challenge, in comprehensive two-dimensional gas chromatography–mass spectrometry (MS) data management. When we started method development for ILR analysis, there was no standard approach to aligning chromatograms in both dimensions, or for calculating resolution in the second dimension. The use of Kovats indices in the first dimension is relatively common to allow overlays between GC and GC×GC chromatograms. The challenge arises with the second dimension, as existing methods require thermodynamic models or isovolatility curves, which are very involved approaches. An additional challenge is defining the starting measurement point (zero-value) in the second dimension. This is often achieved by relating each compound to its bracketing n-alkanes, which requires several calculation steps. We wanted to simplify this time-consuming approach for an easier transition to routine analysis by making it more user-friendly and allowing the whole system to be condensed into three steps: defining zero across the whole chromatogram, obtaining first-dimension retention index (RI) with a single calculation, and obtaining second-dimension RI with a single calculation. We therefore developed this RI system to address alignment in an inexpensive, quick, and user-friendly fashion by combining two well‑known RI systems from GC. The combination of Kovats and Lee indices correlates seamlessly to the ASTM ILR classification scheme, which bases light, medium, and heavy product classification on an n-alkane range (Kovats indices), and compound grouping with polarity and aromatic ring number (Lee indices). Both RI systems have readily available standard mixtures due to their popularity in GC analysis, are inexpensive, and store easily.
Q. What challenges did you encounter when developing this method and how did you overcome them?
Gwen O’Sullivan: GC×GC is more widespread in academia than industry, so there is very little standardization in how to approach handling data across a wide range of applications. As a result, we needed to evaluate the workflow for ease of use, the RI system for its flexibility and performance, and then ensure it was applicable to ILR analysis. Making the RI system user-friendly was achieved by building on existing single dimension GC RI systems. The standards were readily available at low cost, and the systems themselves had been validated over decades of GC applications. Evaluating flexibility and performance was more challenging, as published results displayed variability depending on GC system setup and no acceptable error margins had been published—especially for the second dimension. We established criteria based on theoretical expectations and requirements for general application to evaluate performance against, and evaluated flexibility based on the potential area covered in the chromatogram, as well as options to expand the range of compounds covered regarding boiling point and polarity. The application to ILR analysis was demonstrated by the development of the ILR classification map in relation to ASTM E1618 (2) and its practical application to a variety of neat ignitable liquids, as well as a fire scene sample.
Q. What challenges exist when analyzing wildfire samples?
GO: Wildfires provide exceptionally difficult scenes to investigate. ILRs are typically present at low concentrations and are dispersed; natural compounds are present at very high concentrations and can have structural similarity to ILR compounds. The remoteness of the locations and efforts to extinguish these fires can result in the destruction of evidence. Limitations in sensitivity of analytical equipment lead to false negative reports, frequently contradicting canine detection on scene. Advances in analytical power (for example, using GC×GC–MS) allow laboratory analysis to keep up with canine detection rates and provide fewer ambiguities before court. Critical issues in the field of wildfire investigation are: the presence of ILR at low levels; complex matrices; competitive adsorption; interferences/pyrolysis background; subjective determination and judgment; and the need to determine error rates for court cases. With the prevalence of wildfire increasing globally, it is even more pressing to address these knowledge gaps within the field of wildfire forensics.
Q. What can chromatography reveal about how wildfires impact the environment?
GO: In the last decade, large-scale wildfires have become increasingly prevalent across North America, Europe, and Australia. Although wildfires can be caused by natural phenomena, such as lightning strikes, the majority are caused by human activity. The cost of human-induced fires has risen significantly as the frequency, severity, and associated damages grow. Climate change causes earlier snow melts and overall drier conditions, exacerbating fire risk. Investigations into the origin and cause of wildfires are important for understanding the direct role of humans, the influence of climate change, and the development of necessary and appropriate forest management strategies. Chromatography contributes significantly to our understanding of wildfire initiation and propagation, enhances investigative processes, increases prosecutions and allocation of liability, and contributes information that informs policy and wildfire management.
Q. You have also recently developed a method for optimizing the analysis of ILRs using flow-modulated comprehensive GC×GC (3). Please could you talk a little about this research.
NB: We wanted to utilize the selectivity and sensitivity offered by GC×GC–MS to address the particular challenges associated with wildfire analysis: separation of target compounds from matrix interferences, increasing sensitivity to avoid false negatives, and offering a seamless transition to the currently used standard (ASTM E1618).
When looking at existing method development papers, we noted that few went into details for hardware optimization, that is, column selection and modulator setup, beyond phase chemistry and modulation period. Flow-modulated GC×GC is often considered complicated, mostly due to the need for establishing a suitable flow equilibrium specifically for the method. In this paper, we provided a comprehensive overview of the most important factors to be considered and suggested a template workflow on how to achieve the desired results, with efficient decision-making based on our approach to an ILR method.
For column selection, this covers phase chemistry and its effect on selectivity, column dimensions and column order with their effect on the flow equilibrium, and orthogonality and film thickness with their effect on selectivity and effective chromatography area usage. We observed effects on total percentage of chromatogram area covered (with consideration of wraparound peaks, since they can lead to more demanding manual data processing), as well as separation and resolution in each dimension, and total peak capacity for the system with the potential number of distinguishable peaks.
In the modulator setup section, we first looked at establishing a suitable flow ratio between the columns to ensure good peak shapes across the chromatogram for consistent integration. This flow ratio was also applied to the column selection process to keep results comparable. Second, we investigated carbon loading and dilution effect to ensure adequate sensitivity of the method. Establishing flows across the 7-port valve connecting the columns depends on the flow ratio and its resulting individual column flows. They need to be balanced with the bleed line to allow consistent loop fill, which in turn is responsible for carbon loading and peak shapes in the second dimension. Lastly, the flow equilibrium in the 3-port valve determines the detector split. To achieve the optimum split ratio, transfer line dimensions can easily be optimized with a modelled approach and the manufacturer-provided flow calculator.
The hardware optimization is not usually mentioned in method development approaches, despite its significance. Changes to columns and lines require system downtime, which may be a contributing factor. Regardless, following very general GC method recommendations for method parameters (for example, oven ramp) after the hardware optimization resulted in us meeting all target separations for the verification, which consisted of a certified standard mixture and simulated wildfire debris.
Since real wildfire matrix is more varied and complex, parameter optimization was still required. We used design of experiment (DoE) to evaluate the three most significant parameters under the same performance criteria as the column selection process. The resulting optimized method successfully addressed the associated challenges (separation from interferences, increased method sensitivity, seamless transition to current ASTM standard) on real wildfire debris samples. Without our previous work on retention indices, the evaluation of resolution across different column combinations or parameter settings would have been incredibly challenging. They also allowed us to create an ILR classification map relating to the ASTM classification scheme, which provides a very user-friendly visual comparison of different samples and has been applied to various wildfire debris samples in our research.
Q. What projects are you working on next?
GO: The biggest area for growth in the application of GC×GC analysis for fire debris lies in interrogating the large volumes of data generated from each analysis. The compositional characteristics information with each analysis is unequalled. However, the size and complexity of the data make data analysis a challenge. The work in this area has begun, including the development of new software and workflows for non‑targeted screening and targeted analysis. Techniques such as chemometrics assess both chromatography and spectral data (for example, total ion counts, total ion spectrum, extracted and summed ion counts) using tools such as principal component analysis (PCA), hierarchy cluster analysis, and partial least squares (PLS) regression to reduce and rationalize data processing outputs. Our work continues in this area and we are looking forward to publishing our work in the early new year.
Nadin Boegelsack is a passionate analytical scientist with 10+ years of experience working internationally in industry and academia. She is currently finishing her Ph.D. at the University of Saskatchewan (Canada) while working as an instructional assistant at Mount Royal University (Canada), exploring solutions to analytical challenges in forensic and environmental sciences.
Gwen O’Sullivan is Associate Professor and Chair of the Department of Earth & Environmental Sciences at Mount Royal University. Her areas of expertise include environmental chemistry, environmental forensics, and contaminated land and groundwater. She has over 20 years of experience in a variety of environmental andcriminal forensic projects, including legal sampling, chemical fingerprinting, and statistical evaluation of data for source identification of ignitable liquid residues, persistent organic pollutants (POPs), petroleum hydrocarbons, methane, nitrates, and dioxins in water, soils, and sediments.
E-mail: email@example.com / firstname.lastname@example.org