OR WAIT null SECS
This month we interview Katelynn Perrault, Associate Professor of Forensic Sciences and Chemistry at Chaminade University of Honolulu in Honolulu, Hawaii, about her work translating 1D GC methods to effective comprehensive 2D GC (GC×GC) methods for forensic applications and the benefits that GC×GC offers the analyst.
Q. When did you first encounter chromatography and what attracted you to the subject?
A: I first encountered chromatography during my Bachelor’s degree while studying forensic science. We were introduced to chromatographic techniques in our forensic chemistry classes at the same time we were starting to learn about them in analytical chemistry. I remember my chemistry courses iterating the fundamentals—how and why separation works based on a mobile phase, stationary phase, and analyte affinity. What I remember most is my forensic classes iterating when chromatography would be used—drug analysis, arson analysis, explosive investigations, and more. I remember being introduced to the concept of using chromatography to solve problems and answer questions. This combined view on what chromatography was and how it could be used made me really excited about it.
Q. What did your Ph.D. focus on?
A: My Ph.D. focused on isolating, separating, and characterizing the odour from soil samples associated with decomposing remains. I worked on understanding how volatiles were evolved from decomposing remains and how they partitioned into the environment around them. One specific focus was determining which compounds in decomposition odour remain within soil when a body is removed from a site, say due to scavenging, an attempt at concealment, or other environmental factors, such as a flood or storm. We used gas chromatography–mass spectrometry (GC–MS) and developed new comprehensive two-dimensional gas chromatography–time-of-flight mass spectrometry (GC×GC–TOF-MS) methods to better characterize the soil matrix. We also worked with cadaver-detection dogs at a local police agency to complement the analytical information we were obtaining with real-world application information.
Q. What chromatographic techniques have you worked with?
A: The main chromatographic techniques that I am currently using for my work are GC–MS and comprehensive two‑dimensional gas chromatography with quadrupole mass spectrometry and flame ionization detection (GC×GC-qMS/FID). I also use a lot of thin layer chromatography (TLC) within my courses to teach the principles of chromatography. Sometimes I also work with paper chromatography in outreach settings, which can be really fun too, especially when you get to see a young scientist react to seeing chromatography at work for the first time.
Q. You were recently involved in a project translating a one-dimensional (1D) GC method to a comprehensive 2D GC (GC×GC) method for forensic applications (1). How did this project arise and what type of forensic applications did you study?
A: This project actually arose from a desire to implement GC×GC in the laboratory in which I was establishing my research group as a new faculty member. In looking towards GC×GC options, I was really intrigued by the idea of converting a one-dimensional GC system with qMS to a multidimensional instrument. The motivation behind this was that I am really invested in promoting the adoption of multidimensional techniques in forensic laboratories, and there is precedent for GC–qMS already existing in forensic laboratories. We wanted to demonstrate that GC×GC is an accessible technology to implement and can be added on to a previously existing setup with great success. In this work we demonstrated the parameters we monitored in converting our system to a GC×GC by adding on a modulator and splitting device, and how we use simultaneous detection (qMS/FID) to overcome some of the limitations of qMS as a detector for GC×GC. We focused largely on compounds present in decomposition odour, but this work could certainly be applied to other areas of forensic and medical analysis, among others.
Q. What were the main challenges you encountered when translating this 1D method to a GC×GC method?
A: We used components from several different suppliers and manufacturers and therefore a large part of the challenge in this type of approach is to make sure that every component is communicating with one another effectively. Our sample introduction methods (liquid injection, SPME Arrow, and thermal desorption unit) have to speak to the GC system, the GC system has to speak to the modulator, and everything has to communicate with the detectors that are functioning simultaneously. In addition, the data we generate have to be exported, converted to a different file format, and then imported into a GC×GC software for processing and analysis. Making each of these steps flow smoothly is one of the bigger challenges of a retrofitted instrument, but it is possible and can be made straightforward if you have the appropriate guidance. We also recently published a paper explaining our workflows and published open access datasets so that our workflow can be followed and reproduced by other researchers who are looking to learn these steps (2).
Q. What were the advantages of the comprehensive GC×GC approach you developed?
A: The advantages of using the GC×GC approach were that you can obtain improved peak capacity and detectability within complex samples. This is the main benefit of multidimensional chromatographic analyses in general—you see more peaks within your sample. There is a large misconception within the chromatography community that coeluting analytes can always be resolved using deconvolution techniques. In our experience, most of our complex samples have many coelutions (that is, sometimes upwards of five compounds with broad dynamic range). This makes deconvolution algorithms problematic and causes the potential for both false positives and false negatives in a peak table. Improving the physical separation of analytes through multidimensional chromatography is a much more feasible and successful approach to comprehensively characterizing your sample.
Q. Can you elaborate on how your new approach was applied to forensic analysis?
A: Our new approach was published to demonstrate how flow modulation can be used as a technology to convert a one‑dimensional GC system within a forensic laboratory into a GC×GC instrument (1). We applied it to the analysis of chemical standards to monitor the translation workflow and be sure that no quality is lost in this translation, especially related to identification and quantification. Most often people raise concerns about losing data in doing this translation and switching to a new approach. There are also concerns about losing the ability to accurately quantify, and so in the chemical standards we analyzed we demonstrated the retention of standard calibration and other parameters. We mostly focused on volatile organic compounds (VOCs) of forensic interest, but these concepts help to better understand how GC×GC might be used in the future in drug analysis, toxicology, arson investigation, ink analysis, and other areas of forensic science currently using GC.
Q. You have also used a GC×GC method in a forensic analysis to investigate decomposition odour in tropical climates (3). What is novel about this approach?
A: Since the start of graduate school I have always been interested in the idea of validating the work I was doing in different areas of the world. We have conducted studies in Canada, Australia, Belgium, and now for the first time in a tropical region—Hawaii. Before now, decomposition odour production was never studied in any tropical region. Therefore it is fitting and helpful that we can investigate the profile in this new region and verify the subset of the profile, which is reproducible from one location to another. One thing we are particularly interested in is seeing what the “core” VOC profile is from decomposing remains that can be detected anywhere in the world and with any instrumental setup—that is, what is actually common about this profile no matter what other factors are at play. We are certainly getting much closer to answering that key question, which lends a lot of validity to this active research area.
Q. GC×GC is often regarded as complicated. Is this the case and have there been any recent technology advances that have simplified ease of use?
A: In my opinion, GC×GC is no more complicated than many other currently used analytical techniques out there. That being said, I think many researchers using GC×GC tend to overcomplicate it when explaining it to new audiences—and I include myself in this statement too! Over the past few years I have been working on teaching GC×GC at the undergraduate level within my classes and over time I have developed a lot of new strategies on effectively communicating GC×GC concepts and incorporating project-based learning to solidify topics. New technology advances are happening so frequently for GC×GC, and I find that in teaching about this technique, sometimes we have to take a step back and just focus on the fundamentals of how each component works rather than all the different options that are currently commercially available. I think we are moving towards the availability of instruments that are more user-friendly and hopefully that will help with adoption. I do think the most important component is in making communication, training, and resources helpful for the next generation of GC×GC users.
Q. Are there any recent research projects using GC×GC that you find particularly innovative and exciting?
A: Most of the articles that draw my attention and excitement these days are ones that have different approaches for dealing with large batch data, usually with chemometric analysis. I recently read a very interesting paper by Favela et al. (4), which applied chemometrics on a large number of extracts from mask materials to see which chemicals people are exposed to during inhalation while wearing a mask. This study employed GC×GC to analyze the mask samples and two proprietary tools to extract data and classification information from mask samples. I found the application interesting and the approach to the data processing was also intriguing. I am always interested in seeing if there are new ideas from other applications that I can apply to forensic samples to help us get more out of our experiments. One thing that I am currently very interested in is incorporating statistical analyses on GC×GC data that have been collected from time series data because a lot of our studies involve analyzing the same samples repeatedly over a period of time. I think we can improve the way we incorporate time as a factor in our analyses and I am looking forward to exploring this area further.
Katelynn Perrault earned a Ph.D. from the University of Technology Sydney and a Bachelor of Science (Hons) in forensic sciences from the University of Ontario Institute of Technology. She specializes in the development of multidimensional separations for the comprehensive characterization of odours of forensic relevance. Kate is the Principal Investigator of the Laboratory of Forensic and Bioanalytical Chemistry, which is supported through several federal grants, foundation grants, and industry support. She researches decomposition odour
for forensic search and recovery, and mentors numerous undergraduate researchers as part of her integrated teaching and research programme. Her current interests include odour production from post-mortem microbes, development of data processing workflows for multidimensional chromatography, promoting the adoption of multidimensional separations in the forensic sciences, and producing curriculum on multidimensional separations to be taught in
undergraduate chemistry classes.