Special Issues
As the GC–MS market sees a trend toward high throughput and fast GC analyses, advances in data analysis software must be up to the task.
The terms "high sample throughput" and "fast GC" have become increasingly common in the gas chromatography (GC) market. Whether it's a QC chemist who is looking for a way to handle samples whose numbers increase with the legislation that govern them or an environmental chemist looking for an edge in an extremely competitive market, increased throughput is a concern throughout the gas chromatography–mass spectrometry (GC–MS) world. As the need for reduced sample run times climbs the list of importance for those looking to purchase new capital equipment, instrument manufacturers have been developing systems to meet new productivity requirements.
(Photo: Getty Images)
This can be seen in gas chromatographs capable of faster ramps or reduced cool times between runs, and by mass spectrometers capable of higher acquisition rates. As an example, a GC–time-of-flight (TOF) MS system (Pegasus, LECO Corporation, St. Joseph, Michigan) is capable of acquiring data at 500 spectra per second, fast enough to sufficiently sample across extremely narrow peaks (less than 50 ms). With hardware capable of these speeds, analysis times routinely can be halved, quartered, or even better. However, this increased throughput does not come without challenges of its own. In fact, two very significant challenges arise as a direct result of this much sought after speed — coelution in the analyses and the review of the resulting data. To properly handle these two challenges, adequate software becomes an absolute necessity. The employed software must evolve from something that does not just allow data to be acquired and reported, but to something that is a dynamic and powerful tool that provides much more capability to handle both of these important issues. This article shows how software is able to accomplish these tasks when used with a GC–TOFMS system.
Figure 1: Chromatogram displaying nine pesticides present over a four-second window following the deconvolution.
Coelution
Coelution is familiar to most analysts who have spent much of their careers attempting to reduce the number of instances or its severity. When performing time-compressed chromatography this is, of course, no longer possible, as reduced runtimes allow for less chromatographic space over which to spread the analytes. Instead, software must be capable of deconvoluting the newly coeluted peaks. Figure 1 further illustrates the symbiotic relationship between hardware capable of high acquisition rates (in this case, 40 spectra per second) and a software capable of properly handling that data by deconvoluting the resulting spectra. The deconvolution algorithm present within ChromaTOF software (LECO) is capable of separating nine present pesticides over a 4-s window. One such spectrum containing three of the pesticides at 158.529 s is shown in Figure 2, with the masses corresponding to fenthion, chlorpyrifos, and parathion identified. Following the algorithm's deconvolution of the spectrum, the resulting spectrum for chlopyrifos can be extracted easily. An example of this is shown in Figure 3.
Figure 2: Resulting spectrum at 158.529 s identifying fenthion, chlorpyrifos, and parathion.
Data Review
The second challenge faced by increased throughput is less often considered but is equally important. As sample run times are reduced significantly, the number of samples analyzed increases, as does the amount of data to review. Take as an example an analyst who has run the same analysis for years with a standard run time of 1 h. If, in the interest of increased throughput, a high-speed system is purchased and run times are reduced to 10 min, the laboratory manager will benefit from dramatically increased data acquisition. However, the analyst will have approximately six times as much data to review. This data-review process now represents the most likely candidate for "bottlenecking" the process. In this situation, the software must be capable of reducing the amount of time taken to analyze each sample by about the same factor that the sample run time was reduced. Effective software must meet this goal in two ways. First, the software should take advantage of increased automation so that analysts have less to address between samples. Second, the software should present itself to the user in a manner that makes it quick and efficient to accomplish the same tasks.
Figure 3: Deconvoluted spectrum (top) and NIST library spectrum (bottom) for chlorphyrifos.
Many functions in the software used here (ChromaTOF, LECO) are capable of increased automation to accomplish this goal. For example, it allows for a sample to have more than 15 processes automatically applied through one mouse click, including Automated Peak Find, library searching, quantifying of the analytes, applying a retention index, printing a series of reports, uploading data to a laboratory information management system (LIMS), and exporting files to a backup drive. By automating these functions, the software frees analysts from having to perform these time-consuming processes and enables them to focus on reviewing the results instead.
While automation is extremely important in allowing for increased throughput with regard to data review, the layout and design of the software is equally important. Users must be able to work seamlessly within a region of the software without needing to move back and forth between sections to accomplish a single goal. It also is essential that different information be displayed in a manner that facilitates speed, ease, and flexibility. The capability to have information displayed while reviewing samples or calibrations, the flexibility in how it is displayed, and its location on the screen increases operator productivity. In Figure 4, a screen shot is displayed showing the increased functionality for the calibration for several analytes. From this single view, analysts can see a list of the analytes in the calibration in addition to the standards in which they appeared, as well as the chromatogram and spectrum for every analyte in each of the standards.
Figure 4: Screen shot generated from software displaying calibrations for several analytes (ChromaTOF, LECO).
Conclusion
As today's laboratories further look toward ways of increasing throughput and productivity, faster sample methods such as TOFMS are gaining interest. These methods have introduced instrumentation with the potential of gaining large amounts of data in a shorter amount of time, enabling more samples to be run. However, this increased amount of data presents a new set of challenges in the data-analysis process. With so much data present, it is often difficult to "sort through" that data in a time-efficient manner, even reversing the goal of increased throughput.
With the proper software, however, TOFMS instruments have the potential of being one of the most sought-after tools for GC analysis. In order to be beneficial to the user, this software should overcome two challenges: the coelution of samples and the review of resulting data. In this article, examples were provided of how an integrated software package can provide a solution for the data collected.
Lucas Smith
LECO Corporation
Please direct correspondence to lucas_smith@leco.com
Detangling the Complex Web of GC×GC Method Development to Support New Users
September 12th 2024The introduction of comprehensive two-dimensional gas chromatography (GC×GC) to the sample screening toolbox has substantially increased the ability to comprehensively characterize complex mixtures. However, for many gas chromatography (GC) users, the thought of having to learn to develop methods on a new technology is daunting. Developing a basic GC×GC method for most (nonspecialized) applications can be accomplished in minimal time and effort given parameter suggestions and ranges to target analytes in a sample of interest. In this article, the authors work describe a simple workflow to develop a GC×GC method for a specific sample upon initial use, with the aim of decreasing the time to accomplish functional workflows for new users.
Modern HPLC Strategies: Improving Retention and Peak Shape for Basic Analytes
August 16th 2024In high-performance liquid chromatography (HPLC), it is common for bases and unreacted ionized silanols on silica-based columns to cause irreproducible retention, broad peaks, and peak tailing when working with basic analytes. David S. Bell, Lead Consultant at ASKkPrime LLC offers innovative HPLC strategies that can help mitigate such issues.
Two-Dimensional Supercritical Fluid Chromatography System Created with Multiple Heart-Cutting Modes
September 11th 2024Université d’Orleans and Chromisa Scientific scientists recently created a two-dimensional supercritical fluid chromatography (SFC) system with multiple heart-cutting (MHC) modes.