Gas Chromatography Instrumentation: The State of the Art

Article

LCGC Europe

LCGC EuropeLCGC Europe-12-01-2012
Volume 25
Issue 12
Pages: 675–681

A panel of experts give their opinions about the current trends in gas chromatography (GC) instrumentation and how the technology could develop in the future.

Laura Bush, editorial director of LC•GC North America and LC•GC Europe, recently organized a panel discussion on current trends in gas chromatography (GC) instrumentation and how the technology could develop in the future.

A panel of experts (listed in the sidebar) were asked to discuss the most important trends in gas chromatography (GC) instrumentation and to try to predict how the technology will develop in the future.

The Future of Fast GC

We started by asking our expert panel about some of the latest methods, including fast GC. All our experts agree that the future for fast GC is bright. Naturally, chromatographers would like to get their results faster with lower consumption of carrier and detector gas. Also, fast GC is synergistic with the shift from helium to hydrogen carrier gas, and it can increase overall sample throughput. This speed is particularly valuable in applications like high-throughput screening.

The main route toward faster GC analysis according to Paola Dugo, Luigi Mondello and Peter Q. Tranchida of the University of Messina, is to use microbore capillaries. Such an approach is readily accessible, because most commercially available GC systems can provide the required instrument performance needed when using such columns. Also, microbore capillaries with a 0.1-mm internal diameter are now available with a wide variety of stationary phases. "All this means that it is now possible to shorten analyses times by a factor of 4–5 times, with no price to pay in terms of resolution," they say.

But there are challenges with fast GC, says John Hinshaw, a senior scientist at BPL Global and a longtime columnist for LCGC. For example, the instrumentation must present the column with fast enough injection speeds and rapid column oven programme rates while capturing the resulting fast peaks. Also, existing methods must be translated to preserve peak order, and they still may require revalidation. Lastly, he says, narrow-bore thinner-film columns may have reduced sample capacities and injection-volume limits that require adjusted sample concentrations and volumes.

Fast GC can also get bogged down by slow sample preparation or delays in cooling the oven at the end of the run. "There's no point in a 2-min fast GC run if it is accompanied by a 6-min oven cool down," says Alastair Lewis of the University of York. This points the way, he says, toward resistively heated columns, but with really low thermal masses. "The annual energy savings for labs could turn out to be the clincher here, rather than better analytical capability."

Nor will fast GC solve problems of poor resolution, according to Frank Dorman of Penn State University: "Fast GC really needs to be coupled with the optimization of the other GC parameters, most notably the selectivity of the stationary phase," he says. "This is especially true for targeted separations, which is where more of the fast GC applications have been directed." So he predicts that fast GC will remain a niche technique for the short term. "But that could change if we move away from 30-m fused-silica columns," he adds.

Hans-Gerd Janssen of Unilever Research and Development argues that there is not that much need for fast GC. A standard GC run including cooling, reconditioning and cleaning the syringe, rarely takes more than 45 min he points out, which means at least 30 unattended runs can be done in one day. "For most laboratories, that is enough and more than the analyst can handle in sample preparation and interpretation," he says. "Also, there are not many laboratories that need a short time between arrival of the sample in the laboratory and the results being available."

Two-Dimensional GC

We also asked our experts about another emerging method, comprehensive two-dimensional (2D) GC (GC×GC). Will this technique become the norm for the analysis of volatile compounds, we wondered?

Our experts generally agree that GC×GC is extremely powerful, but unlikely to become a mainstream method in the near future. "Issues such as high instrument costs, use of cryogenic gases and optimization difficulties inhibit the widespread and routine use of this approach," says the group from the University of Messina.

John Seeley of Oakland University agrees: "In GC×GC, initial runs often produce a nearly unrecognizable 2D chromatogram and the appropriate changes to the run conditions require both an understanding of the GC×GC technique and knowledge of the sample mixture," he says. "I'm not sure that there is widespread commitment by the community to take the time to become proficient in the use of GC×GC."

Perhaps just as important, the resolving power of 2D GC often is not necessary. "In many cases, typical problems can be solved with conventional GC technology, and this will continue to be the norm," says Hernan Cortes of H.J. Cortes Consulting and the University of Tasmania.

As a result, Hinshaw believes that GC×GC will remain limited to complex or difficult separations in which two orthogonal separation chemistries are required to fully pull apart a mixture.

But for such analyses, says Dorman, GC×GC usage will grow. He points out that current commercial GC×GC systems are quite robust, often with software that is fully integrated with the hardware, making these instruments capable of routine analysis. "The technique is still viewed by many as a niche, but each year, more practitioners are realizing that GC×GC is no longer a 'research tool' only," he concludes.

Philip Marriott of Monash University, who considers himself a proponent of comprehensive 2D GC, would like to see it play a wider role in the chemical analysis of volatile samples, but doesn't expect that to happen quickly. "Few standard methods of analysis describe dedicated GC×GC protocols, and these take a long time to develop," he says. "It would be interesting if a reviewer (or even a judge) were to call into question the chemical analysis of a volatile sample if only 1D GC was used, and proposed or required that a GC×GC method should be the logical methodology!"

Stan Stearns of Valco Instruments is more optimistic about the expansion of the method. "Comprehensive GC will grow as users realize how much simpler multidimensional systems can be than current GC methods," he says. For example, he says, the analysis of trace impurities in ultrapure gases is routinely accomplished by the use of valves to divert unresolved components into separate columns for determination at the part-per-billion level. "Partial modulation has been shown to provide a simple alternative to the many complex multidimensional systems requiring off-line valves and electronic pressure controls," he points out.

Janssen agrees and believes 2D GC methods will become the norm, and not just for volatile compounds. "Comprehensive GC×GC is getting easier to do," he says. "So rather than occasionally using heartcut 2D GC only if everything else fails, comprehensive GC×GC will be done for all samples." He also believes GC×GC is a good solution for laboratories that need flexibility. "Because of the very high separation power of 2D GC, many different applications previously requiring different columns can now be done on one GC×GC instrument with one column set."

The Impact of Miniaturized Sample Preparation

We also asked our expert panel if the miniaturization of sample preparation procedures will lead to increased automation or greater use of GC in the future. Several of our experts believe that improved sample preparation can expand the use of GC, both in the laboratory and in field applications.

"If we can get better miniaturization of the sample prep, it could completely change how GC is viewed and used," says Lewis. "GC instrumentation really needs to be viewed as not just the column and the oven but the whole sample prep as well." He feels the GC community has been very slow to adopt the microfabricated technologies and lab-on-a-chip approaches that dominate sample preparation for biomedical applications. "There is a lot of catching up to do," he says.

Even without automation, expanding the use of popular sample preparation techniques such as solid-phase microextraction (SPME) and stirbar sorptive extraction (SBSE) will continue to expand GC into other areas of analysis, says Jared Anderson of the University of Toledo: "There are many types of samples in which the concentration of the analytes is well below the detection limits of GC instruments, even when mass spectrometry (MS) is used as the detector," he points out. "Sample preparation methods that enable preconcentration of analytes have broadened the types of samples that can be analysed by GC."

Techniques, such as SPME, adds Seeley, allow users to analyse a wider range of samples without spending tens of thousands of dollars on a specialized sample inlet.

Stearns says he is already seeing evidence that the miniaturization of GC components is leading to greater automation and GC usage. "We see this in the growth of component sales to end users who are unable to find commercial system offerings," he says. "We have seen great interest in nanovolume devices for sample introduction and column switching."

Automation, furthermore, can really make a difference in applications where large numbers of samples are handled, such as metabolomics and forensics. "[Speed] becomes the decision point as to whether a technique will be used or not," says Dorman.

But, Janssen notes, automation requires an investment in equipment and user training. "Automated sample prep devices are easier to use, but more difficult to maintain than their manual counterparts," he says. "Manufacturers should probably also sell maintenance deals, at realistic prices."

The Messina group, however, doesn't foresee significant expansion of GC as a result of improved sample preparation. "GC will always be the preferred choice for GC-amenable analytes," they say.

Large-Volume Injection Techniques

Our panel also considered whether large-volume injection (LVI) techniques have a promising future. They generally agree that LVI will always be very useful, when enhanced sensitivity is needed, given that LVI is an easy method to improve sensitivity by a factor of 10 or 100. "Any new sample introduction technique improving performance of the overall method is promising," observes Janusz Pawliszyn of the University of Waterloo.

GC Instrumentation Expert Panel

Dorman cautions, however, that when used for difficult sample analysis, LVI techniques are only of value when coupled to sample preparation techniques that minimize matrix complexity. "Merely injecting more onto the column, even if solvent elimination is performed, causes difficulty from the other materials in the sample," he warns. This is routinely an issue in food analysis, he says, where the samples often contain high levels of native compounds (such as chlorophyll or fatty acids) that may preclude the use of LVI.

Hinshaw adds that with small-bore columns, large injections come with their own set of side effects and problems that must be managed, such as peak shape distortion, irreproducible retention times and rapid column deterioration.

Indeed, many people find LVI far too complex. Janssen points out that a small mistake in a parameter setting can have dramatic consequences, causing the GC to show very severe carryover for weeks. "For LVI to become really accepted, much better software is needed for parameter selection, combined with hardware that protects the inlet, column and detector from the dramatic consequences of slight mistakes in the setting of the LVI conditions," he says.

Because of concerns related to the introduction of matrix interferences for some sample types, adds Marriott, additional column or injector maintenance may be required with LVI. (LVI normally requires a retention gap or other means of protection of the main analytical column.) "Obviously the ease of making a regular injection is the counter-balancing argument to the more complex LVI method," he says.

The Continuing Evolution of Sample Introduction Methods

Many sample introduction methods are available for GC. Our question for the panel: Will these all survive the next decade?

Many agreed that there is a natural evolution in methods as new techniques are developed and gradually adopted, and the popularity of older ones fades. But, they add, given the very wide range of compounds, concentrations and matrices that can be studied by GC, multiple injection methods will always be needed for full coverage. "Every new injection method adds to the versatility of GC and complements rather than replaces other techniques," observed Stearns.

Hinshaw countered that the need for specialized injection is driven by limitations of the column and detector. "Splitless injection exists because the inlets and columns of the time, and to a large extent today's inlets and columns, absorb or interfere with very small solute amounts," he explains. Similarly, he says, most detectors are not sensitive enough to detect low levels. "If inlet and column deactivation continues to improve and if detector sensitivity becomes greater, then specialized sample introduction techniques will become less important," he concludes.

New Stationary-Phase Chemistries

When we asked whether there is a need for new stationary-phase chemistry for GC, Dorman did not equivocate.

"In a word, 'absolutely'," he declared. "The trend of separating almost everything on a '1' or a '5' column has to come to a stop." GC practitioners rely more and more on MS to resolve chromatographic coelutions, he continued, and this is a bad practice. Topping his list of needed chemistries are thermally stable polar and shape-selective phases.

For his part, Cortes feels that chiral separations are an obvious example where specific selectivities are not provided by conventional stationary phases.

Marriott cites several other areas of unmet demand for improved phases. "For instance, a true high-temperature and improved-stability polyethylene glycol–type phase will be a valuable addition to GC," he says. He would also like to see liquid-crystal phases more widely available in capillary format.

Seeley points to the work of Colin Poole, Peter Carr, Michael Abraham and others as demonstration that the vast majority of commercially produced stationary phases have highly redundant selectivities and do not exploit the full range of intermolecular interactions. "For example, virtually all liquid stationary phases have negligible hydrogen bond acidity," he says. "Luckily, the recent introduction of new stationary phases, like ionic liquids, is starting to address these deficits."

Others agree that ionic liquids are a fertile area of growth. "Recent advances in ionic-liquid phases are a good example of a new phase class engendering a whole new set of separation capabilities," says Hinshaw.

The Messina group finds the introduction of novel phases most attractive for the field of multidimensional GC. Given the high complexity of real-world samples, they say, a change in single-column stationary phase does little to improve the final number of fully isolated compounds. But with multidimensional GC approaches (both heart-cutting and comprehensive), they point out, the combination of stationary phases with entirely different retention mechanisms can greatly improve the separation of complex mixtures of volatiles.

The Future of Field-Portable Gas Chromatographs

Field-portable GC instruments offer great advantages, particularly for applications such as homeland security and the emergency response to environmental disasters. In addition to improving speed and sample robustness, "taking the laboratory into the field" also dramatically decreases the overall cost of an analysis. But a limited number of samples are amenable to field analysis, and field-portable instrumentation has historically given up a lot of analytical power relative to its fixed-laboratory counterparts.

"Ease of use, ruggedness and price [of field-portable GC instruments] are all acceptable, but better performance in terms of numbers of plates and an easier method to change the selectivity of the systems are needed," says Janssen. "Instruments are needed that provide separation powers similar to that of laboratory instruments, with easy methods to change and replace columns.

But some believe the performance differences between laboratory and field systems can be reduced. "With more advances in micromachining and microelectrical mechanical systems, it is likely that this gap will be narrowed, making field instruments capable of fixed lab method compliance," says Dorman. And once the analytical power of field instruments approaches that of fixed lab instruments, he says, the throughput advantage will drive more separations to these devices.

Improving detectors will also aid in this process. "As systems become smaller and faster and are combined with selective detectors, particularly compact mass spectrometers, their use will expand," says Stearns.

Some also foresee growth in field-deployable systems not just for emergencies, but for routine monitoring.

The Messina group holds a contrarian view, generally preferring the option of fast and effective field sampling, rather than analysis in the field, and note that great advances have been made in the sample preparation field over the past two decades. "Essentially, it is preferable to bring the sample to the GC–MS system, instead of the opposite," they say.

Preparative GC

Unlike preparative LC, which is used routinely in the chemical and pharmaceutical industries to isolate sufficient quantities of material for further study, preparative GC has not been a workhorse in separation science. The classical preparative GC systems of the 1960s have largely fallen into disuse, notes Hinshaw, as advances in stereoselective synthesis have reduced the need for post-synthesis enantiomeric purification. Furthermore, GC–MS often yields sufficient information so that further large-scale isolation is not necessary.

But preparative GC can be valuable, say the experts, particularly for niche applications. Examples include the isolation of specific compounds for a variety of scopes, such as thorough structural elucidation using nuclear magnetic resonance (NMR) spectroscopy, for use as standards, or for evaluation of a biological activity.

Marriott also sees potential in preparative-scale GC using capillary columns, rather than the packed columns normally used in this method. "Some recent studies (such as our studies with NMR, and also preparative-scale capillary GC collection of material for crystal structure analysis) point to the value of having capability in capillary prep GC," he says.

New Detectors

We asked our experts if GC requires new detectors to keep pace with changes in analytical methods. Most feel that what is needed is not new detector technologies, but improvements to make existing detectors perform better.

"The detectors available should be made much more user-friendly to operate and more rugged," says Janssen. "All selective detectors are troublesome to use. NPD [nitrogen–phosphorus detection], SCD [sulphur chemiluminescence detection], ECD [electron-capture detection], FPD [flame photometric detection] and so forth are notorious for their problems."

Particular improvements needed include minimization of detector internal volume, reducing detector rise times and increasing acquisition frequencies add the Messina group. Such improvements are especially needed for techniques that generate narrow chromatography bands, such as fast GC and comprehensive 2D GC.

Improvements are also needed in terms of reliability and serviceability, as well as sensitivity. "There are few substances that cannot be detected, even with the alphabet soup of available detectors," adds Hinshaw.

The experts also seemed to have a wish list of specific developments.

Seeley would like to see smaller-volume versions of existing detectors. "For example, the current generation of electron-capture detectors can be used with GC×GC, but they must employ very high make-up gas flows," which decreases sensitivity, he says. "An electron-capture detector specifically designed for detecting 100-ms wide peaks would be very advantageous."

Lewis's list includes simpler, highly sensitive, portable detectors that could be coupled to a field GC instrument, a micro flame ionization detector with built-in hydrogen generation or electrolysis, and a compact, high-speed photoionization detector.

Cortes sees promise in differential mobility spectroscopy as a detection system for GC. "I expect to see further developments in this direction," he says.

The development of portable MS detectors is another important direction, says Pawliszyn. Lewis agrees. "Compact mass spectrometers already exist, but need to match lab sensitivity to find real application," he says.

The Role of Mass Spectrometry

The panelists generally agree that improvements in mass spectrometers, such as better sensitivity, higher mass accuracy, tandem MS and easier use and maintenance, will be beneficial to GC techniques. "Such developments will make many existing GC applications simpler, and enable additional applications that currently cannot be done by GC," says Janssen. For example, compounds that are thermally unstable could be analysed on very short GC columns if the MS detector provided additional selectivity, and samples that are too complex for regular GC–MS could be done with improved GC–MS methods.

Lewis sees the need for higher-resolution MS systems at a lower cost. "Some astonishingly capable basic mass specs exist for GC," he says, noting that current quadrupole instruments offer "superb capability for a very low price." But as one moves to higher-performance instruments, he says, the cost for incremental increases in performance go up exponentially. "Low cost (meaning approaching quadrupole) TOF [time-of-flight]-MS with high mass accuracy would be a game changer," he declares.

MS Will Not Make GC Obsolete

Seeley also pointed to the need for affordable TOF-MS instruments. He wonders, though, if growth in MS leads to stagnation in GC. "When analysts feel that all of their separation problems can be solved by changing the electronic settings on a mass spectrometer, will they take the time to fully adjust the column arrangements and chromatographic conditions?" he asks.

Many of the other panelists have also considered this concern, including the notion that advances in MS will make GC separations unnecessary.

"Although there have been suggestions that as MS technology develops further the need for a separation system will decrease or perhaps be eliminated, I am of the opinion that separations will always be necessary," says Cortes. "I do not foresee MS yielding all the necessary information without chromatography in the front end, especially for the very complex samples we deal with day-to-day."

Marriott agrees: "This is not so much a question of using MS as a substitute for GC, but rather a question of how MS developments might impact the belief in 'how much separation is needed' in the GC step," he says. He agrees with Seeley that GC innovation seems to have suffered from the focus on MS. "Definitive studies in this area should be encouraged," he concludes.

"Future developments in MS will still highlight the need for an efficient GC separation step, because real-world mixtures of volatiles can reach extremely high degrees of complexity," adds the Messina group. "In short, there is a need to develop both powerful GC and MS methods."

Ill-Prepared Analysts?

We also asked our panelists how the status and use of GC is affected by the poor state of knowledge of most users today.

All lamented the current state of affairs. "In the last decade the user has become the limiting factor," says Janssen. "There are nice technologies available allowing much more efficient use of GC. But many users find these too difficult. Who is using heart-cut, or backflush, or large volume injection, or coupled columns? People prefer LC–MS-MS: It's more expensive, but easier and more sexy."

Many also worry that the quality of GC analyses suffers as a result of poor training. "This problem has even invaded the academic institutions where separation science is often thought of as a 'technician' function and the 'real science' is considered to lie in the interpretation of the results," says Dorman. "If the method development chemist and the operator of the instrumentation do not know how to properly optimize and maintain the instrumentation, then the data interpretation will be flawed from the start."

This problem also can affect the development of advanced methods like comprehensive GC. "One of the major concerns will be if papers reporting GC studies are not fully informed of the technology of GC, and this might be the case for the emergence of GC×GC today, poor background can lead to misinformed research or incorrect interpretation of research," says Marriott.

Similarly, poor preparation can discourage the development of new instrumentation. "The lack of knowledge leads to a 'black box' mentality by the users, which in turn limits the motivation of the major instrument manufacturers to develop new technology," says Seeley. "To avoid a customer support nightmare, manufacturers are reluctant to introduce new methods unless they are highly evolved or fairly simple."

Lewis also sees a vicious circle. He blames it not just on the poor state of GC knowledge, but also on the influence of inflexible regulatory methods that use GC, which can leave the best scientists disinterested in pursuing the method. "GC is left as the preserve of the poorly trained lab technician following a well tested recipe," he says. "The poor image has real impacts; it limits the depth of ongoing parallel academic research that in turn stifles the flow of smart people in to the field." And he feels the dire public relations image of GC is partly deserved. "Commercial offerings have barely changed in 30 years, [we see a] smarter looking box, slightly neater software, but no real game changing introductions," he observes.

So that raises the next question: Is the current teaching and training of scientists equipping them properly to function in the workplace?

The educators on our panel say it is hard to generalize, but they all feel that they make a concerted effort to prepare their students as well as possible, with both theory and hands-on experience with the instrumentation.

Anderson is one such educator. "In my laboratory, the undergraduate and graduate students are exposed to all aspects of GC," he says. "For example, we often coat our own capillary columns in addition to the students' learning the theoretical basis for GC separations as well as the function and purpose of all components that comprise the gas chromatograph."

But the academics also acknowledge limitations and constraints. Given the range of work skills that employers need, the university system cannot possibly fully prepare students for all possibilities. Also, because of time and financial constraints, says Seeley, students in most universities are given very little time (if any) to learn how to design separations from the ground up.

Larger pressures from the university system also have an effect. Not many university courses are prepared to make the time commitment to ensure analytical chemistry is covered sufficiently broadly, says Marriott, and that may be partly a result of the under-representation of analytical chemists in the academic hierarchy. That means students may get a broader organic or inorganic chemistry education, but ignores the fact that analytical chemistry is a major destination for most graduates. "But it is generally true that administrators find it easy to force a reduction in the skillstransfer experience that results from laboratory sessions," he concludes.

Lack of access to good instrumentation can be a problem, too, says Lewis, and in some institutions, there may be a tendency to follow overly prescriptive methods or experiments. "Ideally, there would be more open-ended experimentation that allowed students to explore the possibilities for analysis with GC," he says.

Training deficits also occur in industry. "Many companies have training procedures that are designed to document training, but not necessarily truly educate," says Dorman. "Modern instrumentation is so user-friendly that it is easy to get results. That does not mean that the results are correct, however, and education is critical to understanding the data generated by these systems."

Janssen also observes that inexperienced laboratory managers may underestimate how quickly the skills can be learned. "They do not understand that taking a week-long course may be sufficient to learn the principles, but becoming a real GC artist requires many years of experience and free time to play with the instruments," he concludes.

Future Developments

Finally, we asked our panel what are likely to be the most important instrument developments in the next decade.

A number of experts observed that miniaturization and portability are the future frontiers in instrument development. Lewis believes a key development to enable that will be a move away from the turbulent fan oven to resistive low-thermal-mass columns. "This would bring with it the intrinsic reduction in instrument size, power consumption and potential field-ability," he says. "It would stimulate activity researching new small-scale detectors and introduce GC to new markets." Others noted that miniaturized MS detectors will also play a key role in the evolution of portability.

Janssen and Seeley both foresee important developments in fluidics and column coupling. "Microfluidics will be used to direct gas flows and compounds to additional columns for selectivity tuning and to bypass contaminated column segments," predicts Janssen. For those hardware configurations to be adopted, Seeley adds, user-friendly software will be needed to help manage the additional degrees of freedom.

Both Janssen and Stearns foresee the transformation of the current "boxes" into computer-controlled, component-based systems that might even make it possible to change columns from the keyboard.

Dorman takes that idea one step further and envisions the development of a GC system that utilizes a modular column would possibly revolutionize GC. "If the column were able to be automatically installed by the instrument, and encoded with its own performance specifications, then the instrument could even diagnose when the column needed to be changed, and perform this operation automatically," he says. "This would further simplify the technique, and aid in an area where operator skill can be most problematic."

Marriott, in turn, believes the first major development on the horizon is a much greater integration of multidimensional GC into the standard operating process of GC instruments. "Comprehensive GC has revealed that sample complexity — whilst often suspected, but also often overlooked — demands a new paradigm in the practice of routine GC," he says. "Having special devices (such as microfluidic Deans switches) that the experienced user has to fit to a gas chromatograph does not make the technology widely available." So if GC manufacturers are serious about their interest in multidimensional GC, he says, they will have to produce smart methods that simplify the process for the general GC user.

Other answers were more about wishes than predictions. The Messina group would like to see an improvement in the sample injection process, namely in the capability to inject a very narrow chromatography band.

Hinshaw dreams of a portable device that combines Raman backscatter with a micro GC–MS system and chip-LC. "That would come close to the apocryphal tricorder," he says.

Acknowledgment

I would like to extend a special thank you to Colin Poole for serving as the chair of this GC Instrumentation section. Poole is a professor of chemistry at Wayne State University in Detroit, Michigan, USA, and a member of the Editorial Advisory Board of LCGC.

This article was first published in the 30th Anniversary issue of LCGC North America in August 2012.

Related Videos
Robert Kennedy
Related Content