Following the publication of â€œAnalysis of the State of the Art: GC Instrumentationâ€ in the August 2012 issue of LCGC, a reader notes further factors to consider, including poor user training, unanswered questions in fast GC, and the need for further instrument improvements, particularly better deactivation of the internal surfaces to minimize loss of trace samples.
Following the publication of “Analysis of the State of the Art: GC Instrumentation” in the August 2012 issue of LCGC, a reader notes further factors to consider, including poor user training, unanswered questions in fast GC, and the need for further instrument improvements, particularly better deactivation of the internal surfaces to minimize loss of trace samples.
In reading the article “Analysis of the State of the Art: GC Instrumentation” in the August 2012 issue of LCGC (L. Bush, LCGC 30, 656–663 ), I noticed a few things that were missed.
Over the past 30 years, gas chromatography (GC) and GC–mass spectrometry (MS) instrumentation have become much more user friendly and reliable.
The issue is that while it is now possible for anyone to use them, and in fact there are a lot of “anyones” using them, they do not understand how the systems really work, what affects the results, or how to verify and validate their results. There are enough ways to make a bad analysis and usually only one or two ways to make a good reproducible analysis.
Back in the day, when instruments were as quirky as the scientists that ran them, the scientists verified that everything was working properly and that the data looked right and made sense.
Now that the results get spit out into automated reports across networks, from instruments fed by personnel who have never darkened the door of an instrumental analysis class, let alone a chromatography class or MS class that dealt with any of the instrumental issues that impact your results, good luck. I am not saying that this is always the case, but it is the case too often. This is especially true with autosampling systems, where the samples are loaded, the queue loaded, and the start and process data functions follow through automatically without review of baselines, peak fits, and so forth. Even worse is when the user is someone who has never been in a basic organic chemistry class, or any chemistry class at all.
Another issue: How many points do you really need on a GC peak for reliable peak integration? This is a big question to be answered in fast GC. How fast of a time constant in a detector circuit do you need for fast GC? What is the settling time? Has this been addressed in the GC systems that you are using for microbore or other fast GC?
In addressing the future improvements in GC technology, better deactivation of the internal surfaces of injectors, detectors, and so forth is necessary to minimize loss of trace samples. In the aerospace industry, many times the composition of trace components in very small samples is considered very important. When you may have a total of 1 µg of sample and you need to verify the presence or the absence of a fraction of a thousandth of that amount, you need to not have nonspecific absorption in your system. If you do, you better know why, where, and what the impact is.
These issues expand into the planetary science communities, where arguments as to the performance capabilities of the Viking lander’s GC–MS system are still being argued. Does anyone know how to verify the capabilities of a 30-m packed capillary GC column across a known temperature profile, with a known fixed flow rate of hydrogen gas and a published procedure for creating the packing materials from commercially available materials?
John S. Canham, PhD,
Senior Staff Scientist, Systems Engineering Group, ATK Space Systems, Inc., Beltsville, Maryland