During the past year, LCGC examined current trends in the application of liquid chromatography (LC), and gas chromatography (GC), and related techniques
in environmental, food, forensics, and pharmaceutical analysis. This article presents some developments made by separation
scientists working in these application areas and offers insights into the current trends in each field.
Our interviews with separation science experts in specific application areas, such as environmental, food, forensics, and
pharmaceutical analysis, have provided the LCGC audience with insights into what's going on in those fields. Here, we have excerpted several recent interviews that were
published in our application-focused newsletters and our digital magazine, The Column.
Detecting Pharmaceuticals in Water
LCGC recently spoke with Edward T. Furlong of the Methods Research and Development Program at the National Water Quality Laboratory
with the U.S. Geological Survey (USGS) about his group's work on developing new methods to detect pharmaceutical contaminants
What prompted you to develop a direct-injection high performance liquid chromatography (HPLC) tandem mass spectrometry (MS-MS)
method to determine pharmaceuticals in filtered-water samples?
Furlong: For more than a decade the presence of pharmaceuticals and personal-care products (PPCPs) in the aquatic environment has
been a topic of increasing interest to the public and to scientists. A simple search of the terms "antibiotics, drugs, or
pharmaceuticals" in the six environmental science journals that have published the bulk of papers on this topic would have
resulted in 18 to 25 papers total in any one year between 1998 and 2002; in 2002 that number increased to over 40, and in
2012 the number of publications in those six journals was over 350. We expect this publication trend to continue.
Concurrently, mass spectrometers have become more and more sensitive, so that triple-quadrupole instruments can now be routinely
used to detect many small molecules, including most pharmaceuticals, at subpicogram amounts (on column). This level of performance
suggested that a direct-injection HPLC–MS-MS method was possible at the nanogram-per-liter concentrations typically observed
in natural aquatic environments.
Finally, as ecotoxicologists and other environmental scientists have studied the effects of exposure to single pharmaceuticals
and pharmaceutical mixtures on fish and other aquatic life, both in the laboratory and in the field, significant sublethal
effects, often behavioral in nature, have been demonstrated at the ambient parts-per-trillion concentrations that are routinely
Thus, we saw that there was a compelling need to develop a single method that could comprehensively, sensitively, and specifically
identify and quantify pharmaceuticals in environmental samples. In addition to sensitivity and specificity, we hoped to include
the widest range of pharmaceuticals, particularly human-use pharmaceuticals that researchers at the USGS might expect to encounter
in samples from across the range of water types and sources present in the United States. Our final list of analytes for the
method was the result of our own understanding of prescribing trends and that information, our knowledge of the available
scientific literature, and input and collaboration with our USGS colleagues, particularly the USGS's Toxic Substances Hydrology
Program-Emerging Contaminants Project and the National Water Quality Assessment.
What kind of environmental impact do you expect this method to have?
Furlong: The method itself we hope is environmentally friendly, since it reduces consumables and disposal costs associated with sample
preparation, along with the carbon footprint associated with sample collection and transport.
More broadly, I think the impact of having this method available to our USGS and other federal, state, and university collaborators
will allow scientists to more comprehensively "map" the distributions, compositions, and concentrations of pharmaceuticals
in US water resources. This method will become incorporated into national-scale monitoring, such as what has already been
initiated by the USGS Toxics Program, and is now being incorporated into the USGS's National Water Quality Assessment Program
and in more regionally or locally focused studies conducted by the USGS's state-based Water Science Centers.
The method also will have a major impact on the quality and depth of the applied research being conducted to elucidate the
sources, fates, and ultimate effects of pharmaceuticals and other emerging contaminants, particularly that undertaken by the
USGS Toxics Emerging Contaminants Project and its many collaborators. That project will apply the method at some key long-term
research sites and in specific projects focused on providing more focused, hydrologically grounded understanding of the effects
of these compounds on ecosystem and human health.
One such project where that has already occurred is a joint study conducted by USGS and the US EPA in which we have sampled
source and treated waters for 25 municipal drinking water treatment plants (DWTPs) across the United States for pharmaceuticals
and other compounds that the US EPA classifies as contaminants of emerging concern (CECs). This study, which is being prepared
for publication, will provide insight into the compositions and concentrations of pharmaceuticals and many other CECs entering
DWTPs and their subsequent removal or reduction during treatment.
The Long-Term Impact of Oil Spills
Chris Reddy from the Woods Hole Oceanographic Institution spoke to LCGC about the role of chromatography in the ongoing environmental analysis of the Deepwater Horizon oil spill and how comprehensive
GC×GC works in practice.
Tell us about your group's involvement in the work at the Deepwater Horizon disaster site. What were the objectives?
Reddy: In April of 2010, the Deepwater Horizon (DWH) drilling rig exploded and released approximately 200 million gallons of crude oil along with large quantity of methane,
ethane, and propane.
It was an ongoing spill for 87 days with oil residues that we continue to find along the Gulf beaches as recently as November
2013. We have studied — and continue to study — a wide range of research questions from determining the flow rate, analyzing
how nature breaks down or "weathers" the oil, and fingerprinting it to confirm that oiled samples we have found have come
from the DWH disaster.
Our field work has led us to collecting samples using a robot right where the oil was coming out of the pipe that you may
have seen on TV at the time. We have walked many miles of the Gulf of Mexico coastline and even 300 or 400 miles away from
the explosion. So we have gone from analyzing oil samples a foot away from the source of the spill to hundreds of miles away.
I expect to be working at the site for the next 10 years, alongside some other oil spills and projects.
One of the main techniques you used was comprehensive two-dimensional gas chromatography (GC
Reddy: My team has extensive experience in tackling some interesting research questions with GC×GC by studying numerous oil spills
that have occurred as well as natural oil seeps. What makes GC×GC so powerful is that it has the capacity to resolve and detect
many more compounds than with traditional analytical techniques, such as gas chromatography with mass spectrometry (GC–MS).
Now, I want to be very clear here. A lot of people hear me say GC×GC can do more than GC–MS, and they immediately assume that
GC×GC is a replacement to GC–MS, but it is not. It is just another tool in the laboratory that allows us to address some specific
questions where a regular benchtop GC–MS cannot.
On the other hand, for polycyclic aromatic hydrocarbons (PAHs), I don't think GC×GC can do any better than what a benchtop
GC–MS can do and the GC–MS software is much, much more user-friendly. And so, in my lab, we don't quantify PAHs by GC×GC.
There's no point. It's easier and faster to do so with a GC–MS.
One of the main factors that makes GC×GC very powerful, that I think a lot of people miss out on, is that when you look at
a chromatogram in two-dimensional GC space you are not only just able to identify and measure compounds (many, many more compounds
than with traditional techniques), but also can convert retention times in the first and second dimension to vapor pressure
and water solubility.
If you're interested in the fate of oil there are two key questions you want to know: What is the vapor pressure of a compound,
and what is the water solubility of the compound?
Now if we use some newly developed algorithms, we can allocate how much of a compound evaporated versus how much dissolved
in water. That is the real major leap in my mind: GC×GC allows us to discover where the compounds are going. It's beyond just
making your Excel spreadsheet bigger and identifying many more compounds. It allows us to say where is this compound going,
or where has it gone.