OR WAIT null SECS
Laura Bush is the Editor in Chief for BioPharm International
A panel of experts give their opinions about the current trends in high performance liquid chromatography (HPLC) instrumentation and how the technology could develop in the future.
Laura Bush, editorial director of LCGC North America and LCGC Europe, recently organized a panel discussion on current trends in high performance liquid chromatography (HPLC) instrumentation and how the technology could develop in the future.
A panel of experts (listed in sidebar) were asked to discuss the most important developments in HPLC instrumentation over the past two decades. Ultrahigh-pressure liquid chromatography (UHPLC) — with its small-particulate columns, ultralow-dispersion fluidics, low dead volumes and ability to achieve fast separations — topped the list.
LC Instrumentation Panel of Experts
"In many industries, UHPLC has opened up new approaches to working practices in terms of speed of analysis and ability to better 'fingerprint' samples," said Paul Ferguson of The Chromatographic Society in the UK.
Richard A. Henry of the Pennsylvania State University agrees that UHPLC was a significant development, but sees important downsides to it. "The current benefits of UHPLC instruments may be offset by reliability issues, high instrument pressure drop, loss of temperature control, and more," he says. "And the cost of doing HPLC has gone up very dramatically."
Mansoor Saeed of Syngenta feels the most important developments were two of the enabling technologies for UHPLC. One is the design of pumps that made it possible to increase operating pressures to 15000 psi, reduce dwell volumes to below 45 µL and improve solvent blending to refine gradient operations. The other was increasing detector sampling rates from 10 Hz to more than 100 Hz. "This has led to the expansion of eleven different instruments from ten manufacturers in the HPLC and UHPLC segment, each with a different power range," he says.
Mass spectrometry (MS) detection (with the introduction of electrospray ionization, ESI), was cited as another key development that greatly expanded the applications of LC, especially in the life sciences. Many pointed to not just the sensitivity of LC–MS but also its current ease of use, which has made it a standard analytical tool.
"The availability of effective, commercially available and reliable LC–MS instruments that can be used routinely by chromatographers rather than mass spectrometrists was important," said Bob McDowall of McDowall Consulting. "These instruments provide selectivity and sensitivity, making them instruments of choice in areas where these parameters are required."
In particular, atmospheric-pressure ionization techniques were also named several times. "While not quite a universal detector, atmospheric-pressure MS has made it possible to measure compounds at increasingly low levels with relatively greater ease," says Howard Hill of Huntingdon Life Sciences. "It also provides the possibility of increased throughput, even though the latter has not always been exploited."
Data processing and systems integration were high on the list for a number of our experts. Mel Euerby, of Hichrom Limited and Strathclyde University, particularly values the development of robust, automated method development platforms, including column- and solvent-switching technology coupled with triple detectors (UV, MS and charged aerosol detection, also called corona discharge detection [CAD]) and retention modeling software programs.
Mark Argentine of Eli Lilly and Company said the development of highly flexible and precise variable volume, programmable autosamplers approximately two decades ago had a tremendous impact on laboratory efficiency and analytical data quality. "While we take such technology for granted in most analytical equipment today, tools such as these have enabled analysis of precious (microvolume) samples in a wide variety of applications,including high-throughput screening and analysis of biological materials," he said.
Ferguson feels the introduction of reagent-free ion chromatography (IC) was a key development. "This instrument, co-developed by Paul Haddad's group at the University of Tasmania and Dionex, has greatly simplified the way analysts can perform IC, and importantly, transfer these methods across organizations," he said.
Panel co-chair Tony Fell, of the University of Bradford, stressed the important achievement of chiral separations — for chiral analytes and biological targets, with diverse reversed-phase and normal-phase packings coupled with sensitive MS detection.
Fell also noted an important underdeveloped area. "The lack of sensitive detectors based on optical activity — such as circular dichroism and polarimetry — is a major disappointment," he says.
Clearly MS detection has been a critical recent development. So what do our panelists see for the future of detection technology?
"Mass spectrometry increasingly will become a standard detector for any lab environment, from research down to QA/QC," predicts Wim Decrop of Thermo Fisher Scientific. "MS will continue to improve in terms of ease of operation, data acquisition times, sensitivity, and resolving power."
"Detector improvements will still revolve around MS, such as advances in high-resolution MS and the ability to rapidly and automatically optimize these systems, enhancing their appeal," says Hill. But for these developments to have maximum impact, he says, parallel developments in software, computer storage, and data accessibility will be essential.
Both Euerby and Ferguson expect to see smaller footprints and greater portability for LC–MS instruments. That, adds Euerby, should also lead to the development of more affordable, pared-down mass spectrometers for certain applications — such as single-quadrupole MS systems for fast scanning.
Progress is also anticipated with other types of detection, particularly evaporative light scattering detection (ELSD) — useful for large molecules — and CAD coupled with flow-through detectors to enhance sensitivity and identification with appropriate solutes and mobile phases. Panelists see the potential for increased scan speeds, reduced dispersion and better linear responses with these instruments.
Fell recalls that since his prediction of UV diode-array detection (DAD) as the standard LC detector some 30 years ago, this detection method continues to make a major impact. "In the past two decades, detection has improved the most with diod-earray UV spectrometers and MS, making major contributions to lowering detection limits and providing solute identification," noted Henry.
Gordon Marr of Patheon, a pharmaceutical contract research and manufacturing organization, sees a need for more development in in-line chiral detection technology, as noted above by Fell. "This technology has been available for more than 20 years, but because it is treated like a premium market, little progress has been made," he says.
Decrop, for his part, foresees that electrochemical detection will become more compatible with UHPLC. "By improving ease-of-use and integration into LC systems, multichannel electrochemical detection has a great potential to become a standard LC detector," he says.
But the quest continues for the holy grail of a universal detector — one that will respond equally to all analytes regardless of physicochemical properties.
"Such a detector would be able to provide a relative measurement in metabolic routes, particularly where no synthetic standard is available, and avoid the requirement for a radiolabeled analogue," anticipates Saeed. "The detector technology would require properties similar to chemiluminescent nitrogen and corona aerosol detectors but with high dynamic range." One possible way to get there, he continues, would be to use novel ionization sources in LC–MS that can impart ionization to components of interest and reduce matrix effects.
Ferguson believes the path to a universal detector may be through the development of flame ionization detection (FID) compatible with fluidic chromatography.
Fell, however, foresees another route to universality. "Perhaps this will be based on ELSD or CAD, but only when the technical issues with response linearity, dead-volume, and reliability have been resolved."
The panel showed broad agreement that MS detection was a critical development, and that this and other detection methods will continue to develop. But what is the current status of hyphenated LC techniques?
Saeed notes that dramatic advances — in sensitivity, ease of use, the wide application to the analysis of large and small molecules, as well as the development of suitable software for identification of proteins and metabolites — have greatly expanded the use of hyphenated technologies, particularly those that couple chromatography to high-resolution accurate mass spectrometry. The major bottleneck, he says, is data generation, increasing the demand for automated data processing software.
In bioanalysis, says Hill, all MS technologies play a major role and are spilling over to the formulation analysis field, where they are finding a role as a replacement for DAD. "This is certainly true in the method development lab, if not yet in routine analysis," he says.
LC–MS is a mature technique and has migrated from an expert system to an open-access, walk-up format in many laboratories, says Argentine. He also notes that software advances have greatly advanced the productivity and utility of HPLC–MS systems for routine use in the analytical laboratory. "These tools afford opportunity for advanced analyses of extremely complex sample matrices, including biological samples," he says.
The panelists also see strong potential for multidimensional, or 2D, LC–MS, in which two different, or orthogonal columns typically are used to achieve greater resolution of complex samples, particularly in fields such as bioanalysis, proteomics and biopharmaceutical drug development.
"Two-dimensional HPLC instrument platforms are relatively new but have the potential to provide significant impact, particularly for complex biological separations and stressed degradation studies of pharmaceutical compounds," note Argentine and his colleague Todd Maloney of Lilly.
"Targeted heart-cutting, which can be a high-speed sample preparation step prior to subsequent analysis, may actually have the highest potential for wider adoption, but is often overlooked," says Decrop. But, he notes, high-resolution UHPLC with its dramatically improved peak capacity has cannibalized part of the multidimensional application field.
"Multidimensional LC promises to be an exciting area of development in the next 10 years," says Euerby. "The use of LC–LC systems with two reversed-phase columns hopefully will be expanded to real two-dimensional LC (using two orthogonal columns, such as ion-exchange and reversed-phase columns) yielding phenomenal increases in peak capacity of separations." He also hopes this will be further expanded to combinations like supercritical fluid chromatography (SFC)–LC.
Peter Myers of the University of Liverpool also foresees continued development of 2D and 3D separations, but says new interface methods must be developed. "In GC, the 2D methods work because there is a focusing step between each dimension," he says. "Such a focusing step needs to be developed for LC. The simple valve switching used today is just not suitable."
Like Myers, Ferguson also sees potential benefits in dual analysis types such as SFC and LC in a single instrument, so that samples containing both highly polar and hydrophobic constituents may be analysed on the same platform. But such instruments will be more complex, he warns, so building additional robustness into these platforms will be essential.
Fell agrees that greater robustness is needed. "Hyphenation of orthogonal separation systems (LC–LC, SFC–LC) has demonstrated substantial benefits in solute selectivity and peak capacity, especially in bioanalysis, but remains to be packaged in a standardized format for robust industrial applications," he says.
Henry notes that HPLC has become more versatile since the introduction of carbon dioxide as a mobile-phase solvent, with improvements in speed and selectivity for both chiral and achiral solutes — experiments that could be described as "high-fluidity" HPLC.
We also asked our panelists about the status of in-line sample preparation technologies for HPLC in biological, industrial and environmental analysis.
"Automated in-line sample preparation stations have greatly enhanced laboratory productivity, particularly for sampling liquids, slurries and suspensions for biological and industrial applications," say Argentine and Maloney. "The ability to automate sample prep and bring the analytical instrument closer to the process or test system (on-line HPLC) will further promote advancements in process control."
Rainer Bischoff of the University of Groningen agrees. "Integrating sample preparation into the overall LC–MS workflow is critical not only for processing many samples in an unattended fashion, but also for achieving better reproducibility and robustness," he says. He also notes that since LC methods give insufficient resolution for truly complex samples, this presents the MS system with a classical dilemma — the balance between comprehensive analysis (fragmentation of all incoming molecular ions) and sensitivity. The overwhelming number of compounds ionized, plus matrix effects in the mass spectrometer, introduce measurement bias and sampleto-sample variation.
In-line separation technologies have seen successful development to provide more-or-less standardized systems for pre-LC cleanup, notes Fell. Still, its adoption is not widespread — it is common in bioanalysis but less so in other application areas — and challenges remain.
"In-line sample preparation is not used as much as it should or will be in the future," says panel co-chair Roy Eksteen of Sigma-Aldrich/Supelco. He and others noted that the impact of matrix effects has been a major challenge for such systems, particularly in fields such as environmental analysis.
Even in bioanalysis, Hill adds, in-line sample preparation technology has not spread as widely as might have been expected. "Indeed, it is perhaps the silver standard, the gold standard being no or minimal sample preparation at all," he observes.
Hill wonders if ion mobility spectrometry (IMS) — which can carry out separations in tens of seconds at relatively high levels of sensitivity — is likely to have greater impact on minimizing sample preparation requirements in the future. "While examples of IMS in bioanalysis in its purest form are few, a derivative form, field asymmetric ion mobility spectrometry (FAIMS), is more widely used as part of a hyphenated technique (such as LC–ESI-FAIMS-MS–MS) post the source, where it can be used as a partial clean-up, minimizing extraneous material entering the mass spectrometer," he says. He acknowledges, however, that this method is likely to remain a niche enabling technology that makes the MS–MS process more efficient.
Kerry Nugent of Bruker, in turn, believes in-line sample prep for LC will be increasingly more popular as a way to purify and simplify complex samples as such systems become more automated and easy to use.
The technology for solvent gradient elution systems has become more robust in recent years, and, as Decrop points out, is now widely practiced in biopharmaceutical analysis.
Important limitations remain, however. "In routine pharmaceutical analysis, solvent gradient elution systems are generally regarded as methods of last resort, because of the lack of transferability across company and international boundaries," says Fell.
McDowall puts it more bluntly: "Why would you want to use gradient elution for a routine analytical procedure?" he asks rhetorically. "Only if you were forced to because you could not get an effective isocratic separation."
Euerby, however, believes that gradient HPLC can be developed and performed reliably in routine use in the hands of experienced operators. The problem, he says, is that few practitioners know how to use it properly or understand the theory behind it.
Nonetheless, solvent gradient methods have gained broad acceptance in protein and peptide separations. According to Decrop, many scientists currently also are evaluating pH-gradient ion-exchange chromatography for biopharmaceutical analytes.
In bioanalysis, adds Hill, gradient-elution systems are used as adjuncts to sample cleanup, to remove late-eluting peaks and allow rapid separation of closely related or interfering peaks. "While reproducibility of gradients has always been an issue, the use of modular systems with minimal user plumbing choices has certainly improved the situation," he notes.
Thinking more broadly about the mobile phase, Ferguson advocates more research into greener separation technologies. UHPLC, he points out, is a key enabler to this approach, as it uses significantly less mobile phase (and thus organic solvent) than conventional HPLC. He refers to research into the addition of carbon dioxide into reversed-phase HPLC mobile-phase systems ("enhanced fluidity chromatography") as well as work using high temperatures for separations that are faster, use less solvent, or have alternative selectivity. "If these extremes could be pushed further, supercritical water separations might be possible, and would provide an alternative approach for eluting a wide polarity range of compounds in a very green manner," he concludes.
Henry also sees promise in such approaches: "The ability to chill carbon dioxide and deliver it as a nonpolar solvent may turn out to be very important and popular," he says. "Carbon dioxide is an environmentally friendly solvent that has properties similar to low-molecularweight hydrocarbons, yet blends easily with useful polar solvents like methanol and acetonitrile. It also interfaces easily to common detectors."
Marr, however, would rather see more effort put into isocratic methods. "Developments in gradient-elution methods would be advantageous, but I would settle for better isocratic method development," he says.
There is little doubt that UHPLC has expanded the technological boundaries of separation science. "The remarkable selectivity, speed and applicability of UHPLC with current sensitive detection systems indicates that it is likely to remain a useful and expanding resource in the analytical armory," says Fell.
Yet in spite of its advantages, there continue to be barriers to wider adoption of UHPLC.
An important consideration in the pharmaceutical world, says Marr, is that few companies want to risk delaying product approval by including new analytical technologies in a regulatory filing. "While we have invested in and continue to explore the technology, the first question often asked is, 'Have any [pharmaceutical] products been approved using this technology?'" he says. "No one wants to be first to adopt a new technology. Everyone wants to be second."
Another challenge with UHPLC, says McDowall, is that data handling has not kept up. "Only by enabling electronic data handling will the real potential of UHPLC be realized," he says. "However, most chromatography data systems on the market are not designed for effective electronic workflows."
Instrument robustness is also a concern with UHPLC. "HPLC instruments have become so sophisticated that maintenance and downtime have become major concerns again," laments Henry. "We have also moved beyond the point where the average user can easily modify or maintain a UHPLC instrument."
Because of the challenges with UHPLC, many chromatographers prefer to use core–shell particles, which can provide sensitivities and resolution close to those of UHPLC but at lower pressures.
As a result, observes Eksteen, the future of UHPLC is not as bright as it once was. "The use of core–shell particles may bring us back to what was normal in the 1980s and 1990s, when most practitioners rarely operated at more than 40% of the pressure capability of the HPLC instrumentation," he says.
Many separation scientists even object to making a strong delineation between UHPLC and HPLC, seeing one as just an extension of the other. "UHPLC is nothing new," says Euerby. "It's just that we have expanded the working operating parameters."
We also asked our panelists about the competitive position of HPLC and the various modes of capillary electrophoresis (CE) for regulatory applications, such as in the pharmaceutical industry. All agree that HPLC dominates the field.
HPLC is more robust and easier to use, making it superior to CE, says Decrop. "Method development is more straightforward in HPLC than in CE, and CE typically requires more user experience than HPLC," he adds.
Hill agrees. "There are few drivers for CE to replace LC–MS in bioanalysis," he says. "Its lack of robustness and the difficulty in using it routinely with high sample throughput have all mitigated against it. HPLC will reign supreme for the foreseeable future."
Another important barrier to CE adoption is that changing analytical methods in regulated environments is never easy or quick. "There is such a lag in the regulatory area that I think we will see HPLC as the major force for a number of years," says Myers.
McDowall agrees. "When an analytical procedure is cast in stone (in the European Pharmacopoeia or the United States Pharmacopeia) there is little incentive for a laboratory to change, because if legally challenged, the pharmacopeial method is always right," he explains.
Nevertheless, this is starting to change. The gradual shift in focus in drug development from small to large molecules has greatly expanded the use of CE methods (capillary gel electrophoresis, capillary isoelectric focusing, and immunoaffinity capillary electrophoresis) over the last decade, says Saeed, as an alternative separation technique and in some cases as a direct replacement to slab gel electrophoresis and HPLC. "With CE methods in pharmacopeial monographs and gaining FDA approval, it is clear CE has a major part to play in the delivery of biologic regulatory packages," he concludes.
Overall, concludes Fell, it seems that CE and it various modalities have a niche role to play. "These applications will include the analysis of chiral analytes and provide an orthogonal approach to the validation of peak purity in method development, complementary to the insight provided by atmosphericpressure ionization MS and its variants," he says.
We also asked about the future role of capillary electrochromatography (CEC) in separation science as a whole. In the late 1990s, CEC certainly looked promising, but then its progress stalled.
Ferguson explains that the method suffered because of issues in the reproducible production and physical stability of columns (particularly retaining frits) and the fundamental issue of the mechanism generating the electro-osmotic force (un-derivatized silanol groups) on the packing surface, leads to poor chromatographic peak shape for cationic analytes.
Euerby agrees, noting that the experts got good results with CEC, but the instrumentation and capillary columns were just not robust enough to be transferred to routine use.
Argentine and Maloney share that view, adding that the technique also suffers from lack of a dedicated commercial instrument platform.
Myers, nevertheless, sees some hope for the future of the technology. "Today things are changing," he says. "New detectors such as total on-column detection and MS, together with new support and electrode systems, are starting to deliver research results."
Euerby is also hopeful. "It is hoped that in the future CEC may rise like a phoenix in the form of microfabricated chip CEC technology where most of the disadvantages will be eliminated," he says. Argentine and Maloney anticipate a similar path. "The future of CEC will likely be in a lab-on-a-chip format," they predict.
Our panelists also considered the maturity of "lab-on-a-chip" technology. Its primary benefit, of course, is that it minimizes sample volume. "That directly contributes to the well-being of patients and test animals alike," notes Eksteen.
Decrop noted, however, that the current performance and robustness of lab-on-a-chip technologies do not surpass the performance of current nano-UHPLC systems.
"'Lab-on-a-chip' has failed to meet its promises and is currently only used in niche applications where small sample sizes (such as single-cell analysis) and volumes are critical to the assay," suggests Nugent.
Ferguson adds that there are significant limitations in detection with these systems. "Detection is typically limited to MS and fluorescence, the latter often requiring analyte derivatization," he says.
Bischoff agrees, adding that chip LC–MS systems would benefit from an "on-chip" UV or fluorescence detector. "This is commonly done in microfluidics but seems not to be transferred to the LC world," he says.
Multidimensional separations have become part of the routine workflow in nano-UHPLC, adds Decrop. "Chipbased workflows typically imply limited versatility in comparison with column-based multidimensional workflows," he says. "Chip-based systems nevertheless show promise for multidimensional separations providing ease of use for dedicated workflows."
Marr notes that just like with UHPLC, lab-on-a-chip adoption also suffers in the pharmaceutical environment because end users don't want to stick their necks out. "It suffers from the same 'no one wants to be first, but everyone wants to be second' conundrum in regulated laboratories."
"It will take another decade for lab-on-a-chip technology to clearly differentiate itself from micro versions of traditional HPLC systems," agrees Eksteen.
Nevertheless, work is ongoing. "The technology continues to mature in an impressive way," says Fell.
Lastly, we asked our panelists if they foresee any significant developments in lab-on-a-chip technology that would extend its range of applications.
Some advances, such as reducing the size of the instrumentation used with it, and improvements in automated data interpretation, might extend its application range and utility, our panelists said. Potential examples include use in a doctor's office to aid patient diagnosis and remote environmental applications by nonexperts.
"Professor George Whitesides at Harvard University is currently working on paper-based labonachip diagnostic tools that could be fabricated simply in the third world using common machinery like dotmatrix printers," notes Ferguson. In that system, capillary wicking is used to move fluids around the paper chip, which can contain enzymes or chemicals to produce a simple visual response.
Euerby and Henry think the method will be used mainly for specialized applications with large numbers of samples, where end users collaborate with the chip manufacturers to produce expensive customized systems. "Probably the first lab-on-a-chip instrumentation to the market will be generic chips for high-throughput sample purity determinations in the drug discovery arena," Euerby predicts.
Myers, in turn, feels the real benefit of lab-on-a-chip technology will be in its capability of delivering a "total solution" concept. "Chips will be supplied that contain a computer, memory and of course the column," he predicts. "They will be plug-andplay devices that will deliver a total black box solution to the standard problems."
Development in this technology would advance faster if a "killer application" were found for it, observes Nugent. "That would help to fund research into better sample handling, instrumentation and separation devices, which would then open it up to more analytical applications," he says.
Eksteen, however, thinks flow injection on a chip is more likely to be commercialized before HPLC on a chip.
LCGC North America and LCGC Europe would like to extend a special thank you to Tony Fell and Roy Eksteen for serving as the co-chairs for this special section on LC Instrumentation They provided extensive guidance throughout the entire process, by forming the panel, developing the questions and aiding in preparing the manuscript from the excellent responses received. Fell is an Emeritus Professor of Pharmaceutical Chemistry at the University of Bradford (Bradford, UK), and Dr Eksteen is the Biopolymer Separations Market Segment Manager at Sigma-Aldrich/Supelco. Both are members of the Editorial Advisory Board of LCGC North America and LCGC Europe.
This article was first published in the 30th Anniversary issue of LCGC North America in August 2012.