LC/LC–MS

Article

Special Issues

LCGC SupplementsSpecial Issues-03-01-2018
Volume 31
Issue 3
Pages: 8–15

A snapshot of key trends and developments in the chromatography sector according to selected panelists from companies exhibiting at Analytica 2018.

LCGC: What trends do you see emerging in LC or LC–MS?

Hansjšrg Majer: Hydrophilic interaction liquid chromatography (HILIC) broadens the range of target compounds to small polar molecules. Although HILIC has been discussed for a long time, the method itself has become more routine because of better tools to perform this technique with confidence. Comprehensive two‑dimensional liquid chromatography (LC×LC) will leave the research field and enter routine analysis if more tools are developed to make this powerful technology simpler.

Higher peak capacities can now be obtained by hyphenating multidimensional liquid chromatography with ion mobility spectrometry, and high resolution mass spectrometry is driving the “omics” scene.

Ashley Sage: I see a continued growth in the use of liquid chromatography with tandem mass spectrometry (LC–MS/MS) technology where previously other types of analytical instrumentation would have been preferable. This is because the improvement in MS technology, in both hardware and software, allows highly selective and sensitive methods to be developed for applications including food safety, food authenticity, and security; environmental protection, including water quality testing; biopharmaceutical research and development; and clinical research, forensics, and “omics”-related research. The use of high-resolution accurate mass LC–MS in all these applications is a growing trend.
Atsuhiko Toyama: Automation has always been the trend in LC and LC–MS and has been driving the development of various “analyzers” for alleviating the need for optimization and streamlining the sample preparation workflow. I predict that this trend will continue but with increased sophistication. Now instrument vendors are beginning to provide fully automated, “sample-to-result” analytical platforms.

Kai Scheffler: On the LC side the trend towards ultrahigh-performance liquid chromatography (UHPLC) using sub‑2‑µm particle size is ongoing, whereas for low‑flow applications the trend splits into two directions: towards very low flow (sub‑nanolitre) rates, and towards capillary flow LC. Very low flow is required where applications demand extremely high sensitivity. The demand for increased sensitivity, throughput, and robustness has seen capillary flow LC becoming more important because of its ability to provide increased MS sensitivity compared to typical analytical flow LC–MS, with the additional advantage of lower solvent consumption and higher throughput while maintaining similar sensitivity as nanoflow LC. For routine and quality control (QC) markets we see demand for increased productivity, robustness, reliability, and accuracy with high selectivity and sensitivity; this can be addressed by LC–MS, with a continuing trend towards the use of high-resolution accurate mass (HRAM) MS within these environments.

 

LCGC: What is the future of LC or LC–MS?

Hansjšrg Majer: Nano-LC and capillary LC are the way forward for greener and faster liquid chromatography technology.

Fast targeted analysis using fast and specific columns is the way forward for routine analysis and increasing peak capacity is the way forward to understand complex analysis.

The routine analyst will ask for automation of the complete workflow, including sample preparation as a kit or as a specific analyzer.

Ashley Sage: The future of LC–MS is the continued growth and adoption of the technology to solve challenging analytical problems. The technology involved in mass spectrometry development over the past 20 years or so has meant smaller, faster, more selective, and more sensitive instrumentation being designed and implemented for analytical assays that can be considered as extremely complicated. The analysis of multiple analytes with minimal sample preparation or the identification of protein structure is no longer a complex challenge. One of the biggest challenges going forwards is processing the data and understanding what all this data means to scientists.

Atsuhiko Toyama: Although much effort has been put into simplifying data processing and review, these factors still remain far from full automation, consuming a lot of time and analytical expertise. It is easy to envisage the implementation of artificial intelligence as an integral part of an analytical platform to assist data processing and review, as well as enabling self‑diagnosis, self‑tuning, and self-maintenance to further reduce human intervention in a routine operation.

Kai Scheffler: In the future, LC will continue to be a separation technique of choice, and we expect to see increasing connectivity with MS, in particular for workflows in the routine applied markets and quality control. MS, and in particular HRAM MS, provides an additional level of confidence that most optical detectors simply cannot provide. In addition, with recent technological advances, both triple quadrupole (QqQ) and HRAM MS can now provide a level of sensitivity that opens doors for replacing other technologies, for example, costly immunoassays in clinical laboratories. This does, however, require very easy‑to‑use, robust instruments and workflows providing reliable, high-quality data, regardless of the user’s experience.

 

 

LCGC: What is the LC or LC–MS application area that you see growing the fastest?

Hansjšrg Majer: Food safety continues to be one of the fastest growing areas. Fast biomarker analysis will become a more prominent area in clinical applications when drug monitoring becomes more routine.

Multidimensional LC combined with multidimensional spectroscopy will drive all the “omics” sectors, but will become important in other complex areas, including food fraud.

Ashley Sage: Several application areas are growing fast with the use of LC–MS. These include food safety (protection of the consumer from contaminants and adulteration of food ingredients and products); pharmaceutical development for health protection, particularly the design and development of new biopharmaceutical‑related therapeutic medicines; clinical research, especially related to disease identification and patient treatment; and metabolomics and proteomics and the understanding of how human health is affected by external factors and how debilitating diseases, such as cancer and Parkinson’s, can be better understood to hopefully find better treatments and potentially cures in the future.

Atsuhiko Toyama: Multitarget screening and general unknown screening for food safety, toxicology, and drug-of-abuse testing have been showing dramatic changes in the past few years. Recently developed data-independent acquisition schemes have been implemented in this area that enable both quantification and qualification in an attempt to replace high-sensitivity targeted quantification by QqQMS. Further improvements in the scanning speed of QqQMS have enabled thousands of multiple reaction monitoring (MRM) transitions to be programmed into a single analysis to catch up with of the number of compounds to be screened and quantified. High speed is attributed not only to the increased compound coverage, but also to improved identification confidence by a type of MRM that decreases false-positive reporting.

Kai Scheffler: The biopharmaceutical market is a rapidly growing market with an increasing need for full characterization and monitoring of critical attributes of biopharmaceuticals from development to production, requiring high-performance separation coupled with HRAM MS. The falling patent cliff for biotherapeutics opens opportunities for biosimilars. This, combined with the ability to address stringent regulatory requirements, are key growth drivers for LC–MS technology. Whereas most biopharmaceutical QC laboratories are currently using LC–ultraviolet (UV)‑only methods next to a whole variety of other technologies, there is strong interest in applying LC–MS to monitor multiple critical quality attributes within a single assay, thus increasing analysis speed and confidence, while reducing the number of required assays. Also, demand for targeted quantitative assays in DMPK is growing (both regulated and nonregulated), requiring robust, reliable, sensitive, reproducible workflows addressing all regulatory requirements.

 

 

LCGC: What obstacles stand in the way of LC or LC–MS development?

Hansjšrg Majer: Instrument developments to improve micro-LC methods by, for example, decreasing dead volumes to make them more robust will evolve this technology.

Software developments to improve “feature recognition” in multidimensional LC combined with multidimensional spectroscopy should be investigated further.

Mass spectrometry has become prevalent and at times so good that the LC part is regarded in some quarters as being unnecessary. However, the lower the concentration level of compounds of interest, the more complex the composition of the sample and the more is expected to be seen by MS, the more benefit there is in having a tool to de-complex the sample before it enters the mass spectrometry inlet. This is often neglected when new direct injection techniques for MS systems are discussed. But the struggle between sensitivity, speed, and the technological limitations that exist today makes LC a great way to ensure the MS system can handle the entire contents of the sample.

Ashley Sage: Smaller, faster, more sensitive, and higher performing instruments are always the holy grail of LC–MS development. However, improvements in sample analysis workflows to streamline and simplify processes are always being challenged. These include simpler and faster sample preparation routines, developments in chromatography systems, particularly the implementation of easier‑to‑use micro‑flow LC systems for efficient and sensitive methods, newer phases for improved separation power, and also developments in MS technology in terms of sample introduction techniques, such as ion sources, and MS scan functions, such as mass analyzers. You should also include software developments here too. As analytical scientists generate more data, better software routines are required to process and understand what the data is saying.

Atsuhiko Toyama: The main obstacle is the long-unsolved trade-off between good LC separation and LC–MS sensitivity. Researchers are forced to choose between nanoflow or conventional LC–MS, where nanoflow can achieve 100 times more higher sensitivity in comparison to conventional flow, which in turn sacrifices robustness, flexibility, throughput, and resolution of chromatographic separation. Achieving the same level of analyte ionization in the presence of a high volume of mobile phase, or in a miniaturized LC system, would be both groundbreaking and dramatically expand the application of LC–MS in all fields. Unravelling the enigma of electrospray ionization might be the key to a major breakthrough.

Kai Scheffler: As the trend towards increased speed and sensitivity continues for both LC and MS, instrument developments have to balance this with system robustness and reliability, ensuring high overall quality of data as well as matching the acquired MS data points to the narrow UHPLC peak widths.

Bringing LC–MS to routine markets and QC requires a holistic solution that is easy to use, offering intuitive and compliant software for all analytical requirements, as well as instrumentation with smaller footprints as a result of limited space in these high-throughput environments. LC–MS is still perceived as a challenging and complex technology requiring high levels of expertise. Lastly, for analysis of intact proteins, MS instrument platforms have significantly improved; however, more column chemistries providing the separation capabilities required for complex protein mixtures would benefit analysts.

 

 

LCGC: What was the biggest accomplishment or news in 2017/2018 for LC or LC–MS?

Hansjšrg Majer: The commercialization of LC–MS/MS systems as analyzers, dedicated to automating a complete workflow, has ramifications in the industry. While these new analyzers can make routine jobs easier, they lessen the skills of LC–MS/MS practitioners at a time when more knowledge and understanding is necessary.

Ashley Sage: A very good question! From my perspective, it is the increased use of LC–MS for all the applications discussed. From ensuring the safety of our food with crises such as the contamination of eggs with fipronil through to the new characterization of bio-therapeutic proteins to help treat diseases. I believe that
LC–MS will continue to be a routine front-line analytical technique and we are yet to see new applications for its use. Maybe we might even see some new MS developments over the coming years to advance the technology even further. As a science, we have even put mass spectrometers in space, so who knows what may come next!

Atsuhiko Toyama: The micro-pillar array columns can be seen as one exciting technology that has been delivered to market in the past year. I see this as an interfacing technology between conventional nanoflow HPLC and next‑generation microfluidics, though the real impact still needs to be demonstrated in more applications and pioneering research areas. Nowadays, the speed improvement of
LC–MS/MS allows to increase the number of MRM acquired and to convert the responses of multiple MRM into a pseudo‑MS/MS spectrum which provides higher confidence in compound identification than measuring a full MS/MS spectrum.

Kai Scheffler: The market introduction of different MS instruments with advances for triple quadrupoles as well as high resolution‑based platforms provided increased sensitivity and scan speeds
up to 40 Hz, resolving power up to 1,000,000 FWHM, and ultraviolet photo dissociation (UVPD)-a novel fragmentation previously only available as a customized instrument modification. These are all a variety of tools for significantly improved qualitative and quantitative analysis of different molecules, in particular of proteins on the intact and peptide level in complex biological samples, showcased by the record of more than 1200 unique peptides identified per gradient minute (1).

References

  1. J. Olsen et al., J. Prot. Res. (2017).

Hansjörg Majer is the European Business Development Manager at Restek Corporation.

 

 

 

Ashley Sage is a Senior Manager, Applied Markets Development, EMEAI at Sciex.

 

 

 

 

Atsuhiko Toyama is a Manager, Marketing Innovation Centre at Shimadzu Corporation.

 

 

 

Kai Scheffler is a Product Manager, Chromatography and Mass Spectrometry Division at Thermo Fisher Scientific.

Related Videos
Robert Kennedy
John McLean | Image Credit: © Aaron Acevedo