
Voices of Separation Science: Expert Insights, Emerging Trends, and Innovations that Shaped Chromatography in 2025
Key Takeaways
- GC×GC–TOF-MS and AI-driven data analysis are revolutionizing forensic and analytical chemistry, enabling more comprehensive and accurate analyses.
- Lipidomics and metabolomics are advancing through novel statistical workflows, improving data processing, analysis, and visualization.
LCGC International presents a sampler of our 2025 interviews with key opinion leaders in the field of chromatography, exploring emerging trends, persistent analytical challenges, and the innovations poised to redefine separation science.
As the field of chromatography accelerated through 2025, with an eye on the years (and decades) to come, it has been reshaped by groundbreaking instrumentation, smarter data workflows, and an expanding array of real-world applications. To capture these rapid developments from the front lines, LCGC International brings together insights from interviews conducted with leading voices across industry, academia, and government laboratories. In each interview, we explored emerging trends, persistent analytical challenges, and the innovations poised to redefine separation science. Each discussion offered a candid look at where chromatography is headed—and how the experts driving its evolution are navigating the opportunities and disruptions ahead.
Below is a curated selection of our standout interviews from the past year, featuring the most memorable insights and quotes. Enjoy the read!
In this interview, Petr Vozka, associate professor of chemistry and biochemistry at California State University, Los Angeles, USA, and is the director of the Complex Chemical Composition Analysis Laboratory (C3AL) highlighted how comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC×GC–TOF-MS) reveals time-dependent chemical changes in fingerprints, enabling age estimation through chemometric modeling, and advancing forensic timelines beyond traditional ridge pattern analysis.
What emerging trends or technologies do you believe will shape the future of forensic analysis?
I think that one of the most transformative trends in (not just) forensic science is the integration of chemometrics and machine learning to interpret high-dimensional data sets from techniques such as GC×GC–TOF-MS. As forensic chemistry moves beyond targeted assays toward untargeted or semi-targeted analysis of complex mixtures, the ability to extract meaningful information from large data sets is becoming essential.
In our fingerprint aging research, we apply chemometric techniques to identify key molecular markers and temporal trends, reduce data dimensionality, and improve model robustness. We anticipate that these data-driven approaches will become increasingly central, not just in fingerprint analysis but also across the broader spectrum of forensic applications.
Robert B. “Chip” Cody—recognized in Stanford-Elsevier’s 2025 top 2% of scientists—considered the future of separation science over the next two decades. In this interview, he reflects on his career and the most transformative trends shaping the field.
Across your career, you’ve seen the shift from hardware innovation to software-driven insight. Which frontier do you think will define the next 20 years of chromatography–mass spectrometry?
GC–MS is a mature technology, so the most recent developments I have seen have been with two-dimensional gas chromatography (GCxGC) and data analysis. JEOL recently added support for GCxGC–high-resolution mass spectrometry (HRMS) data to their qualitative analysis software. That software makes use of all the available information from a GC–MS or GCxGC–MS analysis: EI and soft ionization, chromatographic deconvolution, accurate mass and isotope data, database search, fragment ion assignments, and retention index matching. A searchable AI-generated database of electron impact (EI) mass spectra and predicted retention indices is invoked to assist in identifying compounds that are not registered in the NIST and Wiley databases. By using the information from our software in combination with other software tools like the NIST Hybrid Search and MS Interpreter, I’ve had several successes in identifying complete unknowns.
We’re using all the mass spectral and chromatographic data, so there’s no more information left to extract from GC–MS for qualitative analysis of unknowns. Combining GC–MS with information from other analytical techniques like Fourier transform infrared spectroscopy (FTIR), Raman, nuclear magnetic resonance (NMR), and ultraviolet (UV) is a possible approach to add more information. Wiley’s Know-It-All product is an example of software that is attempting to do just that.
On the other hand, LC–MS is still a rapidly evolving technology. I have not been heavily involved in LC–MS in recent years, so I’ll leave any predictions to those who work more closely in that area. As an observer, I have seen great advances in the expansion of LC–MS databases and data analysis tools. I have also seen a growing number of presentations on two-dimensional liquid chromatography (with or without mass spectrometry) and even multidimensional chromatography combining GC and LC separations, so I won’t be surprised to see the development of new and creative analytical solutions that combine different separation methods.
Supercritical fluid chromatography (SFC) hasn’t received as much attention as gas and liquid chromatography, but SFC–MS is a promising alternative for applications such as the characterization of jet fuel. Thin layer chromatography (TLC) is still widely used for reaction monitoring. In fact, TLC was coupled with direct analysis in real time (DART) mass spectrometry as early as 2006. The coupling of mass spectrometry with microfluidic separations is in its early stages, but I expect to see a lot of growth in that area. We must also consider the growing role of ion mobility, both as an alternative to, and in combination with, chromatography.
I don’t believe in one-size-fits-all analytical solutions. Each piece of the analytical puzzle brings different information, and the number of different puzzle pieces we need to solve a problem is directly related to the complexity of the problem we’re trying to solve. I frequently cite Curt Brunnée’s tour-de-force article “The Ideal Mass Analyzer: Fact or Fiction?” in which he compared the strengths and weaknesses of different mass analyzers and concluded that no one mass analyzer can do everything. I think the same applies to all separation methods, detectors, and analyzers. We need to choose the appropriate technology and system complexity that answers the analytical problems we are trying to solve.
Recent developments in lipidomics and metabolomics have highlighted persistent challenges in data processing, statistical analysis, and visualization—areas that often hinder reproducibility, transparency, and accessibility within the field. This year saw the publication of a guideline addressing these issues by presenting a comprehensive, code-based framework for the statistical treatment and graphical representation of omics data using R and Python. LCGC International spoke to Michal Holcapek (professor of analytical chemistry at the Faculty of Chemical Technology, UPCE, Pardubice [Czech Republic]) and Jakub Idkowiak (research associate from KU Leuven [Belgium]) about the rationale behind this research and the benefits it offers separation scientists.
What future developments do you foresee in this area, particularly regarding automation, artificial intelligence (AI)-driven annotation, or the miniaturization of separation platforms? Are you exploring any of these directions in the future?
We anticipate increased automation in annotation, such as AI-driven feature assignment and lipid identification, which is already emerging in mass spectrometry. Simultaneously, there can be a closer integration with separation methods, for example, including real-time QC feedback or analysis of retention patterns. As separation platforms continue to miniaturize and throughput rises, scalable, parallel, and adaptive preprocessing will become essential. There is also strong potential for leveraging machine learning models for anomaly detection, automatic drift correction, and semi-supervised identification of novel lipids or metabolites, helping to capture features that might otherwise be overlooked by analytical chemists. However, careful manual supervision of all reported information is crucial to ensure the highest possible confidence and to avoid overreporting.
LCGC International spoke with Joachim Weiss, recognized as an international expert in analytical chemistry, and author of the Handbook of Ion Chromatography, where he shared insights into the past, present, and future of the technique.
What advice would you give to a young chromatographer considering specialization in IC, and how does the field stay vibrant and innovative going forward?
I would highly recommend specializing in IC to a young chromatographer, as there are not that many experts left in this field of science today.
The scientists in industry and academia who were driving the development of IC during the past decades are about to retire or have already done so. To avoid a knowledge drain, it is necessary to pass on the accumulated experience to the younger generation of chromatographers. For a young chromatographer considering a specialization in ion chromatography, it is important to develop a solid understanding not only of the fundamental separation and detection principles, but also of the latest advances in hardware and software. It is also crucial to gain practical experience to be able to understand the wide applicability of IC.
This year, we are celebrating the golden jubilee of ion chromatography, as this analytical method has matured over the past 50 years. Although “revolutionary” developments may not be occurring within the foreseeable future, the field will stay vibrant as stationary phase materials, detection methods, and automation techniques are continuously improved. Moreover, the demand for precise analyses in almost all IC application areas is steadily increasing. If a young chromatographer is willing to invest in self-education and is open to new technologies, this field can be further advanced, and innovative solutions can be created.
An article written by researchers at Imperial College London (United Kingdom) and published in npj Parkinson’s Disease explored the potential of volatile organic compounds (VOCs) as early, non-invasive biomarkers for Parkinson’s disease. In this interview, Ilaria Belluomo, lead author of the review, spoke about how GC-MS will play a part in future diagnosis of the disease.
How close are we to integrating validated GC-MS-based breath or skin tests into clinical workflows for Parkinson’s disease (PD) screening or diagnosis?
We are still some distance from fully integrating validated GC-MS-based breath or skin tests into clinical workflows for Parkinson’s disease screening or diagnosis, but the path forward is becoming clearer. Such non-invasive tests would represent a major step toward earlier, easier, and more accessible diagnosis, which is crucial since motor symptoms appear only after substantial neurodegeneration has already occurred.
However, several critical steps remain. Large-scale, high-quality studies are needed to validate findings across diverse and representative populations, with appropriate control groups and standardized protocols that account for variables like diet, environment, and medication. Many studies to date have small sample sizes and inconsistent control matching, limiting the use of the results. Moreover, deep profiling using high-resolution GC-MS is essential to confidently identify VOCs related to PD pathology. Once key biomarkers are selected, more practical targeted approaches can be developed for population-level screening, such as simplified GC-MS setups or even sensor-based platforms.
There is an urgent need for simpler, more accessible methodologies, especially in Parkinson’s disease and other conditions where reduced mobility can make traditional diagnostic approaches challenging. Sensors are expected to play a central role in the future of non-invasive diagnostics due to their portability and ease of use. However, as mentioned earlier, technical refinement is still required to improve their specificity, sensitivity, and reproducibility, and to better align their outputs with those of gold-standard GC-MS methods. While we are not there yet, with continued investment in robust, large-scale, multi-center research and methodological standardization, breath or skin testing could soon become a valuable part of PD clinical care.
With green chemistry becoming more of a goal to laboratories in general, LCGC International discussed the environmental gains achieved using ultrahigh-performance liquid chromatography (UHPLC), reverse phase liquid chromatography (RPLC), hydrophilic interaction chromatography (HILIC), and normal phase liquid chromatography (NPLC) separations of model compounds with Sakil Islam of the University of Texas at Arlington.
What are the environmental gains of UHPLC, and how do they compare to its cost, maintenance complexity, and instrument lifetime?
UHPLC cuts solvent use and shortens run times, leading to higher throughput with less waste and lower energy consumption per unit of analysis. However, there are trade-offs: systems are more costly, run at higher pressures, solvent filtration/degassing matters more, and columns can be expensive. It is important to use fully miscible mobile phases and avoid very viscous blends which result in pressures that can be too high.
When methods use shorter columns, narrower internal diameters, and appropriate flow rate, the solvent savings and faster runs can outweigh the added cost and maintenance of UHPLC. Obviously, with guards/inline filters and clean samples, both column and instrument lifetimes can be improved.
Newsletter
Join the global community of analytical scientists who trust LCGC for insights on the latest techniques, trends, and expert solutions in chromatography.




