Liquid chromatography-mass spectrometry (LC-MS) analysis continues to evolve in ways that make it a versatile technique in laboratories worldwide
Liquid chromatography–mass spectrometry (LC–MS) analysis continues to evolve in ways that make it a versatile technique in laboratories worldwide. Participants in this forum are Rob Ellis of AB Sciex, Rohan A. Thakur of Bruker Chemical and Applied Markets, and Iain Mylchreest of Thermo Fisher Scientific.
What trends have emerged in LC–MS recently?
Ellis: Some of the trends we see include enhancing the capacity of LC–MS analysis, improving sensitivity and elevating confidence in bioanalytical results as well as a continued trend of raising the level of sensitivity in the instrumentation.
Thakur: The rise of the emerging markets such as China, India, Brazil, and Southeast Asia are driving a paradigm shift in the user experience. While LC–MS designs today reflect the user experience of the developed pharmaceutical market that drove this technology since the 1990s, the shift toward utilization of LC–MS in the applied markets such as food testing and pain management, has shifted the emphasis from hardware to software.
Mylchreest: Two major trends: First, the greater adoption of MS as a detector for LC applications particularly in applied applications, such as food safety and environmental analysis. Second, the adoption and rapid expansion of high resolution MS systems for all application spaces, for screening, qualitative structural elucidation, and more recently their application for quantitative analysis providing increased specificity and accuracy. This will continue to be a trend as these instruments become easier to use and therefore far more accessible to all laboratories.
How do you see LC–MS evolving in the future?
Ellis: I see LC–MS as evolving to include new strategies based on increased throughput of sample processing and extracting more information from each experiment. For example, instead of focusing MS analysis on the detection of a particular target analyte, all of the ions are detected, generating a complete dataset that can then be processed and analyzed to obtain various types of information and to test multiplehypotheses.
Another point to make is that improving confidence in bioanalytical results will require optimizing the speed and accuracy of existing systems, doing multiple reaction monitoring (MRM) more efficiently, and collecting more data points per peak. And on the LC side, improvements in sensitivity are being achieved using techniques such as micro LC. This enables separation of small molecules at rates of 10–15 µL/min. As both sampling efficiency and the efficiency of MS ion sources improve, more ions will be generated in a shorter amount of time, yielding a better signal.
Thakur: LC–MS will evolve to “push-button” operation influenced by software. The current problem today is data deluge with no real effort in converting data in usable knowledge. The LC–MS manufacturers remain hardware focused and do not have any real solution for content management. The conversion of data to information, upon which decisions can be taken, will be the technology evolution of the future.
Mylchreest: We will continue to see increased simplification of operation even for the high-end systems, and more and more integration into defined workflow solutions for targeted applications. The interface between LC and MS will become increasingly seamless for workflow-based applications.
What applications areas are growing the fastest?
Ellis: Among all the many applications for which LC–MS is relevant, I’d say that food safety testing is the fastest growing, especially when it comes to allergens. We are seeing tremendous interest in allergen testing applications.
Thakur: Food testing, and pain management (clinical diagnostics) are the fastest growing areas for LC–MS applications.
Mylchreest: Food safety and clinical research applications in applied and research areas, quantitative proteomics for biomarker validation, pathway analysis, and metabolomics.
Is data handling evolving at the same rate as LC–MS?
Thakur: Data handling is significantly behind the hardware technology that is driving LC–MS. The hardware is generating more data than ever, but processing this data into usable information remains a challenge.
Mylchreest: Data processing and visualization will always be an area of opportunity. Each scientist and laboratory have different questions they want to answer, and may want to view data in different ways. This is clearly recognized and much more emphasis is being put on rapidly developing new software for the new application areas. There is a lot of data generated in the most complex experiments and the new high-resolution systems create even more possibilities to dig deeper experimentally, and it is our job to extract that into useful information. Most applications now require their own software suites and tools, as the experiments and requirements have become so specific to an area of application. The demand for new software tools will always be high and one of the areas of greatest investment. It’s the new questions being asked that make the field exciting.
Is LC–MS being used in areas more commonly associated with other analytical techniques?
Ellis: Absolutely. Other analytical techniques are commonly associated with forensic toxicology, vitamin D analysis, and immunosuppressant analysis, just to name a few, but data have already been produced to prove that LC–MS delivers faster and more accurate results for these applications than what is currently widely used. There is also potential to use LC–MS for analysis required to produce biofuels.
Thakur: Not really. Every analytical technique has its advantage. The common mistake is to shoehorn a technology for the sake of technology. The idea should be to use the right tool for the right job, and not use the tool because it is the only tool you have.
Mylchreest: We are all starting to see MS replace or complement some of the optical detector technologies typically associated with high performance liquid chromatography (HPLC). That trend will continue to expand as the sensitivity, specificity, and information content of MS extends the scientist’s available knowledge base.
Can you illustrate with a practical example an LC–MS application that you found particularly inspiring?
Ellis: One recent example is a new LC–MS application specifically tailored for laboratories in India to ensure the safety of spices, teas, fruits, vegetables, and other food products for India’s domestic and export markets.
Thakur: Inspiring is perhaps not the appropriate word. Michelangelo wasn't inspired by his paintbrushes. He simply used them to create inspiring art. Similarly, LC–MS provides a paintbrush for the various scientific and regulatory disciplines so that science and commerce progresses. LC–MS is simply a tool in the arsenal of the scientist that ties together the various disciplines of science to affect the progress of science, whether it is ensuring the food is safe or the protein identified is indicative of a cancer therapy successfully at work.
Mylchreest: There is not one specific example, but an area making great advancements is the contribution that LC–MS, specifically high-resolution MS, is making to the understanding of biological systems. This is one of the most exciting areas of scientific development. The ability to contribute to the understanding of disease mechanisms through understanding molecular pathways in cellular systems with proteomic and metabolomic LC–MS methods, and taking those learnings and driving toward development of markers that could predict or diagnose a disease state, and using the same capabilities to develop and monitor new therapies, is an area that is motivating the industry to develop better tools. Contributing by developing technologies to improving disease detection and developing therapies to manage or cure has to be inspirational to all of us.
New Study Examines PFAS in Breast Milk Using LC-MS/MS
November 5th 2024On the suggestion that per- and polyfluoroalkyl substances (PFAS) affect both lactation and the human metabolome, perfluorooctanoate (PFOA) and perfluorooctane sulfonate (PFOS) were measured in the milk of 425 participants from the New Hampshire Birth Cohort Study using liquid chromatography-tandem mass spectrometry (LC-MS/MS).
AI and GenAI Applications to Help Optimize Purification and Yield of Antibodies From Plasma
October 31st 2024Deriving antibodies from plasma products involves several steps, typically starting from the collection of plasma and ending with the purification of the desired antibodies. These are: plasma collection; plasma pooling; fractionation; antibody purification; concentration and formulation; quality control; and packaging and storage. This process results in a purified antibody product that can be used for therapeutic purposes, diagnostic tests, or research. Each step is critical to ensure the safety, efficacy, and quality of the final product. Applications of AI/GenAI in many of these steps can significantly help in the optimization of purification and yield of the desired antibodies. Some specific use-cases are: selecting and optimizing plasma units for optimized plasma pooling; GenAI solution for enterprise search on internal knowledge portal; analysing and optimizing production batch profitability, inventory, yields; monitoring production batch key performance indicators for outlier identification; monitoring production equipment to predict maintenance events; and reducing quality control laboratory testing turnaround time.