HPLC Diagnostic Skills—Noisy Baselines

March 1, 2020
Tony Taylor
Tony Taylor

Tony Taylor is Group Technical Director of Crawford Scientific Group and CHROMacademy. His background is in pharmaceutical R&D and polymer chemistry, but he has spent the past 20 years in training and consulting, working with Crawford Scientific Group clients to ensure they attain the very best analytical science possible. He has trained and consulted with thousands of analytical chemists globally and is passionate about professional development in separation science, developing CHROMacademy as a means to provide high-quality online education to analytical chemists. His current research interests include HPLC column selectivity codification, advanced automated sample preparation, and LC–MS and GC–MS for materials characterization, especially in the field of extractables and leachables analysis.

LCGC North America

LCGC North America, LCGC North America-03-01-2020, Volume 38, Issue 3

To avoid producing data not fit for purpose, chromatographers need to know how to identify worrying symptoms from HLPC instrument output.

Just as medical practitioners are able to discern worrying features from a variety of medical physics devices (electrocardiogram, electroencephalogram, and ultrasound, for example), we need to develop the skill to identify worrying symptoms from our high performance liquid chromatography (HPLC) instrument output. Medical professionals learn an innate ability to identify critical symptoms (signals) from the noise or random variation in the instrument output, and we need to develop these same skills to avoid production of data not fit for purpose, or instrument failure. 

One of the most useful diagnostics in HPLC is the nature of the baseline produced by the detector while the eluent is flowing. While there can be many baseline characteristics, such as drift, irregular, or more regulation cycling (pulsations), baseline noise is perhaps the most commonly encountered, and can arise from a variety of different sources. One needs to be aware of what constitutes “normal” baseline, as opposed to “unusual“ levels of baseline, depending upon the instrument configuration. Of course, the business imperative is not only to spot problems, but also to quickly and efficiently deal with them, and that is the subject of this article.

The signal to noise (S/N) of the HPLC output is usually measured as the ratio of the detector signal to the inherent background signal variation, and is a useful measure of the “normal” noise within the system (Figure 1). The inherent or background noise is typically measured over a predefined portion of the baseline, and most data systems will be capable of making this measurement and reporting the result.

 

When inherent or background noise within the system is unusually high, this can affect system performance, and will usually result in an increase in the limit of quantitation and issues with reproducible integration. This is why, as chromatographers, we get so worked up about noise levels that are higher than expected.

The smallest detectable signal is usually estimated to be equivalent to three times the height of the average baseline noise. This would give a S/N ratio of 3:1 for the limit of detection (LOD) of the detector. If the amount of analyte injected is less than this, then the signal ceases to be distinguishable from noise. For quantitative analysis a S/N ratio of 10:1 is recommended for the limit of quantitation (LOQ).

The magnitude of the analyte signal cannot be used in isolation when calculating detector sensitivity; the sensitivity of detection is usually defined in terms of S/N ratio, a measurement of the ratio of the analyte signal to the variation in baseline. S/N measurements are usually performed by the data system.

One needs to begin by establishing, preferably for each method and set of instrument conditions, the S/N when the method (or instrument) is performing well, and perhaps even set a system suitability performance criterion (usually a range or lower acceptable limit) for the determination. Of course, the seasoned chromatographer will typically know by glancing at the baseline whether the inherent noise is “usual,” and this comes only through experience. One should also take care to assess the noise at a reasonable screen magnification or signal attenuation, as any baseline can be made to appear noisy with the correct level of magnification!

However, once again the data system may be able to help us out by reporting what is known as the peak-to-peak noise, which may be expressed as absorbance units. This measurement is of the variation in the normal baseline portion, rather than a ratio to the height of a signal, and can be very useful at establishing acceptable limits for the background noise. Most HPLC detectors will run a noise test evaluation as part of their initialization routine, or can perform a longer test using ASTM criteria with HPLC-grade water flowing through the flow cell. Specifications for acceptable noise levels will be given in the manufacturer’s literature.

Although typically associated with detector phenomenon, there are many contributors to the noise within an HPLC system. Noise can be both random and periodic, depending upon the nature of the underlying cause of the problem, and this difference can, in itself, give us some clues to the nature of the issue.

In future installments, we will examine a few of the main culprits of baseline noise.

This article is an excerpt of an installment of the LCGC blog .

Tony Taylor is the technical director of Crawford Scientific and CHROMacademy. Direct correspondence to: LCGCedit@mmhgroup.edu

 

download issueDownload Issue : LCGC North America-03-01-2020