Core–Shell Particles for HPLC — Present and Future

May 12, 2014
By LCGC Editors

An excerpt from LCGC's e-learning tutorial on core–shell particles for HPLC at

In a recent Essential Guide webcast, the CHROMacademy team was joined by Fabrice Gritti, PhD, of the Department of Chemistry at the University of Tennessee (Knoxville, Tennessee, USA), to discuss the present theory and practice — and future — of core–shell particles for high performance liquid chromatography (HPLC).

Core–shell particles consist of a solid core coated with a layer of porous silica that is deposited either in layers or a single coating, depending on the manufacturer. The diameter of the solid core and porous layer vary between different manufacturers and the required overall particle size. A 2.7-µm particle may, for example, consist of a 1.7-µm solid core and a 0.5-µm coating. Core–shell columns are now commercially available in particle sizes ranging from 1.3 to 5 µm. A key feature of core–shell columns is the much narrower particle size distribution that can be achieved during manufacturing, typically with 3–6% relative standard deviation (RSD) compared to the 10–30% RSD that is more typical with fully porous materials for a comparable particle size.

Usually, columns packed with 2.7-µm core–shell particles will produce efficiencies approaching those packed with 1.8-µm fully porous particles, but at significantly lower back pressures — the back pressure is proportional to the inverse of the particle diameter squared. So these particles are beneficial in that they offer high-efficiency options for those of us using traditional (400 or 600 bar) HPLC equipment, which leads to a significant increase in peak capacity via improved efficiency or a significant reduction in analysis time.

Initially, the increase in efficiency with core–shell particles was attributed to a reduction in mass transfer (the van Deemter C term) because of a decrease in the possible pathlengths of the pores within each particle. Latterly, however, it has been established that for analytes of less than around 500 Da, the majority of the increase in efficiency is derived from a reduction in the van Deemter A term (eddy diffusion), whose contributors include the distribution of paths through the packed bed and differences in analyte velocity between particles and at the column walls. As the particle size distribution of the core–shell particles is much narrower than fully porous materials, the reduction in the A-term contribution was initially thought to arise from an increase in the packed bed homogeneity (reduced number of possible paths of different length through the column) that results from core–shell particles. However, Gritti's work has indicated that this may not be the case and the reduction in eddy diffusion may instead relate to "trans column" (long range) effects, potentially attributed to an increase in the particle surface roughness, which causes an increase in shear stresses between particles, reducing the radial strain from the centre to the column wall and, therefore, providing a more homogenous interstitial velocity across the column diameter.

There was initial concern regarding the capacity of core–shell columns (the amount of analyte that can be loaded onto the column before significant peak shape deformation occurs) because of the reduced pore volume; however, it can easily be demonstrated that the volume fraction of modern core–shell particles is typically 60–75% of that of a fully porous material. This, combined with the fact that core–shell columns typically contain more particles per column volume because of improved packing homogeneity, results in columns that typically do not suffer from the sensitivity and resolution issues related to significantly reduced column loadability.

Counterintuitively, it can be demonstrated that the performance of columns packed with core–shell particles is actually improved with increasing column diameter. A more highly ordered packing arrangement of stationary-phase particles is observed at the column walls, compared to the more random ordering observed in the centre of the column. Analytes are much more likely to experience the differing packing arrangements in reduced-internal-diameter columns because of the greater relative contribution of wall to centre region. When introduced into larger-internal-diameter columns, the majority of analyte molecules will only encounter the more random ordered packing in the centre of the column because this region occupies a much larger area.

The particle size of the packing material is crucial not only in determining the column back pressure but it is also the dominant factor responsible for frictional heating. In essence, if the particle size of the packing material is reduced, then frictional heating will increase, which has limited the performance of highly efficient fully porous sub-2-µm, especially at higher velocities. It has been demonstrated that core–shell particles have improved heat dissipation properties compared to their fully porous counterparts and so may be used at higher eluent linear velocity before frictional heating effects begin to affect the chromatography.

Gritti also noted that to take full advantage of core–shell materials, as is the case for any highly efficient particle including fully porous 1.7- and 1.8-µm particles, the contribution of extracolumn volume was highly important, and that each system must be minimized in terms of tubing volume, flow cell volume, injection (loop) volume, and column connection volumes. This is especially true if we are to take advantage of sub-1-µm core–shell particles.

Further interesting work in the future will concentrate on optimizing the core to overall particle diameter ratio for analytes of various sizes (hydrodynamic volume) and molecular weights. Core–shell particles capable of operating over wide pH ranges and at much higher back pressures are also areas for future interest.

This article is from The Column. The full issue can be found here: