
LCGC Blog: Rewiring the Fundamentals
Key Takeaways
- The shift from process-focused to goal-oriented learning is crucial in the context of AI, emphasizing efficiency over mastering underlying processes.
- Early programming experiences provided foundational logic skills, which remain relevant despite advancements in high-level languages and AI.
This month's LCGC Blog from Jonnie Shackman from the American Chemical Society (ACS) reflects on how early experiences with computer programming shaped the author’s understanding of fundamental logic, and how today’s artificial intelligence (AI) tools similarly shift the balance between mastering low-level processes and focusing on analytical goals.
When I was seven years old, my family bought their first home computer: a Leading Edge Model D. This budget IBM clone came packed with a solid 4.77 MHz processor (we didn’t have the turbo), 256 KB of RAM, and a whopping 20 MB hard drive. Pull up your current computer’s system info, and I’ll bet those specifications are not even in the decimal points. However, back then, it was a step change in technology, with similarities to what we are experiencing today with artificial intelligence (AI). The democratization of these advances can be both powerful and frightening. I see the parallels between me playing simple programming games on an Apple IIe in elementary school and my own child learning how to interact with LLMs at school.
The family’s Leading Edge also came with a stack of manuals: technical, MS-DOS, and BASIC programming. Not content to simply die of dysentery on The Oregon Trail, I struggled to comprehend what seemed like a second language in those volumes. Between the manuals and a slowly growing collection of programming language books at the local library, I progressed to creating new worlds in the land of BASIC (mostly text adventures, with a dash of ASCII-based “graphics”). More importantly, during those hours staring into the orange monochrome glow of the screen, I was learning the fundamental structure and logic of programming, which was far more useful than the dictionary of BASIC commands.
For my undergraduate Analytical Instrumentation course, our text was Principles of Instrumental Analysis, 5th Ed.(1). Interestingly, it had an early section on binary and performing basic binary operations to introduce digital electronics. By this time, the Leading Edge was long forgotten, and my homework (and afterhours Netscape internet surfing) was being knocked out on a “lightweight” eight-pound Compaq Presario 1200 running Windows 98. Binary had been around for centuries.
The 5th edition of Principles was brand new. Was it necessary for me to learn binary to operate an analytical instrument (even a decade old one) in the nineties? I certainly didn’t need it to recreate the land of Zork on that Leading Edge of the eighties. A fair response may be that I would need it to build a new analytical instrument (perhaps designed in Microsoft Office 97 with the help of Copilot’s ancestor Clippy).
The following semester I was fortunate enough to get into a graduate level course, where we were actually tasked with building a fully-functional and computer-controlled “instrument”:a LabVIEW light-bulb oven. While Principles’ teaching of signals, analog-to-digital, and basic measurement theory was useful in the exercise, the time spent on how binary gets assembled and compiled into a “high level language” wasn’t really necessary to meet the end goal. In fact, those hours spent struggling with logical flows and Boolean logic on the Leading Edge a decade prior were far more useful. Perhaps we need to rethink what is a “high level language” in the age of natural language processing. In the nineties, LabVIEW’s graphical means of programming made my BASIC code lines sound like the grunts of Neanderthals. Today, interacting with generative AI might push yesterday’s “high levels” into the same afterthoughts as machine language. I would like to think that programming logic and design will still be applicable though.
Shooting on Goal
In order to realize true utility with AI, we still must guide it to our desired end goal. What this means is focusing less on the process and more on the end-state to maximize our efficiency. While it is fascinating to learn exactly how electrons hop holes in a transistor, understanding the physics is not a barrier to using your favorite chromatograph to solve the next challenging mixture analysis (despite the astounding number of transistors enabling it to happen). If your goal is to build the next generation LC capable of a bazillion bar pressures, you probably still don’t need to know the difference between PNP- and an NPN-type transistor to be successful. to be successful. If your goal is to build a better processor to run the LC… well then, you’ve likely found a required fundamental in learning about transistors.
We are separation scientists. Our goal is to— say it with me— separate things! How do we commonly characterize separations? Resolution: In all the wonderful ways that it can be calculated (2). How does a new chromatographer best learn this universal metric and how to optimize it? Partition theory? Plate theory? Statistical moments? Experience? They’re all relevant. When teaching, I subjected graduate students to building up an Excel spreadsheet that would ultimately allow them to visualize how different statistical moments would impact resolution. Many struggled with it: both from an understanding of the concepts as well as the implementation into probably not the best-suited application for the task. The problem was that I was having them focus on the process and not the goal. Yes, Excel skills are (were?) invaluable in the field, but I was teaching Graduate Separations, not Programming in Excel. I missed the goal.
Reliving the Moments
If I had to do it again today, I think the graduate class exercise might go something like this: “Using AI, create an interactive interface to teach yourself how peak shape and relative peak size impact chromatographic resolution.” Note this approach does not specify the “how”: that comes from the student creators’ prompts. In other words, it could be any application: Excel, Python, Mathematica, BASIC; it could be textual or graphical; it could be presentation style or gamified. As a teacher, I could then provide various scenarios across the breadth of the parameters, discuss why they occur, and how to mitigate them if possible. They would still need to labor with the underlying concepts to get a workable product. Hopefully the process barrier, being less prescriptive, would be reduced, as well as tailored to their individual teaching needs. Needless to say, we would still need to understand the fundamentals to guide, verify, and improve what our high-level language helper interface spits out.
True Believer
The above scenario is not just a thought exercise. With goal in mind (the hardest part!), I spent two minutes crafting a prompt for Anthropic’s Claude-4-Sonnet, let it reason for a minute, and then had a first version product as shown in Figure 1. It’s not perfect, as it was a pretty barebones prompt:
Create a python graphical user interface (GUI) to visualize how two chromatographic peaks' resolution changes based on the first four statistical moments. Default all parameters to 1. Allow the user to change the values with 1 digit of precision. Display a numerical output of resolution. Have a dropdown with 5 scenarios, such as void volumes, and apply those values to the display.
I’d want to iterate a bit with my teaching-assistant GUI and verify the output, but it would have been a game changer for my former students as well as myself. I can imagine generating training tools “on-the-fly” in the classroom. This is now the process I use in my daily work for reaching analytical goals. Is there an analysis feature missing from my chromatographic data system? Why wait on the vendor to implement it when I can export and build my own with AI assistance? Have a folder full of old chromatogram PDFs that need a scan of peak profiles? A previous immensely-laborious manual operation to get a starting point for data analysis is eliminated, allowing the goal of trending the data to begin immediately. I’m gratified that logic and program design are still important in these tasks. Even more so are creativity and curiosity. I now try to force myself to question every task’s process, and if it’s the best means to the end. AI can help rewrite the process, but I still need the fundamentals to assess if it did it the correct way or the wrong way. Our time is finite. If in the end I need to delve into a lower level of fundamentals, I’m finding it’s more efficient to learn it just-in-time. This also gives me just enough free time to write blogs the old way.
In memory of Prof. James Winefordner, a passionate teacher, amazing analytical chemist, and wonderful person.
References
(1) Skoog, D. A.; Holler, F. J.; Nieman, T. A. Principles of Instrumental Analysis, 5th ed.; Harcourt Brace & Co., 1998.
(2) Shackman, G. Resolving Resolution.The Column 2023, 19, 25–27.
Jonathan Shackman is a scientific director in the Chemical Process Development department at Bristol Myers Squibb (BMS) and is based in New Jersey, USA. He earned his two B.S. degrees at the University of Arizona and his Ph.D. in chemistry from the University of Michigan under the direction of Prof. Robert T. Kennedy. Before joining BMS, he held a National Research Council position at the National Institute of Standards and Technology (NIST) and was a professor of chemistry at Temple University in Philadelphia, PA. To date, he has authored more than 40 manuscripts and two book chapters. He has presented more than 40 oral or poster presentations and holds one patent in the field of separation science. Jonathan has proudly served on the executive board of the ACS Subdivision on Chromatography and Separations Chemistry (SCSC) for three terms and is a first-term Councilor of the ACS Analytical Division.
Newsletter
Join the global community of analytical scientists who trust LCGC for insights on the latest techniques, trends, and expert solutions in chromatography.





