Digital Watch

Sep 12, 2017
Volume 13, Issue 13, pg 2–4

Photo Credit: Wayne0216/Shutterstock.comIncognito wonders where we are, where we have been, and where we are going digitally in the analytical laboratory. 


In a secret experiment, I monitored the usage of two pieces of chromatography equipment within our laboratory—one which had a touchscreen for operation, the other of which was operated via a computer next to the chromatograph. Analysis of usage data showed that coworkers below the age of 40 preferred to use the touchscreen version whilst older workers preferred to use the computer terminal. The study was as “blind” as I could hope for without overt intervention. I don’t believe there were any confounding factors and a quick test showed my results to be statistically significant. This set me thinking.

Most of you will have heard of the phrase digital native, which pertains to those who were born and brought up in the age of digital technology, and the phrase digital immigrant, which refers to those who were not, but have come to know something of the usefulness of digital technology, but what does this mean for the present and the future of those working in the field of analytical chemistry? What are the upsides and downsides for the digital natives and are we serving them properly in terms of technology and information handling systems? Further, what does the future hold for the digital immigrants and how do we prevent ourselves becoming “digital exiles”? 

Our digital natives could manipulate controls and information on the touchscreen instrument by quickly toggling between screens, whereas my own preference (I consider myself a digital immigrant—my coworkers may disagree) is to have as much information as possible displayed simultaneously on the computer monitor. Think here of how much younger folks (or savvy older folks!) can do now on the small displays of their smartphones—only resorting to computers when absolutely necessary. Perhaps this points the way to more instrument control or data processing and analysis becoming possible using tablets or even mobile telephone-based applications. This would certainly make the workplace more flexible in terms of its physical organization. But is a smartphone screen large enough to visualize the subtle nuances of chromatographic or mass spectrometric output (my own myopic failings being corrected with spectacles)? Is it enough to rely on scroll and zoom functions—or do we sometimes, literally, need to see the big picture? If the answer is the latter, what has happened to the concept of digital paper, and does this herald a possible way forward for visualization “on the fly”? Or would it be better to have glasses (perhaps safety spectacles) with back projection facilities to allow us to “see” the data on the lenses of our glasses to help us analyze and manipulate data or control instruments on our train ride home from the office or as we walk between meetings? Take this one step further and we can begin to imagine the virtual laboratory where we can control instruments and conduct experiments remotely. Am I too impatient or does it feel like a long time since these technologies were heralded, with nothing tangible being delivered to date?

I had a hilarious conversation recently with a colleague who suggested that they never use e-mail for anything important these days, citing that the chances of eliciting a response on anything important was far higher via text or WhatsApp messages. I believe they were highlighting the dilution of effectiveness of e-mail and no matter how much we prioritize or use “smart rules” to manage our digital information flow, the effectiveness of e-mail decreases (I would say proportionately) with the number of e-mails we receive. Does this point to the need for new modalities in terms of communication? Why is WhatsApp or Snapchat more attention grabbing? Is it simply related to the number of messages or is it more to do with the restrictions and personalization of the groups that we create? Is it better to have our work arranged by small groups of individuals who, for example, can control instrument use and scheduling in a laboratory, using instant messaging applications? Should we be organizing this ourselves at all? Surely technology exists that would better enable instrument scheduling and could be interfaced to a commonly used, or a bespoke, social media app.

Take this discussion one step further and we can move into the field of remote telemetry and robotics. How many of us still walk away from a liquid chromatography (LC) or gas chromatography (GC) instrument hoping that our results will be available later that day or the next because the system seemed a little “flaky” when we set it up that day? I know that systems exist to remotely monitor equipment, but how many of these would allow a remote intervention or indeed the ability to adjust parameters or take corrective action remotely, to ensure that problems are identified, avoided, or perhaps even fixed? How many of these systems use machine learning to interpret chromatographic problems, baseline issues, or signal variance and then compare with a database of previous issues and their remedies and suggest fixes that we might like to apply remotely or that the instrument can apply itself? Robotics comes into play when we consider the possibilities for submitting samples to sample “hotels”, which will then deploy them onto instruments when they are free. Surely this would help our productivity and instrument utilization—perhaps allowing the control of workflows when multiple tests are required on a single sample. I accept that laboratory information management systems exist, but I’ve yet to see a truly “smart” laboratory where robotics and telemetry are working hand in hand to fully optimize the workflow and improve the number of right first time determinations.

How many workplaces are there really that require such a high productivity of data generation? Would this approach be detrimental to the future of analytical scientists because there would be fewer of us needed? Perhaps, but why do I hear so often that analysts and laboratory managers are under pressure to get data out of the door? It may be that the analysis or interpretation of data, or the development of these automatable methods in the first place, are the rate limiting steps.

This brings me to my final musing on the digital workplace: big data. Or at least connected data with a common data standard.

Can you imagine a world in which every method or application developed in academia, industry (where intellectual property right would allow), or by instrument vendors conforms to a common data format and the results of those experiments are also similarly encoded to allow search and comparison? If I want to analyze analyte W in matrix X to resolution Y and detection limits of Z I could literally search for any methods that (claim) to meet these standards. Surely this would be a step in the right direction? I know it would certainly give my computer search engine a well-earned rest. So, what’s stopping us from working towards this? I suggest the answer may contain a currency symbol, and I don’t mean the one associated with the cost of developing such a standard—but I’ll let you ponder this one further yourselves.

In terms of big data applied to the analysis and interpretation of analytical results, such a universal sharing of data is unlikely because intellectual capital would not see companies sharing their competitive edge. However, the usefulness of this approach on an intracompany basis could be huge, and we could link with the limited information already in the public domain. The data mining capabilities of search and compare using massive databases of relational data is almost beyond our comprehension, however, drawing inspiration from—dare one say it—more advanced industries, allows us to consider what it might be like to trend data, compare a chromatogram with the last time the analysis was performed within the business for quality purposes, contrast the differences between reference and sample spectra and draw data on the origin of the spurious ions or ion ratios, consider the origin and best treatment of our outlier data, and suggest variable settings to be used for the processing of our data (integration algorithms or deconvolution settings). This would certainly help us to overcome some of the productivity limitations associated with data analysis and interpretation, but it would surely also increase our capabilities and the quality of our data. It may draw us into the murky world of only giving attention to outliers or out-of-specification results and we would need to carefully resist such moves.

But perhaps I’ve strayed too far into the future and should conclude by returning to my original thoughts on digital natives and immigrants. I propose that the near future will see the rapid progression of mobile technologies that delink instrument operation and data analysis from the physical space around the instrument. I also propose that communication, workflow management, and instrument utilization technologies will evolve to be more fully automated and truly accessible on smartphones and tablets.

Whether we will ever get to the stage where we can set up and run analyses by simply donning our “smartglasses” from wherever we are in world I can’t say. If we ever achieve the big data objectives that will enable the leaps forward in application development or data analysis is also unknown. What I do know is that if you struggle to send a text by mobile phone or don’t know much about WhatsApp or Snapchat, I would abandon plans to live in exile right now, and think instead about becoming a chromatography expert—in the oldest sense of the word!

Contact author: Incognito
E-mail: [email protected]

lorem ipsum