Technology Forum: Software

August 4, 2009

E-Separation Solutions

E-Separation Solutions-08-06-2009, Volume 0, Issue 0

Joining us for this discussion are Fraser McLeod of Dionex Corporation; Bob McDowall of McDowall Consulting; and Linda Doherty, Dan Holmes, Jeff Louie, and Leon Kopelev of Agilent Technologies, Inc.

Software is the one element of research that all chromatographers share. Whether working in GC, HPLC, or one of the many hyphenated techniques, at some point, you will need software to identify, quantify, and generally get the work done.

Joining us for this discussion are Fraser McLeod of Dionex Corporation; Bob McDowall of McDowall Consulting; and Linda Doherty, Dan Holmes, Jeff Louie, and Leon Kopelev of Agilent Technologies, Inc.

What trends do you see emerging in software? How has software been evolving?

McLeod:I think one of the key trends is simplicity. In the past, a lot of software, especially in the scientific industry, has been focused on delivering more and more functionality to users –to the point that use of the software becomes extremely complex. That means that users are faced with two options – either use the most basic features that the software offers, or spend a long time learning the software so that they can fully exploit the benefits offered by the more advanced functionality. This pattern can be broken by making advanced features more accessible and easier to use – and is why simplicity is a key trend.

McDowall:• Software for the laboratory has been developed as point solutions: LIMS, CDS, ELN etc.
• Interfacing of instrumentation is still an issue: we do not have plug-and-play interfacing of instruments - so better interfacing is needed.
• We need a middle-ware product that will enable instruments and applications to communicate with each other.
• Electronic working is in its infancy and needs to evolve more.

Agilent group: • Analytical data systems are moving from single workstation, proprietary solutions to systems that are connected through centralized SDMS, LIMS, and ELN.
• Open standards and shared services (i.e: licensing, user management, instrument management), provides increased productivity and the ability to focus on integration and development of application specific software solutions.
• Demand for multi-vendor instrument control and data interchange standards (technology neutral formats (TNF) such as ANiML).
• Improved efficiency through workflow automation (optimized utilization of instruments and lab personnel to maximize lab productivity).
• Lab personnel requiring adoption of software industry standards (user interface design/search capabilities) to improve their day-to-day operations.
• Movement from a paper-based processes to electronic-based processes (e-sig, IP protection).

What is the software application you see growing the fastest?

McLeod:With regards to scientific software, growth is typically in the area where we see the highest instrument growth. Even today, liquid chromatography instruments are the fastest growing instrument segment, so naturally that means that chromatography data systems are the fastest growing software applications.

Another area where we see strong growth is with data archiving applications, and I include LIMS, ELN, and ECM under that umbrella. They are all trying to do the same thing – archive data in a way that makes it accessible and usable. This offers many benefits to companies and is a key driver of their growth rate.

McDowall:• The LIMS and CDS markets are relatively mature and most of the growth here is in replacement systems.
• ELN is probably the fastest growing application at the moment.

Agilent group:• Electronic laboratory notebooks are the fastest growing area, focused mostly around IP protection and discovery rights, however the market is very fragmented as it is based on workflow, much like LIMS.
• Data management moving from archival / storage to true SDMS (federated searching, review of scientific information (calibration curves, electrophoresis gels, etc.) reporting, collaboration, and data presentation). This is the future of laboratory software as it combines the workflows of ELNs and LIMS with the data generation of instrument systems.
• Laboratories (including pharma) will extend their borders to outsource more services that they should buy versus own (judicious employment of cloud computing). This includes the need for adaptability (e.g., integration of new entities, collaboration between and across entities, remote access and mobility, optimizing asset usage, and the accommodation of local practices, etc.).

What obstacles stand in the way of future software development?

McLeod:The main obstacle is the prevalence of proprietary instrument control standards and data storage formats. This places a large burden on software developers as they often have to design software that communicates with instruments from different suppliers and that reads data from different software packages. This effort is continuously growing due to the increasing number of instruments and applications in the market place. The consequence is that developers have less time to create new features that the market is requesting.

McDowall:1. The laboratory exists in a proprietary software world:
• When you buy an application (with or without an instrument) you are stuck with it forever as there are no universal standards.
• Interfacuing of instruments is still an issue: we do not have plug-and-play as we do with operating systems.
• The current standards that exist are useless as they either do not go far enough to include all the metadata associated with the record or allow changes by individual vendors and fail to allow interoperability of records (acquired on one vendor's system and interpreted on anothers).
• Emerging standards are too slow to emerge and be effective; vendors have vested insterest and users don't push for standards or aren't willing to pay for them.

2. Software is not geared up for fully electronic working:
• Audit trails are not integrated with the rest of the application to inform a reviewer when data has been modified.
• In many cases, the application does not inform the user of work that iswaiting to be performed.

Agilent group: • Regulatory compliance, training of employees/end users affect the acceptance of newer technologies.
• Software technology lifecycle is outpacing the analytical industries ability to absorb necessary changes.
• Hardware lifecycles are much longer than typical PC lifecycles, support of mature solutions is mandatory in the lab, so managing older data and the original data systems must be considered and supported.
• Some vendors resist trends and try to maintain proprietary linkages to protect their business.

What is the future of software?

McLeod:Fortunately, I think a lot of the future will be based around open standards for data storage and for instrument control. A group called the Open Chromatography Association is already pursuing an open standard for control of chromatography instruments, and more and more people are starting to adopt the AnIML data format. These attempts at standardization will ensure that companies will be able to spend more time developing the high value features that users are looking for.

McDowall:Software is an essential component of the laboratory - without it our instruments won't work and we cannot work. Therefore, the future will be more reliable software.

Agilent group:• Integration and collaboration across vendors in the analytical/life science space, tools to manage and review data to handle the explosion of data and quickly take data to decisions.
• Open system (plug-and-play architecture), enabling ease of integration into customers’ IT environments, and flexibility to accommodate customer workflows.
• Movement from islands of laboratory information (instrument based), to an integrated enterprise view (workflow based), which drives reuse of data to get to decisions/discovery faster.

If you are interested in participating in any upcoming Technology Forums please contact Assistant Editor Meg Evans for more information.


Related Content: