OR WAIT null SECS
Incognito navigates the potentially precarious path to a successful relationship with a contract research organization (CRO).
Incognito navigates the potentially precarious path to a successful relationship with a contract research organization (CRO).
The global healthcare analytical testing services market is estimated to grow at a compound annual growth rate (CAGR) of 11.3% from 2016 to 2021, reaching $4.13 billion by 2021 from $2.42 billion in 2016 (1).
As the statistics show, many of us will face the situation where we need to outsource analytical work. It’s very important that we select the most appropriate provider, manage the relationship well, and achieve the desired outcomes in the most time- and cost-efficient manner.
The internet is littered with articles advising on how this might best be done-see references 2–6 for just a few examples-however, this advice, for the most part, lacks the practical and pragmatic aspects of how to get things done properly at the “bench” level. Much of it focuses on the quality, financial, and staff audit, with the technical aspects considered at a very cursory level. Further, the nature of the actual “point to point” relationships, that is, those folks who must regularly communicate, solve problems, and deliver the results from both parties, are very poorly covered.
As someone who has, at various times in my career, had a foot in both camps, I wanted to offer some practical advice on how to establish and strive for a highly productive working relationship with your analytical outsourcing provider. This almost always comes down to good diligence, good communication and transparency, the ability to challenge constructively, the humility to accept that sometimes we are wrong (both sides!), and a willingness to adopt a co-operative mindset.
From experience in delivering projects from both sponsor and outsourcing provider perspectives, I categorize the keys to productive “bench-level” contract research organization (CRO) relationships into the following broad categories:
The advice given in a lot of the articles I have read is very wide of the mark on the aspects of the people relationships that need to be established and maintained.
Frankly, whilst we can’t like everyone we work with in an outsourcing relationship, it really does help if we do, and you can tell a lot about the outsourcing provider by the people that you meet when engaging with or undertaking your project.
I have read a lot about the importance of the “client relationship manager” or that having a “single point of contact” is important. I find the latter to be a hinderance rather than an enabler in most cases. It will certainly slow down communications and progress, but more on that later.
Some CRO’s don’t treat their bench-level chemists very well; one worker has described themselves to me in the past as “an advanced autosampler who is sometimes allowed to process data”. When selecting a partner, one should be very careful to assess the “mood” of those folks who will ultimately be preparing your samples, and acquiring and processing your data. If they appear to be downtrodden, only allowed to see part of the big picture, constantly assigned to mundane tasks and given very little opportunity for development, then alarm bells should be ringing.
One of the most destructive problems in any CRO relationship is staff turnover. I’ve seen many projects significantly delayed or even fail because key staff have left or there are insufficient suitably qualified staff to carry out the work placed with them. This needs to be carefully evaluated when selecting a partner and monitored as you move forwards with project work. Whilst you wouldn’t expect to always have “Bob” or “Mary” carry out your analyses, the number of different people you must communicate with, and their knowledge of your analyses, will give an indication of the focus your CRO is putting on your projects.
The ability to develop closer personal ties can also very much help to foster a spirit of emotional investment in your CRO partners, and I have established relationships with CRO partner staff that enable discussions on sport, family members, and holidays. This doesn’t get in the way of productivity, although protracted conversations of this nature would obviously be frowned upon by senior management on both sides, and it very much helps to create a bond of common purpose and should not be overlooked.
Culture and Communications
These link closely to people considerations and describe how “alike” the CRO is to your own working environment. Whilst it is unlikely that the CRO is a mirror image of your own business, there are certain traits that one can assess in order to highlight any potential issues going forwards.
Culture describes the management style and working practices adopted by the outsourcing provider and can include aspects such as:
People Development: What training and development opportunities does the CRO provide to their staff and is there budget and time available for these? What non-technical training is provided? This is a good measure of how the organization cares for their people outside of them being “work units”.
Flexibility: The modern zeitgeist within the CRO is to enable a flexible workforce who can move across techniques and disciplines as the workload shifts. Whilst this is a generally very positive approach, one must ensure that the knowledge and skills of the workforce are sufficient to generate high-quality outputs.
Management Style: How “inclusive” is the management style? Do the folks at the bench have the “whole picture” of the projects on which they are working or is there a silo mentality in which the “jigsaw” of data is pieced together by someone more senior? This can often to lead to subtle implications in the way data are generated or interpreted being missed and can be highly detrimental.
Workload: Are there enough people to do the work you require and are they suitably qualified? If planning and workload management are not first class, then you may find projects are delayed or work is carried out to a lower standard as uniformed or inexperienced analysts are drafted in as extra pairs of hands. No one likes the plumber or builder who “has to slip away for a couple of hours to go and look at another job” whilst still working on yours-the same is true with the staff working on your analytical project!
Staff Contracts: What type of contracts do the analytical staff have? Are they similar to the terms and conditions within your own workplace? Do they seem pejorative, or are they flexible and provide benefits beyond monetary compensation? Are salaries in line with other CROs or businesses in the local area?
Non-Disclosure Agreement/Confidential Disclosure Agreement (NDA/CDA): Whilst seemingly esoteric, the nature of the standard non-disclosure or confidentiality agreement supplied by the laboratory can tell you a lot about the business’s openness and attitude to risk. If the agreement contains a host of defensive clauses, then this will tell you something of the company culture. In general, outsourcing partners will ask for the NDA to be generated on your side, however, I always ask for the CRO standard documents to review in the first instance, just to see how risk averse they are, and how carefully they are promising to treat our confidential information, and how they propose to handle the data and information during, and crucially after, the project.
Of course, the items above can be difficult to quantitatively measure, but there should be no hesitation when evaluating a provider or asking for interviews or a chat with bench-level scientists, and I have often sought to review typical staff contracts to assess the way in which the provider compensates staff at various levels within the business, as well as training records to review how much non-technical staff development is available. As one begins project work, you will quickly come to realize the satisfaction or dissatisfaction of staff regarding their working environment, management, level of inclusivity, and workload during day-to-day conversations. It is much better to attempt to assess this ahead of time wherever possible.
In this regard, larger, more widely experienced organizations don’t necessarily fare better than smaller outfits.
Culture and communications meet when there are issues that need to be resolved, and we need to ensure that honesty and transparency are of value. It is always good to test the water in this regard by arranging a situation in which constructive challenge can be raised in order to evaluate how the provider deals with it. This can be raised as part of the initial audit process, the quoting, or initial project engagement phases.
By far the most effective way I have found is to arrange a presentation or “What’s your approach to…” type session where staff from the provider are asked to present on a particular topic relevant to the work that you intend to subcontract, or a meeting is arranged to review laboratory documentation (procedures or instrument standard operating procedures [SOPs]). It is usual to be able to find topics on which one might question the laboratory staff on scientific or quality aspects and press them to justify their science, process, or regulatory knowledge, making it easy to assess the honesty and transparency of their responses. It is also reassuring when staff admit that they do not know the correct answer, but that they have a colleague who would know the answer. Always allow them to seek the correct answer-this also assesses the internal communications and interoperability within the business.
Project briefing meetings can also tell you a lot about the commitment of the CRO to your work, and you should note the level of questioning coming from the partner organization, especially around the nature of the samples, project background and work undertaken so far, expected analytical outcomes, potential pitfalls you anticipate, unusual aspects of sample preparation, data acquisition conditions, and data processing. The more questions that are asked, the more engaged and inquisitive the partner and the more experienced they tend to be in scoping and identifying possible problems from the outset. Also be wary of groups who appear to know more than they do. I always find questions starting with “Can you explain further”, “We have not come across that before”, “What do you mean by…” reassuring as, provided these aren’t about basic concepts, they demonstrate an openness to admitting what is known or unknown and the ability to seek clarification without concerns over losing face in front of the customer.
The more mundane aspects of communication are also important to establish. How will meetings be conducted? Is video conferencing available? Is screen sharing a possibility? What is the project review cadence? Who will attend project review meetings? What format will the meetings take and how will interim data be presented? However, other less mundane communication scenarios can also be pressure tested early in a CRO evaluation or relationship. How will bad news be communicated to you? Are there escalation processes in place? What circumstances would generate a call to discuss out-of-specification or unusual results and how quickly will this be done? What is the dispute management policy? How open is the organization to troubleshooting visits by the sponsor? How easy is it to get access to senior management at the CRO? Very often, dispute management is needed when issues around unacceptable performance arise. Here it is very helpful to define issues that you believe fall into this category and to work through these in order to assess how the provider would handle these specific instances.
Invariably, problems will arise when outsourcing analytical work and it is of vital importance to establish how these will be resolved as early as possible. If these processes resemble your own internal approaches, then so much the better.
This is typically the area that receives the most focus when evaluating a supplier or when establishing and reviewing ongoing project work. However, there are aspects that are often not considered, and evaluations tend to be focused at a high level, when more esoteric, practical factors can often be the marker for a successful project outcome.
Life can be so much more straightforward when the CRO uses the same equipment as you do, especially when transferring methods. However, even when the equipment is nominally the same, there can be subtle differences. I always ask the CRO laboratory if they measure the dwell volumes of their high performance liquid chromatography (HPLC) systems in order to compare with my own and hence intercept any possible issues with gradient separations. I feel that this is a marker of a laboratory who are really on top of their game. I also ask for the instrument or method SOPs to see if there is any treatment of secondary parameters such as pump compressibility settings, diode-array detector (DAD) bandwidth, and slit width settings.
For gas chromatography (GC), the differences in inlet liners used, settings for the detector supply gases, and splitless times are all worthy of consideration, and an evaluation of CRO personnel knowledge of the possible effects of “unwritten” differences between the sending and receiving laboratory protocols is important.
When transferring a method, I often ask for a critical review of our methods, including issues that they might face in implementation or questions they may have over the scientific approach and method variables, to assess the organization’s ability to spot issues ahead of time and how they would seek to address them. Discussions such as these can also tease out potential misinterpretation of methods and help to get everyone on the same page. Remember that adage of “Our practice is not necessarily their practice”? Just because it’s written in an SOP doesn’t mean it will be interpreted or indeed implemented in the same way. A favourite of mine is to remove the units from a trifluoroacetic acid (TFA) solution in the analytical conditions summary, just to see if anyone asks if the 0.1% TFA is weight/weight or weight/volume prior to getting to the experimental section. Again, a good mark of a laboratory who can intercept problems before they occur. Note in project kick-off meetings the numbers of questions raised by the CRO around method variables and the degree of clarification sought on information not included but that could be key (secondary or esoteric variables as described above). Here, the key is level of engagement and inquisitiveness based on their previous experiences of method transfer.
Don’t be afraid to set up meetings where technical questions may be asked, much like a technical interview, based on problem methods, poor chromatography, or issues with quantitative analysis, in order to really dig into the scientific knowledge within the business. Undoubtedly, the CRO will offer their “expert” for these sessions, but push back and get a selection of personnel into the room to assess levels of understanding from the bench upwards. Of course, the organization’s willingness to offer up multiple staff in such a meeting will depend upon the amount of work (and money) you are offering; however, it is the sponsor’s responsibility to ensure they are comfortable with the technical credibility of the CRO staff!
I also try to insist on a laboratory tour and some time to linger to observe working practices. I do feel that watching operations, such as weighing and pipetting, give me a sense of how good the project work will be.
Where method development or validation is being undertaken, I always insist on the production of a protocol for the work ahead of the project meeting. The approach taken to this type of work is again very revealing about the capabilities of the people and their resources. If I’m spending company money outsourcing a project, I’d certainly like to know that there are efficient processes in place to expedite the work as quickly as possible whilst retaining the highest quality. Do they have a column switching HPLC system that can run screening analyses overnight? Are they equipped with method development software to help model and predict the optimum separation conditions from these screening runs? Is there any statistical design for robustness testing during method validation? What is their approach to emerging initiatives, such as analytical life cycle management, and do they have redacted examples of analytical target profiles developed for other clients? Again, these considerations can be highly informative regarding the experience and abilities of the laboratory undertaking the work.
In general, I find that the more redacted reports for similar types of work the CRO are willing to share, the more comfort one might take that the work will be performed to sufficient high technical standards, and that the partner organization has the depth of understanding to deliver the data and understanding that you require to take your projects forwards. This is especially true when the work you are outsourcing becomes more “specialist” in nature. Some types of analyses take years of experience to properly plan experimentation and generate data fit for purpose for. One may ask, where appropriate, for information on how many regulatory submissions have been successfully made using the data that the CRO has generated, what questions were asked by the regulator, and how these were addressed.
The ultimate question one is trying to answer here is “Can I trust the data?” and it is your responsibility to ensure that you can. I’ve often heard the phrase “We would have expected you to be doing that’ in sponsor–CRO discussions and it is my firm belief that this is a sponsor diligence oversight!
Documentation, Process, and Delivery
Here again, advice and guidance tend to be at a fairly high level and based on “management evaluation” of the provider. However, the areas of documentation, process, and delivery can be of crucial importance in the day-to-day working relationships with your CRO and can lead to an easier and more productive working relationship.
Of course, the “working” documentation provided should be carefully checked when first evaluating the supplier and should include SOPs and procedures relating to the general laboratory operations and anything specific to projects that you are planning, including the instrument SOPs. However, there are many instances where these are checked only as part of an audit, and those involved in the day-to-day work are not given the opportunity to review these documents. This is a dangerous situation and it is important that all sponsor personnel involved in the project are given the opportunity to comment, considering not only the specific technical and scientific details but also the flow and layout and how they compare to your own documentation. By the way, don’t be surprised if the documents are of a higher standard than your own. Swallow your pride and learn what you can. Some CRO organizations are highly professional and can often teach sponsors a lesson or two!
If you are involved in the documentation review process, be prepared to adapt your own internal assessment and evaluation criteria documents to be specific to the project that you are working on. Often these documents and their criteria are broad or more generic and it is important that this stage does not become a pro forma exercise; you should be able to flag specific technical issues that may be troublesome further down the line.
There are several key aspects that can help to quickly identify red flags, some of which I have summarized below:
System Suitability Testing: Are the correct aspects being measured to ensure the system is fit for the purpose of producing high quality data? What, if any, subjective criteria are included? How are the pass/fail criteria defined? What action criteria are there in the event of a test failure?
Out-of-Specification (OOS) Data: How are these defined? How are they flagged? What are the action criteria? What are the re-test criteria? How are OOS results reported?
Quantitative Analysis Protocols: How is instrument calibration defined and implemented? How are calibration curve data handled? What testing is done to verify linearity or heteroskedasticity beyond the usual r2 regression coefficient? How are nonlinear instrument responses handled? How is bias verified? What checks are in place to ensure the instrument response is not drifting or changing over time and are these verified using statistical significance tests? What criteria are defined for the omission of data points in calibration curves? How sensible is the approach to calculation of the results and is this mainly done within the computer data system or are there manual steps, typically using Microsoft Excel? Could your sponsor take the raw numbers from the sample preparation and the peak areas and manually carry out the result calculation, that is, how well does the CRO know exactly how the result is derived?
Method Development/Validation Protocols: How well defined are the analytical outcomes? How clear are the proposed tests and their pass or fail criteria? How systematic is the flow of the protocol? What statistical design or verification elements are there and how appropriate are they?
Reporting Formats: How amenable is the CRO to reporting data and outcomes using your own templates to avoid transcription errors on your side?
Data Access: What facilities are there for you to access either test results or raw data remotely? Is there an opportunity for you to “walk through” a data analysis exercise with the CRO staff? (This can help enormously when troubleshooting issues.)
Many sponsor organizations take the attitude that once the initial audits have been successfully completed, then a study of lesser value or importance will be given to the new CRO to test their mettle. I would strongly advise against this and, wherever possible, one should place a study using a know reference material or, preferably, samples that have been previously tested, either internally or at another CRO. This will really help to build confidence in your new partner and may even, as has happened to me on at least one occasion, prove that the initial testing was not properly carried out.
At the very least, one should be asking the provider to produce a protocol for the work that is to be carried out to assess the scientific approach taken.
As per my earlier comment, I have worked on both sides of the sponsor–CRO relationship and I can say with absolute certainty that the more diligence work that is carried out when selecting a partner or defining a new project, the more successful the outsourcing will be. By setting clear expectations on both parties across all the areas outlined above, and by properly defining the outcomes and the routes taken to achieve them, much angst, time, and money can be saved. Whilst we are all under great pressure to generate data and analytical insight to help take our projects forwards, it is always the case that more haste leads to less speed. Having a scientifically competent partner with whom you can communicate regularly and honestly will make your analytical outputs better and your life so much easier.
Contact author: Incognito
E-mail the editor:firstname.lastname@example.org