A Holistic Approach to Assay Development and Subsequent Data Management – An Overview


Developing an assay means having the opportunity to hand-select the most important attributes by which you’ll collect and assess your data. For this reason, the selection process should ideally involve a thoughtfully holistic and integrated approach for robust, repeatable, and reproducible assays. This article will also provide specific examples of common mistakes and discuss how labs can avoid them.

Assays are the backbone of modern pharma research. Countless hours of R&D are spent identifying, analyzing, refining, and developing compounds that profoundly affect our everyday lives. All the activities are based on increasingly sophisticated screens that mimic physiologic conditions more precisely than ever before. From drug discovery and development through manufacturing and QA/QC, developing robust and repeatable assays, as well as managing big data is an area of high priority for pharma labs.

In the following sections, we will discuss a holistic approach to developing robust, repeatable, reproducible assays, and provide a glimpse of the value of an integrated IT and informatics infrastructure that significantly benefits the entire process. In addition, we will also provide specific examples of common gaps and discuss how labs can overcome them.

Steps to Ensuring Success in Assay Development

1. Choose the best model for your assay

First, when developing an in-vitro-based assays it is important to use an authentic cell model from a reliable source. Using a misidentified, or worse, contaminated cell line can lead to false conclusions and lend itself to irreproducible results. A good cell line (or tissue samples) form the underpinning of an experiment, and thus it is critical to be diligent in cell line selection.

Once a cell model has been determined, ensure that the supply of the appropriate cells or tissue, particularly from the same batch or lot to run the experiments. For a phase of research, availability of materials from the same lot ensures consistency and reliability thus avoiding delays, costs and time associated with the project.

Another trending consideration in selecting the appropriate model for more functional assays is physiological relevance. Spheroid (3D based) and organoid systems closely mimic in vivo conditions superior to their 2D monolayer predecessors. The 3D model approach can potentially minimize the volume of animal studies during both target validation as well as preclinical stages during the drug development process. Therefore, there is an increasing preference for 3D cell/tissue systems upfront for ethical and cost reasons in addition to the physiological relevance and “fail drug early” rationale.

However, the choice between a 3D vs. a 2D model will ideally involve decision-making in the context of other assay requirements, discussed in the next section. Although a 3D model system could closely mimic a physiologically relevant state in vitro (or ex vivo), minimal and confirmatory level of in vivo studies would still need to be performed within the preclinical drug development phase to evaluate ADME (Absorption, Distribution, Metabolism, and Excretion) and toxicity effects of the drug candidate(s).

For in vivo studies, selection of the appropriate animal model system, if available specific to the disease under investigation, is important for both organ/tissue imaging assays as well as biological sample in vitro assays.

2. Ask the right questions

A holistic approach to assay development would include “big picture” considerations in addition to an end to end workflow and technical considerations. Hence, one of the primary questions would be whether the assay will lead to elucidation of the mechanism of action involving a signaling pathway, receptor-ligand binding, protein-protein interactions, and/or identification and characterization of biomarker panel(s).

In an ideal set up of assays, for example, biochemical and/or cell-based events such as activation, modulation, and inhibition of a target can be inferred. Similarly, phenotypic assays provide functional and subcellular localization insights, although these may not require the same background rigor as the targeted approach. However, it is critical to consider experimental controls, specificity, and content or antibody selection for a particular target. In addition, considerations such as streamlined workflow and automation environment amenable for the screening operations are important as well.

3. Consider the technical aspects of the assay

Beyond the “big picture,” there are a number of technical considerations in selecting and developing an assay. The starting material or sample matrix compatibility with the assay, particularly with complex matrices if a wash step within the assay workflow is required, high signal-to-background ratio, sensitivity, dynamic range, as well as reproducibility and accuracy.

Although there are numerous types of assays, end users may select from a spectrum of choices based upon ease of use and higher signal-to-background to complex ones that deliver greater sensitivity, dynamic range, and multiparametric data for downstream analysis.

For example, for immunoassays, to achieve greater sensitivity and broader dynamic range end users consider migrating from wash-based ELISA to either wash or no-wash assays such as ALPHA (Amplified Luminescent Proximity Homogeneous Assay) or TR-FRET based HTRF or LANCE Ultra. Eliminating the wash steps will help decrease variability, streamline and increase the overall throughput of the assay.

An automated no-wash technology can be easily miniaturized and has a much wider dynamic range of detection, as well as more sensitive read-out of low concentrations of analytes against soluble proteins. No-wash assays can accommodate a broad range of analytes, from small hormones to larger protein complexes. For more information on the benefits of no-wash, next-gen assays, here is a good reference.

Irrespective of the choice of wash or no-wash assay, signal-to-background, sensitivity, dynamic range, and reproducibility as performance attributes of the assay take precedence.

4. Selecting the appropriate microplate

In general, microplates are designed with ANSI/SLAS compliance guidelines and are fabricated with a variety of materials, come in different format sizes, geometries, and may include coating of substrates. The geometry of the microplate bottoms are uniquely engineered for several different assays in mind that include flat-bottom, U-bottom, and V-bottom.

The latter is typically used in compound library preparation and automation delivery of materials to screening plates. For cell-based assays, microplates are typically designed with tissue culture treatment, are optically clear for imaging assays, and may be coated with ECM substrates to promote healthy cell growth and maintenance.

Just as important, choosing the right microplate for biochemical assays… taking into account factors such as the required volume and density of wells, and whether the new plate will have optical compatibility with lab’s existing screening instrumentation.

Not achieving the required volume and/or density of wells can have a profound impact on the time spent processing samples, and can also potentially derail results, especially in circumstances of low soluble analyte concentration.

5. QC for success: ensure the quality, titration, and optimization of all reagents used in an assay

When running high-throughput assays, it is critical to lean on QC measures in order to ensure the robustness and consistency of results. This means having a protocol for assessing the state of buffer solution, antibodies, and other reagents, as well as the instruments themselves. Plan to use the reagents within the recommended specification of the manufacturer. When any of the above have reached the end of their lifespan it is important to remove from experiment assay to insure integrity of the screen.

This same concept applies to use of assay instruments or materials that are no longer (or were never) industry standard. In developing a custom assay, the goal is progress. Using instruments that have become outdated, do not integrate easily, or generally do not support the next stages of research undermines best practices in data collection and somewhat defeats the purpose of investing in a new assay.

6. Consider fluid handling technologies

For microplates, the ability to dispense accurate volumes is critical for the success and reproducibility of an assay. An error prone practice in this area would be selecting a microplate with volumes that are far too small for a lab’s pipetting capabilities resulting in inconsistency and variability.

7. Choose the right type of plate reader or imager

Microplate reader capabilities determine the spectrum of the assay readout. If multiple modality detection is required to read various types of assays, a multimode reader should be used. Microplate readers or imaging instruments detect, acquire, and export data into software or file formats for further analysis.

Multiple detection and/or imaging instruments should ideally be integrated into lab’s other instrumentation and assay systems for end users to seamlessly access data as well as manage smooth operation of the infrastructure. Independent work stations or siloed lab environments can negatively impact productivity and data management and communication between end users.

Robust informatics tools and services play a vital role throughout the assay development process

Data management, analysis, and visualization informatics tools and services are vital to the assay development process. Clunky or obsolete technology which does not support the evolving lab workflow, or is difficult to navigate or learn, can decelerate an entire project. Although an investment in IT infrastructure and additional informatics resources, both skilled and solutions or tools, add to the upfront costs of the project, long term it ensures a robust streamlined workflow as well as broadens the bandwidth for data capabilities to onboard next gen. technologies.

In addition, at research and operational level, integrated IT and informatics solutions not only ensure upstream reproducibility of results, but also delivers compliance via electronic security and audit trail, respectively.

For example, creating, managing, and analyzing data via an electronic lab. notebook (ELN) enables organization of metadata such as notes, assay protocols and methods, as well as structured data interpretation and visualization. In addition, within the ELN, audit trail capabilities become more important for compliance as the drug compound moves downstream into preclinical development and manufacturing and QA/QC, within regulated GLP and GMP environments, respectively.


Developing a suite of assays for various applications within the workflow translates into availing the opportunity to evaluate and prioritize key attributes important to an assay followed by the collection, management, interpretation and security of data. Hence, the project management process for assay development and downstream evaluation should ideally involve a thoughtfully charted holistic and integrated path as discussed above.

A solutions and systems approach would result in an integrated infrastructure that would deliver value and confidence to an end to end drug discovery and development process.


  6. Slide scanning,
  7. HCA/HCS,
  8. Nature reproducibility article,
  9. Principles of early drug discovery,
  10. Transparent reporting/optimizing the predictive value of preclinical research,
  11. Lead optimization and preclinical development of therapeutic candidates for diseases of interest,
  12. Big data, health informatics and the future of cardiovascular medicine,
  13. Big data and pharmacy,
  14. Deloitte report,
  15. NIH Assay Development & Screening,
  16. Assay development in drug discovery,