Open access peer-reviewed chapter

Going Small: Using Biophysical Screening to Implement Fragment Based Drug Discovery

Written By

John J. Bowling, William R. Shadrick, Elizabeth C. Griffith and Richard E. Lee

Submitted: 08 April 2016 Reviewed: 20 October 2016 Published: 30 November 2016

DOI: 10.5772/66423

From the Edited Volume

Special Topics in Drug Discovery

Edited by Taosheng Chen and Sergio C. Chai

Chapter metrics overview

2,443 Chapter Downloads

View Full Metrics

Abstract

Screening against biochemical targets with compact chemical fragments has developed a reputation as a successful early‐stage drug discovery approach, thanks to recent drug approvals. Having weak initial target affinities, fragments require the use of sensitive biophysical technologies (NMR, SPR, thermal shift, ITC, and X‐ray crystallography) to accommodate the practical limits of going smaller. Application of optimized fragment biophysical screening approaches now routinely allows for the rapid identification of fragments with high binding efficiencies. The aim of this chapter is to provide an introduction to fragment library selection and to discuss the suitability of screening approaches adapted for lower‐throughput biophysical techniques. A general description of metrics that are being used in the progression of fragment hits, the need for orthogonal assay testing, and guidance on potential pitfalls are included to assist scientists, considering initiating their own fragment discovery program.

Keywords

  • fragment‐based drug discovery
  • biophysical screening
  • efficiency metrics
  • nuclear magnetic resonance spectroscopy
  • surface plasmon resonance
  • thermal shift assay
  • isothermal titration calorimetry
  • fluorescence polarization
  • X‐ray crystallography

1. Introduction

“Going small” with fragment‐based drug discovery (FBDD) denotes using low molecular weight compounds to probe a therapeutic target. This also includes using smaller tailored libraries and lower screening throughput in more carefully measured assays. This is a consequence of being reliant on biophysical technologies, as compared to classical high‐throughput screening (HTS) approaches performed in 384‐well plates that detect product formation. FBDD at its core is target‐based drug discovery, but the initial approach of fragment screening differs from standard lead‐like screening, which utilizes much larger higher molecular weight screening libraries.

In theory, modern target‐based drug discovery screening libraries are designed to maximize coverage of chemical space. This is especially important for groups that use high‐throughput technologies (HTS) and screen against diverse targets. These target‐based programs tend to rely on screening the highest practical number of chemical entities from their screening libraries, sometimes accumulating millions of compounds [1]. However, with over 166 billion possible synthetically accessible organic molecules containing up to 17 heavy atoms (nonhydrogen) [2], even the biggest screening libraries cannot possibly statistically represent this vast chemical space [3].

From modeling described in a 2001 article, Hann and colleagues showed how higher molecular complexity (i.e., ligand size) significantly decreased the probability of protein‐site molecular recognition [4]. The authors outlined this as a primary shortcoming of the combinatorial chemistry/HTS approach to drug discovery and promoted the idea that screening smaller libraries with reduced complexity could be a complementary approach. Thus, by reducing compound complexity, FBDD evades the pitfall of scaffold bias which develops in large lead‐like screening libraries.

In this chapter, the reader should come to appreciate how going small is an intrinsically orthogonal screening platform that is easily integrated with established biophysical techniques. The methods, examples, and citations discussed are intended to guide a newcomer to FBDD, specifically scientists who have some prior experience with drug screening principles.

Advertisement

2. The rise of fragments as a screening ideology

Fragment‐based drug discovery started as a concept published in 1981 [5] by biochemist William P. Jencks, who characterized the binding affinities of molecules to proteins as being built from components. Citing several examples, he described how the balance of Gibbs‐free energy for a two‐component molecule binding to a protein could also be described through the equation ΔGAB0=ΔGAi+ΔGBi+ΔGs, where ΔGi values are the “intrinsic binding energies” of the components and ΔGs is their “connection Gibbs energy.” A key aspect of the concept is that the component's contribution to the balance of the observed binding energy (ΔGAB0) would be relatively significant but weak, thus creating a challenge for detection. Jencks’ concept was obscured at the time due to the excitement around combi‐chem/HTS drug discovery approaches. In 1996, the conceptual observations of Jencks were experimentally validated by the team of Fesik. His team successfully produced a drug lead by linking fragment hits detected by a sensitive two‐dimensional nuclear magnetic resonance (NMR) spectroscopy binding assay (often referred to as SAR‐by‐NMR) [6]. Since then, fragment‐based drug discovery has energized the pharmaceutical industry. In 2011, a drug for BRAF‐mutated metastatic melanoma became the first FDA‐approved drug discovered via FBDD [7]. A second, for chronic lymphocytic leukemia, was approved in 2016 [8], with several related candidates currently making their way through the pipeline [9].

Advertisement

3. Fragment primary screening

At present, NMR, surface plasmon resonance (SPR), thermal shift assay (TSA), isothermal titration calorimetry (ITC), and X‐ray crystallography (Sections 3.2–3.4, 4.1 and 4.2) are the most widely used techniques in FDBB. Given their respective throughput capacities NMR, SPR, and TSA are often used as the primary screening technology with ITC and X‐ray crystallography reserved as secondary screening. Figure 1 shows the effective ligand affinity coverage of each technique which partly demonstrates their utility with FBDD. All biophysical screening techniques work best in combination and individual hits need careful orthogonal validation. Crystallography is the gold standard as the information gained allows for rapid ligand advancement. However, its application as a primary screen is often impractical due to resource and time limitations. High concentration inhibitor biochemical or fluorescence polarization (FP) assays can be used in some cases for orthogonal validation of primary screening hits where crystallography is not an option.

Figure 1.

The range of fragment affinities covered by the biophysical techniques described in this chapter. The techniques are ranked from the top downwards in order of their typical frequency of use in FBDD programs (SPR being a close second). NMR (yellow outline) is represented as a medium‐throughput method but can be low‐throughput based on the availability of protein and whether recycling is required. *Fluorescence polarization, technically a biochemical technique and highly dependent on probe affinity, is included for comparison but can be applied to fragment screening.

The workflow represented in Figure 2 shows how these techniques might be organized into a traditional screening paradigm. It is common to use some of these techniques in parallel, particularly at the secondary screening stage since there are always fewer compounds to evaluate. Pragmatically, scientists should obtain structural insights at this secondary stage to validate primary screening results. Structural information is preferred when deciding to progress a fragment to the hit generation stage (discussed further in Section 5.2).

Figure 2.

An example of the typical FBDD screening workflow. The workflow assumes structural information by X‐ray or NMR. The hierarchy will not accurately depict the resources of all drug discovery programs. Dashed connections represent screening options at the primary and secondary screening stages. Arrows point to the results from each screen to be ranked or compared to those of other techniques. Fragments promoted to secondary screening will eventually require structural information to be progressed to hit generation. *The ITC technique is typically used for ranking fragment hits after secondary screening.

3.1. Fragment library design

An obvious first step for any primary fragment screening is to resource compounds for the screen. However, wielding a proper fragment library as a tool for hit generation conflicts with traditional lead‐like screening methods. Central to the conflict is the regular use of high concentrations of compounds to accommodate expected low binding affinities. Practical pitfalls, such as compound aggregation, compound precipitation, dramatic pH changes, detector saturation, and nonspecific interactions, are a minor concern when nanomolar concentrations are used during screening of larger molecular weight molecules, but can become major issues when millimolar concentrations are used in fragment screening.

In 2003, scientists at Astex Pharmaceuticals published a synopsis of their emerging fragment drug‐discovery program and noted that the average physical properties of their fragment hits fell conveniently within different orders of 3 (molecular weight <300, hydrogen bond donors ≤3, hydrogen bond acceptors ≤3, and ClogP ≤3) [10]. As a compliment to Lipinski's rule of 5 (RO5), the fragment rule of 3 (RO3) was a convenient target toward which chemical suppliers built fragment libraries from their existing stores. Ten years later, Astex Pharmaceuticals revised their position [11], stating that similar to the RO5 described by Lipinski, RO3 was more of a guideline and that their refined library consisted of fragments with less than 17 heteroatoms with molecular mass <230 Daltons. This exemplifies how well‐constructed fragment libraries should rely heavily on practicality to be effective tools and speaks to the need for additional scrutiny of (in order of importance) solubility, stability, and reactivity for effective fragment screening.

Commercial and nonprofit access to fragment libraries exists, and several examples have been characterized [12, 13]. A custom library allows for existing libraries to be used in addition to catalog resources with some tailoring based on specific cheminformatics principles, such as optimization of the representative chemical space, avoidance of nuisance compounds, and guidance with pharmacophore models. One early‐stage example of note is the Global Fragment Initiative (GFI) by Pfizer [14], wherein the library was built from compounds on hand, purchased, and synthesized. Each member of the GFI library was rigorously characterized and empirically tested for aqueous solubility up to 1 mM, with the intent of using the library for multiple biophysical techniques. How a fragment library is procured will depend on an acceptable balance of convenience and cost, but it is highly recommended that the end user have methods in place to reliably assess each compound for practical use at high concentrations. For an example of this workflow, see Figure 3.

Figure 3.

A suggested library construction workflow for FBDD. Once compounds are obtained as dry stocks, the workflow proceeds left to right. Having redundancy built into the screening stocks helps rule out contamination or mishandling. It is presumed that fragments will eventually be used in NMR studies, therefore dissolved using deuterated solvents (d6‐DMSO and D2O). Rigid quality control is recommended to eliminate spoiled or misidentified compounds and repeated on hits or on the event of significant additions of fragments to the library.

Fragment hits can have limited traction toward chemical expansion or linking as a consequence of their size. As a safeguard, Merck strategically redesigned its general FBDD library to accommodate more structure‐activity relationships and to fill structural gaps by visual inspection [15]. The purposeful move away from diversity in its general library was a concerted effort of cheminformatics and crowdsourcing of medicinal chemists to gain pipeline traction. The strategy leads to larger general screening libraries and effectively restricts widespread application to appropriately equipped programs. Regardless, any FBDD program design must account for this pitfall and ensure the potential to develop fragment hits through chemistry or catalogs.

Finally, Pan‐Assay Interference Compounds (PAINS) are a well‐known classification of chemical entities to activity across multiple assays and proteins, and they have been thoroughly reviewed in regards to their practical impacts on FBDD [16], and related cheminformatics filters are available via the Internet [17]. The reduced chemical complexity of fragments does inherently diminish the number of “worst offenders” in its library and often bad fragments are quickly identified and triaged from screening libraries.

In conclusion, for FBDD, it is prudent to prioritize highly soluble fragment libraries with a diversity of ring shapes that can match a broad range of hydrogen bonding interactions from the protein target. A minimalistic approach would be to only eliminate mostly predictable fragment “show stoppers,” containing toxicophores subject to xenobiotic metabolism, since it is often easy to scaffold hop in the early stages of FBDD to remove unwanted motifs.

3.2. Nuclear magnetic resonance

Modern NMR spectroscopy is best known for enabling the three‐dimensional characterization of ordered molecular structures in solution and was the first technique to be used for fragment screening [6]. It is also one of the few biophysical techniques that can easily be switched between perspectives of the small molecule and the protein at run time. A growing list of NMR experiments used in fragment screening can help validate hits without using additional biophysical techniques.

Samples are prepared in situ by using automation (such as the Gilson GX‐271 in Figure 4) or by pipetting manually. One immediate benefit of the manual method is the ease with which a scientist can eliminate precipitated or turbid samples by optical analytics. Individual inspection of samples is time‐consuming but a must for any successful FBDD program, providing important feedback about the fragment library. Fragments must be dissolved in deuterated solvent (e.g., 99.9% d6‐DMSO, Cambridge Isotope Laboratories, Inc., USA) for programs using NMR at any stage (see Figure 3). Typical concentration ratios for test samples are 10:1 up to 30:1 fragment to protein in the chosen buffer (phosphate buffers being the most common). Prior knowledge of the Kd and stoichiometry is not required but can be used to tune concentrations to avoid unintended site saturation by an individual fragment, which is a general concern when screening fragment mixtures or performing competition experiments. Finally, the issue of whether or not to use surfactants (e.g., 0.05% Triton X100) to help eliminate false positives is best left to a case by case basis, but if used, the conditions should be consistent with orthogonal techniques, with the exception of crystallography.

Figure 4.

Screening fragments by NMR may include the use of automated sample preparation and handling. Common NMR experiments used in FBDD programs are listed, categorized according to their use of the magnetization pathways indicated. For ligand‐detected experiments, additional pathways to the unbound ligand also exist. The white arrow indicates magnetization transfer from the protein to its bound ligand and is specific to the saturation transfer difference (STD) experiment.

Spectroscopy experiments that indicate binding from the fragment's perspective are structurally less informative but have a higher dynamic range than do protein‐detected experiments. Because significant cost savings can be made by using unlabeled protein, early‐stage, budget‐conscious programs may focus on using the ligand‐detected suite of experiments shown in Figure 4, often acquiring them in parallel for each sample. The saturation transfer difference (STD) experiment can provide a binding‐epitope map, as magnetization can only travel through the protein to the bound fragment [18]. The epitope map enables a scientist using unlabeled protein to identify the portions of the fragment in closest proximity to the protein and, conversely, the portions available for expansion or linking for hit generation. Specialized experiments, such as the interligand nuclear Overhauser effect and target immobilized NMR screening (ILOE [19] and TINS [20], respectively), are best reserved for the study of difficult proteins or competition experiments. From the protein perspective, several variants of the two‐dimensional HSQC experiment (typically 1H nuclei measured directly and X nuclei, indirectly) that helps to disperse the numerous signals in a protein target and depend on the protein isotopic enrichment strategy. TROSY (another HSQC variant) can be used for large, usually perdeuterated proteins but requires a high‐field instrument (i.e., 800 MHz and up); the discovery of methods to selectively label methyls [21] has simplified the resulting spectra and enabled screening on lower‐field instruments (e.g., 500 MHz) [22].

To reduce protein consumption, NMR FBDD relies on the screening of equimolar fragment mixtures. This step is immediately followed by mixture dereplication, usually involving manual interpretation of spectra, although with careful sample preparation and a well‐curated fragment spectral database, hits can be identified by software (e.g., Mnova Suite, Mestrelab Research S.L.; ACD/Spectrus Suite, Advanced Chemistry Development, Inc.; Topspin, Bruker Inc.). Individual reference spectra are acquired during the quality control steps of the fragment library design, and mixtures are then designed to avoid spectral overlap and reactivity. The number of fragments used per mixture is not standardized; but, basic statistical principles support using as few fragments as possible while avoiding mixing acids with bases or nucleophiles with electrophiles. For 1H‐observed experiments, 5–7 compounds per mixture is a reasonable starting point. Validation of hits is accomplished by a second round of screening individual fragments. At the validation stage, the use of spectroscopy experiments that provide binding site information (e.g., HSQC, STD mapping, ILOE) is highly recommended if crystallography is not readily available. Any resulting structural information is crucial for promoting the fragment into the downstream processes of medicinal chemistry and hit generation.

With the exception of WaterLOGSY, titration and analysis of the resulting signal of these NMR experiments can provide binding affinity scores for fragments with reasonable accuracy. In addition to the previously described STD experiment, 19F NMR screening by filtered transverse relaxation (T2), a filter also referred to as a Carr‐Purcell‐Meiboom‐Gill (CPMG) scheme, can be a powerful option if used in competition with a known fluorinated ligand having a Kd measured carefully by using more rigorous techniques (e.g., SPR, ITC, FP). The potential benefit is that one sample can be analyzed for fragments in competition with or, perhaps, having an allosteric contribution to binding, and binding affinity is back‐calculated relative to an internal nonbinding control or electric reference signal [23].

There are relatively few drawbacks to using available NMR facilities in a FBDD program considering the method's ability to contribute to every aspect of the workflow, such as library quality control and hit generation. The two drawbacks that are most often cited are the speed of the screen from sample preparation to data analysis and the demands on protein production for screening by NMR.

3.3. Surface plasmon resonance

Surface plasmon resonance shares the spotlight with NMR as a major screening technique for FBDD programs. The hurdles that come with using an immobilized protein for screening are counterbalanced by increased sensitivity and immediate access to kinetics data. Although absolute binding kinetics are not assured when dealing with weak affinities, for well‐optimized experiments, obtaining ka (binding), kd (dissociation) rates, and the KD (binding constant) value is certainly possible. Consequently, interpreting the resulting sensorgram can be challenging; but thankfully, the biosensor community has over 25 years of experiments to help set standards—hundreds just in 2009 [24]. With improvements in software to include experiment wizards, relatively new users can screen numerous fragments.

These experiments generally have long lead in times, as protein immobilization chemistry and buffer conditions need to be carefully perfected prior to screening experiments. Setting aside the routine instrument maintenance and screen validation using appropriate controls, the traditional stages of each experiment have been buffer and compound preparation, target immobilization, start‐up samples (i.e., buffer match, blanks, and positive controls), fragment primary screen, data reduction and analysis, hit selection, and secondary dose response of hits.

Successful FBDD programs focus most practical attention on the preparation of the chip immobilization surface, ensuring stability between experiments. This focus on stability assumes that the loading conditions for the protein have been standardized and validated across multiple coupling methods, such as a covalent amine coupling [25] (to amine termini or lysine) or the less‐amenable (regarding FBDD) coupling by protein tags (e.g., biotinylation or poly‐histidine).

Once the target is loaded, screening samples should be prepared as single‐point concentrations, for example 100 μM. The concentrations can be variable, but the expectation is that they are carefully prepared to avoid common problems such as precipitation and aggregate formation that may produce nonstoichiometric binding to the target. The use of detergents is allowable for the sake of the target but further complicates buffer matching in the reference channel. Troublesome fragments identified by their atypical sensorgrams are usually triaged from screening collections.

Once the data have been collected, data reduction and normalization follows and requires some practice to prosecute these steps efficiently. An experienced scientist can perform a first pass of the entire data set to quickly exclude sensorgrams for which a reasonable curve fit is unlikely due to compound incompatibilities or systemic problems with the instrument (Figure 5). Next, data reduction can seem like a tedious process to simply “clean” the sensorgrams; but in addition to aligning the injection time points, it checks the soundness of the blank injections, which yield important problem‐solving data. Finally, the configuration of most instruments requires multiple runs to cover entire fragment libraries, so run‐to‐run variation naturally exists. Normalization seeks to enable the comparison of experimental responses, regardless of target density, binding activity, molecular weight, and buffer mismatch. The most expedient way to achieve this is by using the known concentration and binding affinities of the control compound injections to convert the response unit (RU) to a relative occupancy (θ), wherein 100% would be saturated binding (i.e., Req/Rmax).

Figure 5.

Ideal (green) and unfortunate (red) sensorgrams simulate binding and nonbinding events during the SPR experiment. Aggregation is typically concentration‐dependent, whereas incorrect stoichiometry will not fit simple binding models. Relative occupancy (θ) of 100%, with a 1:1 stoichiometry indicated at the dashed line, represents how using high concentrations of fragments that appear to bind do so nonspecifically, binding weakly to multiple sites on the protein, chip surface, or nothing at all.

Recently developed methods have increased the throughput of dose response studies, providing kinetic information earlier in the screening workflow and practically eliminating the need for a secondary set of experiments for hits. For example, a nonbinding diluent (i.e., 20%, w/v sucrose) can be used to create a range of compound concentrations from the same sample [26], or individually prepared sample concentrations can be sequentially injected (e.g., “single cycle kinetics” or kinetic titration) [27]. Both methods save time by avoiding multiple regeneration steps. Further gains have been realized by using Taylor dispersion injections [28] in a longer flow path (e.g., OneStep™, SensiQ Technologies Inc.) to deliberately produce a gradient of analyte concentration flowing across the chip surface: by modeling [29] this dispersion from a known initial concentration, the same kinetic data can be obtained during the injection phase of the experiment.

3.4. Thermal shift assay

Protein exists in a thermodynamic equilibrium between the folded and unfolded state. As the temperature of the system increases, the ratio of folded to unfolded protein shifts toward the unfolded state, making it possible to determine the temperature at which half of the protein is in the unfolded state. This point is referred to as the melting temperature of the protein (Tm). The thermal shift assay relies on ligand‐induced conformational stabilization, which is based on the energetic coupling of ligand binding and protein denaturation. In short, fragment binding alters the ratio of folded to unfolded protein by stabilizing the folded state (Figure 6). Adding a stabilizing fragment will shift the Tm, allowing for the calculation of ΔTmas an indicator of fragment binding [3032]. Several methods available to measure the ratio of folded to unfolded protein in a given condition are: circular dichroism, infrared spectroscopy, differential scanning calorimetry, measurement of the intrinsic fluorescence of exposed tryptophan residues, and TSA, which is the most commonly applied technique. These methods were first adapted for use in TSA as a simple and inexpensive biophysical method for drug discovery in 2001 [30]. This work determined that ligand‐induced protein stabilization could be tracked with environmentally sensitive dyes over a range of experimental temperatures. In an aqueous environment, the dye is quenched, giving minimal quantum yield. As the protein is denatured in the increasing temperature, its hydrophobic core is exposed to react with the dye. This interaction measurably increases the quantum yield of the dye.

Figure 6.

This figure demonstrates the theory associated with thermal shift interactions. Ligand binding causes stabilization of the protein in the more ordered folded state (represented in green). Thus, more thermal energy is required to move ligand stabilized protein from the folded to the unfolded state. Protein in the absence of ligand is represented in red. This protein requires less thermal energy to shift to the disordered state.

Thermal shift has since been revised and optimized for use as an economical fragment screening method. A typical TSA has minimal requirements for the quantity of the target protein. A pilot assay should be completed wherein the concentration of protein is altered over a given range, as running TSA with an excess of protein can saturate the detector. Alternatively, too little protein will give a flat curve and negatively affect the signal‐to‐noise ratio in the resulting data (Figure 7). Thermal shift assays are commonly completed with 1–10 μM target protein, which should be as pure as possible. Gross impurities within the sample protein could lead to multiple curves, reducing the accuracy of the resulting data.

Figure 7.

The importance of protein concentration in TSA. As protein concentration moves above or below the optimal range for the assay the curves fail to accurately represent the system and will increase the error of melting temperature calculation.

In addition to protein, TSA requires an environmentally sensitive dye. Historically napthylamine sulfonic acid dyes (such as 1,8‐ANS, 2,6‐ANS, or 2,6‐TNS) were used for this purpose. Currently, SYPRO Orange is used more commonly as its fluorescence (Ex/Em of 300, 550/630 nm) is better adapted to rtPCR instruments. This dye is supplied from the vendor as a 5000X concentrate in DMSO. A pilot assay should be conducted to determine the appropriate concentration of SYPRO Orange for a given assay, as is done for protein concentration. In practice, many assays can be effectively run at a 5X concentration of SYPRO Orange. After the appropriate dye concentration has been established, a master mix of dye and buffer can be applied to the assay plate. At this point the plate is ready for the application of a fragment library.

As in many fragment‐screening assays, the quality of the library is paramount. In the assay development stage, a pilot assay should be run in which the concentration of solubilizing agent is varied over a range to define any effects that the agent might have on the stability of the protein. If time and availability of fragments allow, then a screen of fragment alone should be performed to check for fluorescence in the absence of ligand; this additional screen could yield data that is useful for removing problem compounds within the library. With an appropriate fragment library, it is possible to apply fragment to the screening plates. This is typically accomplished by using a pin tool or similar device capable of accurately delivering small volumes. After the plate has been exposed to the fragment library, it should be sealed with a fluorescently inert plate seal to avoid sample evaporation during the course of the assay. The assay plate should then be centrifuged to remove any air pockets from the samples, as these might reduce the quality of the data.

It has been established that TSA can be completed in a standard qPCR instrument [33]. The minimal requirement is the ability to evenly heat the samples over a suitable range of temperatures and record fluorescence. Deconvolution of resulting data output is variable based on instrumentation. This step can be time consuming and automation in data processing is helpful if large‐scale projects are planned. Results should be recorded as fluorescent units recorded at each temperature in each well. This information can then be moved into data analysis software (e.g., Graphpad Prism, Graphpad Software, Inc.). Plotting fluorescence units against temperature should result in a sigmoidal curve reflecting the folded and unfolded states of the protein over a range of temperatures (Figure 8a and b). The signal commonly drops after it has reached a plateau. This drop is the result of aggregation in the protein‐dye complex after denaturation [33]. Failure to remove data points resulting from this drop in signal can detrimentally affect subsequent curve fitting (Figure 8a and b). The Boltzmann equation can be adapted to calculate the exact Tm for each protein. Alternatively, it is possible to plot the derivative of the signal against temperature, recording the maximum of this derivative as the melting temperature (Figure 8c). The appropriate method for calculating the Tm can vary by target. Different methods should be tested to determine which calculation most accurately reflects the Tm of the experimental protein [32]. The thermal stability of a protein is increased to varying degrees when ligand is bound. The extent of this shift can vary greatly. In the case of fragment binding, thermal stabilization can be as little as 0.5 °C, making it crucial to establish the baseline stability of the protein within the experimental environment. It is then possible to establish the threshold of ΔTm for a positive result in fragment stabilization of the ligand. As a rule of thumb, the standard deviation should not be greater than 10% of the ΔTm [34].

Figure 8.

Curve fitting. (a) In this figure fluorescence intensity is plotted against temperature in blue. Above 41°C the dye begins to denature causing a decrease in signal. Fitting the full curve with the Boltzman equation (shown in red) would give an inaccurate estimation of Tm. (b) In this figure fluorescence intensity is plotted against temperature in blue. The data here is trimmed as signal begins to decrease improving the curve fitting (shown in green). (c) In this figure the change in fluorescence intensity over the change in temperature is plotted against temperature in purple.

There are many benefits to using TSA for the initial biophysical screening of a fragment library. First, the assay does not rely on the biochemical activity of the target and can be performed with limited knowledge of the target's function, which is beneficial for FBLD because fragment binding often does not yield a measurable biochemical result. Additionally, TSA requires only a small amount of minimally stable protein whose thermal stability can be tracked in the presence and absence of the ligand [30]. Thermal shift assays can be completed by using widely available RT‐PCR instruments [33] and is relatively simple to perform with limited training, reducing the up‐front cost of implementing. This medium‐ to high‐throughput assay typically enables the testing of up to 384 compounds in only 30–40 min.

Thermal shift assay is not a silver bullet per se and has some limitations and drawbacks. Traditional methods of assaying thermal shift will not work if a protein does not contain a hydrophobic core, as there will be nothing for the dye to differentially interact with when the target unfolds. Similarly, this assay will not produce valuable data if the surface of the protein is hydrophobic because the dye will fluoresce before the protein unfolds. Changing the dye used in the assay can mitigate these issues. The fluorescent readout of this assay also creates limitations. Some fragments commonly found in screening libraries fluoresce and interfere with the signal from SYPRO Orange. This phenomenon is readily evident upon inspection of the resulting data but requires a deconvolution step to avoid false‐positive or false‐negative results. Additionally, TSA does not provide accurate affinity data. However, a concentration versus ΔTm curve can be fit to generate an EC50 value that can estimate the range in which subsequent biochemical or biophysical assays can be more effective [34, 35]. With these limitations in mind, TSA can be a powerful tool for detecting fragment binding.

3.5. Fragment validation

For the purposes of validation, a “good” fragment hit should be spatially described within a known target site via crystal structure, two‐dimensional NMR studies, or at least, a ligand epitope map. The structural information enables the chemical expansion or linking of fragments during hit generation.

Hit rates in fragment‐based screening are typically high, frequently at least an order of magnitude higher than those of ligand‐based screening. Hits in the primary screen can be narrowed by using an orthogonal technique of comparable throughput for validation. Considering the numerous techniques available at the primary screening stage, the path to validation can be variable. Using an orthogonal approach to primary screening assumes that fragments that are hit by both techniques will translate successfully in secondary screening.

Using multiple techniques for fragment primary screening may yield a diminishing return. For example, after solving 71 crystal structures from soaking 361 fragments and statistically comparing the results of other fragment screening techniques [36], one group found that nearly half of the 71 “good” fragment hits were missed by other techniques. When used in combination, hit validation was statistically worse, but this fact was heightened by the inclusion of hits that were originally missed by crystallography. Therefore, orthogonal primary screening with at least two techniques still achieves the goal of fragment hit validation. However, if the primary fragment screening techniques do not provide meaningful structure characterization, then NMR or crystallography is required in a secondary screening capacity.

Advertisement

4. Fragment secondary screening

4.1. X‐ray crystallography

Several companies, including Astex Pharmaceuticals, SGX Pharmaceuticals, Plexxikon, and Abbott, have effectively used structural biology in their fragment‐based lead discovery efforts. This valuable tool avoids the pitfalls of false‐positive results and nonspecific binding that may result from other fragment‐screening methods. Any fragment hit discovered via crystallography is inherently validated for the given target. Crystallography gives a clear picture of the fragment binding posture within the active site. This information can greatly facilitate the design of libraries based on the initial hit.

Using crystallography as a method of fragment‐based lead discovery has some limitations. It has long been associated with slow throughput. Additionally, some targets, such as membrane‐associated proteins, do not readily lend themselves to crystallization. Crystallography often requires extensive and time‐consuming efforts to arrive at crystallization conditions suitable for fragment soaking experiments. Even if these conditions have been determined, ready access to a suitable beam line and expertise in crystallography and data reduction can be hurdles in the rest of the lead discovery process. Protein in crystallization conditions is in a crystal lattice, which does not completely reflect a physiological environment. This artificial environment can lead to artifacts in the data and an inaccurate picture of fragment binding. Although a crystal structure is rich in information, it does not reflect the potency or the biochemical activity of the bound fragment. With a wealth of structural information, it will not be possible to rank hits based on these criteria; orthogonal assays are critical for these purposes, and crystallography alone will not suffice.

The process of generating a fragment structure typically follows a set path. First, protein must be purified. Then, crystallization conditions for the purified protein are determined. Crystals can then either be grown in the presence of a fragment or soaked into a preexisting crystal. The resulting crystals are flash frozen and used for data collection either in house or at a larger beam line. The data is analyzed to generate a three‐dimensional model of the fragment binding site within the target protein. This model can be used in iterative design efforts to grow the fragment into the binding site or link it to other fragments in neighboring sites.

Perhaps one of the most critical hurdles to successfully implementing structural biology into the FBLD process is having a suitable supply of the target protein. In most cases, protein used for X‐ray crystallography must be pure and in high yields. A typical screen for crystallization conditions is completed by using as much as 20 mg/ml of protein. If initial crystal screening efforts using native protein are not effective, then it may be necessary to modify the target via removal of mobile loop regions or trimming the terminal ends. Creating multiple variants is commonly a valuable step in generating robust high‐resolution structural data.

Once protein is available with sufficient yield and purity, screening for crystallization conditions begins. This is typically performed as a high‐throughput screen with as many as 1000 set conditions in a single experiment. Many commercially available sparse matrix and additive crystallization screens use conditions that have historically yielded crystals. When these experiments yield a hit, the conditions can then be optimized to yield larger, highly reproducible crystals. Suitable crystals for fragment soaking should have fairly high resolution (<2 Å). Starting with a higher resolution structure improves resulting maps and increases the chances of producing an accurate model of fragment binding. If multiple crystallization conditions are available, it is best to choose the one that more closely represents physiological conditions, even at a slight cost to resolution. This trade‐off will result in fragments with best chance of advancement to be prioritized.

Fragments suitable for X‐ray crystallography benefit from good solubility, as insoluble fragments have a low probability of yielding a structure with suitable occupancy of the ligand. Compounds are soaked at high concentrations to improve the chances of high occupancy within the structure. Given this fact, a fragment should have a solubility of at least 1 mM [37]. SGX pharmaceuticals benefited from generating a brominated fragment library, which enabled detection of anomalous scatter as an indication of successful soaking, streamlining the data collection process. To further increase throughput, fragments were soaked in mixtures composed of fragments with diverse shapes. Resulting structures could then be deconvoluted based on the shape of the ligand in the active site [36]. Fragment mixtures do run the risk of decreasing the effective concentration of individual fragments in the mixture because high fragment concentration contributes to high‐occupancy crystal structures, which can be detrimental to an experiment. In addition, these mixtures increase the chances of damaging the crystal in the soaking process, and fragments within the mix can interact with one another, skewing the results of the experiment. In one report, fragment mixtures resulted in 11 structures, whereas 20 structures resulted from individual soaking experiments [37]. These data suggest that if time and resources permit, soaking individual fragments is preferable to using mixtures.

If all other factors fall into place, a data set is collected. Improper treatment of this data set can result in an inaccurate and misleading model of fragment binding. Methods of data reduction and refinement are highly variable, and model building is a continuously evolving process. Certain steps in model building are especially pertinent when dealing with fragments. When searching for ligand density in a map, it is tempting to perform a quick refinement, and presume the location of a ligand. This approach is especially hazardous when modeling small fragments. If the map is not of high enough quality when the search for ligands begins, it is possible that water, a poorly resolved side chain, or even highly conserved buffers could masquerade as bound fragments [38]. It is best to perform several rounds of refinement before the ligand hunt begins. Model in any waters and then refine a few more times [37]. If the fragment is bound, a convincing map should take shape, and the creation of an accurate model of the target protein bound to fragments should be possible.

As technology progresses, many of the limitations of structural biology are being addressed. Most notably, throughput is being increased by incorporating automation. As this occurs, obtaining structural data is no longer the rate‐limiting step in lead development. In several cases, automation of this process has improved to the extent that structural studies are successfully being used as a primary screen. Although this approach might not yet be a feasible option in laboratories with limited resources and limited access to beam lines, it holds promise for an improved fragment screening process in the future.

4.2. Isothermal titration calorimetry

Calorimetry measures the thermodynamics of a molecular interaction via observations of heat change in a reaction occurring in an adiabatic (thermodynamically closed) system [39]. In the context of drug discovery, the molecular interaction most commonly measured is the heat of binding of a small molecule to a protein target, although reaction kinetics can be measured under specific circumstances [40]. Measuring the heat associated with a molecular interaction allows direct measurement of the extent of breakage and formation of noncovalent interactions upon complex formation [39]. Using other methods, such as coupled reactions (e.g., product release) and fluorescent binding techniques, the change in enthalpy can only be inferred via the Van't Hoff relationship [41].

Classically, calorimetry has been applied to measurement of a binding interaction in two different ways: isothermal titration calorimetry, which measures heat release upon binding; and differential scanning calorimetry (DSC), which measures thermal stabilization of a protein due to binding of a small molecule. These methods offer a very detailed look at the thermodynamics of binding and have been used successfully in optimization after initial fragment hits have been identified. For the purposes of this chapter, discussion will be limited to ITC, as it is more directly applicable to FBDD. As a direct measurement of the heat of binding, ITC allows the researcher to remove the effects of fluorescent tags, antibody relationships, or coupling chemistry from the investigation of a binding relationship. Because ITC is a solution‐based method, surface physical effects that interfere with binding (often seen with SPR) are not an issue.

Directly determining the thermodynamic components of overall binding allows a researcher to optimize a lead compound for a chosen target through specific binding interactions while minimizing off‐target effects that often derail drug discovery programs. When combined with X‐ray‐based binding information or applied to analysis of structure‐activity relationships, ITC can be a powerful tool in drug discovery. Overall binding of a small molecule to a target (as expressed by KD) can be broken down into enthalpic (specific interactions such as H‐bonding and π‐stacking) and entropic (nonspecific events such as bound water release and increases in conformational flexibility) components.

The normal range of dissociation constants that can be measured by ITC is from 10 nM to 100 μM [41]. This range can be extended below 1 nM or above 1 mM by using displacement methods [42, 43], although a suitable displacement ligand (independently characterized) must be identified beforehand. Displacement ITC has not yet gained wide acceptance in drug discovery as of this writing, with most researchers reporting results of direct binding studies. Most fragments have binding affinities in the millimolar range, limiting the applicability of ITC as a screening method. In addition, large amounts of protein are typically required (usually 0.1–0.3 mg of protein per experiment, this moderate amount adds up for multiple samples and repeats). Each titration requires a moderate amount of time (45 min–1 h), but for a large number of samples, experiment time becomes a hurdle to using ITC as a screening method.

For these reasons, ITC is usually brought in to the drug discovery process after screening, as part of hit validation and lead optimization [44]. Fewer compounds are involved, allowing more focus on the large amount of information provided by ITC [4446]. After a compound of interest is identified, a small set of structurally similar compounds can be purchased or synthesized to gain insight into the nature of binding to a target [47, 48]. At this stage, ITC offers the most benefit, as small molecules can be identified that maximize enthalpic interactions and minimize entropic interactions with the target.

Recent research into high‐throughput calorimetry offer solutions to researchers wanting to incorporate calorimetry into an earlier stage in their screening cascade. Research into technologies in pursuit of the so‐called “lab on a chip” has led to the development of both microfluidic [49] and droplet‐based systems [5052]. The droplet‐based system has been applied to both binding and kinetics.

4.3. Fluorescence polarization

Biochemical screening is not a typical choice for the primary screening of fragments but can be used to verify an inhibition of function, and inhibition by a known mechanism which may help discriminate fragments binding to alternative target sites.

Fluorescence polarization‐based assays are an option in FBDD when preliminary information about a target, such as small molecules that bind, is known. FP assays are competition assays in that they indirectly measure the effect of a compound on binding of an enzyme to a fluorescent probe. A fluorescent probe ideally starts out as a small molecule that binds tightly to an enzyme with known stoichiometry. This small molecule is chemically modified by the addition of a fluorescent label via an aliphatic or polyol linker to generate a fluorescent probe. Signal readout, represented in millipolarization units (mP), is calculated by measuring the amount of plane‐polarized light passing through two filters (perpendicular and parallel to the plane of incident light) that remains after interaction with a solution containing probe and calculating the ratio of parallel to perpendicular light [53, 54]. In an FP assay, a plate reader measures the difference in relative tumbling between a free probe (high amount of tumbling, leading to more scattered emission, and thus low FP signal) and probe bound to a protein (low amount of tumbling, with more ordered emission relative to incident light, leading to high FP signal).

Depending on assay design, FP can be applied to either binding or activity assays, with ready‐to‐use kits available from BellBrook Labs [55] or Cayman Chemical [56]. Activity assays using FP rely on endpoint detection of binding to a probe, thus, are modified binding assays. For applications for which no kit is available, a small amount of synthesis can combine a small molecule of interest with a wide variety of synthetic fluorophores available from major vendors. When designing a probe for FP, resources such as the Molecular Probes Handbook [57] can be valuable.

FP is less commonly used in fragment screening than perhaps it should be. The range of binding affinities normally found in fragment libraries is in the high‐μM to low‐millimolar range, previously considered to be out of the range of FP when tight binding probes are used [58]. Although still somewhat limited in application by the need for a well‐characterized probe, FP offers an inexpensive way to screen large numbers of compounds and has been used for many years in conventional drug discovery programs. With the advent of fragment‐based techniques, FP is finding use both as a site directed primary [59, 60] and secondary [6164] screening method, and has been used to validate new methods [65].

Advertisement

5. Fragment hit progression

For fragments, it is important to see activity in more than one assay, but equivalent potency in assays is not so important, and is not the best way to determine which molecules to promote. Various combinations of advanced metrics (i.e., efficiencies) with empirically driven evaluations (e.g., PAINS, metabolic stability) can help scientists make informed decisions on hit progression. An outside example of the successful application of metrics is sabermetrics. Today widely used in baseball, but also heavily scrutinized and evolving, sabermetrics uses advanced statistics to define in‐game performance and improve decision‐making by managers. Just like a game manager, scientists have to be aware of the limitations and effects of following a metric's indication, being consistent about hit progression occasionally in the light of conflicting results. Some experts [66] suggest using ligand efficiencies that can be easily determined without a calculator to facilitate discussions. Several metric‐focused reviews are available to consult; one in particular covers a large number of reported hit‐lead programs [67] for a wider perspective.

5.1. Empirically based fragment evaluations

Fragment screening methods virtually ensure that most screens will produce multiple hits for any target. Thus, the challenge to the researcher is not identifying compounds that interfere with a specific enzyme but determining which of many is the best to carry forward. Several metrics have emerged to guide the selection of fragment lead molecules through the drug discovery process: these metrics combine physical properties (e.g., molecular weight, cLogP, polar surface area, number of H‐bond donors and acceptors) with potency data. The earliest of these was simply termed ligand efficiency (LE) and involved dividing the free energy (ΔG) of binding by the total number of nonhydrogen atoms in the molecule [68]. With the introduction of LE, researchers now havea relatively simple way to keep focused on the specificity of binding to the target, potentially avoiding downstream problems due to nonspecific binding [69].

Another metric in wide use in FBDD is lipophilic ligand efficiency (LLE) [70], which takes into account the total lipophilicity and potency of a molecule (IC50, KD, Ki). A useful modification of LLE (LLEAstex) also controls for molecule size [71]. These statistical means of grading performance can support the early and late stages of the FBDD workflow and can be extended into progression analyses used during lead development (Figure 9). Which metric to use is up to the individual researcher and is based on the specific goals of the research program. Further reading to find an appropriate metric to use is recommended.

Figure 9.

A typical progression analysis found in lead development can also include ligand efficiency metrics in a seamless fashion.

5.2. Hit generation

Fragments that progress to the hit‐generation stage typically do so with structural insight that either describes the fragment bound to its protein target or the binding epitope mapped onto the fragment. A typical downstream workflow for hit generation includes a path that is structure blind, but this is essentially a diversion from traditional target‐based discovery and may lead to an SAR bottleneck. The hit‐generation stage refers to acquisition and screening of larger nonfragment ligands, which are obtained by catalog or synthetically prepared. This workflow includes either a chemical elaboration of individual fragments or the linking of at least two fragments, which then requires some optimization of the linker between them.

The hit generation phase is a practical place for virtual screening to be used to assist in fragment development. Method comparisons [72, 73] suggest that the currently available force fields and docking procedures based on lead‐like molecules will provide adequate results for fragments (i.e., better than randomized screening). However, the careful consideration of a scoring function to reliably discern weak interactions cannot be overemphasized. Scientists have recognized this distinction within FBDD and have sought to improve the scoring functions for fragments [74]. One notable viewpoint is that fragment elaboration or linking may be facilitated when binding poses are expressed as Gibbs energy [75].

Advertisement

6. Conclusions

Considering that FBDD has been attributed to at least two FDA drug approvals, and that the platform is relatively easy to integrate into existing technologies, many companies and academic groups have started their own fragment‐based discovery programs. The biophysical techniques each group uses will be dictated by the target, available facilities, and individual preferences of the investigators. Generally speaking, as long as the protein target has been successfully used with a technique in lead‐like screening and structural information is available, there are virtually no other major obstacles to generating new chemical matter for a given target‐based screening program. What remains are practical challenges, two of which bear repeating for those who are in the beginning stages of FBDD.

First, the key practical difference between lead‐like screening and fragment screening is the use of high concentrations of the fragments. The increased concentration impacts the compound library that is used and the clarity at which hits are delineated. Some suggestions have been made as to the optimal concentrations to use for a given technique. These are only suggestions and will likely change based on the system and techniques employed. With some practice, these procedures can be suitably optimized and need less attention going forward.

Second, is the challenge of directing fragment build out and/or fragment linking chemistry which can be resource intensive. As such, medicinal chemists are aided by the use of a preferred ligand efficiency metric early in the FBDD process to assist the ranking of fragment hits. Certainly other empirical and nonempirical factors will influence the progression of fragments, but metrics will help organize the structure‐activity relationship which is a key driver of the expansion or linking of fragments during hit generation.

Advertisement

Acknowledgments

The authors would like to acknowledge the funding support from the National Institutes of Health grant R01AI110578, and the American Lebanese Syrian Associated Charities (ALSAC), St Jude Children's Research Hospital. We thank Cherise Guess, PhD, ELS, of the SJCRH Department of Scientific Editing for her assistance with editing.

References

  1. 1. Macarron R, Banks MN, Bojanic D, Burns DJ, Cirovic DA, Garyantes T, Green DVS, Hertzberg RP, Janzen WP, Paslay JW, Schopfer U, Sittampalam GS. Impact of high‐throughput screening in biomedical research. Nat Rev Drug Discov. 2011; 10(3): 188–95. DOI: 10.1038/nrd3368
  2. 2. Reymond J‐L. The chemical space project. Acc Chem Res. 2015; 48(3): 722–30. DOI: 10.1021/ar500432k
  3. 3. Hert J, Irwin JJ, Laggner C, Keiser MJ, Shoichet BK. Quantifying biogenic bias in screening libraries. Nat Chem Biol. 2009; 5(7): 479–83. DOI: 10.1038/nchembio.180
  4. 4. Hann MM, Leach AR, Harper G. Molecular complexity and its impact on the probability of finding leads for drug discovery. J Chem Inf Comp Sci. 2001; 41(3): 856–64. DOI: 10.1021/ci000403i
  5. 5. Jencks WP. On the attribution and additivity of binding energies. Proc Natl Acad Sci USA. 1981; 78(7): 4046–50. DOI: N/A
  6. 6. Shuker SB, Hajduk PJ, Meadows RP, Fesik SW. Discovering high‐affinity ligands for proteins: SAR by NMR. Science. 1996; 274(5292): 1531–4. DOI: 10.1126/science.274.5292.1531
  7. 7. Bollag G, Tsai J, Zhang J, Zhang C, Ibrahim P, Nolop K, Hirth P. Vemurafenib: the first drug approved for BRAF‐mutant cancer. Nat Rev Drug Discov. 2012; 11(11): 873–86. DOI: 10.1038/nrd3847
  8. 8. Deeks ED. Venetoclax: first global approval. Drugs. 2016; 76(9): 979–87. DOI: 10.1007/s40265‐016‐0596‐x
  9. 9. Erlanson DA, Fesik SW, Hubbard RE, Jahnke W, Jhoti H. Twenty years on: the impact of fragments on drug discovery. Nat Rev Drug Discov. 2016; 15(9): 605–19. DOI: 10.1038/nrd.2016.109
  10. 10. Congreve M, Carr R, Murray C, Jhoti H. A ‘Rule of Three’ for fragment‐based lead discovery? Drug Discov Today. 2003; 8(19): 876–7. DOI: 10.1016/S1359‐6446(03)02831‐9
  11. 11. Jhoti H, Williams G, Rees DC, Murray CW. The ‘rule of three’ for fragment‐based drug discovery: where are we now? Nat Rev Drug Discov. 2013; 12(8): 644. DOI: 10.1038/nrd3926‐c1
  12. 12. Keseru GM, Erlanson DA, Ferenczy GG, Hann MM, Murray CW, Pickett SD. Design principles for fragment libraries: maximizing the value of learnings from pharma fragment‐based drug discovery (FBDD) programs for use in academia. J Med Chem. 2016; 59(18): 8189–206. DOI: 10.1021/acs.jmedchem.6b00197
  13. 13. Swain C. Fragment Collections [Internet]. 2016. Available from: http://www.cambridgemedchemconsulting.com/resources/hit_identification/fragment_collections.html [Accessed: 2016‐09‐09]
  14. 14. Lau WF, Withka JM, Hepworth D, Magee TV, Du YJ, Bakken GA, Miller MD, Hendsch ZS, Thanabal V, Kolodziej SA, Xing L, Hu Q, Narasimhan LS, Love R, Charlton ME, Hughes S, van Hoorn WP, Mills JE. Design of a multi‐purpose fragment screening library using molecular complexity and orthogonal diversity metrics. J Comput Aid Mol Des. 2011; 25(7): 621–36. DOI: 10.1007/s10822‐011‐9434‐0
  15. 15. Kutchukian PS, So S‐S, Fischer C, Waller CL. Fragment library design: using cheminformatics and expert chemists to fill gaps in existing fragment libraries. In: Klon EA, editor. Fragment‐Based Methods in Drug Discovery. New York, NY: Springer New York; 2015. pp. 43–53. DOI: 10.1007/978‐1‐4939‐2486‐8_5
  16. 16. Davis BJ, Erlanson DA. Learning from our mistakes: the ‘unknown knowns’ in fragment screening. Bioorg Med Chem Lett. 2013; 23(10): 2844–52. DOI: 10.1016/j.bmcl.2013.03.028
  17. 17. Lagorce D, Sperandio O, Baell JB, Miteva MA, Villoutreix BO. FAF‐Drugs3: a web server for compound property calculation and chemical library design. Nucleic Acid Res. 2015; 43(W1): W200–W7. DOI: 10.1093/nar/gkv353
  18. 18. Mayer M, Meyer B. Group epitope mapping by saturation transfer difference NMR to identify segments of a ligand in direct contact with a protein receptor. J Am Chem Soc. 2001; 123(25): 6108–17. DOI: 10.1021/ja0100120
  19. 19. Sledz P, Silvestre HL, Hung AW, Ciulli A, Blundell TL, Abell C. Optimization of the interligand Overhauser effect for fragment linking: application to inhibitor discovery against mycobacterium tuberculosis pantothenate synthetase. J Am Chem Soc. 2010; 132(13): 4544–5. DOI: 10.1021/ja100595u
  20. 20. Vanwetswinkel S, Heetebrij RJ, van Duynhoven J, Hollander JG, Filippov DV, Hajduk PJ, Siegal G. TINS, Target immobilized NMR screening: an efficient and sensitive method for ligand discovery. Chem Biol. 2005; 12(2): 207–16. DOI: 10.1016/j.chembiol.2004.12.004
  21. 21. Gardner KH, Kay LE. Production and incorporation of 15N, 13C, 2H (1H‐δ1 methyl) isoleucine into proteins for multidimensional NMR studies. J Am Chem Soc. 1997; 119(32): 7599–600. DOI: 10.1021/ja9706514
  22. 22. Hajduk PJ, Augeri DJ, Mack J, Mendoza R, Yang J, Betz SF, Fesik SW. NMR‐Based screening of proteins containing 13C‐labeled methyl groups. J Am Chem Soc. 2000; 122(33): 7898–904. DOI: 10.1021/ja000350l
  23. 23. Dalvit C. Ligand‐ and substrate‐based 19F NMR screening: principles and applications to drug discovery. Prog Nucl Mag Res Sp. 2007; 51(4): 243–71. DOI: 10.1016/j.pnmrs.2007.07.002
  24. 24. Rich RL, Myszka DG. Survey of the 2009 commercial optical biosensor literature. J Mol Recognit. 2011; 24(6): 892–914. DOI: 10.1002/jmr.1138
  25. 25. Johnsson B, Löfås S, Lindquist G. Immobilization of proteins to a carboxymethyldextran‐modified gold surface for biospecific interaction analysis in surface plasmon resonance sensors. Anal Biochem. 1991; 198(2): 268–77. DOI: 10.1016/0003‐2697(91)90424‐R
  26. 26. Rich RL, Quinn JG, Morton T, Stepp JD, Myszka DG. Biosensor‐based fragment screening using FastStep injections. Anal Biochem. 2010; 407(2): 270–7. DOI: 10.1016/j.ab.2010.08.024
  27. 27. Karlsson R, Katsamba PS, Nordin H, Pol E, Myszka DG. Analyzing a kinetic titration series using affinity biosensors. Anal Biochem. 2006; 349(1): 136–47. DOI: 10.1016/j.ab.2005.09.034
  28. 28. Quinn JG. Evaluation of Taylor dispersion injections: determining kinetic/affinity interaction constants and diffusion coefficients in label‐free biosensing. Anal Biochem. 2012; 421(2): 401–10. DOI: 10.1016/j.ab.2011.11.023
  29. 29. Quinn JG. Modeling Taylor dispersion injections: determination of kinetic/affinity interaction constants and diffusion coefficients in label‐free biosensing. Anal Biochem. 2012; 421(2): 391–400. DOI: 10.1016/j.ab.2011.11.024
  30. 30. Pantoliano MW, Petrella EC, Kwasnoski JD, Lobanov VS, Myslik J, Graf E, Carver T, Asel E, Springer BA, Lane P, Salemme FR. High‐density miniaturized thermal shift assays as a general strategy for drug discovery. J Biomol Screen. 2001; 6(6): 429–40. DOI: 10.1089/108705701753364922
  31. 31. Ciulli A. Biophysical screening for the discovery of small‐molecule ligands. In: Williams M, Daviter T, editors. Methods in Molecular Biology. 2nd ed. New York: Springer; 2013. pp. 357–88. DOI: 10.1007/978‐1‐62703‐398‐5_13
  32. 32. Schiebel J, Radeva N, Koster H, Metz A, Krotzky T, Kuhnert M, Diederich WE, Heine A, Neumann L, Atmanene C, Roecklin D, Vivat‐Hannah V, Renaud JP, Meinecke R, Schlinck N, Sitte A, Popp F, Zeeb M, Klebe G. One question, multiple answers: biochemical and biophysical screening methods retrieve deviating fragment hit lists. Chemmedchem. 2015; 10(9): 1511–21. DOI: 10.1002/cmdc.201500267
  33. 33. Lo MC, Aulabaugh A, Jin G, Cowling R, Bard J, Malamas M, Ellestad G. Evaluation of fluorescence‐based thermal shift assays for hit identification in drug discovery. Anal Biochem. 2004; 332(1): 153–9. DOI: 10.1016/j.ab.2004.04.031
  34. 34. Vivoli M, Novak HR, Littlechild JA, Harmer NJ. Determination of protein‐ligand interactions using differential scanning fluorimetry. J Vis Exp. 2014; 91: 51809. DOI: 10.3791/51809
  35. 35. Groftehauge MK, Hajizadeh NR, Swann MJ, Pohl E. Protein‐ligand interactions investigated by thermal shift assays (TSA) and dual polarization interferometry (DPI). Acta Crystallogr D. 2015; 71(Pt 1): 36–44. DOI: 10.1107/S1399004714016617
  36. 36. Schiebel J, Radeva N, Krimmer SG, Wang X, Stieler M, Ehrmann FR, Fu K, Metz A, Huschmann FU, Weiss MS, Mueller U, Heine A, Klebe G. Six biophysical screening methods miss a large proportion of crystallographically discovered fragment hits: a case study. ACS Chem Biol. 2016; 11(6): 1693–701. DOI: 10.1021/acschembio.5b01034
  37. 37. Schiebel J, Krimmer SG, Rower K, Knorlein A, Wang X, Park AY, Stieler M, Ehrmann FR, Fu K, Radeva N, Krug M, Huschmann FU, Glockner S, Weiss MS, Mueller U, Klebe G, Heine A. High‐throughput crystallography: reliable and efficient identification of fragment hits. Structure. 2016; 24(8): 1398–409. DOI: 10.1016/j.str.2016.06.010
  38. 38. Kleywegt GJ. Crystallographic refinement of ligand complexes. Acta Crystallogr D. 2007; 63(Pt 1): 94–100. DOI: 10.1107/S0907444906022657
  39. 39. Ladbury JE, Klebe G, Freire E. Adding calorimetric data to decision making in lead discovery: a hot tip. Nat Rev Drug Discov. 2010; 9(1): 23–7. DOI: 10.1038/nrd3054
  40. 40. Hansen LD, Transtrum MK, Quinn C, Demarse N. Enzyme‐catalyzed and binding reaction kinetics determined by titration calorimetry. Biochim Biophys Acta. 2016; 1860(5): 957–66. DOI: 10.1016/j.bbagen.2015.12.018
  41. 41. Ladbury JE. Calorimetry as a tool for understanding biomolecular interactions and an aid to drug design. Biochem Soc T. 2010; 38(4): 888–93. DOI: 10.1042/bst0380888
  42. 42. Velazquez‐Campoy A, Freire E. Isothermal titration calorimetry to determine association constants for high‐affinity ligands. Nat Protoc. 2006; 1(1): 186–91. DOI: 10.1038/nprot.2006.28
  43. 43. Ruhmann E, Betz M, Fricke M, Heine A, Schafer M, Klebe G. Thermodynamic signatures of fragment binding: validation of direct versus displacement ITC titrations. Biochim Biophys Acta. 2015; 1850(4): 647–56. DOI: 10.1016/j.bbagen.2014.12.007
  44. 44. Mashalidis EH, Sledz P, Lang S, Abell C. A three‐stage biophysical screening cascade for fragment‐based drug discovery. Nat Protoc. 2013; 8(11): 2309–24. DOI: 10.1038/nprot.2013.130
  45. 45. Ladbury JE. Isothermal titration calorimetry: application to structure‐based drug design. Thermochim Acta. 2001; 380(2): 209–15. DOI: 10.1016/S0040‐6031(01)00674‐8
  46. 46. Scott DE, Ehebauer MT, Pukala T, Marsh M, Blundell TL, Venkitaraman AR, Abell C, Hyvonen M. Using a fragment‐based approach to target protein‐protein interactions. Chembiochem. 2013; 14(3): 332–42. DOI: 10.1002/cbic.201200521
  47. 47. Banerjee DR, Dutta D, Saha B, Bhattacharyya S, Senapati K, Das AK, Basak A. Design, synthesis and characterization of novel inhibitors against mycobacterial beta‐ketoacyl CoA reductase FabG4. Org Biomol Chem. 2014; 12(1): 73–85. DOI: 10.1039/c3ob41676c
  48. 48. Kišonaite M, Zubriene A, Čapkauskaite E, Smirnov A, Smirnoviene J, Kairys V, Michailoviene V, Manakova E, Gražulis S, Matulis D. Intrinsic thermodynamics and structure correlation of benzenesulfonamides with a pyrimidine moiety binding to carbonic anhydrases I, II, VII, XII, and XIII. Plos One. 2014; 9(12): e114106. DOI: 10.1371/journal.pone.0114106
  49. 49. Wolf A, Hartmann T, Bertolini M, Schemberg J, Grodrian A, Lemke K, Förster T, Kessler E, Hänschke F, Mertens F, Paus R, Lerchner J. Toward high‐throughput chip calorimetry by use of segmented‐flow technology. Thermochim Acta. 2015; 603: 172–83. DOI: 10.1016/j.tca.2014.10.021
  50. 50. Recht MI, Nienaber V, Torres FE. Fragment‐based screening for enzyme inhibitors using calorimetry. Method Enzymol. 2016; 567: 47–69. DOI: 10.1016/bs.mie.2015.07.023
  51. 51. Recht MI, Sridhar V, Badger J, Bounaud PY, Logan C, Chie‐Leon B, Nienaber V, Torres FE. Identification and optimization of PDE10A inhibitors using fragment‐based screening by nanocalorimetry and X‐ray crystallography. J Biomol Screen. 2014; 19(4): 497–507. DOI: 10.1177/1087057113516493
  52. 52. Torres FE, Kuhn P, De Bruyker D, Bell AG, Wolkin MV, Peeters E, Williamson JR, Anderson GB, Schmitz GP, Recht MI, Schweizer S, Scott LG, Ho JH, Elrod SA, Schultz PG, Lerner RA, Bruce RH. Enthalpy arrays. Proc Natl Acad Sci U S A. 2004; 101(26): 9517–22. DOI: 10.1073/pnas.0403573101
  53. 53. Lea WA, Simeonov A. Fluorescence polarization assays in small molecule screening. Expert Opin Drug Discov. 2011; 6(1): 17–32. DOI: 10.1517/17460441.2011.537322
  54. 54. Rossi AM, Taylor CW. Analysis of protein‐ligand interactions by fluorescence polarization. Nat Protoc. 2011; 6(3): 365–87. DOI: 10.1038/nprot.2011.305
  55. 55. BellBrook Labs. Transcreener® HTS Assays [Internet]. 2016. Available from: https://www.bellbrooklabs.com/products‐services/transcreener‐hts‐assays/[Accessed: 2016‐09‐23]
  56. 56. Cayman Chemical. Assay Kits [Internet]. 2016. Available from: https://www.caymanchem.com/Products/kits [Accessed: 2016‐09‐23]
  57. 57. Thermo Fisher Scientific. The Molecular Probes Handbook [Internet]. 2016. Available from: http://www.thermofisher.com/us/en/home/references/molecular‐probes‐the‐handbook.html [Accessed: 2016‐09‐23]
  58. 58. Xinyi H. Fluorescence polarization competition assay: the range of resolvable inhibitor potency is limited by the affinity of the fluorescent ligand. J Biomol Screen. 2003; 8(1): 34–8. DOI: 10.1177/1087057102239666
  59. 59. Baughman BM, Jake Slavish P, DuBois RM, Boyd VA, White SW, Webb TR. Identification of influenza endonuclease inhibitors using a novel fluorescence polarization assay. ACS Chem Biol. 2012; 7(3): 526–34. DOI: 10.1021/cb200439z
  60. 60. Carson MW, Zhang J, Chalmers MJ, Bocchinfuso WP, Holifield KD, Masquelin T, Stites RE, Stayrook KR, Griffin PR, Dodge JA. HDX reveals unique fragment ligands for the vitamin D receptor. Bioorg Med Chem Lett. 2014; 24(15): 3459–63. DOI: 10.1016/j.bmcl.2014.05.070
  61. 61. Davies NGM, Browne H, Davis B, Drysdale MJ, Foloppe N, Geoffrey S, Gibbons B, Hart T, Hubbard R, Jensen MR, Mansell H, Massey A, Matassova N, Moore JD, Murray J, Pratt R, Ray S, Robertson A, Roughley SD, Schoepfer J, Scriven K, Simmonite H, Stokes S, Surgenor A, Webb P, Wood M, Wright L, Brough P. Targeting conserved water molecules: design of 4‐aryl‐5‐cyanopyrrolo[2,3‐d]pyrimidine Hsp90 inhibitors using fragment‐based screening and structure‐based optimization. Bioorg Med Chem. 2012; 20(22): 6770–89. DOI: 10.1016/j.bmc.2012.08.050
  62. 62. Yin Z, Whittell LR, Wang Y, Jergic S, Liu M, Harry EJ, Dixon NE, Beck JL, Kelso MJ, Oakley AJ. Discovery of lead compounds targeting the bacterial sliding clamp using a fragment‐based approach. J Med Chem. 2014; 57(6): 2799–806. DOI: 10.1021/jm500122r
  63. 63. Yu W, Xiao H, Lin J, Li C. Discovery of novel STAT3 small molecule inhibitors via in silico site‐directed fragment‐based drug design. J Med Chem. 2013; 56(11): 4402–12. DOI: 10.1021/jm400080c
  64. 64. Zhao L, Cao D, Chen T, Wang Y, Miao Z, Xu Y, Chen W, Wang X, Li Y, Du Z, Xiong B, Li J, Xu C, Zhang N, He J, Shen J. Fragment‐based drug discovery of 2‐thiazolidinones as inhibitors of the histone reader BRD4 bromodomain. J Med Chem. 2013; 56(10): 3833–51. DOI: 10.1021/jm301793a
  65. 65. Meiby E, Simmonite H, le Strat L, Davis B, Matassova N, Moore JD, Mrosek M, Murray J, Hubbard RE, Ohlson S. Fragment screening by weak affinity chromatography: comparison with established techniques for screening against HSP90. Anal Chem. 2013; 85(14): 6756–66. DOI: 10.1021/ac400715t
  66. 66. Zartler E. Practical Fragments [Internet]. Erlanson D, editor. Blogger: Google Inc. 2015. [2016/08/01]. Available from: http://practicalfragments.blogspot.com/2015/09/is‐this‐still‐thing‐and‐why.html
  67. 67. Ferenczy GG, Keseru GM. How are fragments optimized? A retrospective analysis of 145 fragment optimizations. J Med Chem. 2013; 56(6): 2478–86. DOI: 10.1021/jm301851v
  68. 68. Hopkins AL, Groom CR, Alex A. Ligand efficiency: a useful metric for lead selection. Drug Discov Today. 2004; 9(10): 430–1. DOI: 10.1016/s1359‐6446(04)03069‐7
  69. 69. Hughes JP, Rees S, Kalindjian SB, Philpott KL. Principles of early drug discovery. Brit J Pharmacol. 2011; 162(6): 1239–49. DOI: 10.1111/j.1476‐5381.2010.01127.x
  70. 70. Leeson PD, Springthorpe B. The influence of drug‐like concepts on decision‐making in medicinal chemistry. Nat Rev Drug Discov. 2007; 6(11): 881–90. DOI: 10.1038/nrd2445
  71. 71. Mortenson PN, Murray CW. Assessing the lipophilicity of fragments and early hits. J Comput Aid Mol Des. 2011; 25(7): 663–7. DOI: 10.1007/s10822‐011‐9435‐z
  72. 72. Kawatkar S, Wang H, Czerminski R, Joseph‐McCarthy D. Virtual fragment screening: an exploration of various docking and scoring protocols for fragments using Glide. J Comput Aid Mol Des. 2009; 23(8): 527–39. DOI: 10.1007/s10822‐009‐9281‐4
  73. 73. Sándor M, Kiss R, Keseru GM. Virtual fragment docking by glide: a validation study on 190 protein-fragment complexes. J Chem Inf Model. 2010; 50(6): 1165–72. DOI: 10.1021/ci1000407
  74. 74. Wang J‐C, Lin J‐H. Scoring functions for prediction of protein‐ligand interactions. Curr Pharm Design. 2013; 19(12): 2174–82. DOI: 10.2174/1381612811319120005
  75. 75. Kozakov D, Hall DR, Jehle S, Luo L, Ochiana SO, Jones EV, Pollastri M, Allen KN, Whitty A, Vajda S. Ligand deconstruction: why some fragment binding positions are conserved and others are not. Proc Nat Acad Sci. 2015; 112(20): E2585–E94. DOI: 10.1073/pnas.1501567112

Written By

John J. Bowling, William R. Shadrick, Elizabeth C. Griffith and Richard E. Lee

Submitted: 08 April 2016 Reviewed: 20 October 2016 Published: 30 November 2016