From Ubiquitin to Induced Proximity: Rewriting Cellular Disposal Logic
The modern conception of targeted protein degradation begins not with chemistry, but with a biological inevitability: proteins are transient. The ubiquitin–proteasome system operates as a highly regulated intracellular sanitation network, where proteins marked by ubiquitin chains are selectively dismantled by the 26S proteasome. This system does not merely eliminate waste; it governs signaling, cell cycle progression, and adaptive responses to environmental stress. The realization that this endogenous machinery could be redirected toward therapeutic ends marked a conceptual departure from inhibition toward elimination.
Early mechanistic insights into ubiquitin biology revealed a hierarchical enzymatic cascade involving E1 activation, E2 conjugation, and E3 ligase–mediated substrate recognition. These discoveries established the principle that degradation is not stochastic but encoded through recognition motifs and structural compatibility. By the early twenty-first century, chemists began to envision molecules that could artificially induce proximity between a protein of interest and an E3 ligase. This was not a passive binding event but a deliberate reprogramming of substrate identity.
The first generation of heterobifunctional degraders embodied this principle through modular architecture: a warhead engaging the target protein, an E3 ligand recruiting the ligase, and a linker orchestrating spatial orientation. Unlike classical inhibitors, these molecules operate catalytically, enabling a single compound to trigger repeated rounds of protein destruction. The pharmacological consequence is profound—activity is no longer dictated by occupancy but by the probability of productive molecular encounters.
Yet the transition from concept to controllable system exposed an unexpected complexity. The ternary complex formed between protein, degrader, and ligase is not a rigid assembly but a dynamic ensemble. Productive degradation depends not only on binding affinity but on the geometry and kinetics of interaction, including the accessibility of lysine residues to ubiquitin transfer. Consequently, the field has moved from viewing degradation as a static endpoint toward understanding it as a multistep, probabilistic process embedded in cellular dynamics, setting the stage for a computational rethinking of drug design.
The Ternary Problem: Why Degraders Defy Classical Drug Design
At the heart of targeted protein degradation lies a structural paradox: the most critical entity—the ternary complex—is also the least amenable to traditional modeling. Conventional drug discovery assumes a binary interaction between ligand and target, optimized through affinity and specificity. In contrast, degraders must simultaneously engage two proteins while inducing a non-native interface between them. This transforms a well-posed docking problem into a multidimensional search across conformational and spatial landscapes.
The challenge is compounded by the intrinsic properties of heterobifunctional molecules. Their size places them beyond classical drug-like chemical space, while their flexible linkers introduce a vast conformational entropy. These molecules can adopt multiple shapes depending on their environment, exposing polar groups in aqueous media while shielding them during membrane permeation. This “chameleonic” behavior complicates predictions of solubility and permeability, rendering traditional descriptors insufficient.
Moreover, degradation efficacy cannot be inferred directly from binding. A degrader may bind tightly to both the target and the ligase yet fail to induce ubiquitination if the spatial orientation is misaligned. Conversely, weaker binding interactions can yield efficient degradation if they stabilize a productive geometry. The relationship between structure and function is therefore mediated by emergent properties of the ternary system, including cooperativity and conformational sampling.
These factors expose the limitations of standard computational tools. Small-molecule docking fails to account for protein–protein rearrangements, while protein–protein docking neglects the influence of the bridging molecule. The result is a fragmented modeling landscape in which no single method captures the full complexity of the system. As a result, the field has been forced to adopt hybrid strategies that integrate elements of molecular docking, protein interaction modeling, and dynamic simulation, thereby redefining what it means to computationally design a drug.
Dynamics Over Structure: Simulation as the Language of Degradation
If static structures fail to explain degradation, then motion becomes the essential variable. Molecular dynamics simulations have emerged as a central tool for capturing the transient interactions that define ternary complex behavior. Rather than producing a single “correct” structure, these simulations generate ensembles that reveal how the system fluctuates across time, exposing metastable states that may govern functional outcomes.
In this framework, the degrader is not simply a connector but a constraint that reshapes the conformational landscape of two proteins. By restricting their relative orientations, it biases the system toward specific interaction geometries. Simulations have shown that productive ternary complexes often correspond to low-energy states that differ from crystallographic structures, suggesting that experimentally resolved conformations may not always represent functional ones. This insight challenges long-standing assumptions about structure–activity relationships.
Enhanced sampling techniques extend this approach by enabling exploration of rare but biologically relevant events, such as the assembly of the ternary complex or the positioning of lysine residues for ubiquitination. These methods reveal not only which states are accessible but how the system transitions between them. The resulting kinetic and thermodynamic information provides a more complete picture of degrader function, bridging the gap between molecular interaction and biological outcome.
Equally important is the integration of simulation with experimental data. Techniques such as hydrogen–deuterium exchange and proteomics inform on binding interfaces and ubiquitination patterns, which can be incorporated into computational models to improve accuracy. This iterative exchange between experiment and simulation transforms modeling from a predictive exercise into a collaborative process of hypothesis generation and refinement. As these approaches mature, they redefine drug design as an exploration of dynamic systems rather than static targets, naturally leading into the question of how such insights can be operationalized into design strategies.
Designing Degraders: Integrative Computation and the Future of TPD
The practical challenge of targeted protein degradation lies not in understanding its principles but in translating them into molecules that function reliably in biological systems. Modern design strategies therefore rely on the integration of multiple computational tools into cohesive workflows. These pipelines begin with the generation of candidate molecules, often through combinatorial assembly of warheads, ligands, and linkers, followed by filtering based on physicochemical and synthetic feasibility.
Central to this process is the prediction of ternary complex formation. Docking methods provide initial structural hypotheses, which are then refined through molecular dynamics simulations to assess stability and interaction patterns. Importantly, these simulations are not performed in isolation but within the context of larger assemblies, such as the Cullin–RING ligase complex, allowing estimation of ubiquitination likelihood. This multiscale approach captures the transition from molecular interaction to functional outcome.
Machine learning has begun to augment this framework by identifying patterns that distinguish active degraders from inactive ones. Rather than relying solely on traditional descriptors, these models incorporate features derived from simulations, such as conformational flexibility and interface contacts. The resulting classifiers do not merely rank compounds but reveal which molecular properties are most predictive of degradation, guiding subsequent design iterations.
Yet the future of computational TPD extends beyond incremental improvements. Emerging approaches aim to model entire degradation pathways, linking molecular properties to cellular responses through mechanistic and pharmacodynamic frameworks. These models have the potential to disentangle the contributions of binding, cooperativity, and catalytic turnover, providing a unified description of degrader activity. As computational power increases and experimental data accumulate, the field is poised to transition from empirical optimization to predictive design, where the elimination of proteins can be engineered with the same precision once reserved for their inhibition.
Study DOI: https://doi.org/10.1021/acs.jcim.3c00603
Engr. Dex Marco Tiu Guibelondo, B.Sc. Pharm, R.Ph.,B.Sc. CpE
Editor-in-Chief, PharmaFEATURES


A systems-driven analysis of Dr. Andrea Small-Howard’s leadership at GB Sciences, Inc., detailing how multi-component cannabinoid therapeutics, governance architecture, and AI-enabled discovery are converging to redefine translational drug development.

Sean Sullivan of Arcturus Therapeutics outlines how CMC-integrated strategy is derisking mRNA, oligonucleotide, and plasmid DNA therapeutics.
Igor Nasonkin’s systems-driven approach at Phythera Therapeutics reframes oncology drug development from single-target inhibition to AI-enabled polypharmacologic network modulation using nature-derived molecular architectures.
Devin Swanson’s leadership at Johnson & Johnson Innovative Medicines redefines external innovation as a tightly governed, AI-enabled translational system integrating multi-modal drug discovery, biomarker strategy, and capital-efficient execution.
A systems-level examination of how Mehran F. Moghaddam operationalizes DMPK, externalized R&D, and lipid-mediated therapeutics into a predictive, high-velocity biotech development architecture.
A systems-level analysis of how Shicheng Guo is architecting AI-driven, human data–centric drug development at Arrowhead Pharmaceuticals.
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settings