Goto

Collaborating Authors

 dipeptide


FromBiasedtoUnbiasedDynamics: AnInfinitesimalGeneratorApproach

Neural Information Processing Systems

Toovercome this bottleneck, data are collected via biased simulations that explore the state space more rapidly. Wepropose aframeworkforlearning frombiased simulations rooted in the infinitesimal generator of the process and the associated resolvent operator. Wecontrast our approach to more common ones based on the transfer operator, showing thatitcanprovably learn thespectral properties oftheunbiased system frombiaseddata.





BoltzNCE: Learning Likelihoods for Boltzmann Generation with Stochastic Interpolants and Noise Contrastive Estimation

Aggarwal, Rishal, Chen, Jacky, Boffi, Nicholas M., Koes, David Ryan

arXiv.org Artificial Intelligence

Efficient sampling from the Boltzmann distribution given its energy function is a key challenge for modeling complex physical systems such as molecules. Boltzmann Generators address this problem by leveraging continuous normalizing flows to transform a simple prior into a distribution that can be reweighted to match the target using sample likelihoods. Despite the elegance of this approach, obtaining these likelihoods requires computing costly Jacobians during integration, which is impractical for large molecular systems. To overcome this difficulty, we train an energy-based model (EBM) to approximate likelihoods using both noise contrastive estimation (NCE) and score matching, which we show outperforms the use of either objective in isolation. On 2d synthetic systems where failure can be easily visualized, NCE improves mode weighting relative to score matching alone. On alanine dipeptide, our method yields free energy profiles and energy distributions that closely match those obtained using exact likelihoods while achieving $100\times$ faster inference. By training on multiple dipeptide systems, we show that our approach also exhibits effective transfer learning, generalizing to new systems at inference time and achieving at least a $6\times$ speedup over standard MD. While many recent efforts in generative modeling have prioritized models with fast sampling, our work demonstrates the design of models with accelerated likelihoods, enabling the application of reweighting schemes that ensure unbiased Boltzmann statistics at scale. Our code is available at https://github.com/RishalAggarwal/BoltzNCE.




PEER: A Comprehensive and Multi-Task Benchmark for Protein Sequence Understanding (Supplementary Material)

Neural Information Processing Systems

For example, the feature of dipeptide " st " is defined by its dipeptide composition ( The Moran feature descriptor defines the distribution of amino acid properties along a protein sequence. It should be noted that there are evident class imbalances in two multi-class classification tasks. Table 1: Balanced metric (weighted F1) compared with accuracy on multi-class classification tasks. We report mean (std) for each experiment. Used as a feature extractor with pre-trained weights frozen.


Operator Forces For Coarse-Grained Molecular Dynamics

Klein, Leon, Kelkar, Atharva, Durumeric, Aleksander, Chen, Yaoyi, Noé, Frank

arXiv.org Machine Learning

Coarse-grained (CG) molecular dynamics simulations extend the length and time scale of atomistic simulations by replacing groups of correlated atoms with CG beads. Machine-learned coarse-graining (MLCG) has recently emerged as a promising approach to construct highly accurate force fields for CG molecular dynamics. However, the calibration of MLCG force fields typically hinges on force matching, which demands extensive reference atomistic trajectories with corresponding force labels. In practice, atomistic forces are often not recorded, making traditional force matching infeasible on pre-existing datasets. Recently, noise-based kernels have been introduced to adapt force matching to the low-data regime, including situations in which reference atomistic forces are not present. While this approach produces force fields which recapitulate slow collective motion, it introduces significant local distortions due to the corrupting effects of the noise-based kernel. In this work, we introduce more general kernels based on normalizing flows that substantially reduce these local distortions while preserving global conformational accuracy. We demonstrate our method on small proteins, showing that flow-based kernels can generate high-quality CG forces solely from configurational samples.


Consistent Sampling and Simulation: Molecular Dynamics with Energy-Based Diffusion Models

Plainer, Michael, Wu, Hao, Klein, Leon, Günnemann, Stephan, Noé, Frank

arXiv.org Machine Learning

Diffusion models have recently gained significant attention due to their effectiveness in various scientific domains, including biochemistry. When trained on equilibrium molecular distributions, diffusion models provide both: a generative procedure to sample equilibrium conformations and associated forces derived from the model's scores. However, using the forces for coarse-grained molecular dynamics simulations uncovers inconsistencies in the samples generated via classical diffusion inference and simulation, despite both originating from the same model. Particularly at the small diffusion timesteps required for simulations, diffusion models fail to satisfy the Fokker-Planck equation, which governs how the score should evolve over time. We interpret this deviation as an indication of the observed inconsistencies and propose an energy-based diffusion model with a Fokker-Planck-derived regularization term enforcing consistency. We demonstrate the effectiveness of our approach on toy systems, alanine dipeptide, and introduce a state-of-the-art transferable Boltzmann emulator for dipeptides that supports simulation and demonstrates enhanced consistency and efficient sampling.