Approximation-Aware Bayesian Optimization
–Neural Information Processing Systems
High-dimensional Bayesian optimization (BO) tasks such as molecular design often require >10,000 function evaluations before obtaining meaningful results. While methods like sparse variational Gaussian processes (SVGPs) reduce computational requirements in these settings, the underlying approximations result in suboptimal data acquisitions that slow the progress of optimization. In this paper we modify SVGPs to better align with the goals of BO: targeting informed data acquisition rather than global posterior fidelity. Using the framework of utility-calibrated variational inference, we unify GP approximation and data acquisition into a joint optimization problem, thereby ensuring optimal decisions under a limited computational budget. Our approach can be used with any decision-theoretic acquisition function and is readily compatible with trust region methods like TuRBO. We derive efficient joint objectives for the expected improvement and knowledge gradient acquisition functions for standard and batch BO. Our approach outperforms standard SVGPs on high-dimensional benchmark tasks in control and molecular design.
Neural Information Processing Systems
May-28-2025, 19:34:18 GMT
- Country:
- Europe > United Kingdom
- England > Cambridgeshire > Cambridge (0.14)
- North America > United States
- California > San Francisco County
- San Francisco (0.14)
- New York > New York County
- New York City (0.14)
- California > San Francisco County
- Europe > United Kingdom
- Genre:
- Research Report
- Experimental Study (0.93)
- New Finding (0.67)
- Research Report