multi-information source optimization
Multi-Information Source Optimization
We consider Bayesian methods for multi-information source optimization (MISO), in which we seek to optimize an expensive-to-evaluate black-box objective function while also accessing cheaper but biased and noisy approximations (information sources). We present a novel algorithm that outperforms the state of the art for this problem by using a Gaussian process covariance kernel better suited to MISO than those used by previous approaches, and an acquisition function based on a one-step optimality analysis supported by efficient parallelization. We also provide a novel technique to guarantee the asymptotic quality of the solution provided by this algorithm. Experimental evaluations demonstrate that this algorithm consistently finds designs of higher value at less cost than previous approaches.
Reviews: Multi-Information Source Optimization
This paper deals with the important topic of optimization in cases where in addition to costly evaluations of the objective function, it is possible to evaluate cheaper approximations of it. This framework is referred to as MISO (multi-information source optimization) in the paper, where Bayesian optimization strategies relying on Gaussian process models are considered. An original MISO approach is presented, misoKG, that relies on an adaption of the Knowledge Gradient algorithm in multi-information source optimization settings. The method is shown to achieve very good results and to outperform considered competitors on three test cases. Overall, I am fond of the ideas and the research directions tackled by the paper, and I found the contributions principled and practically relevant.
Multi-Information Source Optimization
Poloczek, Matthias, Wang, Jialei, Frazier, Peter
We consider Bayesian methods for multi-information source optimization (MISO), in which we seek to optimize an expensive-to-evaluate black-box objective function while also accessing cheaper but biased and noisy approximations ("information sources"). We present a novel algorithm that outperforms the state of the art for this problem by using a Gaussian process covariance kernel better suited to MISO than those used by previous approaches, and an acquisition function based on a one-step optimality analysis supported by efficient parallelization. We also provide a novel technique to guarantee the asymptotic quality of the solution provided by this algorithm. Experimental evaluations demonstrate that this algorithm consistently finds designs of higher value at less cost than previous approaches. Papers published at the Neural Information Processing Systems Conference.