Appendix to: Bayesian Optimization over Discrete and Mixed Spaces via Probabilistic Reparameterization

Neural Information Processing Systems 

With this paper in particular, we improve the performance of Bayesian optimization on problems with mixed types of inputs. Given the ubiquity of such problems in many practical applications, we believe that our method could lead to positive broader impacts by solving these problems better and more efficiently while reducing the costs incurred for solving them. Concrete and high-stake examples where our method could be potentially applied (some of which have been already demonstrated by the benchmark problems considered in the paper) include but are not limited to applications in communications, chemical synthesis, drug discovery, engineering optimization, tuning of recommender systems, and automation of machine learning systems. On the flip side, while the method proposed is ethically neutral, there is potential of misuse given that the exact objective of optimization is ultimately decided by the end users; we believe that practitioners and researchers should be aware of such possibility and aim to mitigate any potential negative impacts to the furthest extent. Let be a compact metric space, and consider the set of functionals ={'s.t.':!P Since Z is finite, each element of ' 2 can be expressed as a mapping from to R Lemma 1. Suppose is continuous in x for every z 2Zand that ': 7! (P R is continuous (using that ' is continuous and is bounded). Since both X and are compact ˆ attains its maximum, i.e., J Corollary 2. Suppose the optimizer of g is unique, i.e., that H Corollary 3. Consider the following mappings: Binary: ':[0, 1]!P These mappings satisfy the conditions for Lemma 1.