Goto

Collaborating Authors

ebo


This Wobbly Robot Wants to Be Your Pet's New Playmate

WIRED

I've been living with a pet-friendly robot for the past few months that's made to entertain cats. Enabot's Ebo Pro is a circular bot that fits in the palm of your hand with a cute pixelated digital face. Like a fun WALL-E knockoff, it even lets out an excited "Eebow" in a childlike robotronic voice as it powers up. It has wheels to move, a camera to see, a laser to tease your cat with, and a built-in speaker so you can chat with your pet from afar (or kids, as the website notes). You can use the Ebo Pro in a few ways: Drive it manually using the controls in its app, turn on automatic mode, or set schedules for it to activate at certain times of the day and play with your pet.


Batched Large-scale Bayesian Optimization in High-dimensional Spaces

arXiv.org Machine Learning

Bayesian optimization (BO) has become an effective approach for black-box function optimization problems when function evaluations are expensive and the optimum can be achieved within a relatively small number of queries. However, many cases, such as the ones with high-dimensional inputs, may require a much larger number of observations for optimization. Despite an abundance of observations thanks to parallel experiments, current BO techniques have been limited to merely a few thousand observations. In this paper, we propose ensemble Bayesian optimization (EBO) to address three current challenges in BO simultaneously: (1) large-scale observations; (2) high dimensional input spaces; and (3) selections of batch queries that balance quality and diversity. The key idea of EBO is to operate on an ensemble of additive Gaussian process models, each of which possesses a randomized strategy to divide and conquer. We show unprecedented, previously impossible results of scaling up BO to tens of thousands of observations within minutes of computation.


?utm_source=dlvr.it&utm_medium=twitter

@machinelearnbot

In this paper, we propose Ensemble Bayesian Optimization (EBO) to overcome this problem. Unlike conventional BO methods that operate on a single posterior GP model, EBO works with an ensemble of posterior GP models. Our approach generates speedups by parallelizing the time consuming hyper-parameter posterior inference and functional evaluations on hundreds of cores and aggregating the models in every iteration of BO. We demonstrate the ability of EBO to handle sample-intensive hard optimization problems by applying it to a rover navigation problem with tens of thousands of observations.