Black-Box Optimization with Local Generative Surrogates Supplementary Material A Surrogates Implementation Details

Neural Information Processing Systems 

A.1 GAN Implementation For the training of the GANs we have used conditional generative network, with three hidden layers of size 100 and conditional discriminative network with two hidden layers of size 100. For all the hidden layers except the last one we have used tanh activation. For the last hidden layer leaky_relu was used. The conditioning is performed via concatenating the input noise z with input parameters ψ. The learning rate and batch size is set to 0.0008 and 512 correspondingly.