Training GANs in Julia's Flux

#artificialintelligence 

In order to effectively run machine learning experiments we need a fast turn-around time for model training. So simply implementing the model is not the only thing we need to worry about. We also want to be able to change the hyperparameters in a convenient way. This could either be through a configuration file or through command line arguments. This post demonstrates how I train a vanilla GAN on the MNIST dataset. It is not about GAN theory, for this the original paper by Goodfellow et al. [[1]] is a good starting point. Instead I focus on how to structure the code and subtle implementation issues I came across when writing the code. You can find the current version of the code on github.