Goto

Collaborating Authors

 pixelcnn


Conditional Image Generation with PixelCNN Decoders

Neural Information Processing Systems

This work explores conditional image generation with a new image density model based on the PixelCNN architecture. The model can be conditioned on any vector, including descriptive labels or tags, or latent embeddings created by other networks. When conditioned on class labels from the ImageNet database, the model is able to generate diverse, realistic scenes representing distinct animals, objects, landscapes and structures. When conditioned on an embedding produced by a convolutional network given a single image of an unseen face, it generates a variety of new portraits of the same person with different facial expressions, poses and lighting conditions. We also show that conditional PixelCNN can serve as a powerful decoder in an image autoencoder. Additionally, the gated convolutional layers in the proposed model improve the log-likelihood of PixelCNN to match the state-of-the-art performance of PixelRNN on ImageNet, with greatly reduced computational cost.



pixels (PixelCNN) that is conditioned on a latent code, and the recognition path uses a generative adversarial network (GAN) to impose a prior distribution on the

Neural Information Processing Systems

In this paper, we describe the "PixelGAN autoencoder", a generative autoencoder Both networks are jointly trained to maximize a variational lower bound on the data log-likelihood. Section 2.1, we show that by imposing a Gaussian distribution on the latent code, we can achieve a global vs. local decomposition of information.



ANTN: Bridging Autoregressive Neural Networks and Tensor Networks for Quantum Many-Body Simulation

Neural Information Processing Systems

Neural TensorNet parameterizes normalized wavefunctions, allows for exact sampling, generalizes the expressivity of tensor networks and autoregressive neural networks, and inherits a variety of symmetries from autoregressive neural networks.


A Linear and Quadratic Splines as Flows

Neural Information Processing Systems

For self-containedness, we will here summarize linear and quadratic spline flows, i.e. piecewise For a quadratic spline, the density will be piece-wise linear. We will now show how one can rewrite this distribution in an autoregressive form. The categories should have decreasing dequantization gaps in the listed order. The results are shown in Table 4. CIFAR-10 experiments is summarized in Table 5. Table 5: The PixelCNN architecure used for the CIFAR-10 dataset.3



importance and

Neural Information Processing Systems

We thank all reviewers for their useful comments. Table 1: AUCROC obtained from Likelihood Regret on Glow and PixelCNN. We now address detailed concerns of each reviewer. We apologize for the confusion caused by the notation. Please refer to items 1-3 for concerns regarding why we focus on V AE's OOD detection.


Hierarchical autoregressive neural networks in three-dimensional statistical system

Białas, Piotr, Chahar, Vaibhav, Korcyl, Piotr, Stebel, Tomasz, Winiarski, Mateusz, Zapolski, Dawid

arXiv.org Artificial Intelligence

Autoregressive Neural Networks (ANN) have been recently proposed as a mechanism to improve the efficiency of Monte Carlo algorithms for several spin systems. The idea relies on the fact that the total probability of a configuration can be factorized into conditional probabilities of each spin, which in turn can be approximated by a neural network. Once trained, the ANNs can be used to sample configurations from the approximated probability distribution and to evaluate explicitly this probability for a given configuration. It has also been observed that such conditional probabilities give access to information-theoretic observables such as mutual information or entanglement entropy. So far, these methods have been applied to two-dimensional statistical systems or one-dimensional quantum systems. In this paper, we describe a generalization of the hierarchical algorithm to three spatial dimensions and study its performance on the example of the Ising model. We discuss the efficiency of the training and also describe the scaling with the system's dimensionality by comparing results for two- and three-dimensional Ising models with the same number of spins. Finally, we provide estimates of thermodynamical observables for the three-dimensional Ising model, such as the entropy and free energy in a range of temperatures across the phase transition.


Figure 1: Architecture of the PixelGAN autoencoder

Neural Information Processing Systems

In this paper, we describe the "PixelGAN autoencoder", a generative autoencoder in which the generative path is a convolutional autoregressive neural network on pixels (PixelCNN) that is conditioned on a latent code, and the recognition path uses a generative adversarial network (GAN) to impose a prior distribution on the latent code. We show that different priors result in different decompositions of information between the latent code and the autoregressive decoder. For example, by imposing a Gaussian distribution as the prior, we can achieve a global vs. local decomposition, or by imposing a categorical distribution as the prior, we can disentangle the style and content information of images in an unsupervised fashion. We further show how the PixelGAN autoencoder with a categorical prior can be directly used in semi-supervised settings and achieve competitive semi-supervised classification results on the MNIST, SVHN and NORB datasets.