Controlling Posterior Collapse by an Inverse Lipschitz Constraint on the Decoder Network
Kinoshita, Yuri, Oono, Kenta, Fukumizu, Kenji, Yoshida, Yuichi, Maeda, Shin-ichi
–arXiv.org Artificial Intelligence
However, While VAEs are nowadays omnipresent in the field of machine in practice, they suffer from a problem called learning, it is also widely recognized that there remain posterior collapse, which occurs when the encoder in practice some major challenges that still require effective coincides, or collapses, with the prior taking no solutions. Notably, they suffer from the problem of information from the latent structure of the input posterior collapse, which occurs when the distribution corresponding data into consideration. In this work, we introduce to the encoder coincides, or collapses, with the an inverse Lipschitz neural network into the prior taking no information from the latent structure of the decoder and, based on this architecture, provide a input data into consideration. Also known as KL vanishing new method that can control in a simple and clear or over-pruning, this phenomenon makes VAEs incapable manner the degree of posterior collapse for a wide to produce pertinent representations and has been reportedly range of VAE models equipped with a concrete observed in many fields (e.g., Bowman et al. (2016); Fu et al. theoretical guarantee. We also illustrate the effectiveness (2019); Wang & Ziyin (2022); Yeung et al. (2017)). There of our method through several numerical exists now a large body of literature that examines its underlying experiments.
arXiv.org Artificial Intelligence
Apr-25-2023