Goto

Collaborating Authors

 para-cflow



Para-CFlows: C k -universal diffeomorphism approximators as superior neural surrogates

Neural Information Processing Systems

Invertible neural networks based on Coupling Flows (CFlows) have various applications such as image synthesis and data compression. The approximation universality for CFlows is of paramount importance to ensure the model expressiveness. In this paper, we prove that CFlows}can approximate any diffeomorphism in $C^k$-norm if its layers can approximate certain single-coordinate transforms. Specifically, we derive that a composition of affine coupling layers and invertible linear transforms achieves this universality. Furthermore, in parametric cases where the diffeomorphism depends on some extra parameters, we prove the corresponding approximation theorems for parametric coupling flows named Para-CFlows. In practice, we apply Para-CFlows as a neural surrogate model in contextual Bayesian optimization tasks, to demonstrate its superiority over other neural surrogate models in terms of optimization performance and gradient approximations.


Para-CFlows: C k-universal Diffeomorphism Approximators as Superior Neural Surrogates

Neural Information Processing Systems

Therefore, beyond the distributional universality, it is also important to investigate the universality from the mapping perspective. As INNs are always invertible, it is natural to consider their approximation ability to diffeomorphisms.


Para-CFlows: C k -universal diffeomorphism approximators as superior neural surrogates

Neural Information Processing Systems

Invertible neural networks based on Coupling Flows (CFlows) have various applications such as image synthesis and data compression. The approximation universality for CFlows is of paramount importance to ensure the model expressiveness. In this paper, we prove that CFlows}can approximate any diffeomorphism in C k -norm if its layers can approximate certain single-coordinate transforms. Specifically, we derive that a composition of affine coupling layers and invertible linear transforms achieves this universality. Furthermore, in parametric cases where the diffeomorphism depends on some extra parameters, we prove the corresponding approximation theorems for parametric coupling flows named Para-CFlows.


Universality of parametric Coupling Flows over parametric diffeomorphisms

Lyu, Junlong, Chen, Zhitang, Feng, Chang, Cun, Wenjing, Zhu, Shengyu, Geng, Yanhui, Xu, Zhijie, Chen, Yongwei

arXiv.org Artificial Intelligence

Invertible neural networks (INNs) such as coupling flows are firstly introduced as a class of generative models with a tractable likelihood [11, 25, 40], and have shown their usefulness and powerfulness in various machine learning tasks such as inverse problems [2], probabilistic inference [29] and feature extraction [22] in recent years. With plenty of successful applications of INNs, one would wonder if such a type of models have the universal expressiveness. As most generative models mainly concern about the transform between distributions, existing works such as [19, 23] focused on the expressiveness from the distribution perspective. However, the expressiveness from the distribution perspective does not result in the expressiveness from the mapping perspective, as there are a large (or even infinite) number of diffeomorphisms mapping the given source µ to the given target ν. In many applications, knowing the distributional universality is not yet enough, one may be interested in knowing if the optimal transport [41], which finds emerging applications in many fields, e.g., machine learning [32], wireless communication [30] and economics [15], can be approximated by invertible neural networks.