Supplementary Material for " Non-Asymptotic Error Bounds for Bidirectional GANs "
–Neural Information Processing Systems
Department of Mathematics, The Hong Kong University of Science and Technology Clear Water Bay, Hong Kong, China yyangdc@connect.ust.hk In this supplementary material, we first prove Theorem 3.2, and then Theorems 3.1 and 3.3. We use σ to denote the ReLU activation function in neural networks, which is σ(x) = max{x, 0}. We use notation O() and Õ() to express the order of function slightly differently, where O() omits the universal constant not relying on d while Õ() omits the constant related to d. So far, most of the related works assume that the target distribution µ is supported on a compact set, for example Chen et al. (2020) and Liang (2020).
Neural Information Processing Systems
May-29-2025, 01:57:53 GMT
- Country:
- Asia > China
- Hong Kong (0.45)
- North America > United States
- Iowa > Johnson County > Iowa City (0.14)
- Asia > China
- Technology: