Supplementary Material for " Non-Asymptotic Error Bounds for Bidirectional GANs "
–Neural Information Processing Systems
Department of Mathematics, The Hong Kong University of Science and Technology Clear Water Bay, Hong Kong, China yyangdc@connect.ust.hk In this supplementary material, we first prove Theorem 3.2, and then Theorems 3.1 and 3.3. We use σ to denote the ReLU activation function in neural networks, which is σ (x) = max {x, 0}. We use notation O () and O () to express the order of function slightly differently, where O () omits the universal constant not relying on d while O () omits the constant related to d . So far, most of the related works assume that the target distribution µ is supported on a compact set, for example Chen et al. (2020) and Liang (2020).
Neural Information Processing Systems
Aug-14-2025, 22:11:51 GMT
- Country:
- Asia > China
- Hong Kong (0.45)
- Hubei Province > Wuhan (0.04)
- North America > United States
- Iowa > Johnson County > Iowa City (0.14)
- Asia > China
- Technology: