Supplementary Material for CircleGAN
–Neural Information Processing Systems
We provide hyperparameter settings and architectural details used in our work. All the experiments use the same hyperparameters. The major differences are as follows: 1) we use dropout and BN with weight normalization (WN) as a regularizer instead of the existing techniques such as spectral normalization (SN) and gradient penalties (GP). SNGAN sets the learning rates of the discriminator and the generator as 0.0004 and 0.0001, The architectural details for ImageNet are presented in Table 2. The quantitative results are shown in Figure 1. Figure 2: Comparison of IS and FID on Imagenet with Proj.
Neural Information Processing Systems
Aug-17-2025, 05:50:35 GMT