eec7fee9a8595ca964b9a11562767345-Supplemental-Conference.pdf
–Neural Information Processing Systems
A.1 ModelArchitecture The architecture of the SinGAN used in our paper follows that in [4]. The trade-off parameter in WGAN-GP [3] is set to0.1 for gradient penalty. Adam[5]isadoptedasthe stochastic optimizer with aninitial learning rate of0.0005and adecay factor of0.1after finishing 80% of iterations, and we set the maximum number of training iterations to2,000. C.2 Per-StageWeightDistribution In addition to total weight distribution, the comparison of per-stage weight distribution is also provided.
Neural Information Processing Systems
Feb-12-2026, 17:56:07 GMT
- Technology: