Appendix of Memory with No Forgetting

Neural Information Processing Systems 

Figure 7: (a) The generator architecture adopted in this paper. It inherits the architecture from GP-GAN which is the same as the green/frozen part of our GAN memory (see Figure 1(a)). Given a target task/data ( e.g., Flowers, Cathedrals, or Cats), all the parameters are trainable and fine-tuned to fit the target data. FC/Conv layer, one need to apply it to all layers in real implementation. "None" means no modulation from target data is applied, we only use the modulation from the Only" means that we replace the Only" means that we replace the Only" means that we replace the b "All" means all style parameters from target data "All" is obtained via a similar way to that of Figure 2(a); " means using the style parameters from target data without "None" and "All" are obtained via a similar way to that of Figure 2(a); "FC" is obtained by applying a newly designed style parameters which copies the style parameters " is obtained by copying the designed style parameters under the "FC" setting and replacing " is obtained by copying the designed style parameters under the " "Our" is our GAN memory; "NoNorm" is a modified version of our GAN memory which removes the normalization on "NoBias" is a modified version of our GAN memory which removes the bias term Here we discuss the detailed techniques for interpolation among different generative processes with our GAN memory and show more examples in Figure 8, 9, 10, and 11.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found