TowardsDiverseandFaithfulOne-shotAdaptionof GenerativeAdversarialNetworks
–Neural Information Processing Systems
In this paper, we present a novel one-shot generative domain adaption method,i.e., DiFa, for diverse generation and faithful adaptation. For global-level adaptation, we leverage the difference between the CLIP embedding of reference image and the mean embedding of source images to constrain the target generator. For local-level adaptation, we introduce anattentivestyle losswhich aligns eachintermediate tokenofadapted image with its corresponding token of the reference image.
Neural Information Processing Systems
Feb-12-2026, 19:53:31 GMT