Goto

Collaborating Authors

 qualitative comparison




Training-free Diffusion Model Adaptation for V ariable-Sized Text-to-Image Synthesis (Supplementary Materials)

Neural Information Processing Systems

We now investigate the relation between the attention entropy and the token number. The revised code are shown in Algorithm 1. Both of them are top-ranked parameter files for downloading. Experiments are conducted on a server with Intel(R) Xeon(R) Gold 6226R CPUs @ 2.90GHz and We conduct an text-based pairwise preference test. The screenshot is depicted in Figure 1.





Supplemental Material A Proof for proposition

Neural Information Processing Systems

Reversing the process is not immediately obvious and thus several schedulers were proposed [23, 26, 31, 58]. In this paper, we employ DDIM [58] scheduler, a popular deterministic scheduler. Other deterministic scheduler would be suitable, and we show in section I below that our method performs well with other schedulers.