fid
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- Europe > Germany > North Rhine-Westphalia > Upper Bavaria > Munich (0.04)
- Information Technology > Security & Privacy (1.00)
- Law (0.68)
- Information Technology > Security & Privacy (1.00)
- Information Technology > Artificial Intelligence > Vision (1.00)
- Information Technology > Sensing and Signal Processing > Image Processing (0.93)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.46)
Return of Unconditional Generation: A Self-supervised Representation Generation Method
Unconditional generation--the problem of modeling data distribution without relying on human-annotated labels--is a long-standing and fundamental challenge in generative models, creating a potential of learning from large-scale unlabeled data. In the literature, the generation quality of an unconditional method has been much worse than that of its conditional counterpart. This gap can be attributed to the lack of semantic information provided by labels. In this work, we show that one can close this gap by generating semantic representations in the representation space produced by a self-supervised encoder. These representations can be used to condition the image generator.
- North America > Canada > Newfoundland and Labrador > Newfoundland (0.04)
- Asia > China > Guangdong Province > Shenzhen (0.04)
- Information Technology > Artificial Intelligence > Representation & Reasoning (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (0.70)
- Information Technology > Artificial Intelligence > Vision (0.68)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning (0.47)
A Derivations of Variance Controlled Diffusion
A.1 Proof of Proposition 4.1 Proposition 4.1 For any bounded measurable function τ(t): [0, T ] R, the following Reverse SDEs [ (1 + τ Eq. (20) is a reverse-time SDE running[ from T to 0, thus (there)are two additional minus ] signs in Eq. (21) before term A.2 Two Reparameterizations and Exact Solution under Exponential Integrator In this subsection, we will show the exact solution of SDE in both data prediction reparameterization and noise prediction reparameterization. The noise term in data prediction has smaller variance than noise prediction ones, implying the necessity of adopting data prediction reparameterization for the SDE sampler. The computation of variance uses the Itô Isometry, which is a crucial fact of Itô integral. Similar with Proposition 4.2, Eq. (37) can be solved analytically, which is shown in the following propositon: Following the derivation in Proposition 4.2, the mean of the Itô integral term is: [ A.2.4 Comparison between Data and Noise Reparameterizations In Table 1 we perform an ablation study on data and noise reparameterizations, the experiment results show that under the same magnitude of stochasticity, the proposed SA-Solver in data reparameterization has a better convergence which leads to better FID results under the same NFEs. In this subsection, we provide a theoretical view of this phenomenon.
- Media (0.46)
- Leisure & Entertainment (0.46)
Improving the Training of Rectified Flows
One approach for tackling this problem is rectified flows, which iteratively learn smooth ODE paths that are less susceptible to truncation error. However, rectified flows still require a relatively large number of function evaluations (NFEs). In this work, we propose improved techniques for training rectified flows, allowing them to compete with knowledge distillation methods even in the low NFE setting.
- North America > United States > Pennsylvania > Allegheny County > Pittsburgh (0.04)
- North America > Canada (0.04)