Divergence FrontiersforGenerativeModels: SampleComplexity, QuantizationEffects, andFrontierIntegrals
–Neural Information Processing Systems
The spectacular success ofdeep generativemodels calls forquantitativetools to measure their statistical performance. Divergence frontiers have recently been proposed as an evaluation framework for generative models, due to their ability to measure the quality-diversity trade-off inherent to deep generative modeling. We establish non-asymptotic bounds on the sample complexity of divergence frontiers.
Neural Information Processing Systems
Feb-9-2026, 05:44:47 GMT