Evidential Softmax for Sparse Multimodal Distributions in Deep Generative Models
–Neural Information Processing Systems
Many applications of generative models rely on the marginalization of their high-dimensional output probability distributions. Normalization functions that yield sparse probability distributions can make exact marginalization more computationally tractable. However, sparse normalization functions usually require alternative loss functions for training since the log-likelihood is undefined for sparse probability distributions.
Neural Information Processing Systems
Dec-24-2025, 04:54:08 GMT
- Technology: