Exact inference and learning for cumulative distribution functions on loopy graphs
Jojic, Nebojsa, Meek, Chris, Huang, Jim C.
–Neural Information Processing Systems
Probabilistic graphical models use local factors to represent dependence among sets of variables. For many problem domains, for instance climatology and epidemiology, in addition to local dependencies, we may also wish to model heavy-tailed statistics, where extreme deviations should not be treated as outliers. Specifying such distributions using graphical models for probability density functions (PDFs) generally lead to intractable inference and learning. Cumulative distribution networks (CDNs) provide a means to tractably specify multivariate heavy-tailed models as a product of cumulative distribution functions (CDFs). Currently, algorithms for inference and learning, which correspond to computing mixed derivatives, are exact only for tree-structured graphs.
Neural Information Processing Systems
Feb-15-2020, 01:27:50 GMT