Total Variation Distance Estimation Is as Easy as Probabilistic Inference

Bhattacharyya, Arnab, Gayen, Sutanu, Meel, Kuldeep S., Myrisiotis, Dimitrios, Pavan, A., Vinodchandran, N. V.

arXiv.org Artificial Intelligence 

Machine learning and data science heavily rely on probability distributions that are widely used to capture dependencies among large number of variables. Such high-dimensional distributions naturally appear in various domains including neuroscience [ROL02, CTY06], bioinformatics [BB01], text and image processing [Mur22], and causal inference [Pea09]. Substantial research has been devoted to developing models that represent high-dimensional probability distributions succinctly. One prevalent approach is through graphical models. In a graphical model, a graph describes the conditional dependencies among variables and the probability distribution is factorized according to the adjacency relationships in the graph [KF09]. When the underlying graph is a directed graph, the model is known as a Bayesian network or Bayes net. Two fundamental computational tasks on distributions are distance computation and probabilistic inference. In this work, we establish a novel connection between these two seemingly different computational tasks.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found