C-MI-GAN : Estimation of Conditional Mutual Information using MinMax formulation

Mondal, Arnab Kumar, Bhattacharya, Arnab, Mukherjee, Sudipto, AP, Prathosh, Kannan, Sreeram, Asnani, Himanshu

arXiv.org Machine Learning 

Two noteworthy quantities of widespread interest are the mutual information (MI) and conditional mutual information (CMI). Estimation of information theoretic quantities such as mutual information and its conditional In this work, we focus on estimating CMI, a quantity variant has drawn interest in recent times owing which provides the degree of dependence between to their multifaceted applications. Newly two random variables X and Y given a third variable proposed neural estimators for these quantities Z. CMI provides a strong theoretical guarantee that have overcome severe drawbacks of classical I(X; Y Z) 0 X Y Z. So, one motivation kNN-based estimators in high dimensions. In for estimating CMI is its use in conditional independence this work, we focus on conditional mutual information (CI) testing and detecting causal associations. CI (CMI) estimation by utilizing its formulation tester built using kNN based CMI estimator coupled with as a minmax optimization problem.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found