Goto

Collaborating Authors

 three-variable interaction


A Kernel Test for Three-Variable Interactions

Neural Information Processing Systems

We introduce kernel nonparametric tests for Lancaster three-variable interaction and for total independence, using embeddings of signed measures into a reproducing kernel Hilbert space. The resulting test statistics are straightforward to compute, and are used in powerful three-variable interaction tests, which are consistent against all alternatives for a large family of reproducing kernels. We show the Lancaster test to be sensitive to cases where two independent causes individually have weak influence on a third dependent variable, but their combined effect has a strong influence. This makes the Lancaster test especially suited to finding structure in directed graphical models, where it outperforms competing nonparametric tests in detecting such V-structures.


A Kernel Test for Three-Variable Interactions

Neural Information Processing Systems

We introduce kernel nonparametric tests for Lancaster three-variable interaction and for total independence, using embeddings of signed measures into a reproducing kernel Hilbert space. The resulting test statistics are straightforward to compute, and are used in powerful three-variable interaction tests, which are consistent against all alternatives for a large family of reproducing kernels. We show the Lancaster test to be sensitive to cases where two independent causes individually have weak influence on a third dependent variable, but their combined effect has a strong influence. This makes the Lancaster test especially suited to finding structure in directed graphical models, where it outperforms competing nonparametric tests in detecting such V-structures.


Function Trees: Transparent Machine Learning

Friedman, Jerome H.

arXiv.org Machine Learning

A fundamental exercise in machine learning is the approximation of a function of several to many variables given values of the function, often contaminated with noise, at observed joint values of the input variables. The result can then be used to estimate unknown function values given corresponding inputs. The goal is to accurately estimate the underlying (non noisy) outcome values since the noise is by definition unpredictable. To the extent that this is successful the estimated function may, in addition, be used to try to understand underlying phenomena giving rise to the data. Even when prediction accuracy is the dominate concern, being able to comprehend the way in which the input variables are jointly combining to produce predictions may lead to important sanity checks on the validity of the function estimate. Besides accuracy, the success of this latter exercise requires that the structure of the function estimate be represented in a comprehensible form.


A Kernel Test for Three-Variable Interactions

Sejdinovic, Dino, Gretton, Arthur, Bergsma, Wicher

Neural Information Processing Systems

We introduce kernel nonparametric tests for Lancaster three-variable interaction and for total independence, using embeddings of signed measures into a reproducing kernel Hilbert space. The resulting test statistics are straightforward to compute, and are used in powerful three-variable interaction tests, which are consistent against all alternatives for a large family of reproducing kernels. We show the Lancaster test to be sensitive to cases where two independent causes individually have weak influence on a third dependent variable, but their combined effect has a strong influence. This makes the Lancaster test especially suited to finding structure in directed graphical models, where it outperforms competing nonparametric tests in detecting such V-structures. Papers published at the Neural Information Processing Systems Conference.