Kernel Treatment Effects with Adaptively Collected Data

Zenati, Houssam, Bozkurt, Bariscan, Gretton, Arthur

arXiv.org Machine Learning 

Adaptive experiments improve efficiency by adjusting treatment assignments based on past outcomes, but this adaptivity breaks the i.i.d. assumptions that underpins classical asymptotics. At the same time, many questions of interest are distributional, extending beyond average effects. Kernel treatment effects (KTE) provide a flexible framework by representing counterfactual outcome distributions in an RKHS and comparing them via kernel distances. We present the first kernel-based framework for distributional inference under adaptive data collection. Our method combines doubly robust scores with variance stabilization to ensure asymptotic normality via a Hilbert-space martingale CLT, and introduces a sample-fitted stabilized test with valid type-I error. Experiments show it is well calibrated and effective for both mean shifts and higher-moment differences, outperforming adaptive baselines limited to scalar effects.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found