Learning Fair Canonical Polyadical Decompositions using a Kernel Independence Criterion
This work proposes to learn fair low-rank tensor decompositions by regularizing the Canonical Polyadic Decomposition factorization with the kernel Hilbert-Schmidt independence criterion (KHSIC). It is shown, theoretically and empirically, that a small KHSIC between a latent factor and the sensitive features guarantees approximate statistical parity. The proposed algorithm surpasses the stateof-the-art algorithm, FATR (Zhu et al., 2018), in controlling the trade-off between fairness and residual fit on synthetic and real data sets. Tensor factorizations are used in many machine learning applications including link prediction (Dunlavy et al., 2011), clustering (Shashua et al., 2006), and recommendation (Kutty et al., 2012), where they are used to find vector representations (embeddings) of entities. With the widespread use of tensor factorization, we hope that decisions made from using tensor data are accurate but fair.
Apr-27-2021