Supplement: Scalable and Stable Surrogates for Flexible Classifiers with Fairness Constraints

Neural Information Processing Systems 

All relaxations are optimized via our Lagrangian framework. All code was implemented using PyTorch, and optimized using L-BFGS. On the right, the difference framework is used to achieve equality of opportunity on COMP AS. We set the initial learning rate 0.1, which was Here we define equality of opportunity on false negative rates, i.e. predicting that someone Setting s = b, however, causes the linear relaxation to degenerate. For our deep learning experiments, we used the approach of Sec.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found