SupplementaryMaterialfor"DECAF: Generating FairSyntheticDataUsingCausally-AwareGenerative Networks "

Neural Information Processing Systems 

The bottom graph is a historical example ofunfairness: evenifthere would benobias betweenLoanand Race,redlining(i.e. the practice of refusing aloan topeople living in certain areas) would discriminate indirectly based on race [1,2,3,4]. This example also showswhysimply removing or not measuring a sensitive attribute does not suffice: not only does this ignore indirect bias, but hiding the protected attribute leads to an (additional) correlation betweenPostcodeandLoandue to confounding. InTable 1, we observethat naively removing the protected attribute only ensures FTU fairness, asshown by: GAN-PR, WGAN-GP-PR, and DECAF-PR. This is the direct result of the construction of generatorG and follows a similar argument asProposition 2of[6]. P(Xi|{Xj:(Xj Xi) E}) Given each Gi (see Eq. 2 paper) has enough capacity,G can thus express the full distribution PX(X).

Similar Docs  Excel Report  more

TitleSimilaritySource
None found