Differentially Private Post-Processing for Fair Regression
Xian, Ruicheng, Li, Qiaobo, Kamath, Gautam, Zhao, Han
–arXiv.org Artificial Intelligence
Prediction and forecasting models trained from machine learning algorithms are ubiquitous in realworld applications, whose performance hinges on the availability and quality of training data, often collected from end-users or customers. This reliance on data has raised ethical concerns including fairness and privacy. Models trained on past data may propagate and exacerbate historical biases against disadvantaged demographics, and producing less favorable predictions (Bolukbasi et al., 2016; Buolamwini and Gebru, 2018), resulting in unfair treatments and outcomes especially in areas such as criminal justice, healthcare, and finance (Barocas and Selbst, 2016; Berk et al., 2021). Models also have the risk of leaking highly sensitive private information in the training data collected for these applications (Dwork and Roth, 2014). While there has been significant effort at addressing these concerns, few treats them in combination, i.e., designing algorithms that train fair models in a privacy-preserving manner. A difficulty is that privacy and fairness may not be compatible: exactly achieving group fairness criterion such as statistical parity or equalized odds requires precise (estimates of) group-level statistics, but for ensuring privacy, only noisy statistics are allowed under the notion of differential privacy. Resorting to approximate fairness, prior work has proposed private learning algorithms for reducing disparity, but the focus has been on the classification setting (Jagielski et al., 2019; Xu et al., 2019; Mozannar et al., 2020; Tran et al., 2021). In this paper, we propose and analyze a differentially private post-processing algorithm for learning attribute-aware fair regressors under the squared loss, with respect to the fairness notion of statistical parity.
arXiv.org Artificial Intelligence
May-7-2024
- Country:
- North America > United States > Illinois (0.14)
- Genre:
- Research Report (0.82)
- Industry:
- Education (0.96)
- Information Technology > Security & Privacy (1.00)
- Law (0.89)
- Technology: