Fairness in Matching under Uncertainty
Devic, Siddartha, Kempe, David, Sharan, Vatsal, Korolova, Aleksandra
–arXiv.org Artificial Intelligence
Systems based on algorithms and machine learning are increasingly used to guide or outright make decisions which strongly impact human lives; thus it is imperative to take fairness into account when designing such systems. Notions of fairness in computer science can be classified into those that try to capture fairness towards a group (Hardt et al., 2016; Hébert-Johnson et al., 2018; Kearns et al., 2018; Kleinberg et al., 2017) vs. those that try to be fair to each individual Dwork et al. (2012); Kim et al. (2018, 2020). In our work, we focus on the latter notion. The most widely studied notion of individual fairness is due to the seminal work of Dwork et al. (2012): it assumes that a metric space on observable features of individuals captures similarity, and requires that outcomes of a resource allocation mechanism satisfy a certain Lipschitz continuity condition with respect to the given metric. Intuitively, this ensures that individuals who are similar according to the metric will be treated similarly by the mechanism. We consider a setting in which individuals have preferences over the outcomes of the resource allocation mechanism, focusing on the important setting of two-sided markets. Applications of this setting abound: matching students to schools, job fair participants to interviews, doctors to hospitals, patients to treatments, drivers to passengers in ride hailing, or advertisers to ad slots/users in online advertising (Abdulkadiroğlu and Sönmez, 2003; Bronfman et al., 2015; Mehta et al., 2013; Roth, 1986; Roth et al., 2007), to name a
arXiv.org Artificial Intelligence
Jun-16-2023
- Country:
- North America > United States > California (0.46)
- Genre:
- Research Report (0.64)
- Industry:
- Education (1.00)
- Government > Regional Government (0.46)
- Health & Medicine (1.00)
- Information Technology > Services (0.66)
- Marketing (0.87)
- Transportation > Passenger (0.54)
- Technology: