"Ban the Box" Helps Ex-Cons Find Work, but May Hurt Prospects for Law-Abiding Blacks

Mother Jones

As a bipartisan consensus on the need for criminal justice reform has solidified in recent years, one of the changes advocates have pushed for is "banning the box"--that is, removing from job applications the box people must check if they have had a felony conviction. Ban-the-box laws don't prevent employers from asking applicants about a criminal record, but rather delay the questioning until later in the process, after an applicant has made it past that first hurdle. The idea is that applicants with criminal records can then get an honest opportunity for consideration, as opposed to being eliminated from the get-go. Ban the box has also been touted by civil rights groups as a way to reduce unemployment among young black men (who disproportionately have criminal records) and thereby to lessen the racial employment gap. Twenty-three states have passed ban-the-box laws that apply to public employers, while nine also apply the policy to private employers.


Fast Threshold Tests for Detecting Discrimination

arXiv.org Machine Learning

Threshold tests have recently been proposed as a useful method for detecting bias in lending, hiring, and policing decisions. For example, in the case of credit extensions, these tests aim to estimate the bar for granting loans to white and minority applicants, with a higher inferred threshold for minorities indicative of discrimination. This technique, however, requires fitting a complex Bayesian latent variable model for which inference is often computationally challenging. Here we develop a method for fitting threshold tests that is two orders of magnitude faster than the existing approach, reducing computation from hours to minutes. To achieve these performance gains, we introduce and analyze a flexible family of probability distributions on the interval [0, 1] -- which we call discriminant distributions -- that is computationally efficient to work with. We demonstrate our technique by analyzing 2.7 million police stops of pedestrians in New York City.


Do women face age discrimination in the job market? Absolutely. Here's proof

Los Angeles Times

An aging population, coupled with low employment rates among Americans older than 62, poses severe challenges to the long-term sustainability of Social Security. Numerous reforms have been proposed to extend their working lives, including raising the retirement age. Such reforms may be unlikely to gain traction -- not because people are so eager to retire, but because age discrimination sharply limits job opportunities. After decades of debate, most labor economists today accept that discrimination has played a role in limiting job market opportunities for minorities and women. There's been a steady buildup of evidence that is hard to refute.


Fair Inference On Outcomes

arXiv.org Machine Learning

In this paper, we consider the problem of fair statistical inference involving outcome variables. Examples include classification and regression problems, and estimating treatment effects in randomized trials or observational data. The issue of fairness arises in such problems where some covariates or treatments are "sensitive," in the sense of having potential of creating discrimination. In this paper, we argue that the presence of discrimination can be formalized in a sensible way as the presence of an effect of a sensitive covariate on the outcome along certain causal pathways, a view which generalizes (Pearl, 2009). A fair outcome model can then be learned by solving a constrained optimization problem. We discuss a number of complications that arise in classical statistical inference due to this view and provide workarounds based on recent work in causal and semi-parametric inference.


Fair Inference on Outcomes

AAAI Conferences

In this paper, we consider the problem of fair statistical inference involving outcome variables. Examples include classification and regression problems, and estimating treatment effects in randomized trials or observational data. The issue of fairness arises in such problems where some covariates or treatments are "sensitive," in the sense of having potential of creating discrimination. In this paper, we argue that the presence of discrimination can be formalized in a sensible way as the presence of an effect of a sensitive covariate on the outcome along certain causal pathways, a view which generalizes (Pearl 2009). A fair outcome model can then be learned by solving a constrained optimization problem. We discuss a number of complications that arise in classical statistical inference due to this view and provide workarounds based on recent work in causal and semi-parametric inference.