The Social Cost of Strategic Classification
Milli, Smitha, Miller, John, Dragan, Anca D., Hardt, Moritz
As machine learning increasingly supports consequential decision making, its vulnerability to manipulation and gaming is of growing concern. When individuals learn to adapt their behavior to the specifics of a statistical decision rule, its original predictive power will deteriorate. This widely observed empirical phenomenon, known as Campbell's Law or Goodhart's Law, is often summarized as: "Once a measure becomes a target, it ceases to be a good measure" [25]. Institutions using machine learning to make high-stakes decisions naturally wish to make their classifiers robust to strategic behavior. A growing line of work has sought algorithms that achieve higher utility for the institution in settings where we anticipate a strategic response from the the classified individuals [10, 5, 14]. Broadly speaking, the resulting solution concepts correspond to more conservative decision boundaries that increase robustness to some form of covariate shift.
Aug-25-2018
- Country:
- North America > United States > California > Alameda County > Berkeley (0.04)
- Genre:
- Research Report > New Finding (0.46)
- Industry:
- Banking & Finance > Credit (0.47)
- Education (0.68)
- Government > Regional Government
- Technology: