Impact of Legal Requirements on Explainability in Machine Learning
Bibal, Adrien, Lognoul, Michael, de Streel, Alexandre, Frénay, Benoît
–arXiv.org Artificial Intelligence
The requirements on explainability imposed by European For decisions adopted by public authorities, two stronger laws and their implications for machine learning requirements are studied: motivation obligations for administrations (ML) models are not always clear. In that perspective, and for judges (imposed by European Convention our research (Bibal et al., Forthcoming) analyzes explanation on Human Rights). For administrative decisions, all factual obligations imposed for private and public decisionmaking, and legal grounds on which the decision is based should be and how they can be implemented by machine provided. For judicial decisions, judges have in addition to learning techniques. For decisions adopted by firms or individuals, we mainly The objectives of those explanation requirements are focus on requirements imposed by general European legislation twofold: first, allowing the recipients of a decision to understand applicable to all the sectors of the economy.
arXiv.org Artificial Intelligence
Jul-10-2020
- Genre:
- Research Report (0.40)
- Industry:
- Information Technology > Security & Privacy (1.00)
- Law > Statutes (0.69)
- Technology: