AI Is Like Encryption: It Can't Be Regulated Out Of Existence
As the public becomes increasingly aware of the dangers of AI algorithmic bias and concerned over surveillance and militaristic applications of deep learning, there have been a growing number of calls for AI regulation. Whether new laws governing AI fairness or policies constraining the use of autonomous weapons systems, the challenge confronting policymakers is that AI is very much like encryption: it is not a single controlled algorithm that can be regulated, it is a portfolio of techniques that no single country controls and which are being advanced every day by researchers all across the world. The almost unimaginably rapid progression of deep learning over the past half-decade into every corner of modern life has ushered in profoundly existential questions about how to ensure accurate, fair and beneficial use of this rapidly evolving technology. When it comes to biased algorithms, the fundamental fairness of current AI systems has been largely left to market forces. In turn, basic economics has ensured that free but heavily biased data wins over costly but minimally biased data.
Sep-4-2019, 04:46:24 GMT