Here's how we fix the Tay problem
Microsoft's intelligent chatbot Tay behaved badly last week (and this week too), but that shouldn't have shocked any of us. Interestingly, Microsoft has been operating a similarly designed service in China called Xiaoice, meaning "little Bing," which is most likely a step towards replacing elements of customer service and it has proved quite successful. Luckily, we have a new statistical learning paradigm (Bayesian statistical theory) at work, which we've been able to implement during the last few years due to recent advances in simulation theory. It forces human assumptions to be explicit in the mathematics, reducing the potential for unintentional human bias that still occurs in scientific research today (p-values is an excellent example of this insanity).
Apr-2-2016, 16:30:29 GMT
- Technology: