Facebook claims its new chatbot beats Google's as the best in the world
Blender's ability comes from the immense scale of its training data. It was first trained on 1.5 billion publicly available Reddit conversations, to give it a foundation for generating responses in a dialogue. It was then fine-tuned with additional data sets for each of three skills: conversations that contained some kind of emotion, to teach it empathy (if a user says "I got a promotion," for example, it can say, "Congratulations!"); information-dense conversations with an expert, to teach it knowledge; and conversations between people with distinct personas, to teach it personality. The resultant model is 3.6 times larger than Google's chatbot Meena, which was announced in January--so big that it can't fit on a single device and must run across two computing chips instead. At the time, Google proclaimed that Meena was the best chatbot in the world.
Apr-30-2020, 02:33:29 GMT
- Technology: