Chatbots are surprisingly effective at debunking conspiracy theories

MIT Technology Review 

Turns out many believers do respond positively when presented with the right evidence and arguments. It's become a truism that facts alone don't change people's minds. Perhaps nowhere is this more clear than when it comes to conspiracy theories: Many people believe that you can't talk conspiracists out of their beliefs. It turns out that many conspiracy believers respond to evidence and arguments--information that is now easy to deliver in the form of a tailored conversation with an AI chatbot. In research we published in the journal this year, we had over 2,000 conspiracy believers engage in a roughly eight-minute conversation with DebunkBot, a model we built on top of OpenAI's GPT-4 Turbo (the most up-to-date GPT model at that time). Participants began by writing out, in their own words, a conspiracy theory that they believed and the evidence that made the theory compelling to them.