Medical chatbot using OpenAI's GPT-3 told a fake patient to kill themselves

#artificialintelligence 

We're used to medical chatbots giving dangerous advice, but one based on OpenAI's GPT-3 took it much further. If you've been living under a rock, GPT-3 is essentially a very clever text generator that's been making various headlines in recent months. Only Microsoft has permission to use it for commercial purposes after securing exclusive rights last month. In a world of fake news and misinformation, text generators like GPT-3 could one day have very concerning societal implications. Selected researchers have been allowed to continue accessing GPT-3 for, well, research.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found