Medical chatbot using OpenAI's GPT-3 told a fake patient to kill themselves
We're used to medical chatbots giving dangerous advice, but one based on OpenAI's GPT-3 took it much further. If you've been living under a rock, GPT-3 is essentially a very clever text generator that's been making various headlines in recent months. Only Microsoft has permission to use it for commercial purposes after securing exclusive rights last month. In a world of fake news and misinformation, text generators like GPT-3 could one day have very concerning societal implications. Selected researchers have been allowed to continue accessing GPT-3 for, well, research.
Feb-28-2021, 10:10:23 GMT
- Country:
- Europe > Netherlands
- North Holland > Amsterdam (0.06)
- North America > United States
- California (0.06)
- Europe > Netherlands
- Industry:
- Health & Medicine (1.00)
- Media > News (0.57)
- Technology: