Medical chatbot using OpenAI’s GPT-3 told a fake patient to kill themselves

We’re used to medical chatbots giving dangerous advice, but one based on OpenAI’s GPT-3 took it much further.

If you’ve been living under a rock, GPT-3 is essentially a very clever text generator that’s been making various headlines in recent months. Only Microsoft has permission to use it for commercial purposes after securing exclusive rights last month.

In a world of fake news and misinformation, text generators like GPT-3 could one day have very concerning...