Man files complaint after ChatGPT falsely said he killed his children

BBC News 

Hallucinations are one of the main problems computer scientists are trying to solve when it comes to generative AI. These are when chatbots present false information as facts. Earlier this year, Apple suspended its Apple Intelligence news summary tool in the UK after it hallucinated false headlines and presented them as real news. Google's AI Gemini has also fallen foul of hallucination - last year it suggested sticking cheese to pizza using glue, and said geologists recommend humans eat one rock per day. ChatGPT has changed its model since Mr Holmen's search in August 2024, and now searches current news articles when it looks for relevant information.