LLM Lies: Hallucinations are not Bugs, but Features as Adversarial Examples

Open in new window