Prompt attacks: are LLM jailbreaks inevitable?

Open in new window