Making LLMs Vulnerable to Prompt Injection via Poisoning Alignment

Open in new window