Jailbreaking Prompt Attack: A Controllable Adversarial Attack against Diffusion Models

Open in new window