Maybe I Should Not Answer That, but... Do LLMs Understand The Safety of Their Inputs?

Open in new window