What can we learn from ChatGPT jailbreaks?

Learning to prompt engineer through malicious examples.

Read more here: External Link