Psst wanna jailbreak ChatGPT? malicious prompts for sale
Turns out it's pretty easy to make the model jump its own guardrails
Read more here: External Link
Turns out it's pretty easy to make the model jump its own guardrails
Read more here: External Link