It's Surprisingly Easy to Jailbreak LLM-Driven Robots

Researchers induced bots to ignore their safeguards without exception

Read more here: External Link