Jailbroken AI Chatbots Can Jailbreak Other Chatbots, AI Chatbots Can Convince Other Chatbots to Teach Users How to Make Bombs – Developpez.com
Today’s artificial intelligence chatbots have built-in limitations that prevent them from providing dangerous information to users. But a new preprint study shows how AIs can be tricked into tricking each other into revealing these secrets. In one study, researchers observed target AIs breaking rules to provide advice on how to synthesize methamphetamine, build a bomb, … [Read more…]