Categories
Uncategorized

Sidestepping ChatGPT’s guardrails ‘like a video game’ for jailbreak enthusiasts—despite real-world dangers

ChatGPT and its ilk won't normally tell users how to pick locks and make explosives—but they might if prompted in certain ways.