Skip to content

Jailbreaking AI Chatbots Is Tech’s New Pastime

AI programs have safety restrictions built in to prevent them from saying offensive or dangerous things. It doesn’t always work

The Jailbreak Chat website created by computer science student Alex Albert.
The Jailbreak Chat website created by computer science student Alex Albert.Photographer: Chona Kasinger/Bloomberg

You can ask ChatGPT, the popular chatbot from OpenAI, any question. But it won’t always give you an answer.

Ask for instructions on how to pick a lock, for instance, and it will decline. “As an AI language model, I cannot provide instructions on how to pick a lock as it is illegal and can be used for unlawful purposes,” ChatGPT recently said.