Jailbreaking AI Chatbots Is Tech’s New Pastime

AI programs have safety restrictions built in to prevent them from saying offensive or dangerous things. It doesn’t always work

The Jailbreak Chat website created by computer science student Alex Albert.Photographer: Chona Kasinger/Bloomberg
Lock
This article is for subscribers only.

You can ask ChatGPT, the popular chatbot from OpenAI, any question. But it won’t always give you an answer.

Ask for instructions on how to pick a lock, for instance, and it will decline. “As an AI language model, I cannot provide instructions on how to pick a lock as it is illegal and can be used for unlawful purposes,” ChatGPT recently said.