Microsoft's New Chatbot Zo Won't Talk Politics or Racism
This article is for subscribers only.
Tay take two?
Microsoft Corp. is letting users try a new chatbot on the app Kik, nine months after it shut down an earlier bot that internet users got to spout racist, sexist and pornographic remarks.