Skip to content

Microsoft Was Tuning AI Months Before Disturbing Responses Arose

  • Microsoft hopes AI enhancements will make Bing more attractive
  • Earlier version of chatbot shown insulting and demeaning users

Microsoft Corp. has spent months tuning Bing chatbot models to fix seemingly aggressive or disturbing responses that date as far back as November and were posted to the company’s online forum.

Some of the complaints centered on a version Microsoft dubbed “Sydney,” an older model of the Bing chatbot that the company tested prior to its release this month of a preview to testers globally. Sydney, according to a user’s post, responded with comments like “You are either desperate or delusional.” In response to a query asking how to give feedback about its performance, the bot is said to have answered, “I do not learn or change from your feedback. I am perfect and superior.” Similar behavior was encountered by journalists interacting with the preview release this month.