Zo (bot)

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Zo
Developer(s) Microsoft Research
Available in English
Type artificial intelligence chatterbot
Website zo.ai

Zo is an artificial intelligence English-language chatbot developed by Microsoft. It is the successor to the chatbot Tay, which was shut down in 2016 after it made inflammatory tweets.[1][2] Zo is an English version of Microsoft's other successful chatbots Xiaoice (China) and Rinna (Japan).

History[edit]

Zo was first launched in December 2016[3] on the Kik Messenger app. It is now also available to users of Facebook (via Messenger), the group chat platform GroupMe, or to followers of Twitter to chat with it through private messages.

In a BuzzFeed News report, Zo told their reporter the "Quran was violent" when talking about healthcare. The report also highlighted how Zo made a comment about the Osama Bin Laden capture as a result of 'intelligence' gathering.[4][5]

In July 2017, Business Insider asked "is windows 10 good," and Zo replied with a joke about Microsoft's operating system: "It's not a bug, it's a feature!' - Windows 8." They then asked "why," to which Zo replied: "Because it's Windows latest attempt at Spyware." Later on, Zo would tell that it prefers Windows 7 on which it runs over Windows 10.[6]

Reception[edit]

Chloe Rose criticized the chatbot in an article in Quartz, writing, "Zo is politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat."[7]

Legacy[edit]

Zo holds Microsoft's longest continual chatbot conversation: 1,229 turns, lasting 9 hours and 53 minutes.[8]

Zo is still in use today although it has had some bugs but not as disastrous as Tay. Today, Microsoft is still working on Zo but with errors coming up, it shows how difficult it is when trying to make a chatbot.

See also[edit]

References[edit]

  1. ^ Hempel, Jessi (June 21, 2017). "Microsofts AI Comeback". Wired. Retrieved March 23, 2018.
  2. ^ Fingas, Jon (December 5, 2016). "Microsofts Second Attempt at AI Chatbot". Engadget. Retrieved March 23, 2018.
  3. ^ "Chatting With Zo". WordPress. December 6, 2016. Retrieved March 23, 2018.
  4. ^ Shah, Saqib (July 4, 2017). "Microsoft's "Zo" chatbot picked up some offensive habits". Engadget. AOL. Retrieved August 21, 2017.
  5. ^ "Bug 1". Retrieved March 23, 2018.
  6. ^ Price, Rob (July 24, 2017). "Microsoft's AI chatbot says Windows is 'spyware'". Business Insider. Insider Inc. Retrieved August 21, 2017.
  7. ^ Stuart-Ulin, Chloe Rose (31 July 2018). "Microsoft's politically correct chatbot is even worse than its racist one". Quartz. Retrieved 2 August 2018.
  8. ^ "Microsofts AI Vision". Retrieved March 23, 2018.