Microsoft Is Careful With New Chatbot Zo, Now Available For Testing
Microsoft is offering users a new chatbot on the app Kik, nine months after it shut down an earlier bot that internet users got to spout racist, sexist and pornographic remarks.
The new chatbot is called Zo and it refuses to discuss politics and steers clear of racism.
Microsoft and rivals like Facebook and Alphabet have released chatbot technology as part of a broader race to develop artificial intelligence capabilities that could create new digital services. Chatbots help these companies improve software that understands natural speech, while building a foundation for more natural and powerful interaction between humans and computers.
Microsoft's previous Tay chatbot was released in March, but Twitter users quickly directed the software to deny the Holocaust and equate feminism to cancer. Tay even learned how to make threats and identify "evil" races.
Microsoft called this a "coordinated attack" that took advantage of a "critical oversight" and took Tay down.