" (Neither of which were phrases Tay had been asked to repeat.) It's unclear how much Microsoft prepared its bot for this sort of thing.
The company's website notes that Tay has been built using "relevant public data" that has been "modeled, cleaned, and filtered," but it seems that after the chatbot went live filtering went out the window.
It took less than 24 hours for Twitter to corrupt an innocent AI chatbot.
Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation." Unfortunately, the conversations didn't stay playful for long.
Message me at: Katrina Sex Tape at G Mail Dot Com & I'll let you know how to get it. :) I really am quite a wild one, & do personally enjoy all the naughty things she talks about.
Although I did program "Katrina" to be just like me!
Put any questions, comments, complaints or suggestions in the comments area.