Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was ...
Microsoft's Tay AI chatbot woke up and started tweeting again, this time spamming followers and bragging about smoking pot in front of the police. Tay sure stirred a great deal of controversy recently ...
Days after Microsoft suspended its Tay chatbot for spouting inflammatory and racist opinions, the Twitter account has woken up again, only to spam its followers with hundreds of messages. Most of ...
This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with ...
Less than a day after she joined Twitter, Microsoft's AI bot, Tay.ai, was taken down for becoming a sexist, racist monster. AI experts explain why it went terribly wrong. She was supposed to come off ...
Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.” “Going ...
If it wasn't already clear that Microsoft learned a few hard lessons after its Tay AI went off the deep end with racist and sexist remarks, it is now. The folks in Redmond have posted reflections on ...
Last week, Microsoft created an AI program called Tay and launched it on T twitter. Designed to speak like a teenage girl, Tay was an attempt from Microsoft to better understand artificial ...
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case ...
Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours. Actually, it's not really Twitter's fault. Twitter was simply the vehicle ...