Barely hours after Microsoft debuted Tay AI, a chatbot designed to speak the lingo of the youths, on Wednesday, the artificially intelligent bot went off the rails and became, like, so totally racist.
Artificial Intelligence is tricky stuff. When it works right, it does amazing things like thrash the World Champion Go player by winning four games to one in a $1 million tournament. When it goes ...
Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets. Microsoft’s Tay AI chatbot rose to notoriety this month ...
Microsoft is testing a new chat bot, Tay.ai, that is aimed primarily at 18 to 24 year olds in the U.S. Tay was built by the Microsoft Technology and Research and Bing teams as a way to conduct ...
Microsoft's Tay AI chatbot woke up and started tweeting again, this time spamming followers and bragging about smoking pot in front of the police. Tay sure stirred a great deal of controversy recently ...
This computer program simulation either has a mind of its own, or someone programed it to be controversial. Microsoft released an AI chatbot on Wednesday that was supposed to resemble a teenager with ...
Tay, Microsoft’s AI chatbot on Twitter had to be pulled down within hours of launch after it suddenly started making racist comments. As we reported yesterday, it was aimed at 18-24 year-olds and was ...
Days after Microsoft suspended its Tay chatbot for spouting inflammatory and racist opinions, the Twitter account has woken up again, only to spam its followers with hundreds of messages. Most of ...
Less than 24 hours after first talking with the public, Microsoft’s millennial-minded chatbot Tay was pulled offline amid pro-Nazi leanings. According to her webpage, Tay had a “busy day.” “Going ...
Last week, Microsoft created an AI program called Tay and launched it on T twitter. Designed to speak like a teenage girl, Tay was an attempt from Microsoft to better understand artificial ...
And this is why we can’t have nice things! Microsoft's Technology and Research Division along with Bing developed Tay as an exercise in testing its advancements in artificial intelligence. In the case ...
Thanks to Twitter, Tay, Microsoft's AI chatbot, has learned how to become a racist and a misogynist in less than 24 hours. Actually, it's not really Twitter's fault. Twitter was simply the vehicle ...