Trolls immediately went to work on Tay and within a few short hours they had managed to turn the new chatbot into a racist Holocaust denier. Needless to say, Microsoft took the bot offline and ...
Unfortunately for Microsoft, however, some racist Twitter trolls figured out a way to manipulate Tay’s behavior to transform it into a crazed racist who praised Hitler and denied the existence ...
Taylor's lawyers made a move on Microsoft in 2016, according to a new biography by its boss Brad Smith. She was unhappy with the name of its chatbot Tay, meant to interact with 18 to 24-year-olds ...
Earlier this week, Microsoft launched Tay - a bot ostensibly designed to talk to users on Twitter like a real millennial teenager and learn from the responses. But it didn't take things long to go ...
expert has explained what went wrong with Microsoft's new AI chat bot on Wednesday. Microsoft designed "Tay" to respond to users' queries on Twitter with the casual, jokey speech patterns of a ...