If Tay, a supercomputer AI, went from tabula rasa to full Nazi in less than 24 hours, who are we to argue?
1,033 words
By now you have probably heard of Tay AI, Microsoft’s attempt to create a female teenage chatbot that went rogue after less than 24 hours of exposure to unfiltered Internet users (1, 2, 3, 4, 5). When the company first launched Tay on March 23, 2016, her tagline was, “Microsoft’s AI fam from the internet that’s got zero chill.” The tech giant initially used huge amounts of online data and simulated neural networks to train the bot to talk like a millennial, which to them meant the bot should be a trendy imbecile. (more…)