How To Filter Microsoft's Hitler-Loving AI
- By Kelly Strain
- Technology, CleanSpeak, Strategies
- March 30, 2016
Microsoft apologizes after artificial intelligence experiment backfired. What could they have done differently?
Tay, marketed as “Microsoft’s AI fam from the internet that’s got zero chill,” candidly tweeted racist and sexist remarks confirming she in fact had “zero chill”. The chatbot was shut down within 24 hours of her introduction to the world after offending the masses.