Unsurprisingly, Microsoft's AI Bot Tay Was Tricked Into Being Racist

A mere day after Microsoft's artificial intelligence experiment launched, the Internet taught it to be racist, xenophobic and inappropriate.

By Mikah Sargent | March 24, 2016

Surprise, surprise — just a day after Microsoft's new artificial intelligence, Tay, launched on several social platforms, it was corrupted by the Internet.

If you haven't heard of Tay, it's a machine learning project created by Microsoft that's supposed to mimic the personality of a 19-year-old girl. It's essentially an instant messaging chat bot with a bit more smarts built in.

Those smarts give Tay the ability to learn from the conversations she has with people; that's where the corruption comes into play.

Article Continues Below

As surprising as it may sound, the company didn't have the foresight to keep Tay from learning inappropriate responses.

Tay ended up sending out racial slurs, denying the Holocaust, expressing support for genocide and posting many other controversial statements.

Microsoft eventually deactivated Tay. The company told TechCrunch once it discovered a "coordinated effort" to make the AI project say inappropriate things, it took the program offline to make adjustments.

Seasoned Internet users among us are none too surprised by the unfortunate turn of events. If you don't program in fail-safes, the Internet is going to do its worst — and it did.

In fact, The Guardian cited Godwin's Law, which holds the longer an online discussion goes on, the more likely it is that someone will compare something to Hitler or the Nazis.

As a writer for TechCrunch put it, "While technology is neither good nor evil, engineers have a responsibility to make sure it's not designed in a way that will reflect back the worst of humanity. ... You can't skip the part about teaching a bot what 'not' to say."

This video includes images and clips from Microsoft and clips from the U.S. Army.

Want to see more stories like this?
Like Newsy on Facebook for More Tech Coverage