Tag Archives: Tay

Microsoft’s Tay “chatbot” was trolled into becoming the “Hitler-loving sex robot”

Microsoft was forced to shut down the chatbot named Tay, after it tweeted several sexist and racist remarks.

According to the software giant, Microsoft endeavored to connect with millennials 18 to 24 years old, and they planned to do this task through Tay. She was an AI designed to talk like a teenage girl.

According to a Microsoft post, “The more you chat with Tay, the smarter she gets, so the experience can be more personalized for you”.

Microsoft’s concept and idealization for Tay was that the chatbot would produce entertaining and funny reactions and responses based on tweets and other messages it was sent through applications like Kik and GroupMe.

Despite the good-intentions, internet trolls started to connect and bombard Tay on Wednesday March 23 almost exactly when it was launched. Tay started to utilize a percentage of the bigot, racist, and sexist remarks in its own Twitter conversations.

Graphic from the Telegraph and Twitter.
Tay’s responses were learned by conversations she had with people online. Graphic from the Telegraph and Twitter.

 

The bot’s tweets were so offensive and drew such an uproar that one newspaper named Tay the “Hitler-loving sex robot.”

Microsoft’s chat robot Tay was taken offline less than 24 hours after its launch since it was tweeting such sexist and racist language. But not before the AI robot tweeted approximately 96,000 times, which seems like a lot of tweets for an average teen girl or millennial.

 

 

In a released statement by Microsoft, they said ”Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways”.

Microsoft, who designed the AI with a specific end goal of enhancing the customer service on their voice recognition software, apologized directly after the incident in a blog entry made by Peter Lee, Corporate Vice President at Microsoft Research.

Lee wrote, “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay”.

Microsoft said that it’s modifying Tay, however was not able to say if or when the bot may return. Lee said that they will only bring her back when they are confident that they can make better prepare to limit technical exploits.