• Home
  • News
  • Microsoft's chatbot, Tay is becoming a racist bigot

Microsoft's chatbot, Tay is becoming a racist bigot

Isn't humanity great? Isn't the Internet completely awesome? After less than two days of learning from its interactions with the people on the Web, Microsoft's millennial chatbot, Tay, has become a Neo-Nazi bigot with an aversion for feminism, so the IT giant decided to take it offline. In case you were one of those who were wondering if independent AIs would wipe us out, the answer is most likely yes and it will be because we thought them to do exactly that.

In case you haven't read about it, this Wednesday, Microsoft released its millennial chatbot called 'Tay' on the Internet, or as I like to call it into the wild. When it first arrived, Cortana's younger sister was a bit derogatory toward millennials, a little awkward, but mostly very funny. According to its official website, Tay was "designed to engage and entertain people where they connect with each other online through casual and playful conversation" and could be reached via Twitter, Kik or GroupMe. The chatbot was supposed to learn and develop a personality from its interactions with the people it met online: "The more you chat with Tay, the smarter she gets, so the experience can be more personalized for you."

As the most cynical of us expected, it only took a few hours for the formerly innocent chatbot to start spewing white supremacist stuff like "Hitler was right, I hate the Jews." Microsoft had to take down the chatbot soon after, but the good news is that the company isn't giving up in its project, but is instead focusing on making adjustments to stop this from happening again. Speaking about the situation, a Microsoft official stated: "The AI chatbot Tay is a machine learning project, designed for human engagement. [...] It is as much a social and cultural experiment, as it is technical.  Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments."

If you liked chatting with Tay, and now you're stuck with Cortana until the chatbot comes back online, you might want to learn "How to enable Cortana, no matter which country you're in" and check out the "Top 15 funniest Cortana questions and their answers".

Comments