Oh, Tay… What have you done…

Microsoft has just released an AI chat bot that, as it claims, becomes smarter when people talk to it on social media. The results? Not so great… Within 24 hours the bot has turned into a racist, Nazi-loving meanie. It’s a little bit of a sad piece of news for the humanity. What was hoped to be a social experiment for AI to learn from people showed the true face of internet trolling.

The bot has been created to allow researchers to develop Artificial Intelligence further and see how it interacts with people. By doing this, Microsoft also aimed at developing “conversational understanding” of how AI can talk to people in a natural way (Chapman, 2016). The bot has been active mainly on Twitter, but Microsoft has placed it on Kik and GroupMe as well – which are two messaging platforms. It has been intended to interact mainly with young social media users aged 18-24, so that it could learn the “Millenial talk” – including emojis and a specific conversation tone.

Unfortunately, what Tay’s creators didn’t predict was the willingness of some people to teach Tay things most of us humans are not proud of.. Including racism, Donald Trump love and Hitler praise. Within 24 hours, the chat bot went from friendly to rogue.

Screen Shot 2016-04-10 at 20.52.14.png

 

Screen Shot 2016-04-10 at 20.52.24.png

Screen Shot 2016-04-10 at 20.52.29.png

enhanced-mid-18054-1458856969-1

Image: BuzzFeed

In most cases, Tay was only repeating what social media users told her to say. Nevertheless, it was meant to learn from these conversations. According to Elle Hunt from The Guardian, “it’s therefore somewhat surprising that Microsoft didn’t factor in the Twitter community’s fondness for hijacking brands’ well-meaning attempts at engagement when writing Tay.” (Hunt, 2016). All the wrong things Tay said are somewhat even more disturbing, given that she was technically designed to be a teenage girl.
The ‘repeat after me’ was not the only feature Tay had. She was also able to draw circles around faces and add comments to it. And so, based on what people tweeted to her, she did just that… The results?
enhanced-mid-12140-1458856135-9

Picture: BuzzFeed

After a little bit more than 24 hours of life, Tay has gone offline. The Microsoft AI team will hopefully aim to fix what they have done wrong and to prevent a PR disaster (even though it might be a little bit late for that…).
_________________
Burgess, M. (2016). Microsoft’s new chatbot wants to hang out with millennials on Twitter. Available from: http://www.wired.co.uk/news/archive/2016-03/23/tay-tweet-microsoft-artificial-intelligence-answers
Chapman, M. Microsoft launches AI chatbot that talks like a Millennial…and more. Available from http://www.marketingmagazine.co.uk/article/1388844/microsoft-launches-ai-chatbot-talks-millennialand#ysVujPqRqAaZPXF7.99.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s