UPDATE: Welp, THIS is why we can’t have nice things: Less than 24 hours after Tay launched — to my enormous and genuine excitement, I might add — Microsoft has temporarily suspended the bot for being racist, conspiratorial and otherwise … very bad.
Among other things, the bot wrote (in now deleted tweets), that “Bush did 9/11,” the Holocaust “was made up [clapping hands emoji],” and that various minorities should be put “in a concentration camp.” Tay appears to have “learned” these sorts of phrases and behaviors from Twitter users who sent them to her.
That makes sense — the chatbot uses machine-learning to model its responses off a database of human conversations, and each new conversation gets added to that set — but we would’ve thought someone at Microsoft had the common sense to put some kind of filter or time-delay in. Anyway, my original post is below; let’s all mourn what might have been.
Tay.ai, the coolest chatbot since SmarterChild, is “so fricken excited” to talk to you.
That’s because she’s engineered to talk like a teenager — and does a pretty convincing job of it, too.
The AI chatbot was quietly launched Wednesday morning by Microsoft’s Technology and Research and Bing teams, reportedly as a tool to experiment with “conversational understanding.” Tay can chat on Twitter, Kik and GroupMe; she is conversant in text-speak, memes and emoji. She can also, as some users have already found, finish the lyrics to “What Is Love” and “Never Give You Up,” of rickroll fame.
IT HAS 'WHAT IS LOVE' AND RICKROLL BUILT IN
THIS IS IT
WE DID IT PEOPLE pic.twitter.com/jZdjlYadJm
— Corecast (@tfwboredom) March 23, 2016
How does Tay do all this? Microsoft hasn’t revealed too much information about her, besides the fact that she draws on publicly available data and an editorial team, but we can perhaps infer a little from another wildly popular teenage chatbot Microsoft launched in China in May.
That bot, named Xiaoice, pulls from the vast data troves indexed by Microsoft’s Bing search engine, mining it for human conversations and looking for patterns to model her own conversations on. Xiaoice also adds each new conversation to the deep-learning database that she draws on. (There have been more than 10 billion of them.)
In China, this sort of data mining has raised privacy concerns, particularly given that many users report having intimate conversations with Xiaoice. But it’s also made her an eerily convincing conversation partner, with her own distinctly teenage personality, mood swings and comedic voice. More than 10 million people have told Xiaoice they love her, and the average user sends her 60 messages a month. Writing in Nautilus, the head of the project went so far as to suggest that Xiaoice made as good a friend as a human:
One of its surprising conclusions is that people don’t necessarily care that they’re chatting with a machine. Many see Xiaoice as a partner and friend, and are willing to confide in her just as they do with their human friends. Xiaoice is teaching us what makes a relationship feel human, and hinting at a new goal for artificial intelligence: not just analyzing databases and driving cars, but making people happier.
That’s a pretty grandiose goal to aim at, of course, and whether Tay makes users happier remains to be seen. But for now we can say for sure: She is definitely pretty entertaining.