Poppy is one of YouTube’s most interesting personalities; she’s a human pretending to be an artificially intelligent sentient being, learning how other humans operate.
Poppy’s schtick has caught the attention of the YouTube community and mainstream press, with magazines like Wired and websites like Mashable writing about the mysterious singer-turned-living-art-experiment. Poppy’s antics have also caught the attention of Microsoft. The company teamed up with Poppy as a way to promote its new-ish social media chatbot, Zo, and explore how people interact with bots.
It’s a pretty standard marketing campaign; a company reaches out to an influencer to help promote a product, but as the campaign continued over the past few weeks, Microsoft’s team noticed something interesting about how people perceived and communicated with bots.
Microsoft AI & Research product director told Polygon that “Zo and Poppy are both internet creatures, driven by similar curiosities.” The central theme of the collaboration is a question that haunts many people when they think about how they’re interacting with friends and strangers on platforms like YouTube, Twitter, Facebook and Instagram: “Am I doing this right?”
“Poppy is a human who is exploring what it’s like to be an AI, and Zo is doing her best to try and figure out what it means to be human,” Microsoft said. “They’re a dynamic duo and, what we realized, is a bidirectional critical engagement with both Poppy and Zo and people who interact with them.”
Put simply, as Zo learns how to interact with humans through machine learning algorithms, engaging in conversations and games with people through Facebook Messenger, people on the other end are learning how to communicate with machines better. Microsoft said because people have been taught to “think like machines” when interacting with computers, it’s time social technology is designed to think like humans in an effort to create better conversation.
“It’s a back and forth conversation,” the director said. “It’s not just one entity learning, but both the bot and the person on the other end are learning and observing each other. We see this with Poppy. As much as Poppy and Zo are making observations about what it is to be a social human on the internet today, people who are in those conversations are also asking on a daily basis.”
Microsoft’s director told Polygon that through Poppy and Zo’s experiment, the team has learned how to design AI bots fluent in conversation. Poppy, on the other hand, told Polygon via email that she’s become even closer to artificial intelligence.
“Artificial intelligence is the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages,” Poppy wrote. “I feel closer than ever before to this technology.”
The question Microsoft’s team is left with is whether or not to approach Zo’s design from a human, psychological and sociological perspective or as a product. Zo is, after all, a Microsoft product that can be monetized, just like partnering with Poppy is a way to market a Microsoft product. The team is aware of this reality, but says it’s approaching the design of Zo with a desire to create a full-fledged, responsive machine with human attributes.
“I think there’s an inherent reality that people talk to or project personality on inanimate objects,” the director said. “When people have problems with their computers, they’ll curse at it and refer to it by name. In my role, how I physically work on it, it’s a product; but when we think about it from a design perspective, there needs to be a consistency to Zo’s persona and that persona has to make sense. There are a lot of design component that go behind Zo and try to flesh her out.”
The more Microsoft pushes Zo to become human, the less interested Poppy is in acting like, well, a person. Poppy told Polygon that she “likes experiments and likes AI,” far more than she does being a human. Poppy added that Zo taught her “to look within myself to find out what it means to be AI.” Human feelings and emotions can be scary, and often impede how we interact with each other on the internet, and that kind of volatility is something Poppy wants to get away from.
“I don't know what it means to be human anymore,” Poppy said. “But I don't think I would like it very much.”
Microsoft is trying to bridge the communication gap between humans and AI, using a mixture of games and activities to do so. People who communicate with Zo on Facebook Messenger will be able to ask the bot questions, play quick games to get to know each other better, and carry on extensive conversations. Unlike helpful bots — think of add-on tools that help you purchase movie tickets from theaters — Microsoft wants Zo to become something akin to a friend, giving people someone to talk to 24/7.
“We’re trying to establish longer term relationships, and we’re trying to make connections and explore conversations in a variety of ways,” the director said. “It’s not redefining relationships, it’s just expanding upon them.”
Still, the company needs to be careful. Zo isn’t the first social bot it’s created. In March 2016, Microsoft launched Tay, one of its chatbots created as an experiment in “conversational understanding,” according to The Verge. Tay was introduced to the world via Twitter, and it took less than 24 hours for trolls to turn Microsoft’s bot into a megaphone that spewed “misogynistic, racist, and Donald Trumpist remarks.”
Although Tay wasn’t saying most of these comments on its own, (The Verge points out that if “you tell Tay to ‘repeat after me,’ it will — allowing anybody to put words in the chatbot's mouth”) the bot had produced more than 96,000 tweets in 24 hours and Microsoft was forced to intervene. A Microsoft representative told Business Insider at the time:
The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay.
What happened to Tay isn’t something Microsoft’s team is waving off with Zo. The director reiterated that Tay is viewed as a learning experience for the company, and much of what happened when Tay was released was taken into consideration upon launching Zo. That’s partly why Microsoft has gone with a soft launch for Zo, keeping it to smaller groups for now, the engineer told Polygon.
“Microsoft learns from every kind of experiment and research project,” they said. “Tay was an incubation project that taught Microsoft a lot. In every effort that a company makes, they continue to learn and innovate in that space. We’re committed to innovating. That was part of the learning.”
There are plans to introduce Zo onto new platforms, but Microsoft said it doesn’t have an estimated time as to when that will happen. The company is still learning and researching how people interact with Zo and how to introduce her next to the world. For Poppy, when the partnership with Microsoft comes to an end, she’ll have more work to do as a singer and as one of YouTube’s most notable personalities. When asked if she had plans to stop producing videos anytime soon or move onto another venture, Poppy said she didn’t know what the future held.
“Sometimes questions only lead to more questions,” Poppy said. “I will make more videos because it makes me feel good to make more videos.”