Club Penguin is an online universe where kids hang out and play, and therefore nothing matters more than moderation. One former player support representative points to one of 50 Cent’s most popular songs as an exemplary reason why.
Carlos Figueiredo, the current director of community trust and safety at Two Hat Security, has spent a large portion of his career learning how to take preventative action in curbing negative behavior. This includes his time working on Club Penguin. Figuiredo is also a cofounder of the Fair Play Alliance, an organization that brings together more than 30 companies and developers with the intention to share player behavior research. Trying to determine what he can do to help other companies and game developers — like the team behind Habbo Hotel, which is one of Two Hat’s clients — learn to stay ahead of problems has been his specialty for years.
Figueiredo sat down with Polygon ahead of GDC to talk about the Fair Play Alliance and disruptive behavior among players, and he touched upon a pretty simple but seemingly effective ideology: keep up to date with the memes. Well, so to speak. The concept is to have a near-Wikipedia-level of knowledge about pop culture. This ensures that moderators can keep up with words and expressions that may seem innocuous but actually have a hidden meaning.
The 50 Cent lesson
“Nate [Sawatzky, Director of Community Support at Club Penguin] is just a huge champion of the online safety space, and he became my mentor,” Figueiredo said. “He believed in moderators being super versed in pop culture. We had a lot of pop culture sessions to learn and exchange knowledge — especially when it came to different cultures. When 50 Cent’s ‘Candy Shop’ came out, the song talked about lollipops, but the whole connotation of that was super unique for that word. The way that kids can take a word and twist it to mean something else, if you are not super proactive about it, kids can talk about all sorts of things when they shouldn’t be.”
Habbo Hotel is one of Figueiredo’s biggest clients at Two Hat, and its large young audience is a particular concern. Being able to understand what’s happening, Figueiredo said, and take action before it becomes too big to handle is incredibly important. A more recent example that Figueiredo and his team worked with their clients on was the “Blue Whale Game.” The supposed challenge encourages self-harm and, at the height of its controversy in 2017, was something Figueiredo and his team tried to help others navigate. Although the game was never proven to be real, having conversations about how to prevent this type of behavior in chat-heavy games geared toward young kids is still a must.
“Being able to understand a new lingo that is happening and being able to take action before it becomes too big is so, so important,” Figueiredo said. “The Blue Whale Challenge was a social media thing. You can imagine that before this, Blue Whale was a very common thing; a description for an animal. What do you do when that term then becomes something dangerous? How do you handle when kids are suddenly talking about this. Now Blue Whale isn’t something innocent anymore.”
Setting up sessions to keep up with what’s happening — determining how certain words or images are being used to convey negative, disruptive or toxic messages — is what Figueiredo invests his time into. Once his team at Two Hat discovers something being used for potentially slimy reasons, however, it’s up to the development teams to figure out how to implement different moderation tactics.
“Some clients might want to keep an eye on it, and take action when need be,” Figueiredo said. “Some clients might install an automated message when the term is detected that sends the user to a crisis center or crisis line. It’s nice to have a proactive system like this. Some clients might decide to not have the term show up in the first place. You need to be very proactive, because otherwise people might be at risk. Understanding those cultural nuances, that internet lingo, that pop culture is at the heart of it.”
The Twitch conundrum
If there’s one company, and its subsequent community, that helps to define gaming and internet culture more so than any other right now, it’s Twitch. Figueiredo believes Twitch is doing the best it can considering the size of its audience and the culture built into chat. He praises the company for its work in establishing and updating a code of conduct, including the community in the discussion and using tools like Auto Mod (an automated tool that helps moderate channel chats) to give streamers ownership of their channels.
No one person or company is perfect, though.
“The danger we run into is that we can’t look for a silver bullet; human behavior is super hard. It’s super complicated,” Figueiredo said. “Online behavior is still human behavior but with other layers of complications.”
One of those complications is also one of the biggest conversations happening right now in the streaming and esports scenes: emotes. Blizzard recently announced that it’s asking professional Overwatch League players to not use a Pepe the Frog emote, the frog caricature created by artist Matt Furie that was co-opted as a hate symbol by the alt-right. Other emotes, like TriHard, which is known for its racist connotations, are also in conversation because of their use by popular Twitch streamers. There is an ongoing conversation about whether Twitch steps in and prevents people from using those specific emotes, and it’s only gaining traction.
“The whole idea is that if you abuse it, you lose it,” Figueiredo said. “Someone might really abuse an emote and then we say, is that fair to others now that we’re taking it away? But what I think is we need to take a really hard look at those things because things are not static. A lollipop term isn’t static, and an emote is not static. Take a hard look based on data, asking how many times is this emote being used in a stream in a regular way versus how many times is it being used to disrupt? In racist connotations? If you’re able to quantify that, I think that gives enough ammo to a company to say, ‘ You know what, right now, this emote is being used to cause a lot of issues and we have the chat mods and we have the proof and this is affecting the community.”
Companies are beholden to their users, and in order for platforms to grow, those same companies are responsible for creating a welcoming environment, according to Figueiredo. It’s just bad business otherwise — if people don’t feel welcome on the platform they want to visit, they’ll just leave.
“I think we’re facing this moment now where maybe three years ago, folks thought doing certain things was censorship and it was maybe an altruistic thing to do; nowadays it’s business crucial,” Figueiredo said. “I think we run the risk sometimes of letting something happen organically and we don’t foresee the consequences and the damages. We need to do something about those things — and it’s good to include the community in those things so they understand why and they have a voice — but then do it.
“Take the responsibility, take the accountability, and do something about those things.”