/cdn.vox-cdn.com/uploads/chorus_image/image/57436613/GettyImages_165323022.0.jpg)
In an effort to crack down on subreddits that incite hatred and violence against specific groups of people, Reddit issued a stricter content policy that will enforce bans on those who post threatening messages and forums dedicated to hate. Yet some of the most vicious subreddits may still be allowed to live on in spite of the new rules, according to the discussion platform’s CEO.
Reddit said it strives “to be a welcoming, open platform for all by trusting our users to maintain an environment that cultivates genuine conversation,” a representative told Polygon after the company announced the new guidelines. To accomplish this, “content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or a group of people is not allowed on Reddit.” Redditors (and forums) who post anything that violates these rules will be banned in accordance with the new rules.
Reddit’s stricter guidelines for its communities come in the wake of increased violence toward marginalized groups across the country. Other communication platforms, like Discord, Twitter and Facebook, have also tightened their content policies this year in response.
The rewritten rules are also the latest step in Reddit executives’ multi-year efforts to clean up the site. Those efforts came to a head last year, when Reddit CEO Steve Huffman came under fire after he was found altering users’ posts in notorious subreddits like r/The_Donald and r/pizzagate to be less critical of him. In an interview with Recode following the public fiasco, Huffman apologized to the affected communities, saying, “I don’t want to take your voice away. I just want you to stop being assholes.”
Now, almost a year later, Huffman is talking to the community again about how Reddit polices content. In an Ask Me Anything (AMA) session held today, Huffman explained how the new guidelines came to be.
There were two main reasons. The first is that we take our time on policy changes. We want to be thoughtful about the policy itself, which takes time, and the policy roll-out was done in conjunction with mass enforcement actions, which also take time to plan and coordinate.
The second reason is that we waited until we had more staff on our Trust and Safety team so we guarantee coverage.
Finally, in the wake of Charlottesville, which was my home for five years, I was quite emotional, and it took time to think clearly about what we were going to do.
Since it went into effect last week, the new policy has led to the shutdown of multiple political subreddits, including r/Nationalism, r/Far_Right and r/Nazi. Users who posted on those forums reportedly have also been banned from Reddit. But while Reddit’s new rules are a step forward in making the platform more peaceful, they don’t explicitly address some of the most notorious and questionable subreddits on the platform.
Controversial and highly visible forums like r/The_Donald, r/KotakuinAction and r/incels are just a few subreddits whose violent, discriminatory content have concerned other Reddit users — and journalists — for months, if not years. Among the biggest questions users have about the new policy are why these aggressive forums continue to exist, and whether Reddit will take action against them going forward.
It wasn’t until earlier today, when Huffman took part in an AMA session, that these questions were answered ... kind of. Huffman’s comment on the situation is as follows:
Many of these links [from r/The_Donald] are probably in violation of our policy, but most are unreported, which is what alerts the mods and our team, especially when there are few votes. We'll consider them reported now.
Generally the mods of the_donald have been cooperative when we approach them with systematic abuses. Typically we ban entire communities only when the mods are uncooperative or the entire premise of the community is in violation of our policies. In the past we have removed mods of the_donald that refuse to work with us.
Finally, the_donald is a small part of a large problem we face in this country—that a large part of the population feels unheard, and the last thing we're going to do is take their voice away.
As of this writing, The_Donald, which has more than 500,000 members, is still active, but the subreddit’s past offenses clearly violate the company’s new rules. After a mass shooting in Las Vegas left more than 50 dead and hundreds injured earlier this year, members of The_Donald began digging into the shooter’s history, trying to find as much information as possible. After some members erroneously reported that the shooter was Muslim, they called for aggressive action against those who practice Islam. But a report from Daily Dot archived how The_Donald came to mistake the shooter’s ethnic background and religious affiliation — a fact members didn’t learn until after they’d posted calls to action on the forum.
Those comments were posted before Reddit’s updated content guidelines went into place. Reddit had previously stated that content which incites violence is unacceptable, but it wasn’t until the new rules were established the company laid out more specifically what that content looks like. Reddit even noted in a site-wide announcement that “we found that the policy regarding ‘inciting’ violence was too vague, and so we have made an effort to adjust it to be more clear and comprehensive.”
Despite Huffman’s comments, there is evidence that The_Donald violates the company’s rules. And Reddit executives are aware The_Donald is problematic: After landing into hot water for editing its posts, Huffman took action to prevent The_Donald from appearing on the site’s main page. But the CEO has since returned to arguing in favor of the subreddit’s freedom of speech, leaving many to wonder whether other incendiary subreddits, like the volatile Kotaku in Action (or KiA), will also be left alone.
Kotaku in Action began in 2014, when the GamerGate movement — a reactionary, hateful campaign that targeted women and marginalized people in the games industry and manufactured a cover of being interested in “ethics in games journalism" — first sprung up. Although the subreddit declared itself to be a “place to discuss the drama and other crazy bullshit that seems to be more and more a part of the gaming journalism industry these days,” the forum devolved into a place to hurl insults and write damaging posts against women and people of color associated with the industry.
Unlike those on The_Donald, Kotaku in Action’s 87,000 members have always been conscious of abiding by the guidelines that its content could potentially violate. That’s why the first rule of the subreddit prohibits anyone from doxxing an individual. Still, that hasn’t stopped the subreddit from coming under scrutiny from other oulets. A watchdog subreddit called HateSubredditOfTheDay released a two-part investigative report into how Kotaku in Action toes the line of Reddit’s new policy and Buzzfeed chronicled the people who run the subreddit.
Kotaku in Action has never explicitly called for action against a single person or group of people, instead focusing on foul, offensive discussions of industry people and popular personalities. These regularly include Zoe Quinn, Brianna Wu and Anita Sarkeesian, all of whom were prominent targets during of the GamerGate movement. (The forum has since moved away from discussing and condemning “PC culture” in video games to focus more broadly on right-wing politics.)
Kotaku in Action hasn’t always avoided using threatening language or behavior. In 2015, when Reddit’s then-CEO Ellen Pao instituted policy changes that led to the popular subreddit r/fatpeoplehate’s shutdown, Kotaku in Action members used threatening, violent language against her in a deleted thread that has since been archived. When Polygon asked Reddit’s representative if these and earlier examples of comments that broke the new policy would be examined, the rep declined to comment.
Kotaku in Action’s then moderator, TheHat2 (who later resigned), addressed members’ concerns that Kotaku in Action could also be banned under the new rules. Although he reiterated that other mods in the forum worked to ensure the forum’s content was above board, he noted that it may still be possible that Kotaku in Action could be the next controversial forum shut down because of the policy changes.
“We've been messaged about two major issues: the Boycott Goal posts, and the Modtalk Leaks. That's all. If we were doing anything else wrong, we probably would've had a heads-up by now,” TheHat2 wrote then. “However, we've been called a place for ‘organized harassment’ by various publications, and have been for months, now. Despite our efforts to curb abuse, we still have that reputation. Therefore, we do believe it is possible that KiA could be banned as a result. Many here think that the admins of Reddit have been targeting us for some time, now.
“Of course, there's not much proof in the way of that, but it hasn't stopped us from being prepared in case of such an event.”