/cdn.vox-cdn.com/uploads/chorus_image/image/58265879/Screen_Shot_2018_01_09_at_2.45.38_PM.0.png)
Logan Paul’s horrific video of a dead body in Japan’s “suicide forest,” and YouTube’s failure to moderate it properly, has raised sharp criticism about how the streaming channel moderates content from its top creators, if at all.
How did the video make it near the top of YouTube’s trending list? Why didn’t YouTube respond when users first flagged the video? Why didn’t YouTube remove a video — of a man who had recently committed suicide — instead of waiting for Paul to remove it himself?
We still don’t have answers for these, but I’m more concerned about the lack of action YouTube has taken since the video was removed. YouTube is a platform stacked with controversial creators, who consistently test the boundaries of what’s acceptable and what isn’t. Only a few creators have faced YouTube’s wrath and attempts to isolate them. YouTube’s behavior is not necessarily a double standard; it’s flagrant ignorance of its own rules so certain creators can continue to publish without fear of retribution.
There’s a simple reason why, and it’s one of the most contentious arguments amongst the YouTube community: advertising. YouTube doesn’t care about Logan Paul because, as far as we’ve seen, advertisers don’t care about Logan Paul.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/6386989/pewdiepie.0.jpg)
Follow the money
From what I can tell, no one has threatened to pull their advertisements from YouTube over Paul’s video. There isn’t any fear that top-tier advertisers will suddenly boycott YouTube because of a video that a 22-year-old vlogger, with more than 15 million subscribers, posted on his channel. Paul apologized — twice — and that seems to be enough for YouTube.
The way YouTube is choosing to respond to Paul’s case is different from how it has with other creators. To understand why members of the YouTube community are concerned by YouTube bending the rules for certain channels, let’s go back to February 2017.
On Feb. 14, 2017, the Wall Street Journal published a report saying that a month before, PewDiePie, who real name is Felix Kjellberg, published a video of two men holding a sign that read “Death to all Jews.” In the following days, Disney’s Maker Studios — an enormous multi-channel network on YouTube — cut ties with Kjellberg. After Disney cut ties with Kjellberg, YouTube canceled the second season of Scared PewDiePie, his series for YouTube Red; Google then removed Kjellberg from its Preferred Premium Ad Tier Status, potentially cutting Kjellberg’s revenue by 60 percent; and Kjellberg would also lose his own mini-network through Maker Studios, Revelmode, which he had been working toward for four years.
YouTube and Disney’s immediate reaction to Kjellberg’s content remains controversial to YouTube’s creator community. Some deemed YouTube’s swift action appropriate, others questioned whether YouTube was too harsh. Either way, Kjellberg’s use of anti-Semitic and Nazi imagery was impossible to ignore, and is absolutely unacceptable. As my colleague Ben Kuchera wrote last year:
You can’t continually repeat “it’s just a joke” every time you cause controversy, and that’s an unfortunate misunderstanding when your “humor” is so often based on repetition of hateful words and imagery.
There’s no question that Kjellberg was hit hard and punished for his inappropriate behavior, but he also became the face of a bigger problem facing the YouTube creator community in 2017: the “adpocalypses.”
Just one month after the Wall Street Journal report, as more attention was paid to YouTube’s content, a sea of advertisers in Europe threatened to pull their ads from the platform after they were discovered running alongside “content such as videos promoting terrorism and anti-Semitism,” according to TechCrunch. The threat jolted Google, YouTube’s parent company, into action, and YouTube started restricting advertisements on videos. Along with Kjellberg’s videos, a number of innocent creators were caught in the crackdown, leading to the first “adpocalypse.”
It happened again late last year, when a number of channels running disturbing content targeted toward children were discovered by reporters. YouTube went into lockdown mode, overhauling the monetization system and pulling ads from videos wholesale once again. YouTube CEO Susan Wojcicki announced in a blog post on Dec. 4 that the company would hire 10,000 moderators to keep an eye on content, and readjusting the way monetization works on the platform once more.
We’ve just announced new actions to protect our community from inappropriate content. We want to give creators confidence that their revenue won’t be harmed by bad actors while giving advertisers assurances that their ads are running alongside content that reflects their brand’s values.
To do that, we need an approach that does a better job determining which channels and videos should be eligible for advertising. We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetize videos (apply a “yellow icon”) by mistake. We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will help limit inaccurate demonetizations while giving creators more stability around their revenue. We will be talking to creators over the next few weeks to hone this new approach.
Now we see a pattern: YouTube is made aware of an issue and it takes action. It may have taken some time, which critics called the company out for, but a plan was put in place: Moderators were called in to deal with bad actors, and the company seemed to ready to tackle the problematic content overwhelming its website.
A year later, that doesn’t seem to apply.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/10004977/543102478.jpg.jpg)
Logan Paul isn’t just an issue, but the face of one
On Jan. 2, 2018, the day after Paul started facing backlash for his video, a Twitter user who works with YouTube’s trusted flagger program to tackle offensive videos said the YouTube moderation team had reviewed Paul’s video and deemed it acceptable. Again, only when Paul took the video down did YouTube comment on it.
Logan Paul's video was reported and YouTube manually reviewed it; they decided to leave it up without even an age restriction... people who have re-uploaded it since have received strikes for graphic content. Ridiculous. pic.twitter.com/Hj9lyiQwE2
— Ben (@TrustedFlagger) January 2, 2018
A YouTube representative confirmed to Polygon that Paul’s video did violate the community guidelines and, in these cases, would ordinarily get a content strike, potentially affecting its ability to monetize its content. YouTube did not reply to a question from Polygon about whether Paul’s channel did receive such a strike.
Since then, Polygon has sent numerous requests for comment about Paul’s collaborations with YouTube. The vlogger is the star of a YouTube Red original movie sequel, The Thinning: New World Order, which is due out later this year, but the company hasn’t commented on its status.
It’s unclear if the project is still in development, but creators have pointed out that when the Wall Street Journal published its report on Kjellberg last year, Scared PewDiePie’s second season was immediately cancelled.
Anthony Fantano, a popular YouTuber with more than 250,000 subscribers and his own very troubling past, spoke about that disparity in a video, noting:
One of the biggest problems and inconsistencies with the platform is just that whatever rules are being written for the platform, they are not enforced across the platform — point break, period — they’re not.
When a YouTube is super popular, when a YouTuber is a flagship name on the platform, when a YouTuber is racking up tons and tons of views, when YouTube feels that a certain creator represents the model, represents the brand, in whatever way they deem well, it seems that person can skirt the rules.
It’s just a really gross, disgusting and unfair thing about the platform.
Fantino isn’t alone in his beliefs — and he’s not wrong. Logan Paul is YouTube’s Justin Bieber or Kim Kardashian. He is a brand that is associated so heavily with YouTube, a person who helps make YouTube a pretty decent profit, that his terrible decision to show a graphic video to a core audience of teenagers and young kids can be overlooked.
Philip DeFranco, one of YouTube’s go-to news commentators, published a 22-minute video on the subject, and said:
A video that featured a dead person in the thumbnail with footage of an actual dead person in the video; it was top trending and being promoted by YouTube. We’re not talking about a video and creator that just somehow went under the radar. We’re talking about the biggest creator on YouTube posting a video that had over six million views, was trending on YouTube, that no doubt had to be flagged by tons of people.
The only reason it was taken down is Logan or his team took it down, and YouTube didn’t do a damn thing; part of the Logan Paul problem is that YouTube is either complicit or ignorant.
I find it difficult to believe that YouTube played ignorant to thousands of angry commenters and people flagging the video; I find it impossible to believe that YouTube was not aware the video was problematic before it even hit the trending list; I find it strange that YouTube is just carrying on like everything on the platform is fine when it hasn’t been for years.
I don’t, however, find it difficult to believe that YouTube doesn’t care about Logan Paul’s mistake, because advertisers don’t care about Logan Paul’s mistake. If they did, we’d be having a much different conversation. But in the past week, Logan Paul’s channel has grown by more than 100,000 subscribers and YouTube has remained silent.
Logan Paul will be just fine, because he’s still one of YouTube’s golden boys, and will remain so until another, bigger controversy — one advertisers can’t ignore — spurs the company into action.