YouTube will explore stricter monetization rules by “carefully considering which channels and videos are eligible for advertising,” the company announced today.
A new blog post written by Susan Wojcicki, YouTube’s CEO, states that in an effort to remove bad actors from the platform and ensure a safer space for creators and advertisers, the current guidelines that determine what channels are eligible for advertising and monetization will change.
We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will also help vetted creators see more stability around their revenue. It’s important we get this right for both advertisers and creators, and over the next few weeks, we’ll be speaking with both to hone this approach.
Wojcicki added that these actions are the “right thing to do,” noting that neither creators nor advertisers “can thrive on YouTube without the other.” Wojcicki’s comments come after months of challenges YouTube has faced from its creator community and advertisers. After videos featuring malicious and hateful content began appearing with major advertisements attached, some of the biggest companies in the world including Audi and Adidas threatened to leave.
The first time YouTube tried to address the problem, it led to the “adpocalypse,” in which hundreds of creators noted that their videos were being demonetized for no reason. Following weeks of new reports centering around disturbing videos that exploit children for profit or other harmful reasons, YouTube made the decision to limit monetization once again. While the creator community understands that YouTube’s responsibility is to ensure videos that put children at risk are not on the site, they were worried that another “adpocalypse” would be detrimental.
A separate blog post from Wojcicki, also published today but on YouTube’s Creators blog, addressed these concerns.
We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetize videos (apply a “yellow icon”) by mistake. We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should. This will help limit inaccurate demonetizations while giving creators more stability around their revenue. We will be talking to creators over the next few weeks to hone this new approach.
Wojcicki’s statements come less than a week after a YouTube representative confirmed to Polygon that daily conversations were being held to address bad actors on the site. Human moderation had increased so moderators could watch the site and respond to alerts 24/7. The YouTube representative also confirmed that actions such as age-gating, in which a user needs to log into YouTube to verify their age, and the removal of harmful channels were being implemented.
Our community of creators are currently being hurt by bad actors who are spamming our systems with videos masquerading as family content. In order to protect creators and advertisers alike, we're taking aggressive action using a combination of machine learning and people to take action on this content through age-gating, demonetization and even the removal of channels where necessary. As always, creators can appeal video-specific demonetizations, and our goal is to ultimately to protect the revenue of creators across the platform by taking these necessary actions.
In an effort to keep bad actors at bay, YouTube confirmed that alongside new advertising guidelines and policies, the company is “launching new comment moderation tools and in some cases shutting down comments altogether.”
These changes are expected to be made in the coming weeks and leading into the new year.