clock menu more-arrow no yes mobile

Filed under:

YouTube needs more experts to help tackle dangerous content, says CEO

‘We need a lot more experts’

Google Presents YouTube Brandcast - Digital Content NewFront Taylor Hill/FilmMagic

YouTube CEO Susan Wojcicki is calling for more experts to help the company’s moderation team deal with dangerous content on the platform.

Wojcicki spoke to Recode’s Kara Swisher and MSNBC about the problems YouTube faces in the wake of the events surrounding popular vlogger Logan Paul. Paul uploaded a video on Dec. 31 of his trip to the Japanese forest Aokigahara, which showed the body of a man who appeared to have recently committed suicide. YouTube faced massive criticism for its delayed handling of the video, including the time it took to remove the offensive content from the site and the company’s late response in doling out punishment.

In light of Paul’s video — and substantial other issues that the company faced in 2017 — Wojcicki said YouTube needs to work with more “experts” alongside its moderation team to try and enforce preventative measures rather than reactionary ones.

“Where I think we are going, and what we really learned over the course of this year, is we need a lot more experts,” Wojcicki said. “There’s a lot more nuances, so we follow every law, but a lot of time these issues are complicated, they’re nuanced. We need to go to the experts and get their feedback over what type of content should we be taking down and how do we redraw our policies to be able to do the right thing.”

Wojcicki’s sentiment implies the company wasn’t already relying on experts to help tackle moderation issues. YouTube’s kids content, one of the most disturbing problems the company dealt with in 2017, was highlighted by critics and experts as far back as 2015, but the company didn’t acknowledge it full-on until late 2017. Wojcicki said while YouTube does enforce community guidelines and address flagged content — a statement that other people with knowledge of YouTube’s moderation program have publicly disputed — YouTube will still rely on a combination of human and machine moderation to address dangerous videos.

“At the end of the day, it has to be humans and you have to have those machines,” Wojcicki said. “If we go to something like extremist content, we have 400 hours [of video] being uploaded every single minute to YouTube. We are now able to remove 98 percent of that violent extremism with machines, and half of that within two hours. So you look at that and say, ‘Could we achieve that with people alone?’ No, you need to have those machines.”

YouTube announced in December that it was working with a moderation team of 10,000 people to better tackle content concerns. The company followed up that announcement with news that it is introducing stricter criteria for YouTube’s Partner Program, a group of creators who are eligible for advertising on their videos. Creators must have a minimum of 1,000 subscribers and 4,000 hours of watched time in the past 12 months. Top creators who belong to Google Preferred’s program, a select group of individuals who can earn top-tier advertising from some of Google’s most valued advertisers, will have each video watched by a human moderator.

YouTube’s new creator policies will go into effect in February, and will hopefully tackle the platform’s numerous issues.

Sign up for the newsletter Sign up for Patch Notes

A weekly roundup of the best things from Polygon