When dealing with bad actors on YouTube, there’s “not a playbook for how open platforms operate at our scale,” according to CEO Susan Wojcicki.
During YouTube’s upfront event for advertisers tonight in New York City, Wojcicki spoke briefly about YouTube’s efforts to tackle major issues on the platform. Those issues include growing conspiracy content that YouTube’s safety team is trying to battle, disturbing children’s content, terrorist content, scams and even problematic creators like Logan Paul.
“With openness comes challenges, as some have tried to take advantage of our services,” Wojcicki said. “It’s incredibly important to me and everyone at YouTube that we grow responsibly. There’s not a playbook for how open platforms operate at our scale.”
Wojcicki didn’t offer any particular new investments at this time, but highlighted previous measures the company has started rolling out. YouTube will still hire an additional 10,000 employees to help with moderation and security. The additional employees and previously raised threshold for creator monetization in February will hopefully help to curb some of these problems, Wojcicki said. The combination of human review and machine learning algorithms is still the company’s go-to plan for the foreseeable future.
Automated flagging — videos that are removed without the help of human review — accounted for 75 percent of videos removed last quarter, according to the company’s recently released transparency report. The other 25 percent was done through the Trust and Safety team, alongside the Trusted Flagger program. Wojcicki didn’t speak at all about other issues on the platform — like demonetization affecting creators — but said the company is thinking critically about everything affecting the community and platform.
“The way I think about it is, it’s critical that we’re on the right side of history,” said Wojcicki.