clock menu more-arrow no yes mobile

Filed under:

Wikipedia won’t fix YouTube’s problems

But it’s not Wikipedia’s fault

Wikipedia Imposes A 24 Hour Shutdown To Protest Over Web Piracy Bill Peter Macdiarmid/Getty Images

YouTube’s CEO wants to tackle the platform’s conspiracy theory problem by relying on Wikipedia in an effort to keep viewers informed.

The idea is simple enough: YouTube will identify videos about known conspiracy theories, and include a Wikipedia description that provides more information on the main topic that the conspiracy is concerned with.

“When there are videos that are focused around something that’s a conspiracy — and we’re using a list of well-known internet conspiracies from Wikipedia — then we will show a companion unit of information from Wikipedia showing that here is information about the event,” Wojcicki said at a South by Southwest talk yesterday, as reported by The Verge.

Wojcicki pointed to flat-earthers, people who believe that the Earth isn’t spherical — even though the ancient Greeks figured out the truth, and explorers Ferdinand Magellan and Juan Sebastián Elcano proved it centuries ago. It’s not up to YouTube to tell people what they can and can’t believe, Wojcicki said. But with small boxes sourced from Wikipedia, the video service will make an effort to keep people informed about what they’re watching. You can see examples below.

YouTube - screencap showing Wikipedia information about moon landing
Wikipedia information cues as they appear on YouTube videos.
YouTube
YouTube - screencap showing additional information from Wikipedia about contrails
Additional information from Wikipedia on YouTube videos.
YouTube

YouTube’s solution is plainly ridiculous. It’s also indicative of the company’s refusal to take a stand on political matters.

YouTube is offloading its responsibility to curb the distribution of dangerous content onto Wikipedia. The company’s executives don’t want to define YouTube as a media platform, so they can relieve themselves of the ownership they have over the content that appears there — even though they also want YouTube to be viewed as a go-to place for news.

“If there is an important news event, we want to be delivering the right information,” Wojcicki said, as reported by BuzzFeed, before reiterating that “we are not a news organization.”

Even Wikipedia officials take issue with YouTube’s decision to source the site. Katherine Maher, the executive director of the Wikimedia Foundation, tweeted her own concerns regarding Wikipedia’s speed with real-time news and with putting complete faith in Wikipedia — or any information source. Nor, the foundation stated, is it a formal partnership.

“We don’t want you to blindly trust us,” Maher said. “Sure, we’re mostly accurate — but not always! We want you to read @Wikipedia with a critical eye. Check citations! Edit and correct inaccurate information! You can’t do that in a simple search result.”

YouTube’s answer to angry creators bemoaning an alleged content “purge,” and to criticism from academics and the media, is to display information from Wikipedia — a free encyclopedia whose articles can easily, instantly be edited by anyone in the world.

That’s not a good solution.

YouTube is still incentivizing falsehoods

YouTube is owned by Google — whose parent company, Alphabet, is currently valued at just under $800 billion, making it one of the wealthiest companies in the world — and the video platform has pledged to hire a team of more 10,000 people to deal with persistent content moderation problems. It’s also a company that doesn’t want to take the next steps in correcting systemic issues. That’s why people like Alex Jones, the despicable media personality who published conspiracy theory videos claiming that the Sandy Hook school shooting didn’t happen, can game the system for his own benefit.

Jones, like many other conspiracy theorists and propagandists on YouTube, learned how to take advantage of Google’s algorithm to ensure that their content is seen and shared. The trick is simple: Take a popular news topic — a school shooting, for example — then add a term like “crisis actor” to it, and publish content about it. Jonathan Albright, research director at the Tow Center for Digital Journalism at the Columbia Journalism School, wrote about this growing trend in late February, just after the shooting at Marjory Stoneman Douglas High School in Parkland, Florida.

“Mass shooting, false flag, and crisis actor conspiracy videos on YouTube are a well-established, if not flourishing genre,” said Albright.

That’s how a video about one Parkland high school shooting survivor, David Hogg, accusing him of being a crisis actor ended up on the top of YouTube’s trending page. Albright told BuzzFeed that YouTube is “algorithmically and financially incentivizing the creation of this type of content at the expense of truth.” It’s what makes human moderation — not just machine-learning algorithms, which can be easily gamed — so necessary.

“Journalists and affected parties (parents, survivors, first responders, etc.) are not only fighting the content on YouTube, they are fighting its algorithms — first at the ‘trending’ level and then again at the search and recommendation levels,” Albright said.

On one side of the problem is YouTube’s algorithm, which the company admits is flawed. On the other side is the YouTube’s refusal to take a stance on the dangerous content being spread around its platform. The company still permits monetization on videos published by Jones’ InfoWars media company. One of them, “Must See Video! UK Gov’t Coddles Terrorists,” has two ads running: one at the beginning, and another at the end.

YouTube executives aren’t taking a position on the content being watched by billions of people on their platform because they don’t have to. They want to appear unbiased, which is understandable when political conversations get involved. Robert Kyncl, YouTube’s chief business officer, expressed this exact sentiment in an earlier interview.

“We have four freedoms under which YouTube operates: freedom of expression, freedom of opportunity, freedom to belong and freedom of information,” Kyncl said, reiterating comments he made to YouTuber Casey Neistat in February. “They truly become our North Star during difficult times. [...] Our message is that we absolutely are leaning in to freedom of information and freedom of expression, subject to our community guidelines.

“We don’t intend to be on one side or another.”

There’s a difference, however, between someone talking about their conservative beliefs and someone spreading malignant lies under the pretense of news. This is where the company now finds itself: not wanting to take a stance and relying on user-edited Wikipedia articles to try and outweigh its riskiest content.

It’s a YouTube problem, not a Wikipedia problem

People are quick to point to Wikipedia’s notorious “edit wars” as a reason that the online encyclopedia shouldn’t be trusted. I disagree.

I don’t think that Wikipedia articles about Sandy Hook or the Sept. 11 attacks are suddenly going to become a breeding ground for conspiracy theorists. Subjects like the moon landing and President John F. Kennedy’s assassination will probably have pretty concise and reliable information, but that doesn’t matter to conspiracy theorists on YouTube — or the people watching the videos.

Note how the descriptions will appear on the video pages. An excerpt of the Wikipedia article will appear directly under the video, just above the title. It looks like it can’t be minimized or closed, but Polygon has reached out to YouTube for more information. It doesn’t matter that a Wikipedia article detailing how the moon landing is real appears underneath a video, because by the act of watching the videos, viewers are still incentivizing those creators to produce such misleading content.

Max Read, an editor at New York magazine’s Select All, tweeted about this after Wojcicki’s panel. Read doesn’t think people will engage in edit wars as a way of telling off YouTube executives, but does believe creators won’t think twice about continuing to make harmful content when YouTube continues to put ads on it and share it through its recommendation algorithm.

“YouTube is has a problem with misinformation (and propaganda, and unwashed guys yelling about atheism, and videos of spider-man operating on elsa) because there are specific platform (and consequently financial) incentives for that content,” said Read.

YouTube executives aren’t trying to fix their problem with conspiracy videos, or the disturbing trend of hateful content, because it translates to easy views and monetization. They’re stalling discussions about how to handle content, looking to put the responsibility on anyone else but their own company.

YouTube is quickly becoming a dangerous platform that’s too big to handle, too big to moderate and increasingly looking too big to fix. Sen. Mark Warner (D-Va.), vice chair of the Senate Intelligence Committee, spoke at SXSW about the issues that platforms like YouTube, Twitter, Reddit and Facebook face regarding dangerous content made for the sole purpose of monetization and spreading lies.

Warner said that if companies don’t start dealing with these issues, they’ll see changes to the Communications Decency Act, according to BuzzFeed reporter Ryan Mac. Section 230 of the Communications Decency Act says, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

That’s why Wojcicki repeatedly says YouTube is not a media company. To an extent, it isn’t. That doesn’t excuse YouTube from letting dangerous content thrive on its platform, however, just because the company doesn’t want to ruffle the feathers of prominent figures. YouTube’s executives keep feeding us excuses and apologies, and it’s exhausting. Wikipedia entries are not a solution, and it’s time YouTube started looking into permanent fixes for its scariest problem.

Sign up for the newsletter Sign up for Patch Notes

A weekly roundup of the best things from Polygon