/cdn.vox-cdn.com/uploads/chorus_image/image/58223637/14915318_10155148305236754_7471955098066766739_n.0.png)
No one really knows what YouTube is, and Google seems to be pretty happy allowing advertising dollars to define what is and isn’t allowed on the platform. YouTube isn’t a television channel, nor is it an editorial outlet with rigorously enforced rules or standards. While the troublemakers get the headlines, the utility and value of what YouTube offers is often completely overlooked in the conversation about what Google does or doesn’t need to do to clean up its own act.
It’s this tension that makes using YouTube in a responsible manner so ridiculous. YouTube is probably one of the best educational tools created in the last century. In the past month I’ve used it to learn where find the battery in my car — in the wheel well, oddly enough — and to teach myself the basics of film editing. I used YouTube to learn how to get the most out of an inexpensive drum machine I picked up a few days ago. Want to learn how to produce music in general? YouTube has you covered.
I’ve tackled plumbing projects and hung pictures using tricks and techniques I learned on YouTube. And many of the videos in question were made by smaller channels — by people who are sharing things they’ve learned or taught themselves out of a passion for their subject, not as a way to rack up views and make millions of dollars. No matter your hobby or project, YouTube has a video that will help you along. I’m a huge fan, despite the negative press the platform has gotten in the past year.
But that bad publicity isn’t the case of media that’s hostile to YouTube or influencers or anyone else. Google’s algorithm for related content seems to edge you into hostile territory, no matter what you watch or search for.
I don’t know how watching a few Star Wars videos has led YouTube to believe it should promote videos of people “owning” feminists in my feed. I don’t know why Google thinks that YouTube should show me an equal number of videos of people viciously attacking a person versus an idea I’m trying to learn more about.
No matter what you search for, YouTube seems happy to promote a video where some angry guy is explaining why women are ruining that particular thing. I have no idea why YouTube appears to prioritize videos about people reacting to a trailer ahead of the trailer itself, even when I search for it using specific words and the name of the studio. The algorithm is willing to shrug while suggesting that all the videos it’s offering are equally valuable. It’s a rabbit hole of extremism, and Google seems to highlight videos that move you from a reasonable position to one of anger or hatred. It’s bizarre.
“I wanted to write something about one of [Donald Trump’s] rallies, so I watched it a few times on YouTube,” techno-sociologist Zeynep Tufekci said during a TED presentation about social media last September. “YouTube started recommending to me and autoplaying to me white supremacist videos in increasing order of extremism. If I watched one, it served up one even more extreme and autoplayed that one, too.”
“Now, YouTube’s algorithm is proprietary, but here’s what I think is going on,” she continued. “The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads.”
The algorithm is, to put it bluntly, complete garbage that treats all attention as equal, and seems to be easily manipulated by people who want to serve you hateful or willfully misleading content. We’ve moved from the degrees of Kevin Bacon to figuring out how quickly YouTube can take us from a video about making music to something that’s racially or politically inflammatory.
There is gold on YouTube, and thoughtful use of the platform can make it an amazing tool to teach yourself or others skills or concepts that come across well in video. YouTube contains huge swaths of information, but no one at Google seems particularly concerned with policing the algorithm to allow you to watch things that are helpful without being sucked into extremism. A tutorial is given the same weight as a polemic — or possibly even less weight, since people will sit and watch rants more readily than see someone explain how to do something. And attention, not the videos themselves, is the service’s product.
Influencers and YouTube personalities make videos about themselves and for each other to try to keep that attention bubble from bursting. But from the outside, it looks like a scene that is impenetrable by its inward-facing nature. Content creators want the value of a huge audience without any responsibility attached, and they often want Google to sell their ads without complaint, no matter the content.
Many of these personalities fooled themselves into thinking they were going independent, when really, they joined the gig economy to work for Google — without any promise of regard, benefits or say in how their content could be used or whether it would be monetized at all. The Paul brothers, for all their bluster, are more or less incredibly well-paid Uber drivers. And they’re getting all the press, while the rules seem to be bent in their favor.
The stars of YouTube for most of us, however, are the smaller channels releasing helpful content that makes very little money, but helps us learn or do something. And that’s a shame. Google could turn YouTube into the democratizing force for education and sharing of ideas that it seemed destined to become in its early days — if it paid attention to its algorithm or valued user experience over advertising dollars, at least in some limited way.
I want to learn how to fly a helicopter, for god’s sake, and I have no clue why that keeps leading me to videos about how women shouldn’t be Ghostbusters. But here we are.