clock menu more-arrow no yes mobile
Ross Miller/Polygon

Filed under:

YouTube Kids has been a problem since 2015 — why did it take this long to address?

‘It’s not a safe place for children to explore’

If you buy something from a Polygon link, Vox Media may earn a commission. See our ethics statement.

In the last few weeks, the world has learned through a number of reports that YouTube is plagued by problems with children’s content. The company has ramped up moderation in recent weeks to fight the wave of inappropriate content, but this isn’t the first time YouTube has been in this position.

When did it start?

On Feb. 23, 2015, YouTube announced YouTube Kids, a stand-alone app built for children and child-appropriate entertainment. The idea was to make YouTube a safer platform for parents, who didn’t want their children using the main site unsupervised. The initial blog post about the Kids app mentions that “parents can rest a little easier knowing that videos in the YouTube Kids app are narrowed down to content appropriate for kids.”

Parental controls, including giving parents the ability to remove the search option from the app, giving their children access to “just the pre-selected videos available on the home screen” were also included. The Kids App, according to Shimrit Ben-Yair, YouTube Kids Group’s product manager, marked the “first step toward reimagining YouTube for families.”

Less than two months later, in May 2015, the Campaign for a Commercial-Free Childhood, a coalition of children’s and consumers advocacy groups, complained to the Federal Trade Commission (FTC) about content they called “not only ... disturbing for young children to view, but potentially harmful.”

Using popular characters like Frozen’s Elsa and Spider-Man, YouTubers are able to lure children into offensive videos featuring their favorite characters. While at first these videos seem normal, they soon lead to those same Disney princesses and superheroes participating in lewd or violent acts. YouTube’s search algorithm makes it easy for children to fall into gruesome playlist traps full of this kind of content, as users name their videos and use thumbnails that can get around YouTube’s algorithm, ensuring the content seems safe for kids.

The report lists a number of issues Golin and other members of advocacy teams discovered in the YouTube Kids app early on. These include:

  • Explicit sexual language presented amidst cartoon animation;
  • A profanity-laced parody of the film Casino featuring Bert and Ernie from Sesame Street;
  • Graphic adult discussions about family violence, pornography and child suicide;
  • Jokes about pedophilia and drug use;
  • Modeling of unsafe behaviors such as playing with lit matches.

A YouTube representative told the San Jose Mercury News after the complaint was filed that, when the company was working on the YouTube Kids app, it “consulted with numerous partners and child advocacy and privacy groups,” adding that YouTube is “always open to feedback on ways to improve the app.”

But not much changed. A report from The Guardian in June 2016, pointed out that the third most popular channel on YouTube at the time was “Webs & Tiaras,” which curated content that was targeted to children. The channel, according to reports, starred an assortment of adults in superhero costumes or princess attire performing more mature acts.

The channel’s content was questionable, but mostly understood to be acceptable. Other bad actors who wanted to piggyback off the success of the channel began posting similar content but with sexual imagery and disturbing content. In 2016, Phil Ranta, a spokesperson for the channel told The Verge it wasn’t surprising this was happening.

“I think it’s natural that when something is as big as this [new genre] is, and they see people making millions of dollars a year, they will try almost everything: go cleaner, adding dialog, go sexier, or crazier,” Ranta said. “They kind of just need to exhaust those measures before they realize if they can stay in this game.”

One channel in particular, Webs & Tiaras — Toy Monster was using the Webs & Tiaras name to spread disturbing content under a trending association. Many of those videos have been deleted, but some still remain on YouTube. The original Webs & Tiaras channel was removed before this article was written.

YouTube is finally addressing the issue on its main site, making changes to the way it moderates content. YouTube CEO Susan Wojcicki said the company was expanding its moderation corps to more than 10,000 contractors in 2018, focusing them on “content that might violate our policies.”

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” she wrote in a recent blog post.

Almost three years later, YouTube has responded to a number of concerns raised by parents and media critics about content — both on the main site and in YouTube’s stand-alone Kids app — they find disturbing, glorifying violence or obscene. The company issued the following statement to Polygon:

Content that misleads or endangers children is unacceptable to us. We have clear policies against these videos and we enforce them aggressively. We use a combination of machine learning, algorithms and community flagging to determine content in the YouTube Kids app. The YouTube team is made up of parents who care deeply about this, and are committed to making the app better every day.

But questions regarding content on its independent Kids app — including whether future content will be curated, or if the app will feature fewer videos to allow for more human-led moderation — remain mostly unanswered.

A YouTube representative told Polygon that only five-thousandths (0.005) of a percent of content on YouTube Kids is considered disturbing and against the company’s policies, adding that once content is reported, the company takes strict action to remove videos and, in serious cases, entire channels from the app.

“YouTube is marketing this as a safe place for children to explore but it’s not a safe place for children to explore,” Campaign for a Commercial-Free Childhood director Josh Golin told Polygon. “We were really the first ones to raise this issue, and this is going back two-and-a-half years. What we’ve found in the interim [is] — it’s like Whac-A-Mole — so we pointed out videos as we saw them. Every video we named in our complaint came off [the app], but of course there were more.

“It’s a terrible way to build an app for children,” he said.

In the May 2015 complaint to the FTC on May 19, 2015, Golin pointed out how YouTube’s search algorithm could be exploited even within a supposedly safe environment for children.

As users of YouTube Kids search for material, the app begins to recommend similar videos, as the “Recommended” function on YouTube Kids is apparently based on “Search” history. When we were conducting our review, YouTube Kids actually began recommending videos about wine tasting on its app for preschoolers, as the screen shot below indicates. Thus, the more inappropriate videos children search for, the more inappropriate videos they will be shown via the app’s “Recommended” function.

Golin and his colleagues weren’t the only people who noticed that content targeted toward children on the YouTube Kids app and main site were problematic. In the past couple of years, multiple parent groups have sprung up on Facebook with the intent of learning how to navigate, flag and curate a safe experience for their children.

The mission statement of one Facebook group, “Parents Navigating YouTube,” states the group was created to:

Help white list YouTube content that is safe for our kids to watch without parental guidance. This will cover adult content but also objectionable content like white supremacy, sexism and things of that nature. We will also discuss problematic YouTube content so that we know what's out there and can be prepared to discuss it with our kids.

With little support from YouTube, these groups often act as volunteer watchdogs, going through the worst of YouTube and flagging it. A YouTube representative told Polygon that despite its own machine-learning algorithm being improved daily — learning what content is unacceptable for children — the team does rely on flags from parents to help address problematic videos and channels.

At a time when parents were working together on creating a new, improved system to keep an eye on what children see, YouTube was focused on other aspects of the app. The company chose to highlight other areas of the YouTube Kids app in 2016, including the fact that it could be viewed on the Apple TV and was compatible with YouTube Red.

It wasn’t until this year, that YouTube began to invest heavily in preventative measures.

First and foremost, moderation.

Changes being made

Although YouTube has maintained that its main site is being exploited by bad actors, not the YouTube Kids App specifically, it’s still impossible for YouTube to guarantee that the Kids App is 100 percent safe. The company told USA Today that parents who want to be sure their children aren’t stumbling upon disturbing content, the options for “recommended videos” should be switched off.

“Sometimes your child may find content in the app that you may not want them to watch,” a YouTube representative told USA Today.

Malik Ducard, YouTube’s global head of family and children’s content, told the New York Times these types of videos were “the extreme needle in the haystack,” and pointed to the algorithm’s machine learning and lack of oversight for reasons these videos may have slipped through. Ducard also said YouTube Kids did not serve a curated experience, meaning parents were responsible for controlling what their children watch.

When restriction mode is on, which gives parents the ability to disable the search function and prevent additional videos from the main site flooding the app, it’s difficult for children to switch back to a more open, free roaming setting. YouTube also makes it clear once parents sign up for the app that, despite the company’s best efforts, disturbing content created by bad actors may appear.

From YouTube’s perspective, perfect moderation is impossible. 400 hours of video are uploaded every minute.

But for Golin, current efforts aren’t enough. Golin told Polygon that it’s irresponsible for YouTube to treat its algorithm as a “big fishing net,” assuming that the algorithm will catch every bad video meant to exploit it.

“The entire premise of YouTube Kids’ app is wrong if you’re worried about child safety,” Golin said. “You can’t have an app that has millions and millions of videos on it, but that’s okay. Children don’t need an app with millions and millions of videos on it. They don’t need 20,000 videos of eggs on one app. What they need is a place where content has been vetted and safe.

“From a child standpoint, the problem is not fixable,” Golin said. “The YouTube model has created something, which is so vast, but there are 400 hours of content are uploaded every minute. It’s simply too big. People have been raising these issues for years, just visit any parenting forum and they’ve been talking about the fake Peppa Pig videos. It was only after the Medium piece went viral that YouTube started to take any proactive steps. To be clear, they took steps because advertisers were concerned, not parents.”

YouTube already went through an “adpocalypse,” in which big advertisers pulled out of the platform after finding their ads attached to videos filled with hateful content. So the company wants to avoid anything that would cause others to leave.

Part of YouTube’s plan is to increase human moderation and tweak its algorithm, “training machine-learning technology across other challenging content areas, including child safety and hate speech.” YouTube will also cut down on channels that receive monetization and advertisements attached to these videos. Since YouTube Kids also includes ads — many of which, Golin says, aren’t child appropriate — this will affect channels and videos on the platform.

What’s next for YouTube Kids?

YouTube Kids is a moneymaker; YouTube wouldn’t tell Polygon how much, exactly, but it does sell ads against the videos. Whatever it is, Golin thinks it’s enough that YouTube has no incentive to change its app. YouTube declined to comment when asked whether the company was going to curate its content and restrict the number of videos going forward.

“Their goal is to make money and unless there is enough of an outcry and there’s continued pressure on them, we’re going to see the same problems,” Golin said. “I don’t know that the problems are fixable. It would be great if YouTube came to the realization that these problems were fixable and made it clear [that if the company is not curating content] this is for adults who want to watch videos. I don’t have a lot of faith that they will get their on their own.”

A YouTube representative told Polygon that despite reports, the majority of the problem lies on the main site, which it will spend a large portion of its time addressing in the coming year. The representative said that expended changes to its current policy were put in place to discourage inappropriate content targeting families on the main app and site; by doing this, a representative confirmed, it’s supposed to ensure age-gated content (flagged for an audience of 18+) doesn't appear on YouTube Kids.

A YouTube representative also confirmed that content which is flagged on the main YouTube site is not supposed to appear on the Kids app. If a video does make its way to the app, the representative confirmed a secondary screening takes place, adding that a team is in place to moderate new videos that are flagged on the app at all times.

Questions are still being raised by parent groups and watch dog organizations, like ElsaGate on Reddit and Discord, which keep an eye on nefarious channels or videos that are getting through YouTube’s system about what’s next. A YouTube representative could not provide any more details at the time of writing.

With critics calling out YouTube for its slow response — among them News Corp CEO Robert Thompson, who called YouTube a “toxic waste dump,” — the question now is how immediately the problem is managed after the new year. Ducard and Wojcicki said the company is “working on ways to more effectively and proactively prevent this type of situation from occurring.”

Sign up for the newsletter Sign up for Patch Notes

A weekly roundup of the best things from Polygon