On Sunday, two of Pokémon Go’s biggest personalities woke up to find their YouTube channels gone. Both Brandon “Mystic7” Martyn and Nick Oyzon of Trainer Tips received content strikes from the platform due to what YouTube called “sexual content,” a violation that resulted in the termination of their accounts. Neither channel contained sexual content, and both were quickly returned to service.
“Waking up to find out that my channel has been terminated for violating guidelines and no information beyond that from @TeamYouTube,” Oyzon tweeted Sunday morning. “Would be great if someone could tell me what my family friendly Pokémon channel did wrong.”
Oyzon couldn’t get back into his Gmail account at that time for more information about the flag; Google also closed that down. But Martyn, who received nearly identical emails along with his channel ban, took to his other channel to explain why YouTube had flagged him.
In a video about losing his channel, Martyn explained that YouTube first issued him a warning about a single, seemingly innocuous video. This confused him, as his content is typically appropriate for all ages.
“But then it happened again to a separate video,” he said, “[and] ‘sexual content’ is what I got these strikes for.” He was accompanied on screen by the emails that YouTube sent him. One of them reads:
As you may know, our Community Guidelines describe which content we allow — and don’t allow — on YouTube. Your video ‘HIGHEST CP POKEMON YET IN POKEMON GO! Wild Dragonite! How Much CP Will it Be?’ was flagged to us for review. Upon review, we’ve determined that it violates our guidelines and we’ve removed it from YouTube.
Not only did YouTube claim that Martyn’s videos included “nudity or sexually provocative content,” but also that the offending content explicitly involved minors. Neither statement was true.
Martyn and Oyzon contacted YouTube to argue against their terminations, and YouTube reinstated their accounts before the end of the day. A representative for the service told Polygon that the YouTubers were taken down “mistakenly.”
“With the massive volume of videos on our site, sometimes we make the wrong call,” a YouTube rep said in a statement. “When it’s brought to our attention that a video or channel has been removed mistakenly, we act quickly to reinstate it. We give uploaders the ability to appeal these decisions and we will re-review the videos.”
What led YouTube to make such a wrong call in the first place, terminating two major users whose content primarily focuses on a family-friendly game? It has to do with the titles of their videos.
Each of the videos involved included the term “CP” in the title. While CP is a common term in the Pokémon Go community, referring to a Pokémon’s “combat power,” it has a completely different meaning in other circles. CP is a common shorthand for “child pornography,” which YouTube has working to stop from spreading across the platform with its algorithmic reporting systems. The changes have been in the making since disturbing events like a YouTuber’s arrest on child pornography charges in 2017 and the widely reported, seedy underbelly of kids’ YouTube content overall.
In a video postmortem uploaded after his channel was reinstated, Oyzon called out YouTube for how the reporting algorithm lacks context. He described it as a machine-learned security system that needs better controls.
“If you’re this algorithm, and you see a video with ‘how to get’ and ‘CP’ in the title, well then, yeah, maybe the algorithm thinks it’s done a good job and that my channel needs to be shut down entirely,” Oyzon said.
He hammered home that “at no point does it seem like any human watched the video or read the title of the video or skimmed through the video to verify what this algorithm proposed or claimed,” referring to his reported video, a 2016 tutorial on catching high-CP Pokémon. “They just left it to the AI the algorithm to decide.”
Right after YouTube brought his channel back, Oyzon tweeted a similar message:
In case anyone at @TeamYouTube is taking notes on today's mishap, CP stands for Combat Points. I'm on board with fighting back against inappropriate content, but your algorithm needs a lesson in CONTEXT.— Nick // Trainer Tips (@trnrtips) February 17, 2019
Also, just to reiterate, MANUAL REVIEW BY A HUMAN BEFORE TERMINATION pic.twitter.com/qHLP5GGe9J
His video does go into detail about the proliferation of sexually exploitative content featuring kids on YouTube of late. Oyzon linked to an in-depth video explainer on the subject, created by user MattsWhatItIs, which has accrued nearly a million views since it went live yesterday.
According to its community guidelines, YouTube uses “a combination of automated systems and human flagging” in order to identify and remove inappropriate content involving minors. Since 2017, the platform’s increased efforts against sexually explicit material featuring minors, or pedophiles interacting with videos featuring or uploaded by children, has left its content blockers on high alert.
Perhaps that system just needs more fine-tuning.
Either way, following their terminations Oyzon and Martyn both took aim at YouTube for threatening the careers of legitimate content creators over trigger-happy, automated attempts at curbing a more serious problem.
“In the process of fighting back against [sexually explicit content,] channels like mine and creators whose entire livelihoods rely on uploading content to YouTube have had their channels terminated, because some sick people are watching videos of children,” said Oyzon.
Martyn sees this incident as just the latest in a pattern of YouTube’s trials and errors in protecting its users — without harming creators.
“They’ve been really not doing good, and this is a beautiful example of that,” he said in his video. “This is people’s livelihoods. People survive off of YouTube — this is our full-time job.”