On the basement level of the Anaheim Convention Center, sequestered away from creators, screaming fans and kids vlogging, parents sit in a quieter, air-conditioned area dedicated to creating a quiet space. They may not be taking in VidCon’s festivities, but they’re all talking about YouTube.
There are parents sitting on couches, talking with friends and newcomers to the area, while other people set up mini-offices with laptops and large coffee mugs. Other parents are receiving back and neck massages in one corner of the lounge. It’s tranquil — the exact opposite of the energetic chaos ruling VidCon. These are parents tuned into the YouTubers that their kids, many of whom are teenagers, devote their time to watching. They know and understand YouTube, and that means acknowledging that YouTube isn’t perfect.
“I am concerned, but I’m also a pretty active parent,” Wade, one dad who was using his time at VidCon to work while his 15-year-old daughter went to panels, told Polygon. “I’m watching what she watches and we talk about it. The thing I’m more concerned about isn’t what she’s watching, but she does watch a lot of content. I’m more concerned about how much time she’s on the computer watching YouTube. She keeps up her grades and everything like that, but I think in the last three months she’s figured out she has to curb her usage sometimes.”
Wade is like a lot of other parents in the room. He knows that YouTube has content issues. He keeps up with the news. Reports over the past year have found that YouTube’s algorithm can promote disturbing content targeting children, conspiracy theories and terrorist videos, among other problematic subject matter. Wade said he trusts his daughter to be vigilant of what she’s watching and, if something does pop up, they talk about it.
It’s a similar parenting style that other people echoed in interviews with Polygon.
“I was thinking about this exact topic this morning as we were heading over to VidCon,” Becky, a mother of two, said. “I’m a big news watcher. You hear, ‘Oh, you have to control what your kids watch.’ That’s really hard to do. Especially teens. You can say to them, ‘I want to see everything you’re watching.’ I’d be spending more hours in a day trying to figure all of this out.”
Becky is well-aware of YouTube’s algorithm problems, but said she needs to trust that her children, like her 15-year-old daughter who came to VidCon, know what’s right from wrong. Becky said she could execute a little more control when her kids were younger, but at some point, they understand the platform better then she and her husband ever could. It comes down to trusting that her kids will bring up disturbing content, and ask questions about what they’re watching, if they encounter it. The other factor that Becky likes to really encourage with her daughter is supporting positive creators.
“I’m hoping the ones she’s attracted to are doing appropriate things,” Becky said. “She’s not attracted to the YouTubers who are doing bad things.”
There are some parents, however, like Julie, who want safety concerns addressed better by YouTube.
“I do have concerns, I really do,” Julie said. “I don’t want her to stumble onto videos that are destructive and negative — and I know it’s going to happen. We try to talk about this stuff and she knows what she’s supposed to stay away from. I’m not sure what YouTube could be doing better, but there’s probably something.”
Many of the parents Polygon spoke to noted that their children are teenagers and they can’t control what their kids watch. They did, however, speak to general concerns that YouTube is a breeding ground for any type of content, adding that it’s worrying not to know what their kids may stumble upon.
It’s a subject that YouTube has addressed in the past. CEO Susan Wojcicki told a group of brands and journalists gathered at Radio City Music Hall last month that, “With openness comes challenges, as some have tried to take advantage of our services.”
“It’s incredibly important to me and everyone at YouTube that we grow responsibly,” Wojcicki said. “There’s not a playbook for how open platforms operate at our scale.”
The company is in the process of hiring an additional 10,000 employees dedicated to creating a better moderation and safety team in an attempt to help fix these problems.