For about the past year, Ron Kerbs’ company has been working on a monitoring device that listens to online multiplayer chat and tries to offer a helpful heads-up to parents worried about their kids’ online behavior, or their safety. Kerbs understands he’s trying to thread a small needle: parents won’t find his solution, called Kidas, useful if it pings them every time someone says a bad word. So right now, there are some amusing false positives.
A couple of weeks ago, he says, a Kidas device in a test home picked up a troubling conversation among adolescents. No swears, but taunts and threats were exchanged. When this happens, a human analyst looks over the incident and determines whether the parents should be notified.
“The child was actually playing Bully,” Kerbs laughed. “We were basically alerted to a situation that happened in the game itself.” The player was on Xbox Live party chat as they were in the single-player game, and Kidas confused the game’s dialogue for a confrontation online.
These are the kind of misfires you’re bound to get when you’re building something as ambitious as Kerbs’ idea: a solution that participates in keeping kids safe and happy online, without wagging a finger at them or violating their privacy. It’s shown enough success, over a tight-knit beta in fewer than 100 Xbox households, to draw interest and support from big startup incubators, the latest being Comcast NBCUniversal’s Lift Labs. (Disclosure: NBCUniversal is a significant investor in Vox Media, which owns Polygon.)
Kidas, based in Philadelphia, is a 10-person startup taking up a cause that bigger players have embraced in smaller measures over the past year. Electronic Arts, this summer, introduced a new set of standards for both itself and its players, meant to create healthier online communities in its huge multiplayer games. The new PlayStation 5 has a feature that will record chat dialogue, to help moderators examine complaints of verbal abuse or other code-of-conduct violations. And just one week ago, Microsoft, Sony, and Nintendo co-signed a pledge to make video gaming safer, especially for their younger players.
Kidas’ technology is mainly for a parent, specifically a parent who doesn’t feel those companies’ statements are enough, nor are any appeals to the better angels of our online nature. Kerbs came to the idea for Kidas after his service in the Israeli military, where he developed threat detection solutions for the country’s Central Intelligence Unit. Originally, Kerbs told Polygon, Kidas intended to monitor social media traffic in general. But that was just too large of a scope to manage.
“We decided to switch from social media, to gaming, [where] voice conversations are very hard to monitor,” Kerbs said. “Based on the current technology, we knew from our past experience, that we have the tech skills to do that. We decided to target something that no one else has done before, and monitor those voice conversations in a very accurate way and alert parents about those cases.”
“When Ron first came into our program, Kidas wasn’t just about gaming,” said Danielle Cohn, Lift Labs’ vice president. “It was more of a general product. What he discovered was that there was a real need in gaming. […] Sometimes, when you create a product, and your startup, you’re trying to do too much at once. I think Ron is being smart about this, where he’s starting in gaming. And if he can go really deep in gaming, and perfect that and test it there, then it can have a wider use.”
When asked what makes Kidas go, Kerbs mentioned concepts like neural networks and machine learning. Obviously, it would need to rely on some very sophisticated and ever-changing code to differentiate between kids innocently trash-talking and something that would definitely bring a parent into the room — threats of violence, demeaning language, or solicitations from a predator.
Importantly, he doesn’t intend for the device to be a set-it-and-forget-it nanny in the playroom. Kidas as a product assumes an involved parent is having conversations with their child. And Kidas itself is giving them much more than a chat log of key words tripping their sensors. “We’re not going to alert parents every time someone is flaming someone else in a chat,” Kerbs said. “We are going to alert parents about repeated bullying, and cases in which their child is even targeted in bullying, and [if it’s] affecting him or her psychologically, and even changing their esteem because of that.”
Kerbs wouldn’t say how many households are in Kidas’ ongoing beta (only that they have a lengthy wait list). But the size can’t be large given the depth of interaction Kidas is relying on to shape their alerts and notices, so that parents get actually useful information. Cohn, for Lift Labs’ part, sees Kidas accomplishing that goal already. “What I’ve been most impressed with is the feedback that he’s received from some of the parents who have tried this,” she said. “It’s still in its early days, but it’s certainly an exciting space that is also going to bring some peace of mind.”
Kidas at present has some obvious limitations — it only works with an Xbox One or Xbox Series X, for starters. The dongle solution, even if it is built with off-the-shelf hardware, is also necessary because Kerbs and Kidas don’t have access to the closed environment of Microsoft’s Xbox Live network. Ideally, he says, one day they will, and Kidas will be a piece of software that runs on the console itself.
Just as important, Kerbs said, is presenting Kidas as a helper, not as something whose presence tells either the parent or the child video gaming is, on its face, a bad or dangerous thing.
“Our goal is not, tell [parents], ‘OK, he shouldn’t play Xbox anymore, or he shouldn’t play PlayStation anymore,’” said Kerbs, himself a fan who grew up loving video games. “Our goal is to provide them with tools to talk about this topic with their child, make him or her understand. ‘What are the right tactics to deal with those people?’”