Back in 1996, at age 10, I played a computer game at a friend’s house called Spycraft: The Great Game. In the game, you play as a CIA operative investigating an assassination plot; to mislead a suspect during an interrogation, you have the option to doctor a photograph. The process blew my 10-year-old mind — so much so that I’ve remembered how powerful that minigame felt, all these years. Although it was blurry and pixelated, the photo editor that appeared in Spycraft was a bit like what Adobe Photoshop would one day become. In 1996, it felt like the stuff of high-tech espionage and trickery. In 2023, it’s utterly mundane. It isn’t difficult or expensive to alter a photograph — not anymore. Anyone can do it, and as a result, we have all come to accept that we cannot trust any image we see.
Deepfake technology has already proven that we can’t trust video or audio recordings, either. And the prevalence of generative artificial intelligence has only made creating such deepfakes easier. We all need to get used to this new reality — and fast.
Genna Bain, the wife of the now-deceased YouTuber John “TotalBiscuit” Bain, posted on Twitter last week about a new concern she faces thanks to the advancements of AI tech: “Today was fun. Being faced with making a choice of scrubbing all of my late husband’s lifetime of content from the internet. Apparently people think it’s okay to use his library to train voice AIs to promote their social commentary and political views.” In response, she received sympathy and pleas from her husband’s fans to preserve his online legacy.
But here’s the problem. There’s no practical way that Genna Bain, or anyone else in her position, could adequately prevent anyone from creating a deepfake video or audio clip of John Bain. Only a few minutes of audio are necessary to train an AI to mimic a voice; for a video deepfake, you mainly need footage of multiple facial expressions and angles. So, if you wanted to prevent yourself from ever appearing in a deepfake, you’d need to delete every single visual and auditory record of your existence, which for anyone who uses a smartphone is so close to impossible that it may as well be impossible. That’s even more true for a public figure like Bain, who guested on shows and podcasts that his wife doesn’t necessarily have the ability to remove, and whose face and voice have also already been saved forever on the hard drives of his fans around the world.
In the 1990s and 2000s, Photoshop made it possible for people to paste celebrities’ faces onto other people’s naked bodies, and in 2018, the public learned about how AI tech could be used to make video pornography that appeared to depict celebrities. Since then, the tech has only become more and more accessible. Googling “free deepfake app” will deliver tons of options for editing software. In 2023, this tech is probably still used to make porn of celebrities, just as it was back in 2018, but people nowadays are also using it to make celebrities say goofy shit. The internet has always run on porn, but it also runs on memes, so this tracks.
If you become famous enough, you will be dehumanized and objectified in this fashion, and your own fans will be surprised and confused if you push back against it. You also won’t be able to stop it. But this isn’t an article where I try to convince people to feel sorry for famous people. (That is also a losing battle, albeit one that I try to fight sometimes.) This is, instead, an article where I try to convince people not to trust the video and audio that they see and hear.
It took a long time for all of us to get used to the existence of Photoshop. Seeing an image that has been faked in a clever way can still mislead an intelligent, reasonable person into believing something that’s not true. It’s human nature to want to believe in something that looks real; after all, seeing is believing, right? All that said, I lived through the rise of Photoshop, and that’s why I believe we will adjust to this, too.
I don’t know what the future looks like, or what types of regulations we’ll need in order to address this situation. But I do know one thing: It is already here. We live in a world where this type of fakery is not only possible, but ludicrously easy. We now have to accept that it’s here and move forward into a reality where our skepticism expands to include even more types of trickery.
But hey, at least the memes are going to be great.