clock menu more-arrow no yes mobile
Ellie looks at herself in the mirror in The Last of Us Part 2 Image: Naughty Dog/Sony Interactive Entertainment via Polygon

Filed under:

The Last of Us 2 epitomizes one of gaming’s longest debates

Oh no, I broke the blood oath and wrote about ‘ludonarrative dissonance’

Chris Plante co-founded Polygon in 2012 and is now editor-in-chief. He co-hosts The Besties, is a board member of the Frida Cinema, and created NYU’s first games journalism course.

The Last of Us Part 2 punctuates one of the longest, strangest debates in video games: the 13-year discussion of ludonarrative dissonance.

The term references the disconnect between what players do in a video game (ludo is Latin for “play”) and the story that the game tells (narrative). People were discussing this idea under different terms long before the phrase exploded in 2007, in part due to an oft-cited blog post by game designer Clint Hocking that used the phrase. After that, big-budget video games collectively calcified around its central dilemma. This was the year BioShock and Uncharted debuted. Critics cited games like these as evidence that the medium was “growing up,” while also acknowledging that ludonarrative dissonance was the messy side effect of this flat-footed quest for maturity.

The phrase became a buzzword, appearing in game developer panels and debated in video game writer listservs. Like so many academic terms, it spiraled onto social media, losing its context, becoming a quick insult for violent games that aspired to be high art but fell short. But the core dilemma — How do game makers marry story and play? Should they even try? — never went away.

Big-budget video game studios of the late 2000s wanted to tell serious, adult, and human stories. You know, the types of stories that appear in award-winning films and books. But they were still making games with the dominant “verb” of that generation and this one: shoot.

A huge PC gaming workstation in The Last of Us Part 2 Image: Naughty Dog/Sony Interactive Entertainment via Polygon

In video games, shooting stuff has been a beloved dopamine hit for nearly four decades. Point at something, pull the trigger, and watch that something explode, dematerialize, or ragdoll down a flight of stairs. Shoot and kill. Cause and effect reduced to its simplest form.

Early 3D first-person shooters, from Doom and Rise of the Triad to Unreal and GoldenEye 007, found tremendous success. Because the majority of the best game designers made shooters, the genre rapidly improved, getting AAA video games stuck into a self-fulfilling loop. Shooters became the most polished games, so they sold better, thus publishers greenlit more and better shooters, which sold better.

In the 1990s and 2000s, game publishers built all sorts of shooters. First-person. Third-person. Shoot-’em-ups. Shooters with campaigns and multiplayer. Hell, even puzzle games got guns. But by 2007, critics began to express something like shooter fatigue. A majority of gaming publications in that year awarded game of the year not to BioShock, Portal, Modern Warfare, or Mass Effect. They gave the honor to Super Mario Galaxy.

Despite the unprecedented success of the shooter genre, the creators of shooters seemed similarly burned out. They started telling serious stories about complicated heroes and heroines, stories that ignored the fact that the protagonist had slaughtered hundreds of people along the way.

And that’s why, in 2007, game critics could not stop talking about ludonarrative dissonance.

Uncharted: Drake’s Fortune Chapter 13 collectible treasure locations
Nathan Drake holding an assault rifle in Uncharted: Drake’s Fortune.
Image: Naughty Dog/Sony Interactive Entertainment via Polygon

The Last of Us Part 2’s studio helped launch the debate

Uncharted: Drake’s Fortune, created by Naughty Dog — the same studio that went on to develop The Last of Us Part 2 — became one of the poster children of this bizarre moment, in which games aspired for bigger things while still bearing violent albatrosses of the shooter genre. Its protagonist, Nathan Drake, is a lovable, goofy treasure hunter. Except, the very first moment we meet him, he reveals his other talent: cold-blooded killing.

Drake and his companion, journalist Elena Fisher, uncover a treasure in the middle of the ocean, only to be immediately surrounded by a fleet of pirate boats. Elena suggests they contact the authorities. Drake explains they’re searching for treasure illegally. So he pulls out his old friend: a big-ass handgun. He hands a bonus gun to Elena, who has never handled a gun but, coincidentally, is a great shot.

The first time we control Drake, it’s to slaughter a couple dozen humans, setting the tone for the entire series. We encounter a dissonance between the story — fun treasure hunters — and the gameplay — white guy who travels to foreign lands and wholesale slaughters dozens, if not hundreds, of humans who stand in his way. (This actually aligns with the real history of treasure hunting, but the game never digs into that.)

The contrast between Drake the treasure hunter and Drake the serial killer was so stark that it became something more than a punchline. It was a word of warning. For a beat, game creators across the spectrum, from indie to AAA, appeared to have correctly diagnosed the problem. To tell adult stories, they’d need more and better verbs. The action would need to better align with the story.

After hundreds of blogs, Twitter threads, and essays published in what remained of game magazines at the time, critics tacitly agreed to never mention the words “ludonarrative dissonance” again, but here I am, breaking the blood oath. The challenge of ludonarrative dissonance never went away; it just shifted from something critics discussed into a riddle many developers are still attempting to solve.

Some indie game makers cut violent actions from their games altogether, leading to a spate of “walking simulators” like Dear Esther and Proteus, first-person games more interested in the space around the player rather than what they do inside that space. Designers who had worked on the BioShock series left to create Gone Home and The Blackout Club, a pair of games that retained the tension and mystery of their AAA predecessors, while showing the ways in which stories could bloom when guns got cut from the equation.

But for AAA studios, the allure of violence and its financial security was irresistible. At the end of the day, publishers decide which games get greenlit, and they answer to a board that expects profits. Guns make money.

Soldiers walk past a giant pair of eyes in Spec Ops: The Line
A scene from Spec Ops: The Line.
Image: Yager Development/2K Games

Should violent video games narratively justify their obsession with violence?

In the early 2010s, big-budget games, unable to move the action closer to the story, moved the story closer to the action. In other words, game designers made “mature” games about violence. Games like Spec Ops: The Line forced us to commit battlefield atrocities, like dropping white phosphorus on civilians, and then wagged their finger at us for ... playing the game they designed? After the credits rolled, we could play a multiplayer mode that let us commit all the murder we wanted with none of the cutscene-induced guilt. Indie games took their shot at this too, most notably the Hotline Miami series.

Some of these games did a fine job highlighting the medium’s fetishization of violence. Plenty of others mistook moral ambiguity for profundity. Big-budget video game storytelling was largely treading water by this point, being produced, in part, by designers who wanted to create art but were paid to make hyper-realistic machine guns — and also in part by people who just wanted to make badass kill animations and not worry about a big message. As games grew, so did teams, and suddenly squads of hundreds (even thousands) of people were creating games, many of them with conflicting ideas of what those games should be.

As a result, these self-aware violent video games still never fully aligned the action with the narrative. Which is to say, despite all the hand-wringing, these games were first and foremost “fun,” the gameplay still emphasizing the pleasure of pointing at a target and spewing hot lead.

Ellie kneels over a dead body in The Last of Us Part 2 Image: Naughty Dog/Sony Interactive Entertainment via Polygon

The Last of Us Part 2 is the culmination of this decade of big-budget games interrogating dissonance. The developers at Naughty Dog, the creators of Uncharted, have finally bridged the gap between story and action, dragging the story kicking and screaming and gurgling on its own blood to align with what you actually do in their games: kill people. The result is surreal, an expensive narrative experiment depicting what would actually happen if a real human being behaved like a video game character.

You play as Ellie, a young woman on a quest for revenge in a post-apocalyptic Seattle. The creators imagine a dystopian America in which survivors have divided into warring factions, each convinced its side is the good guys, each willing to commit horrendous acts of violence to protect itself. As Ellie eviscerates dozens of humans who cry for the help of a friend or beg for mercy, the story reveals these people aren’t as bad as Ellie once thought — that their motives are just as valid and complicated as her own.

Ellie can’t change. Not because this is Greek tragedy. It really isn’t. I say that as a compliment! Storytelling has dramatically improved since Aristotle scribbled down the Poetics, and The Last of Us Part 2’s writers begin the game with a handful of appealing threads about generational divides (made literal by the gap between those who lived for decades before the apocalypse and those who were just kids when the world changed) and the choice to build a family in a time of unknowable danger. These stories are the stories we need right now, and for a moment, it seems Ellie might just grow up and live a life that isn’t centered around heavy weaponry. But whenever The Last of Us Part 2 starts to be about something bigger, that thread is flattened by its relentless, suffocating violence.

So no, Ellie can’t change. She can’t change because AAA games can’t change. Let’s say Ellie learns her lesson, that violence begets violence. That to save the world and herself, she must put down the gun. What would she even do? Literally, what would a AAA game even allow for her to do? AAA game design is built and marketed around killing. So I suppose Ellie would shift from killing humans to something more morally simple, such as killing the zombie-like baddies that lurch about her world — which, while less morally mucky, is no less predictable.

An abandoned PlayStation Vita lies on a cement floor in The Last of Us Part 2 Image: Naughty Dog/Sony Interactive Entertainment via Polygon

Thirteen years ago, critics and designers imagined games would no longer have ludonarrative dissonance, that the stories video games want to tell would align with the actions they demand we commit. But if this is the result, then you know what? I’m cool with dissonance. I’ll take violent games that strive for fun and don’t pitch any greater meaning, rather than violent games that seek to justify their violence. I don’t need more stories asking me why I love to kill things in video games, because the answer is simple: It’s what publishers sell me. What I want most, and what The Last of Us Part 2 attempts to be in brief moments, are games without violence. Do the creators truly believe their story captures how people would behave, that we’re all a catastrophe away from forming tribal murder squads? Or do we keep getting stories like this because it’s what the video games, as we understand them, allow? Until we have an abundance of AAA games that don’t hinge on violence, we can’t know for certain.

The Last of Us Part 2 suggests violence is inevitable. Sadly, that appears to be true in AAA video games.