clock menu more-arrow no yes

Filed under:

Cory Doctorow on his drive to inspire positive futures

‘Being hopeful means believing that when things break down, we can rebuild them.’

Cover to Cory Doctorow’s novel Homaland Image: Tor Books

If you buy something from a Polygon link, Vox Media may earn a commission. See our ethics statement.

What does the future hold? In our series “Imagining the Next Future,” Polygon explores the new era of science fiction — in movies, books, TV, games, and beyond — to see how storytellers and innovators are imagining the next 10, 20, 50, or 100 years during a moment of extreme uncertainty. Follow along as we deep dive into the great unknown.

Polygon’s Futures series has been a long process of considering our real and imaginary possibilities. From the ways VR has failed to take off to the ways fantasy sex robots have taken off instead, we’ve considered both real-world technology and the fiction that helps us imagine where it’s going. For a final word on the subject, we turned to author Cory Doctorow, whose work has always lived in the junction between real tech and the very real options it represents.

Doctorow has been one of the most relentless and inventive authors turning actual technology into near-future fiction, particularly with the 2008 novel Little Brother, about a teenage revolutionary named Marcus who starts an underground technological movement after the Department of Homeland Security kidnaps and tortures him following a terrorist attack he had nothing to do with. A sequel, Homeland, followed in 2013, and the brand-new novel Attack Surface joins the series from a different point of view — that of a young government hacker working against Marcus and believing she’s protecting America in the process. I spoke to Doctorow via phone to ask him the question that’s been preoccupying us since we first started discussing Futures week: What goes into imagining a positive but plausible future right now?

This interview has been edited for concision and clarity.

The cover of Cory Doctorow’s novel Attack Surface Image: Tor Books

We’ve been talking a lot at Polygon about whether it’s possible for science fiction to model a positive future. Your earliest science fiction books felt utopian, but your recent books, especially the Little Brother series, is much more cynical and concerned about America. Has the way you think about technology and the possibility for a positive future changed since your early books?

I don’t know, Down and Out in the Magic Kingdom is definitely a complicated utopia, because it supposes that a non-monetary mechanism for allocating resources would just become money again, right? That it would just turn into another unequal rich-get-richer society. So it is, in some ways, a critique of the utopian idea of reputation economics.

Walkaway is about utopianism, in the sense that it’s a book in which crises are weathered. One of the things I recognized when I went out on tour with that book and started talking to people about it is that utopianism is not the assumption that nothing will go wrong. Being an engineer who builds a system on the assumption that it won’t break doesn’t make you an optimist, it makes you an asshole. That’s the thing that makes you decide we don’t need lifeboats for the Titanic.

Instead, being hopeful and utopian means believing that when things break down, we can rebuild them. One of the things we’re living through right now is people acting as though we have lost, as a species, the ability to weather big global crises, like we want to build the pyramids with Egyptian technology or something. Like it’s the practice of a lost civilization that we will never recover. To be an optimist, or to be utopian, is to believe that we can rise to challenges.

Not that challenges will be vanquished once and for all — even if you built a stable system where everything worked well, that system would be subjected to exogenous shocks. Imagine we had a really, really good world where we were enacting the Green New Deal, and then the pandemic came along. The resilience we would have gotten from a more pluralistic society that was confronting disasters would have been good, but we couldn’t have just motored along with the same systems, configured the same way that they were. After the pandemic struck, we still would have had to rise to the challenge. There would have been fear, death, all those things. To be utopian is to believe that when the machine breaks, it will roll to a graceful stop, and we’ll get it started again, instead of believing every machine we’ve built is destined to tear itself apart and shower us with shrapnel when it breaks down.

How do you feel about the more classic idea of utopian fiction? Do you think it’s possible to model positive outcomes in science fiction and still have a dramatic story?

Oh, totally! I’m actually writing a much more utopian novel right now, a book called The Lost Cause, that’s set after a successful Green New Deal. And it’s about truth and reconciliation with white nationalist militias. Any society that is your utopia will be someone else’s dystopia. And one of the things we keep learning is that old grudges are very hard to settle, and old grievances don’t really go away.

I just got a note from my kids’ school, letting Armenian kids know, “If you need some time off for counseling to talk about the war with Azerbaijan, we’re here for you.” I grew up going to Purim parties, where they would commemorate the story of the Jews’ expulsion from Persia 5,000 years ago. And when they would mention the name of the vizier who masterminded this, we would all swing a noisemaker to drown out his name. This is a grudge that is 5,000 years old. So there is no reliable, enduring mechanism that we have found to resolve old grievances. What we can do is find ways to peacefully coexist despite our grievances.

But any disequilibrium risks new flare-ups. We thought Germans figured it out after World War II, with the prohibition on Nazi symbols, a ban on the publication of Mein Kampf for 50 years, and all of that stuff. And then as the austerity crisis hit Europe, and xenophobia started to rise, Nazis started marching in Bavaria again. The most dystopian thing about this novel I’m writing now is my own ambivalence about whether we ever will have reconciliation.

I have educated liberal friends from Toronto who are second-generation Turkish immigrants who deny the Armenian genocide. They got it in their mothers’ milk, and you can’t even talk to them about it. All the trauma we’re going through now, all the divisions, all the unforgivable, unforgettable things that have been said and done, are specters that will haunt us, and that we will struggle with for a long time to come, even after we resolve the conflicts they represent. We’ll never — or maybe we’ll never be shut of them. I don’t want to say never, because this is a question I’m wrestling with. Maybe we can.

Science fiction is one of our most imaginative routes for picturing and popularizing alternate futures. Is there a way for it to help?

The cover of Cory Doctorow’s Little Brother Image: Tor Books

Totally. I just wrote a column about this. Daniel Dennett talks about “intuition pumps,” which is when you have a little thought experiment that helps you think about what you should or would do if something were to happen. And then when it happens, you’ve got a framework or story for negotiating it and for living it. Fiction is an intuition pump. Fiction is a thought experiment, for good and ill.

One of the things about pulp writers like me is that our stories turn on plot, right? You can get a lot of plot out of some pretty terrible theories of human action. If your version of “man vs. nature” is actually “man vs. nature vs. man,” where the tsunami blows your house down and then your neighbors come over to eat you, then you get a twofer. You get so much plot to work with. But the actual experience of people who’ve lived through disasters is that they’re the times where we rise to the occasion. Disasters are humanity’s best moments, when we sacrifice ourselves for others.

And yet we all have this intuition that when the crisis strikes, your neighbor is coming over to take your stuff. When the pandemic hit and I walked down my local shopping street for an evening constitutional, I saw lines stretching around the block for the gun stores. I was just flabbergasted, and aghast. Clearly these guys don’t think they’re such great marksmen that they’re going to shoot the virus particles, right? So what are the guns for? The gun is to shoot their neighbors. That’s the only thing you would buy a gun for in a pandemic, the only thing a gun is good for in a pandemic.

So these guys have the conviction that their neighbors are coming for them, and that they need to strap up so they’re ready. But if you ask them, “Are you coming for your neighbors?” They’re like, “No, no, I’m one of the good guys.” What an amazing piece of incredible luck it would be if 99.99% of people were sociopaths, barely held in check by the constraints of society, but you and everyone you know are just kind of flawed vessels who sometimes get it right and sometimes get it wrong. Like, that would be the most incredible non-representative sample of society for you to find yourself in.

So can speculative fiction be weaponized to help people see each other’s humanity? Is it irresponsible to feed that paranoia? I’m thinking of books like Stephen King’s The Stand, which help popularize the idea that if society breaks down, the murderous gangs will immediately take over.

I don’t want to fault King for writing The Stand. The reason we write those storylines is that they’re so much fun to write. They’re cracking yarns. They give us what Brian Aldiss called “cozy catastrophe” stories. Like Day of the Triffids, where you and your pals turn out to be the only good ones, and everyone else is a CHUD, and you find a farm and board up the windows. It’s a zombie story, basically. That’s the structure of every zombie story.

I’m more interested in stories where the conflict comes from people of good will, acting in good faith, who nevertheless cannot agree with each other about what they should do. People who strongly disagree with each other, and think one of them is doing more harm than good. Those conflicts are far more intense, because you have to reckon with people who share your goals, and still cannot agree with you about how to achieve them, and still think you’re wrong and worse than wrong, that you’re a danger. The way I shorthand this is, the only thing worse than losing an argument at Christmas dinner with your family is winning it, because then you just never speak to them again. In Attack Surface, in Walkaway, in The Lost Cause, I’m really trying to find stories of conflict between people who want the same things, but disagree so thoroughly about how to get it that they end up as as enemies.

The Little Brother series in particular have always seemed educational — you go into detail about how security systems work and how to get around them, how to protect yourself and your privacy online and in the real world, how to approach and understand technology. Are you trying to give people the tools to make their own utopias?

A hundred percent. There’s a great white paper by this guy, Michael Weinberg, who used to be counsel at Public Knowledge, a pressure group in DC. He was writing about copyrights and 3D printing, and the paper had such a good title: “It Will Be Awesome If They Don’t Screw It Up.” The Little Brother novels and my other works are about the promise and the peril. They’re never just about one, right? They’re never Unibomber manifestos. They are calls for us to seize the means of technology, to seize the means of computation, and put it to work for the common good, instead of as a tool of reaction and control.

To the extent that it’s worked, there have been lots of technologists and human-rights workers and cyber-lawyers and other people who’ve approached me, since Little Brother and Homeland came out to tell me that the reason that they got involved in the field was that Little Brother inspires them with the liberatory potential of technology, and frightened them about what would happen if it were to be subverted or denied. That is a very humbling honor to have received, that there are people out there whose work is infused with an ethical posture that came in part from my work.

But it’s also obviously the case that there are a lot of technologists who don’t let those considerations stay their hands. Someone is building bossware and stalker-ware. Someone is building ad tech, and someone is building all the surveillance tools used by dictators, and so on. And the one thing I’m pretty sure is true about all of them is that the thing that inspired them to get involved with technology involved living through the wonderful stuff technology gives you, just the sheer pleasure of being able to crisply articulate what you want a computer to do, and then have it do that perfectly over and over and over again.

Or maybe plugging themselves into a network and reaching all the way around the world to find a community of people, or even just that one person, who shares your interests in that way that is so exciting and invigorating, to have found your your people somewhere around the world. And yet the people who have experienced this tremendous benefit now spend their days and nights figuring out how to deny that benefit to others.

With Attack Surface, I’m hoping to reach not just the young people who grew up after reading Little Brother and are now adults trying to figure out what to do with their lives. I’m also hoping to reach some of those technologists, and to explicitly tell a story of redemption, about coming back from a series of compromises, each of which seemed reasonable enough in the moment, but which taken together, cause you to wake up one morning and realize you don’t recognize the person in the mirror anymore. We had 20,000 Googlers walk off the job last year. We’ve had people at big technology companies refuse to build drone technology, or censor search engines for China, or work on facial recognition tools for ICE. And we need more of our technologists to be having those moments.

If you think about other professions, like medicine, there are a lot of terrible things that doctors have done, and far more terrible things that doctors could do. But the thing that keeps us from having a Tuskegee every year, a crisis or scandal on that scale every year, is not merely the laws that prohibit it. It’s the normative discussion about what it means to be a doctor, what it means to be in service. And I wanted to address technologists who have let themselves be taken away from the ethical excitement that brought them there in the first place, the miracle of being able to empower yourself and others, and I wanted to bring them back to it, and say, “Look, there is an ethics to this stuff. And you knew it, and you know it now, and you don’t feel good about it. And here’s what you can do. Here’s how you can start to confront the moral debt you have accumulated by making these compromises one at a time.”

Your work has always been grounded in real-world technology and in life on Earth. Why has that been so much more of a focus for you than, say, the far future, or aliens in space?

Cover to Cory Doctorow’s novel Homaland Image: Tor Books

I want to frame these remarks by stating the obvious fact that stories about aliens are just stories about Earth. They’re allegories, and they’re always going to be, whether it’s Gene Roddenberry saying, “Let’s do Wagon Train in space,” or any of the other ways in which there are both obvious and subtle allegories between science fiction and what’s going on in the world.

But to the extent that my work differs from other techno-thrillers and science fiction, I think it’s because I try to deal with computers as they are, rather than as narrative conveniences. So much fiction — and to be frank, law — treats computers as empty vessels we can project our desires and fears into, without looking for the real capabilities and limitations of computers.

Little Brother started after I saw a stupid movie with my wife. I was ranting all the way home about how this movie was all about computers, and all of the details about computers were not just wrong, they were stupid. There were better, smarter ways you could use a computer in that story to make plots fall out of it, ways that were really engaging and interesting. I think about it in the context of Moby Dick. You could write a version of Moby Dick where you pretend that you throw one harpoon at a whale and it dies, and the rest of the story is about something else. Maybe that story would be good, maybe it wouldn’t. But what makes Moby Dick what it is is that Herman Melville super nerds out on the plot possibilities and the lived reality of this highly technical endeavor.

One of the things about Little Brother and the other books in its vein and its series is, they’ve had an enduring life. Little Brother came out in 2008, and people read it today as a contemporary story. It’s about online technology, but it almost predates social media. There is no social media in Little Brother. But people still read the book and pay attention to it and talk about it today, because of the underlying facts that make the Little Brother stories: The computers act as they act in computer science. The underlying theory of computing has been making very slow advances since the time of Alan Turing, and it shows no sign of accelerating. We get better at the engineering, but not at the theory. And computers are becoming more important to us every day, and we’re failing to come to grips with that in any meaningful way. Those facts make for enduring fiction. They also make for an urgent political and social situation.

This is where my activism and my fiction writing intersect, because there’s a real consequence to being terribly wrong about how computers work, in our policy and in our discourse, as computers become more and more central to what we do. We’re paying the price in all kinds of ways. Overnight, about seven months ago, we went from a world in which everything we did involved the internet to a world in which everything we did required the internet. And the policy we built up by not treating computers as they are — we’re going to go bankrupt on that debt. We’re in a bad place with it.