I am a zillennial. Born on the border of Millennial and Gen Z; too early to get TikTok, too late to qualify for a pandemic stimulus check. And look, someone has to say it: Computer labs are antiquated.
My earliest years in school involved moving to and from computer labs throughout the week, my class following the painted lines of our elementary school and struggling to remain quiet the way children do. But by the time I graduated high school in 2016, the role of the computer lab had diminished, its uses relegated to standardized testing and Adobe or Autodesk subscriptions. The history of the computer lab is tied up in gaming as much as it is personal computing, though, and that transition had an impact on not just what we play, but how we play.
I used my first school computer in kindergarten, and individual desktops would remain a standard feature in my classrooms. What I did on that first desktop I mostly can’t remember. There were educational games and a lot of coloring (I loved that). Then, in my first few numerical grades of elementary school, computer games became something we played after school. World of Sand, a sort of chemistry sim where pixelated elements pour in from the top of the screen and interact with one another in the petri dish of your desktop window, was a favorite. Space Invaders was a mainstay, and in fifth grade we would take turns playing Cube Runner and other Flash games during lunch in our classroom that by then had three desktops in the back.
Smartphones weren’t quite a thing yet, mind you. My fifth grade teacher had a newly released iPhone that lacked any appending model number, but my classmates and I had flip-top and sliding-screen cell phones, if any devices at all. What more of us did have were handheld game consoles, the Nintendo DS a popular pastime on summer camp transit. This was the era of Diamond and Pearl, of sharing a Mario Kart DS cartridge via DS Download Play, of GameSharks at Walmart. Though the DS was still not exactly common. It was expensive enough that only kids who already played games had the handheld — and unlike home consoles, they were less likely to be shared across normative lines of who was considered a “player” or, as it may be, “gamer.”
In the computer lab, though, anyone could play. And when I first went to middle school in 2009, I often found myself in the library when there was nothing else to do. Kids congregated around tables with their friends; others sat in a corner of the library where a row of computers wrapped around itself. Games, and the unblocked sites you could find them on, passed by way of sight and by ear. I want to emphasize that this was a communal space to game, where people saw what you played and that you gamed. These were almost all browser-based games that mostly involved puzzles and platforming. The more tech savvy of us, though, figured out how to set up multiplayer matches in games like the Flash-based Scorched Earth clone Tanks and the unofficial 3D light cycle game Armagetron Advanced via LAN connections.
With nostalgia (and ignoring every other part of middle school), this is an almost idyllic memory of play. A time before any of us knew what ludonarrative dissonance was, when Minecraft was an incipient sandbox, and “the cloud” was hard to wrap your head around. But by seventh grade, I stopped going to the library in the morning.
At some point in sixth grade, I got an iPod Touch. A few of my friends had them at the time, first- and second-generation devices with accelerometers, headphone jacks, and 8 GB of storage. These became portable windows to the internet and portable gaming machines we could play in the morning waiting in a hallway, or for five minutes before class, or waiting for the final bell to ring, or on the bus on a field trip.
Many of my classmates’ families had desktops of varying degrees of sophistication at home, but my house didn’t have Wi-Fi until I was in eighth grade, which meant I could only download Crazy Penguin Catapult or the classic indie shmup Space Deadbeef at home over a wired connection to my family’s desktop. Early Wi-Fi connections at school, in the public library, or at the mall weren’t going to cut it.
Eventually the iPod Touch became common, and kids at my middle school even started to bring iPhones to class as well. And because of their multipurpose nature, these devices were taken up by people who, as in the computer lab, weren’t already playing games. While Apple and an endless number of industry ex-execs with mobile publisher startups positioned the device as a threat to big publishers like Sony and Nintendo, that competition never really materialized. In part, that’s because this was never about “gamers,” but all the other people who play games anyway. And not unlike the arrival of consoles in the home after the decline of arcades, the iPod Touch saw gaming move again from a communal environment — the computer lab — to an individual experience on school buses or algebra class desktops, stuffed inconspicuously into backpacks.
The iPod Touch, described by Kotaku’s Ari Notis as a “herald” of the mobile gaming revolution, presaged the explosion of the medium over the past decade. But it was ultimately a transitory device. Over the following years the platform would be quickly replaced by the even more multipurpose smartphone, and in my school we frequented computer labs less and less often too. Home computers were just an expectation for homework by the time I entered high school, and my state even made virtual school classes a required part of our curriculum.
As the computer lab was lost, so too were many of its games. Websites forgotten, servers shut down, tech unsupported. The same would happen to the iPod Touch’s library when the App Store dropped support for 32-bit apps. In May, Apple even announced it would stop production of the device. If the iPod Touch helped to kill the computer lab, or at least its cultural impact on gaming, we’re now a generation removed from that schism. But there is an addendum to the computer lab’s role in both education and gaming.
You may have figured out I grew up in an affluent area of the United States. iPhones for middle schoolers in 2010 probably gave that away. The adoption of new technologies at school and in the home is greatly determined by wealth, and I was surrounded by it in the south Florida suburb my parents moved to in the ’90s.
In college I volunteered with a youth literacy program in Parramore, Florida, a historically segregated suburb of Orlando. I worked with elementary and middle schoolers in the neighborhood’s only school, serving the youngest kids who still fit the bill of Gen Z. This was the time of the Nintendo Switch’s rise, of Fortnite, where for many people a laptop may have replaced both a desktop and maybe even a TV at home.
I worked across the street from their school, and I visited the cafeteria after class one day to pick up our students. Orlando’s public school system could not be more different from my home’s. I’m not sure what their computer labs looked like, if their school still has or ever had them, but many of the students had school-issued touchscreen laptops they did homework, messaged each other, and of course played games on (or watched people play games on, as it were). Huddled around cafeteria tables, the tech provided access to play that wasn’t available to them elsewhere. And while these were still more individualized experiences, these students could now easily play with each other online, stay in touch at home, and perhaps even find virtual connection in comment sections and chats.
In that sense, the spirit of the computer lab lives on.