The Rising Case for a Videogame Canon

Recognizing cultural needs that arise in Brendan Keogh’s book “The Videogame Industry Does Not Exist

In the quest for media literacy, common ground is crucial. The concept of being “widely read” is a direct response to the impossibility of reading even a fraction of every book ever written. It works to allow space for intellectual nuance; a conversation between two erudite individuals with disparate personal-reading catalogues is possible when both parties are familiar with a wide range of different books, even though no one is required or expected to have deep knowledge over every single genre or author. Some commonality is usually a given, however; picking up on a Shakespeare quote or acknowledging an allusion to the length of Gravity’s Rainbow helps smooth a tête-à-tête. All media landscapes differ, but being “widely read" in videogames has had shifting connotations over the fifty-ish year lifetime of the medium.

From about 1985 until 1995, there were 678 videogames licensed and released for the NES in North America. The Atari 2600 had 470 official releases in the years it was on officially on the market. A quick scan of the Nintendo Switch digital storefront in September of 2023 brings up a list of nearly 5000: a similar check in Apple’s app store shows about half a million games. Steam, the defacto PC game repository, mirrors the Switch with around 5000, while the more curated retro-experience gog claims slightly more than the lifetime of the NES, at 684 vintage games retrieved from oblivion (gog at its inception stood for “good old games” but has since ventured into modern releases over the past few years).

glad they still have the twitter bird, even in sept 2023

These numbers may not seem like it at first blush, but they illustrate how at one point during my lifetime it was conceivable—if videogames were your passion or your main hobby—to play nearly every game that was released. That’s not to say it would have been easy nor, at the time, desirable: how many kids wanted to spend a weekend playing Lunar Pool or digital pinball? Most of us who are now in our late 30s and early 40s—The NES generation—didn’t have that kind of access, anyway; videogames were marketed as toys, were the purview of children—children without finances, or income, or access to commerce—and there was no record available of everything that had come out beyond the newest Toys R Us catalogue. The much more likely outcome, at least in my personal experience, was that of the possible 700 NES games in all of existence, you might find available for purchase or rent only a smattering. You could clear out a whole rental section or borrow a game from a friend (who was typically delimited by knowledge of and access to the same pool of options as you) and, content with seemingly full knowledge of the videogames you had available to you, think, “Yes, I did play everything I could.” The videogames you personally knew were the videogames that existed, give or take a few chance encounter with an also-rare game-review magazine. 

As chronological gaming series prove, playing all NES games was doable. Certainly, videogames nearer to their inception were more monoculture, akin to broadcast TV, than the splintered microinterests offered by cable and heightened in the streaming era: the relative dearth of games; their overall brevity (how many dozens of games could you fit in the time it takes to play Baldur’s Gate 3?); and a slower release pace (as well as often misleading boxart and marketing materials) funneled a lot of videogame people into playing “popular” games regardless of taste. Genre preferences still hadn’t quite solidified: if you think every 3D action-adventure in the 2020s is loaded down with RPG character-progression elements, try watching through Chrontendo, where “RPG character progression” was tacked onto every other 2D side-scrolling adventure game that made the jump from arcade to NES. Everything is a Rygar-like. While it is hard to know how many people played something like Rygar in the 1980s and 90s, there are a few ways to guess: the “attach rate,” is the total number of videogames sold for a system divided by the number of console systems sold. It is a quick and dirty way to average things out, but it doesn’t give specifics nor account for the rental market—if you knew the global NES attach rate was 8, then the concept of people playing 700 NES games seems unfathomable. But attach rate is an average—outliers abound—and so we must look to the cultural footprint as well as the paper trail.

It is from this arid landscape that the poisoned flower of “keeping current” with videogames has blossomed: now, in a climate of superabundance, when there is a new massive games released every week, the expectation or desire to try everything has turned a pleasurable hobby into a checklist-filled grindset. The sheer amount of videogames, their availability, and the inflated time commitment to play them have all swelled. The ability to play everything–even everything within a favored genre–has been obliterated. Additionally, the expectation that you can and should “beat” a game or else relegate it to a swollen backlog—tagged and sorted and posted online—has expanded in parity with the rising sense that there might be a “cost-per-hours-of-fun” equation driving consumer behavior. Most videogame platforms drip-feed dopamine to players with system-wide literal checklists of achievements and trophies that trigger bells and chimes after performing certain in-game actions, a stimulus/response drawn straight from Pavlov. That little ding can tether certain types of people to a particular videogames long after the enjoyment has withered. Finally, one must remember that over the past fifty years, thousands of games have been released. A 30-year-old today was born the year Final Fantasy 6 was released. There is simply not enough time to play everything. And this is only considering large-scale, commercial videogame releases: itch.io, the indie game hub, clocks in at over 800,000 options currently.

Do people have time nowadays to play FF6 enough to keep Cid alive and stop Celes’ suicide attempt? I know I did!

Being able to point to a specific year or hardware generation as the moment that the world was flooded with more videogames than someone could play in their lifetime matters less than the fact that is has happened. This embarrassment of riches is a cultural shift, throwing everyone into a library to rival Borges. What videogames one played used to be left to random draw, accessed by rental stores and holiday gifts. Now, each individual has near-perfect control in deciding what videogames will shape their understanding of the medium—the problem is one of filtering, a glut of noise that must be parsed to find worthwhile experiences that has infected most of our cultural forms. For videogames, with their ascendant cultural capital and large required time investment, figuring out what to play is crucial. That is why videogames need a canon for players, a common and easily accessible point of entry that gives basic literacy upon which to build. Being “widely read” with respect to videogames is no longer a reasonable option.

What this canon should be is, in my opinion, the most pressing concern in modern media, second only to who should we allow to decide it. Abdicating responsibility will leave things as they are now, with platform and rights holders wringing out as much individual value as possible with no respect for the good of the artform (a true tragedy of the commons, as some might say). Another option is to leave the foundation of a cultural introduction to the history of videogames to the hobbyist chronologists and weekend historians, the youtube deep-divers and r/nostalgia, to decree and declaim the rest of the population which games matter. Physical access matters, as well—does videogame history only belong to the the MiSTer builders or the Anbernic havers, those that live within driving distance of The Strong Museum of Play or have spent thousands of dollars amassing a personal collection? The suddenly infamous 87%—where only 1 in 10 classic games (from a sample size of about 4000 split over 4 specific ecosystems: games from before 1985; games from the C64; games from the Game Boy; and games from the PlayStation 2)—are available for purchase proves there is a real barrier to even approaching the history of videogame culture.

While I don’t want to dilute the stand-alone importance of videogames as cultural artifacts, it might assist the mind to compare them momentarily to another medium: imagine knowing that there are about 37 Shakespearean plays and that people used to be able to go to a theater and see any one they wanted. Over the course of about 30 years, the options steadily decreased until Hamlet, Romeo & Juliet, MacBeth, and Othello were the only ones commercially available. Perhaps your parents or an uncle muttered constantly about owing someone “their pound of flesh”, or you had a friend that was really into 10 Things I Hate about You and so spent a lot of time on reddit reading pixelly uploaded scans of tattered The Taming of the Shrew scripts. Is culture worse off for not having context for Merchant of Venice? Would most people be content with 10 Things and not care that it had roots in Shakespeare (Do Shrew fans care that it seems to have its roots in I Suppositi?)? If the only way to know Shakespeare wrote other plays was by watching the Chron-Shakespeare youtube channel, does that lessen the impact plays can have on broader culture? I would say, “Yes. Yes it does.”

I saw this one in person

Would love to hear the marketing meeting to why they used R&J quotes. “WELL at least it’s SHAKESPEARE” I suppose.

So I ask again: Who will be in charge of access to foundational, canonical games? Who gets to decide what memories of videogames are allowed to shape our culture? Let’s look at what leaving it to the “marketplace” has done: by virtue of allowing platform holders create what videogames history is, you can play Mario 3. And Jumping Flash! came back as early as the PS3 era. But good luck finding the NES Wolverine outside eBay or emulation (please don’t play Wolverine, the NES game that taught me as a child that not all games were good). As the aging minority of videogame hobbyists who lived through an era where they could and did play a wide swath of videogames are replaced by a younger generation that has an unlimited amount of uncurated content to pull from, the creation of an accessible, historical videogame canon is left up to…no one. The platform holders chase profits and the whims of the marketplace through artificial scarcity and draconian digital rights management while also dealing with real physical constraints like server-load (remember: there is no cloud, only someone else’s computer) and the cost of bandwidth. For your Switch, you can rent around 100 NES games for 20$ a year. If you’re looking for something particular—say, Nuts & Milk, because you want to compare the first 3rd party game (a 3rd party game is one not made by the platform owner, i.e. not developed and released by Nintendo) released on an 8-bit Nintendo system to something made by a team with intimate access to the hardware…well, emulation exists.

The question of who controls the history of videogames is so crucial that its shadow touches all research into the field of videogames. The Videogame Industry Does Not Exist: Why We Should Think Beyond Commercial Game Production, a new book from The MIT Press Platform Studies series, is indirectly concerned with the videogame canon. Canonicity becomes another avenue from which to answer the main question of the book—who or what is considered a videogame maker? As an academic foundation upon which to rebuild and then circumscribe what one thinks it means to create videogames, Does Not Exist helps explain why videogames are culturally relevant:

Perhaps the most significant contribution of Bourdieu’s body of work is a more sociologically robust articulation of how the dominant classes reproduce their own dominance not simply through the concentration of economic wealth but through the ability to define broader social and cultural practices and tastes in such a way that they also grow their own concentration of cultural and social wealth, while suppressing such wealth in the dominated classes.

Does Not Exist is a fascinating text that shines an oblique light—through class examination (who gets to call themselves a game developer?) and industrial extraction (when is “independent” simply a stylistic label, like “action-adventure”?)—on the wider structural history that surrounds the culture of videogames. Even if one doesn’t particularly care about the plight of fringe indie developers as their counter-cultural messaging is co-opted and folded back into the machine of neoliberal commodification to add legitimacy to a be-suited monolith of international conglomerated businessmen, it is important to recognize that that standard cultural spiral does apply, like books, music, poetry—you know, art—to the videogame scene. By virtue of following the same extractive tract of all other cultural forms of expression, the argument of whether videogames are art—more like functional software, or more like expressive creation?—seems to have been definitively answered. Certainly, videogames have functional elements embedded within them, but one could also say the same about the paint used for Starry Night; it might well have been applied to weatherproof a fencepost.

Over and over the book makes its main argument clearly, and well:

One of the key takeaways from this book, I hope, will be that alternative and noncommercial modes of videogame production are not the fringe of the videogame field but its foundation.

In doing so, it must prove any number of ancillary points, intentionally or not, which it also does with aplomb. A reader with any interest in videogames will get so much more from Does Not Exist than can be expected: even as a someone who never—or in my case, briefly and without much fanfare—touched the development side of videogames will find that their interest in playing games is still a crucial element in the continued propagation of the “global games industry” hegemonic myth. For example, in considering the import in my plea for a videogame canon, one must consider whether to position it toward production or consumption:

Film [in the academic recruitment poster] was depicted as first and foremost a creative craft that required honing particular skills with particular tools; videogames were depicted as a consumer product that the student consumed. Neither the poster nor the prospective students had a comprehension of videogames as crafted works requiring, as films require, specific tools, tastes, and design methodologies. The poster for the film degree was targeting prospective students interested in making films. The poster for the game degree was targeting prospective students interested in playing videogames.

The “most taught” list for Film Studies is topped by 1929’s The Man with the Camera, –which I haven’t heard of, let alone seen–while the literary canon, while constantly in flux, is directed at not aspiring novelists but the general citizenry. Videogames seem to have a game design canon, dependent on genre: Mario level 1-1 and its use of negative space and positioning is constantly heralded as the best onboarding for the burgeoning population of non-fluent proto-gamers in the mid-eighties. I remember some swear-filled rant-like praise (as was the style at the time) for Mega Man X’s intro level at the dawn of the youtube docu-pinion era. GDC, the Game Developers Conference, puts most of its stuff online. Much of this is level-design for moving through a 2D plane; how do videogames account for the variability between a player that has been controlling a 3D camera since 1998 and someone that just decided to pick up Final Fantasy 16 because it looked pretty? Is their any expectation of videogame fluency, and if not, should there be?

What do we lose when every game needs to position itself and its tutorial like it might be someone’s very first experience with a controller? On the other side, what do we lose when there are wonderful mechanical control experiences buried in subpar games–are those lost forever, never to be repeated? Videogames have a preternatural lust for novelty. Some things can’t be followed up for technical or legal reasons: Namco had a (questionable) patent on loading-screen minigames, but by the time it expired the technical need for loading screens had substantially decreased. For many other concepts–intricate checkpointing; obscure yet incredibly deep lore running in the background; RPG character progression (hi again, Rygar–videogames might run into, “Well, why would I make a game like Dark Souls when players could just play Dark Souls?” On the other hand, give someone raised on Death’s Gambit and Hollow Knight a foundational experience like Castlevania Symphony of the Night and—if they don’t have any interest or understanding of the cultural field of videogames—it might seem lackluster rather than the progenitor of a new type of gameplay experience.

I still think its fun that a Castlevania is the first Metroidvania.

Soulvania?

A beautiful moment on Tiktok shines light on the abyss into which games must constantly gaze: the creator was reading Treasure Island—a foundational text in the literary canon—and reminded the viewer that Long John Silver and the book’s protagonist are basically buddies at the start of the book. Long John is positioned as the ship’s cook. and he’s a friendly enough sort. Granted, he has a peg-leg, a tricorn hat, a parrot on his shoulder, a cutlass, and a big pirate coat. Maybe an eyepatch, I don’t recall. And the reason I don’t recall is because all of the standardized iconography that screams “PIRATE” to me right now is drawn from Long John Silver. He was functionally the first pirate. Before Treasure Island, peg-legs didn’t imply anything. Parrots on the shoulder, big hat, big coat? Could mean anything. When the story of Treasure Island was new, it was a startling revelation that this salty seadog was the bad dude. The image didn’t give away the twist. He is the blueprint for pirates so much so that modern audiences see the cover and know what’s up. We read the name “Long John Silver” and are instantly aware, “Hey, yeah, that’s a pirate.” Not so in the 1880s. Imagine, try to imagine, reading Treasure Island and not knowing Long John Silver was a pirate. Sincerely. Try reading Lord of the Rings and being surprised the first time Gandalf wasn’t just an old man with a walking stick.

A pirate’s pirate.

Same vibes

This is the level of differentiation that games must overcome daily. I look at the new Super Mario Bros. Wonder and I am not confused by what genre it will be or how it will play—I have the weight of cultural history to see a pegleg and think “pirate.” Sometimes that’s a positive—I like books about pirates!—and sometimes, it’s not—Treasure Island is kind of boring to read when you know the twist so deeply from the very beginning that you can’t see there was even supposed to be a twist until someone points it out. Are you creating the trope of your genre so well that other, later games can crib it so casually that someone who hasn’t played your game will think it unoriginal? This is what it means to be foundational. “Run” and “Jump” in Mario was the Long John Silver “Pirate” to earlier forms of the buccaneer—sure, prior games existed with those mechanics, but Mario codified what it meant to run and jump in a 2D space.

New creative fringes are what push all artful mediums forward—as they continue to be folded back into the machine, the fresh frontiers become the stodgy centers, and the cycle begins again. Does Not Exist offers compelling first-hand research and contemporary interviews proving that this standard lifecycle of culture exists within videogame creation as much as something like modern art. It also shows the damage caused when the world believes that faceless corporate machines create beloved artifacts:

If I were to note that most musicians or painters or actors do not make enough money to live from their creative work, no reader would blink twice. Yet, in the cultural field of videogame production, the decades of aggressive formalization mean that the largest corporations and most successful (and lucky) indie millionaires cast long shadows that obscure the broader field of cultural activity where economic capital circulates in much smaller and unreliable quantities. The notion of “successfully” making videogames without it being your full-time job is a difficult thing to imagine.

Given the response to the 2023 Hollywood WGA and SAG-AFTRA strikes, many readers would, apparently, blink twice at the idea that most actors cannot make a living in their creative field alone. Regardless, dozens of interviews with game creators and a perspective ensconced within formalized videogame academia make Does Not Exist rigorous and insightful with clear conclusions drawn from details that feel safe to trust. There is a clear understanding of modern economic realities, with two of my favorite quotes encompassing more than videogame creation:

Where once one undertook a hobby to be productive beyond their formal employment, in the precarious hustle of today’s portfolio careers, one more often than not strives to turn what would otherwise be a hobby into a form of employment in lieu of any other alternatives.

A position such as this, coupled with the factual presentation and intense original fieldwork, lends credence to the author’s argument—make no mistake, Does Not Exist is trying to convince you of something—that independent small-scale development is the foundation and the vanguard of videogames. The book keeps its focus not on the artifacts but on the people making them:

The language of entrepreneurism reframes structural precarity as self-chosen adventure, obscuring bother the conditions and motivations of gamework.

Why people make games and why people play games have been untethered for a long time—understanding the field helps one realize the artificiality of the schism and that it was drawn to extract surplus value from hobbyists across the world (think mods, user-generated content, successful crowdfunding signaling to publishers that a game might be lucrative). Acknowledging the motivations of gamemakers is a required aspect for creating a true and navigable canon for gameplayers. This cycle will remain recursive—as the field widens and inspiration is drawn from more disparate sources, fewer videogame players will have had gameplay experiences that allow them to engage with modern game creators—unless a clear foundation of videogames is established to allow a wider audience to engage with the medium with fewer hurdles. When there were 700 NES games, a game developer might synthesize something new that was built upon experiences learned from other videogames that almost anyone else could have played—with options like 800,000 itch.io games out there, no single developer will have ever played more than a fraction, and non-dedicated hobbyists even fewer. How, then, to pull inspiration from infinity? Even more, how to apply that experience without creating an expert’s dilemma, where the only players that can parse a complex game are ones that have decades of experience decoding the nuances of this particularized media?

Too spooky

Other creative fields deal with this by splintering into genres—most mystery writers read most mystery novels, but no fan of mystery novels is expected to read every mystery novel ever written. Videogames, though, have a bifurcated nature: genre can differ in both gameplay and game narrative—does one like all 3D Souls-likes, or does the eldritch-horror setting of Bloodborne not work for the high-fantasy lover in you? To create and define a core canon of videogames—“classics” in the sense of House of Mirth or The Odyssey, not bestsellers—is a daunting task. To even recognize it is important—to know you aren’t supposed to know Long John Silver is a pirate—requires reading books like Does Not Exist.

If you have any interest in understanding what makes videogames culturally relevant beyond “they makes loads of fat cash,” a more complete understanding of the conditions under which they are commonly made is required. I felt a flash of recognition as early as page two when I noted that I opened my review of The Lure of Pokémon with a paragraph eerily similar to the introduction of Does Not Exist:

It’s become common practive, to the point of cliché, to begin any piece of writing about videogame production by proclaiming that “the videogame industry” generates over a hundred billion dollars every year—perhaps more than any other entertainment sector, depending on what revenue sources you include or omit. This total generated revenue has become a reactionary shorthand in journalism articles, policy documents, academic research, and trade association reports for the economic and cultural significance of videogames. Once relegated to the sidelines of cultural relevancy at best, actively excluded at worst, champions of the videogame medium now hold up the revenue generated by the top videogame firms as proof that videogames do, in fact, matter.

Compare with mine:

I have read a lot of mid-aughts game design books and a common theme is the proviso in the opening pages about how, “Video games have overtaken film as the highest-earning entertainment media landscape.” “In a capitalist society–where money is the only points system–videogames now have the high score.” “See, mom, we’re not wasting our time!” It’s a little bit sad, a little bit defensive, and as a posture it has largely fallen away from cultural videogame criticism. The need to make sure videogames are taken “seriously” based on economic bookkeeping isn’t how Lure opens, so even as one of the older videogame theory books I’ve read, it feels remarkably modern.

I heard the co-director of the Videogame History Foundation once say that anyone even remotely connected to videogames has interviewed Ralph Baer, creator of the Magnavox Odyssey. This is true (this me with the “Father of Videogames” 10 years ago)

Does Not Exist is modern and on point. It answers questions that would have been challenging to even know existed without first reading the book. Videogame makers and videogame players are in dire need of a sea change for how to approach the infinitely wide and ever-expanding options of new and historic videogames: art museums offer curated shows based around themes or particular artists; music has albums as well as playlists and festivals to expand breadth while maintaining similarly genred aesthetics. Videogames offer marketplace-based algorithmic recommendations and semianonymous message boards to uncover unrepresented titles. Having a way to bring neophytes into videogames or enthusiasts deeper into the scene without unnecessary personal labor or covetous gatekeeping offers benefits to the field as a whole.

Understanding why and how people make videogames is a crucial step to understanding why they matter in the first place, why their history and cultural relevancy needs to be addressed now. The world still has people in it that were around at the dawn of this medium; it is past time to build a canon that lets everyone else in on the game.