The first time a video game made me cry, I was sixteen years old, sitting cross-legged on my bedroom floor, surrounded by empty Mountain Dew cans and Doritos bags. It was 1997, and I’d just witnessed Aerith’s death scene in Final Fantasy VII. I remember sitting there in shock, controller limp in my hands, as that haunting music played and the materia bounced down the steps of the Forgotten City. I hadn’t expected to feel actual grief for a collection of polygons that vaguely resembled a human being. Games weren’t supposed to do that. They were supposed to be fun diversions—tests of reflexes and problem-solving, not emotional journeys that could punch you in the gut and leave you reeling.
That moment represented something profound about the video game narrative evolution 90s era brought to the medium. Just a few years earlier, most game stories were little more than flimsy pretexts for gameplay. The princess is in another castle. Aliens are invading. Bad guys stole the thing—go get it back. They existed primarily in instruction manuals or brief text crawls, context rather than content. But something remarkable happened during that decade: storytelling moved from the periphery to the core of the gaming experience, transforming how we played and why we played.
I’d grown up on 8-bit NES games where narrative was often an afterthought. Take The Legend of Zelda, which I poured hundreds of hours into as a kid. Its story consisted of a few cryptic lines from old men in caves and a brief text introduction. The manual contained more plot than the game itself. We accepted this limitation because the technology simply couldn’t support more. Or look at Contra—aliens are bad, shoot everything, the end. Nobody complained about the lack of character development for Bill and Lance because we weren’t playing for the story; we were playing for the challenge and the fun.
Early 90s games began pushing those boundaries, but were still heavily constrained by storage limitations. I remember being blown away by the storytelling in games like Secret of Mana and Final Fantasy II (IV in Japan) on the SNES, which managed to create compelling narratives despite minimal text and no voice acting. These games suggested that stories could be more than just setup; they could provide emotional context for gameplay and make victories feel more meaningful.
Then came the CD-ROM storage capacity storytelling revolution. Suddenly, games weren’t limited to a few kilobytes of text. A single CD could hold hundreds of times more data than a cartridge, opening possibilities that previously seemed like science fiction. FMV full motion video game integration, which had been technically possible but severely constrained, now became a central storytelling tool. Games like Wing Commander III and The 7th Guest used actual video footage of actors to tell stories that more closely resembled movies than traditional games.
I still remember the first time I played Phantasmagoria at my friend Tom’s house. His parents had bought an absurdly expensive multimedia PC, and we spent an entire weekend immersed in that horror story told through live-action video. Was it cheesy by modern standards? Absolutely. The acting was community theater quality at best, and the integration between gameplay and video was clunky. But it felt like witnessing the future—games that could tell stories with the production values and emotional impact of film while still giving players agency within that narrative.
The video game voice acting quality improvement throughout the decade was equally transformative. Early CD games often featured performances that ranged from awkward to genuinely painful—resident Evil’s infamous “Jill sandwich” line comes to mind. But by the late 90s, developers were hiring professional actors and directors, treating game dialogue with the same seriousness as film or television production. The original Metal Gear Solid stands as perhaps the best example of this evolution, with David Hayter’s gravelly Snake and the rest of that excellent cast bringing Kojima’s cinematic vision to life.
I vividly remember playing Metal Gear Solid for the first time, alone in my bedroom with the lights off, completely absorbed in not just the stealth gameplay but the twisting conspiracy narrative that unfolded between those gameplay segments. The video game cutscene development history reached a new peak with MGS, using the in-game engine to create cinematic sequences that didn’t break the immersion by switching to pre-rendered videos. The codec conversations, the fourth-wall breaking Psycho Mantis fight, the torture sequence—these weren’t just clever gameplay moments but storytelling innovations that could only exist in an interactive medium.
The JRPG emotional storytelling techniques that developed during this period had a particularly strong impact on me. Final Fantasy VI (released as III in North America) was the first game that showed me how digital characters could have genuine narrative arcs and psychological depth. The opera scene, Cyan watching his family depart on the phantom train, Celes’ suicide attempt on the solitary island—these moments weren’t just sad because something bad happened, but because we had come to understand these characters through both explicit storytelling and gameplay mechanics that reinforced their personalities.
When Final Fantasy VII arrived, it built on those foundations with its flawed, complex protagonist Cloud Strife—a character who begins as the stereotypical stoic mercenary but is gradually revealed to be something far more complex and broken. The video game character development writing in FFVII showed that games could tackle themes of identity, false memory, environmental disaster, and corporate exploitation with surprising nuance. Yes, you were still running around bashing monsters with an oversized sword, but you were doing it within a narrative context that gave those actions meaning beyond simply earning XP and gil.
The video game silent protagonist debate raged throughout this era, with different developers taking opposing approaches to player immersion. Half-Life’s Gordon Freeman never uttered a word, allowing players to project themselves into his HEV suit without the dissonance of hearing a voice that wasn’t their own. Meanwhile, games like Legacy of Kain: Soul Reaver featured protagonists with distinct personalities and extensive dialogue, asking players to inhabit a specific character rather than insert themselves into the story. Both approaches proved viable, showing that storytelling in games wasn’t converging on a single “correct” method but exploring diverse paths to player engagement.
Adventure games, which had been pioneering game storytelling since the text adventure days, reached their narrative pinnacle in this era. LucasArts titles like Grim Fandango and Full Throttle weren’t just showcases for clever puzzles but fully realized worlds with distinctive characters, themes, and visual styles. Tim Schafer’s writing in particular demonstrated that games could be genuinely funny, poignant, and thematically rich simultaneously. Grim Fandango’s noir-inspired journey through the Land of the Dead showed that game stories could draw from literary and film traditions while creating something uniquely suited to interactive storytelling.
I remember playing through Grim Fandango during my freshman year of college, alt-tabbing between the game and my term papers, completely captivated by Manny Calavera’s journey. Here was a game dealing with afterlife bureaucracy, corrupt soul-trafficking, and redemption—hardly the typical video game subject matter—with humor, style, and genuine emotional resonance. I found myself thinking about its story and characters long after I’d solved its final puzzle, the way one might reflect on a good novel or film.
The interactive versus passive narrative advantages became increasingly apparent as 90s games explored storytelling possibilities unique to the medium. While some games attempted to simply emulate film techniques, the most innovative recognized that player agency could enhance rather than detract from narrative impact. Planescape: Torment asked philosophical questions about identity and mortality but allowed players to shape those answers through their choices. Fallout provided a bleak post-apocalyptic setting but let players determine what kind of person they would be in that moral wasteland.
Some of my most profound gaming moments came from this intersection of narrative and choice. In Fallout, I initially played as a typical hero, helping settlements and fighting raiders. On a second playthrough, influenced by a particularly cynical period in my life, I created a selfish character who looked out only for himself. The game accommodated both approaches while subtly commenting on them through NPC reactions and ending slides. This wasn’t just a choose-your-own-adventure book with better graphics; it was a narrative system responding dynamically to player expression.
The video game narrative evolution 90s era brought wasn’t limited to RPGs and adventure games. Action titles like Legacy of Kain: Soul Reaver used impressive voice acting and cutscene direction to tell gothic tales of betrayal and revenge. Racing games like Need for Speed: High Stakes introduced narrative frameworks to give context to competitions. Even fighting games like Soul Calibur created elaborate character backstories and interrelationships to make each battle feel like part of a larger story rather than an isolated contest.
Horror games perhaps benefited most dramatically from the increased storytelling focus. Resident Evil’s B-movie dialogue has been much mocked, but its atmospheric environmental storytelling—telling the tale of the Spencer Mansion’s fall through notes, visual clues, and level design—created a sense of place and history that made the scares more effective. Silent Hill went further, using psychological horror and symbolism to create a narrative experience that was unsettling on a deeper level than simple jump scares could achieve.
The first time I played through Silent Hill remains one of my most vivid gaming memories. I was nineteen, playing in my darkened dorm room with a roommate who didn’t game but became completely invested in the unfolding story. We’d take turns theorizing about the town’s mysteries, the nature of the fog world and otherworld, and the significance of certain symbols. The game became a collaborative interpretive experience more akin to analyzing a David Lynch film than traditional gameplay. That it could support this depth of analysis while still functioning as an effective horror game demonstrated how far video game storytelling had evolved.
Not every storytelling experiment during this era succeeded. The CD-ROM boom led to some truly painful examples of FMV games where the interactive elements felt completely disconnected from the narrative. Many games struggled to balance player agency with storytelling needs, resulting in ludonarrative dissonance—moments where gameplay and story seemed at odds with each other. And some games simply collapsed under the weight of their narrative ambitions, becoming interactive movies with occasional button prompts rather than fully realized games.
But even the failures were instructive, part of a medium discovering its unique storytelling capabilities through experimentation. The 90s represented gaming’s awkward adolescence—sometimes clumsy, occasionally brilliant, constantly trying on new identities and approaches to see what fit. This experimentation laid the groundwork for everything that followed, from the moral choices of Mass Effect to the environmental storytelling of Dark Souls to the narrative innovation of games like Disco Elysium.
For players like me who grew up during this transformation, the impact was profound. Games changed from being primarily skill tests or puzzle challenges to being vehicles for emotional experiences and thought-provoking narratives. They became capable of moments like Aerith’s death that could genuinely move us, or philosophical questions like those posed in Planescape: Torment that might linger in our minds for years afterward.
I still remember staying up until 3 AM to finish Xenogears, completely absorbed in its complex tale of religion, psychology, and memory despite the game’s notorious second-disc collapse into narrated sequences due to budget constraints. Or being genuinely unsettled by System Shock 2’s audio logs documenting the gradual transformation of the Von Braun’s crew. Or feeling a complex mix of triumph and melancholy at Final Fantasy Tactics’ morally ambiguous conclusion.
These weren’t just fun gaming memories but formative narrative experiences that shaped my understanding of storytelling itself. They demonstrated possibilities unique to interactive media—the way player choice could create investment in outcomes, how mechanical systems could reinforce thematic elements, how the pacing of discovery could be controlled yet still feel organic.
The 90s taught us that video games didn’t need to choose between being good games and telling good stories—the best examples found ways to make gameplay and narrative complement and enhance each other. They showed that interactivity wasn’t an obstacle to storytelling but an opportunity to explore new narrative approaches impossible in passive media. And perhaps most importantly, they established that games could provoke the full spectrum of emotional and intellectual responses that we expect from mature art forms.
That awkward teenager sitting on his bedroom floor, processing unexpected grief for a digital character made of crude polygons, was experiencing something unique to this evolving medium—a story that couldn’t have been told the same way in a book or film because it relied on dozens of hours of interactive context. The princess wasn’t just in another castle anymore; she was a fully realized character whose fate mattered because we had actively participated in her journey rather than merely observing it.
The storytelling evolution of 90s games transformed not just how games were made but why many of us played them. They became not just tests of skill or puzzle-solving exercises but vehicles for exploring ideas, experiencing emotions, and considering perspectives outside our own. For all their technical limitations and occasional narrative clumsiness, these games expanded our understanding of what the medium could achieve and paved the way for the rich narrative experiences we take for granted today.
And sometimes, when I hear a few notes from Aerith’s theme or see a collection of blocky polygons that somehow still register immediately as Cloud or Lara Croft, I’m transported back to that formative decade when games discovered their voice—sometimes breaking and squeaking with adolescent awkwardness, but increasingly capable of telling stories that could move us, challenge us, and stay with us long after we put down the controller.