STORE COMMUNITY ABOUT SUPPORT
Login Store Community Support
View desktop website
Well done, everyone. Between the festival of catharsis that was Wolfenstein 2: The New Colossus and Call Of Duty’s return to World War 2, we’ve officially put shooting virtual fascists back on the menu. Jolly good show.
Still hungry to grind some more into kibble? Here’s a couple of free treats to keep you going until Wolfenstein 3. Granted, the goose-stepping villains in Doom mod Shadow of The Wool Ball and its sequel Rise Of The Wool Ball are cats>, but they’re pretty villainous, and unless you’re happy with the Planet Of The Adorable Hedgehogs being ground up into kitty litter, you’re going to have to shoot a few.
Writers of videogame histories often think in terms of individuals and periods—great innovators and clear-cut ‘epochs’ in design, typically bookended by technological advances. Events or people who contradict those accounts have a tendency to get written out of the tale. According to one popular version of the medium’s evolution, the first-person shooter was formally established in 1992 with id Software’s Wolfenstein 3D, a lean, thuggish exploration of a texture-mapped Nazi citadel, and popularised in 1993 by heavy metal odyssey Doom, which sold a then-ludicrous million copies worldwide at release. The company’s later shooter, Quake, meanwhile, is often held up as the first ‘true’ 3D polygonal shooter.
Founded in 1991 by former employees of software company Softdisk, id’s contributions to what we now call the FPS is undoubtedly immense. Between them, Wolfenstein 3D and Doom brought a distinct tempo, savagery and bloodlust to first-person gaming, and programmer John Carmack’s engine technology would power many a landmark FPS in the decade following Doom’s release. But we shouldn’t view that contribution too narrowly, as simply one step along the road to a game such as Call of Duty: World War II. And nor should we neglect the games—before, during and after id’s breakthrough—that took many of the same concepts and techniques in different and equally valuable directions.
To think about the shooter’s origins is to think about labyrinths. Among the earliest pioneers of first-person videogaming is 1973’s Maze, a game cobbled together by high school students Greg Thompson, Steve Colley and Howard Palmer during a NASA work-study program, using Imlac PDS-1 and PDS-4 minicomputers. The three had been carrying out research into computational fluid dynamics for future spacecraft designs, an early show of what would become a problematic relationship between the commercial games business and the US military-industrial complex. Initially a single-plane, 16x32 tile wireframe environment for one player in which you’d turn by 90-degree increments, Maze grew to include shooting, support for a second player via serial cable, a corner-peeking functionality and indicators for which way the other player is facing.
After completing his spell at NASA, Thompson took the game with him to the Massachusetts Institute of Technology. With access to a more powerful mainframe, and the aid of David Lebling—who would go on to create the legendary text adventure Zork and found Infocom—he added eight-player support over the US defence department-run ARPANET, a map editor, projectile graphics, scoreboards, a spectator mode and ‘bots with dynamic difficulty’, all features that would resurface in mass-market shooters many years later. Maze War was very popular on campus—it used up so much computing resources that the MIT authorities created a ‘daemon’ program to find and shut down sessions. In one of its later forms, the maze extended along the vertical axis and players could fly, shoot and take cover in any direction.
If Maze War sounds like a fully-featured FPS in hindsight, it’s important to note that the category ‘first-person shooter’ is of much more recent inception—according to a 2014 study by the academic Carl Therrien, it only entered popular discussion around videogames in the late ’90s. Many studios, including id, preferred terms and slogans like ‘3-D adventure’, ‘virtual reality’ and ‘the feeling of being there’ when describing games that are played from a first-person viewpoint. Nor was the perspective exclusively, or even predominantly, associated with on-foot gunplay. There were racing games, such as Atari’s 8-bit arcade offering Night Rider, which treated the player to a dashboard view of a road made up of shifting white rectangles. There were cockpit simulators such as 1974’s Spasim (often granted dual honours with Maze War as the first-person shooter’s oldest ancestor), a 32-player space combat game in which unofficial approximations of Star Trek vessels wage war at a mighty one frame per second.
There were dungeon-crawlers such as Richard Garriot’s Akalabeth in 1976, which combined a top-down world map with first-person dungeon segments featuring coloured wireframe graphics. Maze War spawned a number of sequels and imitators, attractively billed as ‘rat’s-eye view’ experiences by a 1981 issue of Computer & Video Games magazine. The first-person shooter genre as we understand it today arose from the artistic friction between these approaches, shaping and being shaped by them in turn.
Naturally, methodologies shifted as new technology became available. Among Maze War’s more intriguing descendants is Paul Allen Edelstein’s WayOut, released for the Atari 8-bit in 1982. It made use of a rendering technique known as ray casting, whereby a 3D environment is generated from a 2D layout by sending out beams from the player avatar’s eyeball and drawing a pixel where they intersect with an object’s coordinates. Where light in reality bounces off many surfaces before entering the eye, ray casting simulates a ray’s collision with an object only once. While incapable of nuanced effects such as refraction, it was also much less resource intensive than other 3D projection techniques, which allowed for faster performance on the hardware of the day. If WayOut was a potent demonstration of ray casting’s utility, it is also worth remembering for its eccentric, non-combat premise. You play a clown trapped in a maze with a spinning, sinister ‘Cleptangle’ that will steal your map and compass on contact. A wind blows through the level, its direction indicated by floating fireflies. This interferes with movement, but also helps you get your bearings should you lose your map.
Cockpit simulations were especially popular during the ’80s, beginning with Atari and Ed Rotberg’s arcade game Battlezone, a tank sim featuring wireframe vector graphics that came with a novel ‘periscope’ viewfinder (the US Army would later try, and fail, to convert the game into a Bradley tank training simulation). In 1987, Incentive Software released Driller: Space Station Oblivion—the first game to run on its proprietary Freescape engine, which allowed for complex 3D environments dotted with simple geometric objects. The game assigned a sizeable chunk of the display to your offworld rover’s dashboard, a fat slab of buttons and indicators. In part, the prevalence of cockpit games reflected the influence of Star Wars, with its lavishly realised starfighter dashboard displays. But it also arose from attempts to make often-unwieldy simulation technology more convincing by representing players at the helm of a lumbering vehicle. Among id’s subsequent achievements was to narrow the gap between the player’s body and that of the avatar, thus helping to open a space in which ‘first-person’ denotes not merely a perspective but a narrative in which the player is protagonist.
id’s career as a first-person developer began with Hovertank 3D in 1991. A cockpit sim brought to life with ray casting and featuring animated 2D sprites, it featured players searching for civilians to rescue and tentacular UFOs to blow up. It was followed by Catacomb 3-D—id’s first crack at a first-person character-led action game, with a visible avatar hand and portrait. Catacomb also featured texture maps, flat images attached to surfaces to create the illusion of cracked stone walls and dripping moss. In this respect, id had been strongly influenced by Blue Sky Productions’ breathtaking Ultima Underworld: The Stygian Abyss, often cited as the first ‘immersive simulation’, which offered 3D, texture-mapped environments featuring sloped surfaces, rudimentary real-time physics and the ability to look up and down.
Wolfenstein 3D and Doom—both developed after John Carmack glimpsed Ultima in action at a 1990 expo—can be considered combative responses to Ultima’s representation of the possibilities of first-person 3D, eschewing the latter’s more complex geometry and gigantic array of variables in favour of pace and immediacy. Though busier with ornaments than Catacomb 3-D’s levels, Wolfenstein’s environments are designed to run at speed—designer John Romero once planned to let players carry and hide bodies, but dropped the idea to avoid bogging players down. Where Ultima set out to make players feel like part of its world via deep, consistent systems and a wealth of lore, Wolfenstein dealt in simpler, visceral effects—the sag of your avatar’s body when you take a step forward, the gore spraying from the pixelated torso of a slain Nazi. If the game pushed violence and politically charged imagery to the fore—somewhat to the distress of its publisher, Apogee—it also harkened back to the maze games of previous decades, with secret rooms to discover behind sliding partitions.
This emphasis on the avatar’s bodily presence would set the tone for many subsequent shooters—notably Call of Duty, with its blood spatter damage filter—as would id’s sense that player participation should take priority over narrative elements. When it came to Doom, there was disagreement between Carmack, Romero and id’s creative director Tom Hall over how much plot and backstory to weave into the game. Hall had planned something akin to Ultima, with large, naturalistic levels built around a hub area and a multitude of arcane props. “Story in a game is like story in a porn movie,” was John Carmack’s infamous rebuttal. “It’s expected to be there, but it’s not important.” Hall eventually resigned in 1993. In his absence, the team stripped out a number of more fanciful weapons, turned many plot items required for progression into generic keycards, and cleaned up certain environments to allow for speedier navigation.
Loaded with taboo imagery, ultra-moddable thanks to id’s decision to store game data such as level assets separately from engine data as ‘WAD’ files, and equipped with four-player multiplayer to boot, Doom was a phenomenal success. Such was its impact that before ‘FPS’ became an accepted term, many in the development community used ‘Doom clone’ as shorthand for any first-person game involving gunplay. No game can claim to define a genre for long, however, and id’s work would attract plenty of imitators and rivals in the years to come.
Four months before Doom’s arrival, a fledgling Chicago studio founded by Alex Seropian and Jason Jones released Pathways Into Darkness, a Wolfenstein homage with a pinch of Ultima-style item puzzling. It thrust players into the boots of a soldier fighting through a pyramid in order to nuke a sleeping god before it can bring about the apocalypse. One of the few Mac exclusives available at the time, Pathways was hailed for its colourful hand-drawn art and menacing atmosphere. It deserves mention today for the ability to commune with the ghosts of other explorers using special crystals and elusive keywords—an engaging, melancholy approach to textual backstory. The developer, Bungie, would build on this concept during work on two of the 21st century’s best-known FPS series, Halo and Destiny.
Before Halo and Destiny there was 1994’s Marathon, the series often billed as the Mac’s answer to Doom. A suspenseful sci-fi offering set aboard a hijacked colony ship, it was a more complex game than id’s offering—adding free look with the mouse and a range of terrain dynamics, such as low gravity and airless chambers. It was also a more convoluted work of fiction, which relied on players scouring its open-ended levels for narrative artefacts. In place of the souls of the slain, Marathon offered computer terminals through which you converse with various sentient AIs about the wider universe.
The game’s reach was limited by its choice of platform, but it attracted a dedicated community thanks to its elusive narrative backdrop and infectious eight-player, ten-map multiplayer. 1995’s Marathon 2: Durandal added co-operative play while 1996’s Marathon Infinity introduced a ‘Forge’ level editor, two features that would become central to the studio’s projects. Just as significant, however, was Bungie’s work in the emerging real-time tactics genre. Conceived by Jason Jones in a bid to stand apart from id Software, the top-down Myth games equipped Bungie with a feel for how different unit types and variables might react together. This would yield fruit in the shape of Halo’s famous combat sandboxes.
Its sheer brilliance aside, Doom’s pre-eminence during the ’90s owes much to id’s embrace of the modding community, with players able to create their own maps using the developer’s own editing tools (and thus, squeeze many hours of enjoyment out of the free shareware version). Fan concoctions ranged from Batman and Alien-themed conversions to trashy oddities like The Sky May Be, in which zombiemen moonwalk and the legendary BFG-9000 has a chance of conferring immortality on its target. Many up-and-coming designers cut their teeth on Doom mods, and other studios were eager to license it for commercial use. Among them was Raven Software, founded by Steve and Brian Raffel, which created the fantasy-themed shooters Shadowcaster, Hexen and Heretic using their own bespoke versions of John Carmack’s engine technology. The two companies were at one point based just down the road from each other, and formed an enduring bond—id would eventually hand Raven the keys to the Doom and Quake franchises.
Raven’s games were eclipsed, however, by the noxious excess of 3D Realms’ Duke Nukem 3D, a celebration of B-movie tropes that occasionally resembles a postmodern satire, and occasionally the aimless, chauvinist doodlings of a 13-year-old boy. Duke Nukem 3D is an intensely antisocial game, its levels grimy parodies of real-world locales, such as movie theatres and stripclubs, guarded by porcine coppers and strewn with the corpses of cinema idols like Indiana Jones and Luke Skywalker. While technically accomplished and formally inventive—it introduced jet packs, shrink rays, animated props such as arcade cabinets, physically impossible layouts and a protagonist who provides audible commentary throughout—the game is remembered today mostly for its jiggling softcore imagery. In years to come, shooter developers would spend as much time dispelling the notoriety Duke Nukem generated as they would profiting from his example.
Doom’s success also won the regard of franchise owners in other media. Maryland-based Bethesda—flush from the success of its eye-catchingly vast roleplaying effort, The Elder Scrolls: Arena—released a Terminator adaptation in 1995, endowed with lavish polygonal models. In hindsight, the game’s vast, cluttered wasteland feels almost like groundwork for the studio’s later first-person Fallout titles. In the same year, the venerable adventure game studio LucasArts shipped Dark Forces, the first Star Wars-themed FPS, inspired (and perhaps, annoyed) by the appearance of Death Star mods for Doom. LucasArts had designed a number of historical cockpit-based simulations during the late ’80s and early ’90s, but Dark Forces was a straight riff on id Software’s work. The developer’s impressive Jedi engine allowed for vertical looking, environments busy with ambient details such as ships landing on flight decks, a range of effects such as atmospheric haze, and the ability to stack chambers on top of one another.
By the mid-’90s, developers had begun to shift from so-called ‘pseudo-3D’ techniques such as ray casting to fully-polygonal worlds, capitalising on the spread of 3D hardware acceleration and the arrival of the first mass-market graphics processing units. Released for the Mega Drive’s 32X add-on in 1994, Sega’s lumbering Metal Head is often touted as the first ‘true’ 3D shooter. Pitching large, plausibly animated mechs against one another in texture-mapped urban environments, it was a handsome creation let down by repetitive missions. There was also Parallax Software’s Descent, released in the same year—an unlikely but gripping hybrid of flight sim and dungeon crawler with 360-degree movement. But the game now regarded as a byword for polygonal 3D blasting wasn’t, to begin with, a shooter at all.
John Romero had intended Quake to be a hybrid of Sega AM2’s arcade title Virtua Fighter and a Western roleplaying fantasy. Conceived back in 1991 and named for a Dungeons & Dragons character, the game would have alternated between first-person exploration and thirdperson side-on brawling. Romero envisioned circling dragons, a hammer massive enough to send shockwaves through the earth, and events that trigger when players look in their direction, such as glowing eyes appearing in a cave mouth. By the time John Carmack neared completion of an ambitious 3D engine in 1995, however, other id Software employees were exhausted and reluctant to depart too drastically from the Doom formula. There was also tension between the two founders over Romero’s supposedly inconsistent work ethic and Carmack’s view that the studio’s engine technology took precedence over its games. Romero ultimately resigned himself to a reimagining of Doom in polygonal 3D—and resigned from id Software itself after finishing the game.
As Big Robot’s Jim Rossignol has noted in a 2011 retrospective, something of this failure lingers in Quake as it stands. Though cut from the same coalface as Doom—it offered fast, brutal gunplay, levels made up of corridors and arenas, and a multitude of secret areas—the game’s aesthetic and fiction are curiously divided, at once crustily medieval and high tech. You can expect banks of computer monitors and teleporters, but also broadswords and monsters ripped from the pages of Lovecraft. In hindsight, it plays like a representation of the tipping point from avant-garde into profitable convention, the point at which the chimerical possibilities of 3D action solidified into the features expected of a modern first-person shooter.
In at least one respect, though, Quake was transformative—it introduced a thrilling element of verticality, with players dashing through the air above opponents rather than simply strafing or corner-camping. This quality proved an asset in the emerging field of online multiplayer: by the late ’90s, Ethernet connections and modems had become ubiquitous and internet usage was rocketing. Quake’s multiplayer was initially designed for high bandwidth, low latency local area networks—it would check with a server before showing players the result of an action, which led to jerky performance online when there was a build-up of server requests. id swiftly released an update, titled QuakeWorld, which added client-side prediction. The result can be held up as the original esports shooter—software company Intergraph sponsored a US-wide tournament, Red Annihilation, in May 1997, which attracted around 2,000 participants.
As with Doom, Quake’s modding tools made it an attractive platform for amateur developers—its community gave the world Team Fortress, which would later flower into a standalone shooter, along with early specimens of machinima, including an epic known as The Seal of Nehahra. Its greatest descendent, however, would prove to be a shooter from a developer founded by Microsoft alumni Gabe Newell and Mike Harrington.
Created using a modified version of the Quake engine, Valve Software’s 1998 epic Half-Life remains extraordinary for how it reconciles the abstractions of game design with narrative tactics redolent of a novel (the game’s tale of secret government research and alien invasion was, in fact, written by a novelist, Mike Laidlaw). Its achievement versus earlier shooters can be summed up as the creation of temporal unity: almost everything is experienced in real time from the lead character’s perspective, with no arbitrary level breaks. In place of cutscenes, Valve weaves its tale through in-game dialogue and scripted events such as enemies smashing through doors—a tactic that both gives the player some control over the tempo and avoids jerking you out of the world. The game also sells the impression of a larger, unseen universe not via gobbets of textual backstory, but through the detail, responsiveness and consistency of its environment. The intro sees Gordon Freeman riding a monorail through Black Mesa, gleaning information about the location and your character from PA announcements and the sight of other employees at work. Following a disastrous experiment, you’re asked to backtrack through the same areas, now fallen into chaos.
Half-Life created a blueprint many FPS campaign developers would adopt in the new millennium. In particular, its seamless, naturalistic design would guide studios looking to explore realistic settings, such as the ‘World War’ periods. But it also introduced a note of unreality in the shape of Gordon Freeman’s murky reflection, the besuited G-Man—a personification of the game designer who sits a little outside Half-Life’s fiction. Together with the all-seeing, omnipresent AI manipulators of Marathon and the acclaimed cyberpunk RPG System Shock, the G-Man betrays a genre becoming increasingly aware of itself, and eager to turn its own structural constraints into a source of drama.
One of the greatest influences on first-person shooters at the turn of the millennium wasn’t a game, but a film: Steven Spielberg’s World War 2 epic, Saving Private Ryan. The movie’s thunderous portrayal of the D-Day landings would find echoes decades later in videogames like Killzone and Titanfall. Spielberg himself also has a robust association with game development: he co-founded DreamWorks Interactive with Microsoft in 1995 to work on adaptations of movies like Small Soldiers. Seeking a way to teach younger people about the war after wrapping up production on Saving Private Ryan, Spielberg asked DWI to develop a shooter, Medal of Honor, for Sony’s trendy new PlayStation platform.
Launched in 1999 to strong sales, the game was a watershed moment in several respects. On the one hand, its more earnest, grounded approach opened the genre up to players put off by the lurid sci-fi or pulp comic settings of games like Doom and Wolfenstein. On the other, it facilitated tense discussions about the right of videogame developers to depict such events, and the possibility that violent games spark violent behaviour. Medal of Honor released a few months after the Columbine massacre in Colorado, an atrocity that gave rise to a moral panic over videogame violence. Fearful of a backlash, DreamWorks Interactive removed all blood from the game before launch. It also attracted a heated reaction from the US Congressional Medal of Honor Society, and its president voiced his concerns to Spielberg in person. The game’s release, in spite of all this, created a precedent for other studios to comment openly on history and society.
The close of the ’90s also saw the release of the gorgeous Unreal, sparking a decade-long rivalry between creator Epic MegaGames and id Software. Conceived as a sort of ‘magic carpet’ experience where you fly through caverns dotted with robots, the game evolved into a bona fide Quake killer, running on a proprietary technology capable of 16-bit colour and ambient effects, such as volumetric fog. Like Quake, the game was designed to be modded easily and extensively. Also like Quake, its multiplayer left something to be desired at launch. Epic released a deathmatch-oriented standalone expansion, Unreal Tournament, in 1999, narrowly ahead of the arrival of id’s Quake III: Arena. A brace of colourful alternate fire options aside, it was notable for including both more competitive ‘hardcore’ and relatively playful ‘theme’ maps, such as levels floating in Earth’s orbit. The franchise found a dedicated following online, but the bedrock of Epic’s business would prove to be founder Tim Sweeney’s Unreal Engine, a highly modular entity designed for continual improvement. It would power games as diverse as Ion Storm’s legendary immersive sim Deus Ex and EA’s adaptations of the Harry Potter movies.
Where Quake and Unreal Tournament dealt in cartoon bazookas and evaporating torsos, another 1999 release, Counter-Strike, set its sights on military realism. A Half-Life mod created by attic developers Minh Le and Jess Cliffe, it saw teams of terrorists and counter-terrorists struggling to arm or defuse bombs and rescue or maintain custody of VIPs, customising their loadouts with currency earned at the end of each round. The mod wasn’t a landmark success to begin with, but Valve’s designers knew a killer formula when they smelled it and scooped up Le and Cliffe along with the intellectual property rights in 2000. Counter-Strike became an enduring phenomenon, buoyed up by thousands of user-created maps (including David Johnston’s legendary Middle Eastern levels Dust and Dust 2) and a community as resistant to fundamental rule changes as any diehard fan of football. Perhaps the definitive esport shooter, its objective-based modes and tactics-driven design are integral to the DNA of competitive multiplayer today.
2000 was also the year that Microsoft acquired Bungie, thereby depriving Apple’s Mac of one of its more coveted games, a science fiction odyssey called Halo. The game had begun life as an open world exploration affair, running on Bungie’s Myth engine, and something of that luxuriant scale remains in the completed Halo: Combat Evolved, which was an enormous hit when it launched on Microsoft’s first Xbox console in 2001. Halo’s environments were bright, rangy and colourful, where other shooters were claustrophobic and dingy, and they were lent an intense overarching unity by the silhouette of the Halo ringworld itself, stretching up through each skybox. Its crowded encounters were far more open-ended than in most competitors, woven around delightful AI variables like Grunt footsoldiers kamikaze-rushing the player after you kill their leader. Its weapons retained something of Quake and Unreal’s excess—overcharging an energy pistol to strip an opponent’s shield in one go would become a standard multiplayer tactic—but its blend of finite player health and recharging overshields imposed a more studied, back-and-forth rhythm on firefights. Halo also showed off Bungie’s knack for world-building: the fascination of its wider universe would help cement its status as Microsoft’s flagship series.
Halo would be eclipsed, however, by another World War 2 shooter, created using id Software’s Quake III engine by Infinity Ward—a studio founded by veterans of Medal of Honor: Allied Assault with startup money from Activision. Released in 2003, Call of Duty was among the first shooters to let players aim down a weapon’s sights—a gambit that created a sense of fearful claustrophobia, narrowing your attention to the gun roaring in your hands, even as the game’s sprawling levels and battalions of AI troopers courted comparison with Allied Assault. It was a little overshadowed by Medal of Honor on PC, but Call of Duty’s popularity caught the eye of Microsoft, who asked Activision to develop an Xbox 360 port of the sequel. With Halo 3 still a couple of years away, Call of Duty 2 was a bestseller at the console’s 2005 launch. Mindful of the risks of hanging an entire series on a single developer, Activision brought on Spider-Man studio Treyarch to design Call of Duty 3 using the second game’s engine, giving Infinity Ward an extra year at the coalface. It was the beginning of a yearly alternation that, together with the franchise’s all-year-round multiplayer appeal, would allow Call of Duty to bury competitors and exert an out-sized influence on the genre at large.
Among Infinity Ward’s more ferocious competitors was a multiplayer-centric WW2 game created by Swedish developer DICE. Battlefield: 1942 saw up to 64 players tussling for capture points on enormous, open maps. Where Call of Duty’s own multiplayer came to prioritise pace and lone wolf virtuosity, Battlefield emphasised squad composition, the canny use of strategic resources such as vehicles, and above all, depth of simulation. The developer’s Refractor engine allowed for such crude feats of real-time physics as using TNT to launch a jeep across a bay onto an aircraft carrier’s deck. Though never quite a trendsetter in the increasingly lucrative console market, in large part die to its anaemic campaign options, Battlefield’s scale and freedom were a tonic for armchair generals weary of vanilla deathmatch.
Crytek’s Far Cry had a similar appeal. It began life as a glorified tech demo, the catchily titled X-Isle: Dinosaur Island, but flowered with Ubisoft’s backing into the first open world FPS in the current sense of the term. Where other shooters taught players to keep pushing forward, Far Cry allowed you to run amok in a vast tropical environment, using the undergrowth for cover while tracking unsuspecting soldiers through your binoculars. The series would go onto enjoy a symbiotic relationship with Ubisoft’s third-person Assassin’s Creed games, each experimenting with new ways to structure and diversify an open world.
If Far Cry was one of 2004’s highlights, it and every other game that year was utterly dwarfed by Valve’s Half-Life 2. While not as transformative in terms of storytelling craft as its predecessor, the new game’s post-alien invasion dystopia was a work of unprecedented delicacy. Where older shooters looked to B-movies for inspiration, Half-Life 2’s incompletely terraformed city compares to mid-20th century Communist eastern Europe (the game’s art director, Viktor Antonov, hails from Bulgaria)—at once grand and ground down, alternating steely megaliths with trash-strewn riverbeds and grubby prisons. Its principle opponents aren’t bug-eyed monsters but masked enforcers wielding batons and carbines, their presence given away by indecipherable radio chatter. It’s also, for all its linearity, a celebration of player agency, handing you a Gravity Gun that allows you to pluck and hurl sawblades at enemies, solve slightly goofy seesaw puzzles and pile up objects at whim. The game was widely imitated, within the first-person shooter genre and without, but arguably its greatest legacy is Steam, Valve’s now-globe-straddling desktop games store. It’s hard to imagine players embracing the clunky 2004 version of Steam quite so readily, were it not required to play Half-Life 2.
If Valve’s offering set the standard for FPS design (in terms of its campaign, at least) it was Call of Duty that swallowed up most of the limelight during the ’00s, the critical year being 2007. Weary of World War 2 and conscious of the need to differentiate its offering from Treyarch’s, Infinity Ward decided to transport the series to the present day. The result, Call of Duty 4: Modern Warfare, unlocked a brand-new vocabulary for the first-person shooter. It traded the mud and everyman heroics of WW2 experiences for a slick, cheerfully amoral celebration of western military hardware and urban combat tactics—arming the player with laser sights, ghillie suits, Stinger launchers and drones. It also courted topicality where games like Medal of Honor had tried to distance themselves from the headlines—one level sees you living out the final moments of a country’s deposed president, while another puts you at the controls of an AC-130 gunship, in scenes familiar from news footage of the Iraq War. But what it is mostly remembered for today is the multiplayer. Infinity Ward’s decision to introduce a levelling and unlocks system derived from roleplaying games is the most influential sea change in shooter design during the past decade. Its notion of an online career, whereby players kept plugging away for small rewards rather than just enjoyment, also helped popularise the emerging concept of the game as ‘service’.
Call of Duty 4 wasn’t the only game to do a little genre-splicing in 2007. Irrational’s BioShock began life as a spiritual follow-up to the System Shock series—its creative director, the soon-to-be-famous Ken Levine, was a designer on System Shock 2—but over time it became more of a shooter than an immersive simulation or RPG. It casts the player as an airplane crash survivor exploring a disintegrating undersea ‘utopia’ created by a renegade industrialist, in a thinly disguised meditation on the philosophy of Ayn Rand. The game’s combat, which married chunky period firearms with pseudo-magical powers or ‘Plasmids’, would prove its weakest element. More intriguing was the universe of cruelty and hubris it sketched, a labyrinth of leaking glass tunnels and domed Art Deco plazas.
Building on Half-Life 2’s example, Irrational left much of Rapture’s backstory for players to discover in the form of audio diaries, graffiti and random bric-a-brac. Its environmental storytelling would attract legions of imitators across several genres, from Raven Software’s unfairly overlooked 2010 shooter Singularity through body-horror masterpiece Dead Space to so-called ‘walking simulators’ like Gone Home. It also formed part of an ongoing conversation about games as a means of rousing empathy or exploring moral quandaries. BioShock’s signature characters are the Little Sisters, mutated little girls who collect genetic material from corpses under the eye of their powerful guardians, the Big Daddies. Having disposed of the latter, you can either spare Little Sisters or kill them to harvest their ‘ADAM’, a resource you can use to upgrade your own powers.
The late ’00s saw the rise of the open world shooter, with Crytek’s fearsome Crysis swaddling the player in power armour in order to battle aliens on yet another overgrown island wilderness. The game was sold as an exercise in technological masochism, its detail, lighting and plethora of effects ‘melting’ all but the most expensive PC hardware. But its real trump card was the ability to enhance your Nanosuit’s agility, strength or endurance on the fly by drawing power from a finite reservoir, making it an engaging risk-reward system. It was soon eclipsed, however, by the Far Cry series, which Crytek had by now sold to Ubisoft. That’s both in spite of and thanks to Far Cry 2, an astonishing, bruising shooter stretched across 50 kilometres of African brush. Drawing on his experiences with the Splinter Cell games, designer Clint Hocking set out to create a brutal, Heart of Darkness-esque sandbox in which players fought malaria, self-propagating fire and bullets simultaneously. The results were arresting, but also frustrating, thanks to a patchy narrative, alternately dim or eagle-eyed AI and an unfair enemy respawning system.
2012’s widely acclaimed Far Cry 3 removed much of the frustration, and a little of the sophistication. It opened out the terrain, fine-tuned the AI to be more predictable, and put capturing enemy outposts—each a potted stealth-combat puzzle, inspired by the Borgia towers in Assassin’s Creed 2—at the heart of exploring the map. It also created a combo system, with players chaining melee executions into ranged takedowns, reflecting a growing interest across the industry in fluid first-person animations, epitomised by DICE’s 2008 parkour game Mirror’s Edge. Less positively, it traded the second game’s understated, callous portrayal of a perpetual civil war for a farcical story about whiny, kidnapped backpackers wrestling with the definition of insanity.
Players unconvinced by Far Cry or Crysis had a number of rival open world shooters to choose from. One of them was the Stalker series, inaugurated by Ukrainian developer GSC Game World in 2007, in which scavengers pick their way through radioactive ruins while keeping a look out for monstrous creatures and invisible, fatal anomalies. Stalker’s supporting systems were remarkable—at one point, the AI was allegedly capable of completing the game by itself—but its punishing survival simulation ethic limited its audience. Gearbox’s roleplaying shooter Borderlands took a friendlier, trashier tack. Released in 2009, it saw you touring an anarchic, comic book-style planet as one of four classes, hoovering up procedurally generated (often borderline-unusable) weapons. Part of Borderlands’s success, the novelty of its arsenal aside, was its humour—a rare quality in an often po-faced genre.
The turn of the decade saw a number of long-running FPS series beginning to lose momentum. Most obviously, the Medal of Honor series underwent an abortive attempt at reinvention in 2010, with publisher EA looking to fill gaps in the schedule between Battlefield instalments. In jumping forward from WW2 to present-day Afghanistan, the once-proud series merely left itself open to unflattering comparisons with 2009’s Call of Duty: Modern Warfare 2. id Software’s properties were also at low ebb. Though an accomplished horror experience, 2007’s Doom 3 lost out to Half-Life 2, while Quake had all but evaporated following Quake 4’s muted reception in 2005. Raven Software’s 2009 Wolfenstein reboot doubleddown on the paranormal aspects of the series backstory, to mixed effect. Following a similarly lukewarm response to Singularity, parent company Activision retasked the studio to help out with the Call of Duty series. RAGE—id’s only new IP during these years save mobile game Orcs & Elves—proved a visual extravaganza and a gratifyingly hefty, Mad Max-ish shooter, but all too often felt like it was playing second fiddle to its own graphics technology. id’s old foe Epic, meanwhile, was increasingly dedicated to the third-person Gears of War series and its flourishing Unreal Engine business.
Call of Duty continued to reign supreme, though it attracted increasingly stiff competition from EA’s Battlefield—a franchise increasingly (and a little unfairly) pitched as a freeform ‘thinking man’s shooter’, more respectful of player agency than the linear, attrition-driven Call of Duty. After experimenting with a lighter, buddy-comedy vibe in the Bad Company spin-offs, DICE amped up the grandeur with Battlefield 3, a multiple perspective tale of abducted nuclear weapons set partly in Iran (the bestselling instalment until DICE’s journey into WW1 with Battlefield 1). The series had become famous for its Frostbite engine technology, which amongst other things allowed for real-time terrain destruction in multiplayer: participants could do everything from blasting out spyholes in walls to levelling buildings.
Call of Duty’s greatest existential threats, however, were a mixture of internal discord and external market pressures. In March 2010, Activision—now by far the industry’s largest publisher, following a mega-merger with Vivendi and its subsidiary Blizzard—fired Infinity Ward cofounders Jason West and Vince Zampella over alleged insubordination. A few weeks later, West and Zampella announced the foundation of new studio Respawn Entertainment. A wave of lawsuits and countersuits followed, alongside a mass exodus of staff from Infinity Ward to Respawn. Activision was forced to call upon the recently founded Sledgehammer Games to help the depleted Infinity Ward finish Modern Warfare 3.
While the series weathered this crisis—thanks largely to Treyarch’s pop-savvy, hallucinogen-crazed Black Ops subfranchise—Activision and other publishers also had to manage a problem of budget versus expectation. Scripted corridor campaigns in the Half-Life vein were proving increasingly expensive, thanks largely to the cost of HD art assets, and telemetry showed that players spent the bulk of their time in multiplayer. However, attempts to remove singleplayer from the package led to an outcry. Among the teams that struggled with this problem was Respawn. The EA-published debut Titanfall pioneered the concept of campaign multiplayer, with narrative elements, such as picture-in-picture cinematics, dropped into rounds of team deathmatches. The game was enthusiastically received—a mixture of towering mech combat and nimble parkour duelling, it restored something of Quake and Unreal Tournament’s agility to a genre that had become bogged down in cover combat. Its audience tailed off swiftly, however—many first-person shooter enthusiasts found the mechs-and-pilots premise to be more of a novelty than a game-changing fixture, though the larger problem was perhaps that, on consoles, Titanfall was exclusive to the Xbox brand.
Other shooter developers ‘rediscovered’ mobility during this decade—Call of Duty: Advanced Warfare and Black Ops III dabbled at length with powered exosuits, while Halo 5: Guardians added boost slides, double-jumps and ground-pounds to Master Chief’s moveset. But the game that brought it all together was 2014’s Destiny, the work of erstwhile Halo developer Bungie, now free from producing games solely for Microsoft. It’s a mixture of MMO-style looting and Titanfall-esque acrobatics, all bundled up in an aesthetic that is reminiscent of the ’70s space race and classic sci-fi book cover illustrations. Destiny is in some ways quite a soulless game: it’s as grindy as Borderlands and far less self-deprecating, but its ruined, yet sumptuous, solar system environments have an irresistible mystique. It also feels tremendous in the hands, with some beautifully judged weapon designs and class abilities.
With last year’s Call of Duty: Infinite Warfare tracking far behind Black Ops III, Destiny has become one of Activision’s two flagship shooters. The other is Blizzard’s joyful arena shooter Overwatch, released in 2016. Overwatch is a lovely game to end on because it is essentially an interactive genre history, a celebration of its triumphs, foibles and even failures. It doesn’t merely reach out to weapons, gadgets and abilities from other shooters, but also their quirks, exploits and the antics of their communities—Quake’s rocket jumping, aimbots from Counter-Strike and internet edgelords in general. Its heroes are love letters to 30-odd years of genre history. Pro-gaming celeb turned mech pilot D.Va is both a potted Titanfall and a parody of the noxious ‘gamer girl’ stereotype, for instance. Soldier 76, meanwhile, is Call of Duty man. Even as it pays tribute, however, Overwatch also points to the future—be it in the effortless way it folds in concepts from fighting games and MOBAs, or in how it extends the FPS cast-list well beyond the muscular, dudebro protagonists beloved of so many rivals. It speaks to the enormous range of concepts that make up the modern FPS, for all its myriad hang-ups—a genre that has always been about so much more than firing a gun.
Last week, PlayerUnknown’s Battlegrounds . The previous record-holder, Dota 2, while admittedly made by one of the world’s biggest and most powerful games companies, began as a Warcraft mod. These days, we barely blink an eye at the idea that a game can come from nowhere and shake through word-of-mouth, clever concepts, a bit of cool technology like Portal’s… well, portals… or simply by hooking into some reservoir of good feeling, and accomplish more than any marketing budget can dream of. Minecraft is this generation’s Lego. Undertale is one of its most beloved RPGs.
Indeed, the world of indie development is now so important that it’s hard to remember that it’s only really a decade or so old. That’s not to say that there weren’t indie games before then, as we’ll see, but it was only really with the launch of Steam on PC and services like Xbox Live Arcade that the systems were in place to both get games in front of a mainstream audience, and provide the necessary ecosystem for them to quickly and confidently pay for new games.
The massive success of indie games on Steam has of course come with attendant pitfalls. The early access program gave small studios the ability to beta test their games with player numbers they would not otherwise never reach, and gave players the ability to take part in shaping games. However, a lack of guidelines left players and developers with very different expectations as was seen in the reaction to a paid expansion being released for Ark: Survival Evolved while it was still in early access. Steam Greenlight made it easier for indie games to get on Steam but became a popularity contest that was easily gamed, leading Valve to replace it with Steam Direct.
All this is largely taken for granted these days, with the big challenge for modern indie games being to stand out. Simply getting onto Steam back then could set a studio up for life. These days the market is full to bursting, with most new releases disappearing from sight almost at once.
In both cases though, it’s a world away from how the market began.
The exact definition of ‘indie’ has never exactly been cut-and-dry. To some, it’s an aesthetic, best summed up by the classic bedroom coder. To others, it’s a more commercial distinction, of working without a publisher. To others, it’s ultimately about the work, with an indie game standing out more for being not the kind of thing you get from a commercial company, rather than really focusing on who made it.
There are many definitions to play with, and few hard lines to draw. The poster-children of ’90s shareware, id Software (who you may know courtesy of a little game called Doom), began working under contract for a company called Softdisk, cranking out games like Dangerous Dave in the Haunted Mansion, Hovertank 3D, and Catacomb 3D, before moving on to make games with/for shareware giant Apogee.
In the very early days of gaming, just about everybody was indie to some extent. In 1979 Richard Garriott set out on his path to buying a castle and going into space by selling copies of his first RPG, Akalabeth, in ziploc bags at his local computer store (one of those copies then ended up in the hands of California Pacific, who offered Garriott a publishing deal). Sierra On-Line began in 1980 as just husband and wife team Ken and Roberta Williams, making simple adventure games like Mystery House that nevertheless pushed the boundaries of what people expected from games at the time—like having graphics—before booming to become one of the biggest and most important companies in gaming history.
Companies could emerge from almost anything. Gremlin Interactive began as a computer store called Just Micro, while DMA Design, originally Acme Software, which would make its name with Lemmings and much later become Grand Theft Auto creator Rockstar, began from its founders meeting up at a computer club in Dundee and ultimately signing with Psygnosis. Whole genres were created from a single game, such as Football Manager in 1982.
The speed of all this took many by surprise, with Balance of Power creator Chris Crawford saying in 1984, "We have pretty much passed the period where hobbyists could put together a game that would have commercial prospect. It’s much more difficult to break in, much less stay in. If you want to do a game, do it for fun, but don’t try to do game designs to make any money. The odds are so much against the individual that I would hate to wish that heartbreak on anyone."
But of course, people continued. The PC was largely left out of much of it, however, due to the relatively high cost of disks and its general perception of not being a gaming machine. In the UK, the main indie scene in the ’80s was on cassette based 8-bit systems like the ZX Spectrum, with publishers happily accepting almost any old tat, recording it to a tape, sticking it in a box, and selling it for a few pounds at newsagents, game stores, and anywhere else that would take copies. They were cheap, sometimes cheerful, and allowed for endearing weirdness like 1985’s Don’t Buy This—a compilation of the five worst games sent to publisher Firebird.
It would be many years before most indie PC games could get that kind of placement. Instead, there was shareware. The concept dates back to the 1970s, though it was popularized by PC-Write creator Bob Wallace in 1982. Rather than having a central distributor like a regular published game, users were encouraged to copy software and pass it along. If they liked it, they’d then send the creator a check to unlock the full thing or get more of it.
In the case of Apogee Software, and indeed what became known as the Apogee model, a game might have three parts. The first one would be free, and free to share, the other two commercial and only for registered purchasers to enjoy. (Not that anyone really listened, as the vast, vast numbers of pirated copies of Doom probably shows better than anything.)
The beauty of the system was that anyone could distribute these games, with the rule being that while you weren’t allowed to sell the shareware version, you could charge for materials. That meant games could appear on magazine cover disks and later CDs. They could be on any university server or dial-up BBS or services like Compuserve and AOL. If you wanted a relatively full choice however, you often needed to send off for them. Whole companies were set up to sell just the trial versions, sending out printed catalogues of their stock and charging by the disk.
By the mid-90s of course the popularity of CD had rendered this relatively pointless, with ‘1000 Games!’ CDs available in supermarkets and bookstores and anywhere else there might be an audience, rarely mentioning the part about them being glorified demos. Much like on Steam today, at this point most smaller games got lost. Still, as a player, it was an almost inexhaustible feast.
As crazy as sending off a check to get a game might seem, it worked. In a few cases, registered shareware games even made the jump to boxed products in stores, though that was relatively rare. Either way, shareware was hardly a license to print money for most, but it supported many a developer throughout the '90s and made others their fortunes. Epic MegaGames began with the text-based RPG ZZT before becoming the company that made Unreal. Duke Nukem began as a very simple 2D side-scroller, notable mostly for oddities like the main character wearing pink and just wanting to save the world so that he could get back to watching Oprah, but nevertheless blossomed into Duke Nukem 3D before publicly wilting into Duke Nukem Forever.
And there were many more stars too, regularly appearing in new games or simply popular ones that kept showing up, like Skunny the squirrel and his awful platforming (and ultimately karting adventures), Last Half of Darkness, and Hugo’s House of Horrors, much beloved by magazine and compilation editors for its extremely pretty first screen, and never mind that it was all made of clip art and every other room in the game was barely MS Paint-level scribbles.
Shareware's big draw for players was, inevitably, free games. The downside of the Apogee model and others that erred on the generous side was that a whole episode was often enough—especially as that’s where the developer’s best work tended to be. Compare for instance the deservedly beloved shareware episode of Commander Keen: Goodbye, Galaxy! where you run around a beautiful, varied planet, with the dull space adventure of its commercial sequel. Not every game could be Wolfenstein 3D and promise a fight with Robot Hitler if you paid.
Less cynically though, shareware gave many genres their home. The PC was typically seen as a business machine, with its commercial successes often adventures, RPGs and other slower and more cerebral offerings. There were platformers and beat-em-ups and similar, but they were usually poor conversions from other platforms at best, with few worth taking a risk on.
Shareware removed that risk factor for customers, while letting developers show off. The original Commander Keen, while simplistic to modern eyes, was proof that the PC could do console-style scrolling, even if it wouldn’t be until 1994’s Jazz Jackrabbit that anyone could seriously claim to be doing convincing 16-bit console-style arcade action and visuals. (Even then it wasn’t a very strong claim, but luckily by this point the PC had Doom and so didn’t care.)
This led to a flurry of games you really couldn’t get elsewhere, or that were in very short supply on the shelves, from vertical shooters like Major Stryker, Raptor, and Tyrian, to fighting games like One Must Fall, to quirky top-down RPGs like God of Thunder, and racing games like Wacky Wheels. It offered a great split. When you wanted a deep, polished experience, you had the commercial game market. For action fun, there was shareware, not least because when we did get big games like Street Fighter II, they tended to stink. Shareware supported the industry through much of the '90s.
By the mid-90s though, there was a problem. Commercial games began rapidly outstripping what bedroom teams could do, both in terms of technology and complexity of content. While there were engines available, they were mostly poor quality, with nothing like Unity on the market and the likes of Quake and Unreal costing far too much for anyone but other companies to license.
If you wanted to play with that kind of technology, you were looking at making mods instead. This was the era that gave us the likes of Team Fortress (1996) and Defense of the Ancients (2003), but also where the indie scene became largely forgotten. This wasn't helped by the fact that indie had essentially no place on consoles at all, despite a few nods over the years like Sony’s Yaroze console, a development PlayStation aimed at hobbyists released in 1997. The PC saw its own push towards home development with tools like Blitz Basic/BlitzMAX (2000) and Dark Basic (also 2000), with the goal of inspiring a new generation of bedroom coders. However, despite selling reasonably well, none of them gained much traction or saw many releases.
The indie scene as a whole ceased to be a big player in the market—which isn’t to say that it vanished. Introversion’s Uplink for instance was a big hit in 2001. Jeff Vogel’s Spiderweb Software started releasing old-school RPGs like Exile and Geneforge in 1995. PopCap began in 2000, becoming the giant of casual games like Bejeweled, Peggle, Bookworm Adventures, Plants Vs. Zombies, and Chuzzle—not bad for a company that was originally called ‘Sexy Action Cool’ and planned to make its debut with a strip poker game.
And of course, there are other notable exceptions, such as Jeff Minter, who never stopped making his psychedelic shooters both for himself and others. However, it wasn’t until 2004 when Steam nailed digital distribution that the market had a chance to explode and offer a real chance of going it alone.
Steam wasn’t the first digital distribution system, and at its launch it wasn’t even popular, with Valve forcing it on players for both Half-Life 2 and Counter-Strike. However, it was the first major attempt that nailed the details, like being able to download your games on any computer you owned rather than having them locked to just one, and being able to do so perpetually, rather than simply for a year, as was the case with most of the competition.
The results spoke for themselves. When Valve was a lot pickier, and being backed by a publisher was a distinct advantage to getting onto the system, any developer who managed to get onto Steam effectively received a license to print money. Farther afield, though games not on Steam were at a distinct disadvantage, the legitimisation of digital distribution as a concept certainly raised most boats.
And with all this came something just as important: the indie game ecosystem. With money to be made and developers flocking to indie for all sorts of reasons (being tired of the big companies, wanting to make a go of an independent project) it became viable to create tools and systems to help make the scene. Game Maker for instance, and Unity and Flash. Today, would-be indie developers have the tools to go head-to-head with even the biggest studios, albeit typically on a smaller scale, as well as explore more cost-effective options like pixel art and procedural 3D, while services like Kickstarter and Fig offer a way of seeking funding without immediately selling out.
This also opened the definition of ‘indie’ even further, with companies seriously able to consider going it alone, without a publisher. Not everyone could be Double Fine, raising $3.5 million for Broken Age, but many have had huge successes—Pillars of Eternity pulling just under $4 million, the Bard’s Tale getting $1.5 million and in the height of Kickstarter fever, even Leisure Suit Larry creator Al Lowe managing to raise $650,000 for a remake of the first game.
It’s at this point that the word 'indie' really catches on. Again, it’s not that it was never used, but until this point the scene wasn’t big and important enough to warrant a position as basically a shadow industry in its own right. The release of Cave Story in 2004 was where people really started talking in those terms, with Indie Game: The Movie in 2012 cementing this, highlighting three of the most successful titles of the time—Braid, Fez and Super Meat Boy.
Microsoft embracing the scene via Xbox Live Indie Games played its part, as did their XNA development system, and attempts to make a big deal out of indie launches during its "Indie Game Uprising" events between 2010 and 2012.
Elsewhere, the IGF (Independent Games Festival) launched in 1999 was also going from strength to strength, drawing more attention to the likes of Darwinia, Monaco and Crayon Physics Deluxe. We also saw more overtly indie friendly portals like itch.io, and the Humble Indie Bundle, offering new marketplaces and ways of selling games—even if many later bundles proved a dead-end.
Perhaps most excitingly, it’s now that we start to see whole genres and styles largely associated with the indie market either flourish or come into existence, not least the ‘walking simulator’—games primarily about exploring a space and a story through environmental detail and voiceover. The first big name here was Dear Esther, a free mod released in 2008 and later remade in 2012, with later examples including Gone Home, Firewatch, and Everyone’s Gone To The Rapture.
There’s also the pixel-art aesthetic of games like VVVVVV, Super Meat Boy, and the original Spelunky, and for many old-school gamers, a return to brutal old-school difficulty. And somehow I doubt we need to say much about Minecraft. (It’s been quite popular, and influential.) Classic point-and-click adventures also saw a resurgence outside of Germany, largely spearheaded by the Adventure Game Studio creation engine and the success of Wadjet Eye Games’ The Blackwell Legacy, Gemini Rue, Technobabylon, and the upcoming Unavowed.
But it’s of course reductive to pick specific genres. The joy of indie games is that as long as the money can be raised somehow, a passionate team can take on more or less whatever they like, free of publisher interference or perceived wisdom, allowing for arty games like Limbo and Bastion (distributed by Warner Bros, but only as a publishing partner), throwbacks to lost genres like Legend of Grimrock, exploratory pieces like The Stanley Parable and The Beginner’s Guide, or completely new concepts like Superhot, where time only moves when you do, and the ferociously complex Kerbal Space Program, where difficulty really is a matter of rocket science.
The downside is that as ever, it’s not enough to simply make a game. An indie title buffeted with word of mouth can sell millions, but far more are doomed to languish largely unplayed and discussed in the depths of Steam’s increasing piles or other services’ far less traveled shelves. The initial gold rush is very much over. Still, plenty of gold remains. It’s impossible to predict what game will be the next Spelunky, the next Minecraft, the next Undertale, or the next Super Meat Boy, but absolutely no risk at all to bet that whatever it is, it’s already on its way.
Wolfenstein 3D is primarily a game about shooting Nazis. Occasionally you'll shoot dogs too, but they're Nazi dogs. Occasionally you'll steal treasure, but it's Nazi treasure. Nazis are a common foe in videogames because they're unambiguously bad, thus triggering no moral quandaries among those who digitally shoot them.
But what if you could, uh, try to talk them out of being Nazis instead? Rather than fight fire with fire, what if we had an Earl Grey and a chinwag? Dialogue 3D —a "hack" of the original Wolfenstein 3D by Ramsey Nasser—offers one fairly convincing answer: You'd probably not have much luck.
The game comes amid much online debate relating to how people who are not Nazis should treat those who are. Some suggest having a discussion, whereas others are happy to punch them from here to next Sunday. I think it's fairly clear which side of that conundrum Nasser inhabits, and he makes a pretty strong case—albeit via a shallow 1990s videogame.
The game is free, only 7mb, and thoroughly unenjoyable to play. But you might as well give it a shot anyway.
Have You Played? is an endless stream of game retrospectives. One a day, every day of the year, perhaps for all time.>
I’ve been trying to remember whether I had some sense when I first played Wolfenstein 3D that I really, truly was playing the future of videogames. It did seem landmark, but back then, age 12 or 13, every new game seemed landmark to me – each was a brand new experience, both because I was so young and because so were videogames.
Originally published in August 2016, this feature makes a return for Food Week.
I don't remember which game we were playing, but it was the kind of Japanese RPG that listed everything you needed to know about its characters down the side of the screen. Magic points, coins, food, all summed up with helpful numbers. Only one of them was abbreviated: HP.
“What does HP stand for in this game?” I asked my friend, an expert on JRPGs.
“Health pineapples,” he confidently replied. “You have to knock all the pineapples off before you can hurt someone.”
HP, whether it stands for hit points, health power, or indeed health pineapples, is one of many mechanics to come to video games via the original tabletop roleplaying game Dungeons & Dragons. However, the idea of representing the amount of punishment a character can take with a discrete number of points is much older than D&D. And while we might all know what the abbreviation means, it turns out that what hit points are meant to represent isn't quite so obvious.
In , D&D's co-creator Dave Arneson explained that the earliest version of the game didn't have hit points. The rules had evolved from wargames he and fellow D&D inventor Gary Gygax played, in which a single successful attack was all it took for a soldier to die.
That changed when they started experimenting with having players control individual heroes rather than entire armies, as players identified with them much more strongly. As Arneson put it, They didn't care if they could kill a monster in one blow, but they didn't want the monster to kill them in one blow.
Arneson had previously made his own rules for a naval wargame set during the Civil War called Ironclads, and together with Gygax had collaborated on a Napoleonic naval game called Don't Give Up The Ship! Both games had a mechanic that allowed for ships to take multiple hits before being sunk, which they'd borrowed from the wargaming rules designed by author Fletcher Pratt in the 1930s. They borrowed those rules again for D&D.
In his book about the history of simulation games , Jon Peterson explains why hit points were such an important idea: Hit points introduce uncertainty and variance [ ] In Dungeons & Dragons, even when the prospects of a hit are near certain, the damage dice provide another potential survival mechanism via endurance, another way of forestalling death and increasing the drama of combat.
Like D&D, video game combat discovered a new sense of drama with hit points. Early arcade games like 1978 s Space Invaders typically killed players with a single successful enemy contact, using multiple lives to prolong the experience. Replacing that with the ability to survive a set number of hits before dying added a finer-grained rise in tension. It removes the frustration of being reset to the start of a level every time a player is so much as brushed by an enemy, and as the number of hit points remaining falls your anxiety rises in direct correlation.
Being on your last life may make you cautious, but there's a smoother transition with hit points. You gradually shift between playing more carefully as you approach half-health, biting your metaphorical nails as it dwindles below that, and sinking into erratic risk-taking when only a sliver of life remains.
Video games inspired by D&D were the first to copy hit points, as far back as 1975 games and , which were coded for the system designed by the University of Illinois. DND was also the first game to have bosses, who could have hundreds or even thousands of what it called Hits.
The first official adaptations of D&D to PC were the Gold Box series begun by SSI with 1988 s . They followed the rules of what was then called Advanced Dungeons & Dragons closely, which meant beginning characters had very few hit points. Playing around a table there s always the option to fudge dice rolls to prevent deaths from feeling too arbitrary, but the computer was never so forgiving and players got used to reloading frequently.
Games that weren t licenced had no such problem. The first Ultima began players with a tidy 150 hit points, and the second with 400. Important non-player characters like Lord British had totals so high that killing him , and by Ultima III players were luring Lord British to the beach so they could attack him with cannon-fire, as if he was one of the naval ships in the wargames hit points came from.
Arcade games tended not to represent hit points numerically, however. Memorably, in the platformer Ghosts 'N Goblins (ported to the Commodore 64 in 1986) Sir Arthur lost his armor on taking damage, continuing to fight in his underwear.
One of the first game to represent hit points with the now familiar life bar was , a 1985 dungeon crawler by Namco with a Vitality meter that changed from blue to red as you took damage from its bats, snakes, and cave sharks. While red life bars would go on to become standard, other ways of visualizing hit points have been tried with varying degrees of success.
1983 ZX Spectrum/BBC Micro game had a slowly depleting roast chicken that tracked your starvation, and dinosaur fighter Primal Rage used veins leading to a heart that exploded at the moment of defeat.
Other games have tried to make their life bar a part of the game world, as in first-person Jurassic Park game where it's a heart tattoo on the protagonist's breast you have to look down at to check. In sci-fi horror game Dead Space the life bar is represented by lights on the back of your armor, which would be very useful if you had a doctor standing directly behind you. Each of these visualizations is just a way of integrating a hit-point counter into the world, but in doing so they free the player from having to correlate a number with something that should feel real and immediate. They re all still the same old hit points, under the surface.
, a 1987 first-person shooter on the Atari ST, was an early example of both the deathmatch shooter and the idea of representing hit points visually. Each player was a floating smiley face, like a three-dimensional Pac-Man, and an icon of that face at the top of the screen became sadder as they took damage. Later shooters like Wolfenstein 3D and Doom would copy this idea, their protagonists' faces growing more bruised and bloody as they absorbed bullet after bullet.
MIDI Maze is an early example of another change in the way hit points worked, as it also had regenerating health. It wasn't the first, however. The action-RPG , released on Japanese home computers like the PC-88 in 1984, gave players back hit points when they stood still. Where other games had food and first-aid kits that functioned as magically as the healing potions in fantasy RPGs, regenerating health though no more realistic at least took health items out of the game world. It made healing an abstraction like hit points are, rather than requiring players assume Johnny Medkit has wandered the world ahead of them scattering healing items like seeds.
It was Halo: Combat Evolved that popularized regenerating health, which is ironic because it didn't really have it. Halo's hero Master Chief wears an energy shield that regenerates after a short interval without taking damage, but once that's gone he has a traditional life bar that can only be refilled with medkits.
However, the recharging energy shield was what gave Halo its famous 30 seconds of fun that happened over and over and over and over again as designer , letting players pop out of cover to shoot aliens and then duck back to recharge and reload, and that's what had a lasting impact.
The idea was copied and modified by plenty of other games. Call Of Duty has become the flag-bearer for regenerating health, taking the blame for its propagation though it wasn't introduced until the second game in the series. Even in the mid-2000s as it was first becoming widespread, regenerating health was criticized by old-school shooter fans for removing some of the drama and tension that hit points represent. It's still enraging comment sections today.
Three games released in 2005 and 2006 all tinkered with ways of making regenerating health retain the sense of rising tension that hit points were first introduced to create. Condemned: Criminal Origins, Prey, and F.E.A.R. all set a floor on automatic healing so that if you take enough damage to fall below around 25% of your hit points you can't regenerate back above that line. It models a difference between taking a serious wound and the kind of graze action heroes can just walk off, and adds grit to more serious games.
When the Just Cause games toy with this, only letting you regenerate a percentage of the most recent damage you take, it can seem at odds with their over-the-top action.
Horror games have also tweaked the way they use hit points to suit the genre. Zombie game Left 4 Dead slows you down the more you're hurt, making it harder to run away from the infected as if you're a movie character being worn down by the chase. In Silent Hill 4: The Room you regain health in your apartment, but when that safe space becomes tainted it stops healing you, a mechanical sign of its corruption that ensures you feel the same dread the character would.
Still, across all of these games, what hit points represent isn't entirely clear. Are they purely the injuries you endure, as the suffering face of Doomguy suggests? If that's true why is it so easy to get hit points back, whether through healing items or regeneration or drinking Fallout's irradiated toilet water?
In The Lord of the Rings Online hit points are replaced by morale, which explains why singing a jaunty tune helps top it up. In the Assassin's Creed games it's synchronization, a representation of how accurately your digital simulation is recreating historical events although that raises the question of why being hurt during events where your historical analogue was also hurt doesn't improve synchronization.
Even in D&D it's unclear what hit points really are. In the Dungeon Master's Guide for Advanced Dungeons & Dragons 1st Edition, Gary Gygax wrote that hit points reflect both the actual physical ability of the character to withstand damage as indicated by constitution bonuses and a commensurate increase in such areas as skill in combat and similar life-or-death situations, the sixth sense which warns the individual of some otherwise unforeseen events, sheer luck, and the fantastic provisions of magical protections and/or divine protection.
(Charmingly, the rules then went on to explain that Rasputin would have been able to survive for so long because he had more than 14 hit points. )
Constitution, skill, sixth sense, luck, magic, and divine protection are a lot of things to bundle into one number, and raise further questions about why, for instance, poisoned attacks cause extra damage to your sixth sense . When asked about what hit points really are at conventions Gygax was dismissive, giving different answers to the question each time. Sometimes he said hit points represent the way swashbuckling movie heroes survive so many fights, or that they were an entirely meaningless number that represented nothing more than a way of making the game's combat more enjoyable for players.
That second answer is perhaps the best explanation. Given that hit points started out as a way of simulating the ability of a ship's hull to weather cannon-fire, it's only natural that there's going to be some vagueness and necessary abstraction when we apply that same concept to our video game heroes. They may as well be health pineapples, after all.
No one wants to end up in jail, but there’s something fascinating about life in the clink. There have been some great fictional prisons in literature and cinema—and video games too. The following hoosegows are some of the toughest, most brutal, and hardest to escape from in gaming. Some horrible prisons, both new and old, have made their way onto this list since we first wrote it.
From freezing Russian labor camps to max security space-jails, these are scariest imaginary prisons on PC.
B.J. Blazkowicz had to shoot an awful lot of Nazis to escape from the labyrinthine Castle Wolfenstein. As prisons go, Wolfenstein does offer some perks: ample access to weaponry, secret Nazi treasure, and delicious, hearty meals. On the downside, the dogs aren't very friendly and there's a giant Nazi with two machine guns standing between you and the exit. If you take too many bullets, you'll have to resort to eating dog food. Yuck.
Batman famously has one of the best rogue's galleries in comics, and his nemeses inevitably end up in Arkham, Gotham's prison for the criminally insane. 2009's brilliant Arkham Asylum makes the prison itself the star, imagining it as a densely interconnected 3D playground in the vein of Super Metroid. As Batman gains new bits of equipment he opens up new ways to explore and unlocks new shortcuts. In the end, Arkham Aslyum has some great depictions of Batman's villains and the dark knight's abilities, but mastering the asylum is the true joy.
The Souls series has some of the toughest prisons in gaming. Dark Souls starts you off in one, the Undead Asylum, which is guarded by an overweight demon that ruins newcomers on the reg. Dark Souls 2 has the Lost Bastille, a prison made entirely of cold grey stone, patrolled by undead knights and exploding mummies, and wraps with a boss battle against three nimble suits of armor. But Dark Souls 3’s Irithyll Dungeon is the prison-iest of all (most prison-y?). It glows a sickly green and greets you with the Jailers, spooky robed guards that lower your max health just by looking your way. Explore the cells and eventually you’ll run into the wretches, grotesque human-dragon hybrids, botched experiments of the Lothric family. Deeper in you’ll find giants taken prisoner, massive sewer rats looking for a snack, a downright mean basilisk ambush, some items that sound off a large scream when picked up to alert nearby enemies, a gluttonous humanoid with an enlarged hand for a head called—what else—the Monstrosity of Sin, and some sewer centipedes. Don't Google them.
It’s an awful place that folds over on itself in a disorienting search for one key after another, delaying your escape just beyond its rows and rows of thick iron bars. Get in, save Siegward, and never return.
Protagonist Vito Scaletta gets busted for selling stolen ration stamps and ends up in the clink. This is an act break of sorts, separating the game’s 1940s and 1950s chapters. The slow walk through the gates, being yelled at by jeering prisoners, is straight out of The Shawshank Redemption. You pass the time by punching people and scrubbing toilets, before emerging into a terrifying world of quiffs and rock and roll.
JC Denton defects from UNATCO and becomes a wanted man. He’s captured and wakes up in a mysterious underground cell. With the help of a creepy AI calling itself Daedalus he manages to escape, only to discover that the sinister prison facility is located below UNATCO’s Liberty Island headquarters. Most people who mess with Majestic 12 end up dead, but JC uses his nano-powers to break out and flee to Hong Kong.
Butcher Bay is a space-prison for the galaxy’s toughest, gruffest space-bastards. Escape From Butcher Bay sees the titular Riddick, played by Vin Diesel, breaking out of this maximum security sci-fi prison by stabbing, choking, shooting, and sneaking past its small army of guards. But, even though escape is his top priority, he still finds the time to enter bare-knuckle boxing matches and shiv other prisoners.
“It used to be a high security prison,” says Alyx Vance, gravely. “It’s something much worse now.” She always was good at introductions. Nova Prospekt is an old prison that the Combine have converted into a facility for processing any ‘anti-citizen’ who fights against their tyranny. ‘Processing’ meaning being turned into a hideous half-machine monster. A grim place indeed, but no match for Gordon’s gravity gun.
The Suffering is a mostly forgotten 2004 shooter from Midway, set on the twisted Carnate Island off the coast of Maryland. The penitentiary itself, where you're on death row, is just the beginning—the whole island has a dark history, including an insane asylum and a whole lot of executions. Hell breaks loose immediately when an earthquake calls up hordes of twisted monsters, who proceed to wreak havoc on the prison. It all may sound like standard horror fare, but The Suffering stood out thanks to some fabulously creepy designs by Stan Winston Studios. Those are monsters we would not like to be trapped on an island with.
Probably the toughest prison on the list, Vorkuta is grim Russian labour camp and one of the most memorable levels in Black Ops. With help from Viktor ‘Gary Oldman’ Reznov, your fellow prisoners, a mini-gun called the Death Machine, and giant slingshots loaded with explosives you battle to freedom and destroy half the prison for good measure. Shame about that rubbish vehicle section at the end.
The prison ship Purgatory, operated by the Blue Suns mercenary company, is where unstable biotic Jack finds herself. Commander Shepard, hunting for the galaxy’s baddest asses, flies there in order to recruit her. Before it was a prison, the ship was used to transport animals, which explains the tiny cages masquerading as cells. It’s not all bad, though: if it gets crowded, the Blue Suns will dump you on a nearby planet.
This desert prison used to be a peaceful coal mining town, but now it’s a hellish jail. Cloud and co. are dumped here after a misunderstanding, and have to earn their freedom by entering, and winning, a chocobo race in the Golden Saucer theme park that looms over the prison. As far as I know, this is the only time in gaming history where you escape from jail by riding a giant chicken. Hopefully it’s not the last.
That’s not a very nice name. Why not Warmridge Prison? Dishonored protagonist Corvo Attano is sent here after being wrongly accused of murdering the Empress he was charged to protect. It’s an imposing building—designed by the same guy who dreamed up Nova Prospekt, Viktor Antonov—and serves as the game’s tutorial. Murderous inmates, brutal guards, and rats are among this foul place’s residents.
This Alaskan military base isn’t technically a prison, but Solid Snake finds himself imprisoned in a cell there during the first MGS. There are a few ways to escape, but my favourite is spilling a bottle of ketchup and lying down next to it. The idiot guard thinks you’ve killed yourself and rushes in to help, giving you a window to break out.
Only slightly harder to endure than listening to the band Bastille, this famous French prison was notorious for its brutal treatment of prisoners. It’s here that the foppish hero Arno Dorian learns how to fight, and ultimately becomes an assassin. After the French Revolution it was demolished and replaced with a monument, but it will live forever in the decidedly average Assassin’s Creed Unity. C’est la vie.
Hell's Prison, posted on Reddit, is just one of thousands of devious, depressing prisons concocted by Prison Architect players. There's probably a harsher prison lurking on a hard drive somewhere, but Hell's Prison is a good example of how totalitarian Prison Architect lets you be as a warden.
"At any given time about 90-100 prisoners are in the initial stages of starvation and taking damage," reads the description. "The entire prison is one giant infirmary so that doctors automatically tend to them. Prisoners who are close to death are brought to the medical beds by the guards. I have yet to lose a prisoner to starvation."
Prison Architect's Steam Workshop is also full of fantastic creations and recreations, like Alcatraz. Now that's a tough prison.
One of the most famous video game prisons, this is where you start your adventure in Oblivion. You don’t know what your crime was or how you ended up there—you’re supposed to fill in the blanks—but a fateful encounter with the Emperor of Tamriel leads to your escape and transformation into a hero. You can return later and take the opportunity to teach gobshite Valen Dreth some manners.
The last we heard of a movie based on Wolfenstein was five years ago, when the Writers Guild strike gummed up the works. But you can't keep a good bad idea down, because distributor Panorama Media and producer Samuel Hadida have announced that Castle Wolfenstein is back on-track.
Canadian director Roger Avary, who won an Academy Award for co-writing Pulp Fiction, is slated to direct the film. His other work hasn't been as highly regarded, consisting of writing credits for Silent Hill and directing credit for Killing Zoe, which was executive produced by Quentin Tarantino. Avary has been attached to direct the film from the beginning, but by the time the WGA strike ended in February 2008 he was facing a charge for vehicular manslaughter, to which he later pleaded guilty.
The movie will follow two lead characters, a US Army Captain and British Special Agent on a top-secret mission to Castle Wolfenstein. Hitler himself is paying a visit to the titular castle to debut a new weapon to his Nazi buddies. The two strapping lads have to fight Hitler's SS Paranormal Division, and presumably destroy the weapon so that Hitler can't take over "ze vurld."
The announcement claims the film will be an action-adventure reminiscent of Captain America and Inglorious Basterds. It seems a little counter-intuitive to acknowledge the existence of Basterds, though. At that point, why would a Wolfenstein movie need to happen at all?