Wolfenstein 3D

This article was originally published in two parts across PC Gamer issue 309 and 310. For more quality articles about all things PC gaming, you can subscribe now in the UK and the US.

Writers of videogame histories often think in terms of individuals and periods—great innovators and clear-cut ‘epochs’ in design, typically bookended by technological advances. Events or people who contradict those accounts have a tendency to get written out of the tale. According to one popular version of the medium’s evolution, the first-person shooter was formally established in 1992 with id Software’s Wolfenstein 3D, a lean, thuggish exploration of a texture-mapped Nazi citadel, and popularised in 1993 by heavy metal odyssey Doom, which sold a then-ludicrous million copies worldwide at release. The company’s later shooter, Quake, meanwhile, is often held up as the first ‘true’ 3D polygonal shooter.

Founded in 1991 by former employees of software company Softdisk, id’s contributions to what we now call the FPS is undoubtedly immense. Between them, Wolfenstein 3D and Doom brought a distinct tempo, savagery and bloodlust to first-person gaming, and programmer John Carmack’s engine technology would power many a landmark FPS in the decade following Doom’s release. But we shouldn’t view that contribution too narrowly, as simply one step along the road to a game such as Call of Duty: World War II. And nor should we neglect the games—before, during and after id’s breakthrough—that took many of the same concepts and techniques in different and equally valuable directions.

The beginning: Maze War, Spasism, WayOut

To think about the shooter’s origins is to think about labyrinths. Among the earliest pioneers of first-person videogaming is 1973’s Maze, a game cobbled together by high school students Greg Thompson, Steve Colley and Howard Palmer during a NASA work-study program, using Imlac PDS-1 and PDS-4 minicomputers. The three had been carrying out research into computational fluid dynamics for future spacecraft designs, an early show of what would become a problematic relationship between the commercial games business and the US military-industrial complex. Initially a single-plane, 16x32 tile wireframe environment for one player in which you’d turn by 90-degree increments, Maze grew to include shooting, support for a second player via serial cable, a corner-peeking functionality and indicators for which way the other player is facing. 

After completing his spell at NASA, Thompson took the game with him to the Massachusetts Institute of Technology. With access to a more powerful mainframe, and the aid of David Lebling—who would go on to create the legendary text adventure Zork and found Infocom—he added eight-player support over the US defence department-run ARPANET, a map editor, projectile graphics, scoreboards, a spectator mode and ‘bots with dynamic difficulty’, all features that would resurface in mass-market shooters many years later. Maze War was very popular on campus—it used up so much computing resources that the MIT authorities created a ‘daemon’ program to find and shut down sessions. In one of its later forms, the maze extended along the vertical axis and players could fly, shoot and take cover in any direction.

Maze War

If Maze War sounds like a fully-featured FPS in hindsight, it’s important to note that the category ‘first-person shooter’ is of much more recent inception—according to a 2014 study by the academic Carl Therrien, it only entered popular discussion around videogames in the late ’90s. Many studios, including id, preferred terms and slogans like ‘3-D adventure’, ‘virtual reality’ and ‘the feeling of being there’ when describing games that are played from a first-person viewpoint. Nor was the perspective exclusively, or even predominantly, associated with on-foot gunplay. There were racing games, such as Atari’s 8-bit arcade offering Night Rider, which treated the player to a dashboard view of a road made up of shifting white rectangles. There were cockpit simulators such as 1974’s Spasim (often granted dual honours with Maze War as the first-person shooter’s oldest ancestor), a 32-player space combat game in which unofficial approximations of Star Trek vessels wage war at a mighty one frame per second. 

There were dungeon-crawlers such as Richard Garriot’s Akalabeth in 1976, which combined a top-down world map with first-person dungeon segments featuring coloured wireframe graphics. Maze War spawned a number of sequels and imitators, attractively billed as ‘rat’s-eye view’ experiences by a 1981 issue of Computer & Video Games magazine. The first-person shooter genre as we understand it today arose from the artistic friction between these approaches, shaping and being shaped by them in turn. 

Naturally, methodologies shifted as new technology became available. Among Maze War’s more intriguing descendants is Paul Allen Edelstein’s WayOut, released for the Atari 8-bit in 1982. It made use of a rendering technique known as ray casting, whereby a 3D environment is generated from a 2D layout by sending out beams from the player avatar’s eyeball and drawing a pixel where they intersect with an object’s coordinates. Where light in reality bounces off many surfaces before entering the eye, ray casting simulates a ray’s collision with an object only once. While incapable of nuanced effects such as refraction, it was also much less resource intensive than other 3D projection techniques, which allowed for faster performance on the hardware of the day. If WayOut was a potent demonstration of ray casting’s utility, it is also worth remembering for its eccentric, non-combat premise. You play a clown trapped in a maze with a spinning, sinister ‘Cleptangle’ that will steal your map and compass on contact. A wind blows through the level, its direction indicated by floating fireflies. This interferes with movement, but also helps you get your bearings should you lose your map.

Battlezone

Cockpit simulations were especially popular during the ’80s, beginning with Atari and Ed Rotberg’s arcade game Battlezone, a tank sim featuring wireframe vector graphics that came with a novel ‘periscope’ viewfinder (the US Army would later try, and fail, to convert the game into a Bradley tank training simulation). In 1987, Incentive Software released Driller: Space Station Oblivion—the first game to run on its proprietary Freescape engine, which allowed for complex 3D environments dotted with simple geometric objects. The game assigned a sizeable chunk of the display to your offworld rover’s dashboard, a fat slab of buttons and indicators. In part, the prevalence of cockpit games reflected the influence of Star Wars, with its lavishly realised starfighter dashboard displays. But it also arose from attempts to make often-unwieldy simulation technology more convincing by representing players at the helm of a lumbering vehicle. Among id’s subsequent achievements was to narrow the gap between the player’s body and that of the avatar, thus helping to open a space in which ‘first-person’ denotes not merely a perspective but a narrative in which the player is protagonist.

Into the '90s: Wolfenstein 3D, Doom, Duke Nukem 3D

id’s career as a first-person developer began with Hovertank 3D in 1991. A cockpit sim brought to life with ray casting and featuring animated 2D sprites, it featured players searching for civilians to rescue and tentacular UFOs to blow up. It was followed by Catacomb 3-D—id’s first crack at a first-person character-led action game, with a visible avatar hand and portrait. Catacomb also featured texture maps, flat images attached to surfaces to create the illusion of cracked stone walls and dripping moss. In this respect, id had been strongly influenced by Blue Sky Productions’ breathtaking Ultima Underworld: The Stygian Abyss, often cited as the first ‘immersive simulation’, which offered 3D, texture-mapped environments featuring sloped surfaces, rudimentary real-time physics and the ability to look up and down.

Wolfenstein 3D and Doom—both developed after John Carmack glimpsed Ultima in action at a 1990 expo—can be considered combative responses to Ultima’s representation of the possibilities of first-person 3D, eschewing the latter’s more complex geometry and gigantic array of variables in favour of pace and immediacy. Though busier with ornaments than Catacomb 3-D’s levels, Wolfenstein’s environments are designed to run at speed—designer John Romero once planned to let players carry and hide bodies, but dropped the idea to avoid bogging players down. Where Ultima set out to make players feel like part of its world via deep, consistent systems and a wealth of lore, Wolfenstein dealt in simpler, visceral effects—the sag of your avatar’s body when you take a step forward, the gore spraying from the pixelated torso of a slain Nazi. If the game pushed violence and politically charged imagery to the fore—somewhat to the distress of its publisher, Apogee—it also harkened back to the maze games of previous decades, with secret rooms to discover behind sliding partitions.

Wolfenstein

This emphasis on the avatar’s bodily presence would set the tone for many subsequent shooters—notably Call of Duty, with its blood spatter damage filter—as would id’s sense that player participation should take priority over narrative elements. When it came to Doom, there was disagreement between Carmack, Romero and id’s creative director Tom Hall over how much plot and backstory to weave into the game. Hall had planned something akin to Ultima, with large, naturalistic levels built around a hub area and a multitude of arcane props. “Story in a game is like story in a porn movie,” was John Carmack’s infamous rebuttal. “It’s expected to be there, but it’s not important.” Hall eventually resigned in 1993. In his absence, the team stripped out a number of more fanciful weapons, turned many plot items required for progression into generic keycards, and cleaned up certain environments to allow for speedier navigation. 

Loaded with taboo imagery, ultra-moddable thanks to id’s decision to store game data such as level assets separately from engine data as ‘WAD’ files, and equipped with four-player multiplayer to boot, Doom was a phenomenal success. Such was its impact that before ‘FPS’ became an accepted term, many in the development community used ‘Doom clone’ as shorthand for any first-person game involving gunplay. No game can claim to define a genre for long, however, and id’s work would attract plenty of imitators and rivals in the years to come. 

Four months before Doom’s arrival, a fledgling Chicago studio founded by Alex Seropian and Jason Jones released Pathways Into Darkness, a Wolfenstein homage with a pinch of Ultima-style item puzzling. It thrust players into the boots of a soldier fighting through a pyramid in order to nuke a sleeping god before it can bring about the apocalypse. One of the few Mac exclusives available at the time, Pathways was hailed for its colourful hand-drawn art and menacing atmosphere. It deserves mention today for the ability to commune with the ghosts of other explorers using special crystals and elusive keywords—an engaging, melancholy approach to textual backstory. The developer, Bungie, would build on this concept during work on two of the 21st century’s best-known FPS series, Halo and Destiny.

Marathon

Before Halo and Destiny there was 1994’s Marathon, the series often billed as the Mac’s answer to Doom. A suspenseful sci-fi offering set aboard a hijacked colony ship, it was a more complex game than id’s offering—adding free look with the mouse and a range of terrain dynamics, such as low gravity and airless chambers. It was also a more convoluted work of fiction, which relied on players scouring its open-ended levels for narrative artefacts. In place of the souls of the slain, Marathon offered computer terminals through which you converse with various sentient AIs about the wider universe.

Its sheer brilliance aside, Doom s pre-eminence during the 90s owes much to id s embrace of the modding community, with players able to create their own maps using the developer s own editing tools.

The game’s reach was limited by its choice of platform, but it attracted a dedicated community thanks to its elusive narrative backdrop and infectious eight-player, ten-map multiplayer. 1995’s Marathon 2: Durandal added co-operative play while 1996’s Marathon Infinity introduced a ‘Forge’ level editor, two features that would become central to the studio’s projects. Just as significant, however, was Bungie’s work in the emerging real-time tactics genre. Conceived by Jason Jones in a bid to stand apart from id Software, the top-down Myth games equipped Bungie with a feel for how different unit types and variables might react together. This would yield fruit in the shape of Halo’s famous combat sandboxes. 

Its sheer brilliance aside, Doom’s pre-eminence during the ’90s owes much to id’s embrace of the modding community, with players able to create their own maps using the developer’s own editing tools (and thus, squeeze many hours of enjoyment out of the free shareware version). Fan concoctions ranged from Batman and Alien-themed conversions to trashy oddities like The Sky May Be, in which zombiemen moonwalk and the legendary BFG-9000 has a chance of conferring immortality on its target. Many up-and-coming designers cut their teeth on Doom mods, and other studios were eager to license it for commercial use. Among them was Raven Software, founded by Steve and Brian Raffel, which created the fantasy-themed shooters Shadowcaster, Hexen and Heretic using their own bespoke versions of John Carmack’s engine technology. The two companies were at one point based just down the road from each other, and formed an enduring bond—id would eventually hand Raven the keys to the Doom and Quake franchises. 

Raven’s games were eclipsed, however, by the noxious excess of 3D Realms’ Duke Nukem 3D, a celebration of B-movie tropes that occasionally resembles a postmodern satire, and occasionally the aimless, chauvinist doodlings of a 13-year-old boy. Duke Nukem 3D is an intensely antisocial game, its levels grimy parodies of real-world locales, such as movie theatres and stripclubs, guarded by porcine coppers and strewn with the corpses of cinema idols like Indiana Jones and Luke Skywalker. While technically accomplished and formally inventive—it introduced jet packs, shrink rays, animated props such as arcade cabinets, physically impossible layouts and a protagonist who provides audible commentary throughout—the game is remembered today mostly for its jiggling softcore imagery. In years to come, shooter developers would spend as much time dispelling the notoriety Duke Nukem generated as they would profiting from his example.

Duke Nukem

Doom’s success also won the regard of franchise owners in other media. Maryland-based Bethesda—flush from the success of its eye-catchingly vast roleplaying effort, The Elder Scrolls: Arena—released a Terminator adaptation in 1995, endowed with lavish polygonal models. In hindsight, the game’s vast, cluttered wasteland feels almost like groundwork for the studio’s later first-person Fallout titles. In the same year, the venerable adventure game studio LucasArts shipped Dark Forces, the first Star Wars-themed FPS, inspired (and perhaps, annoyed) by the appearance of Death Star mods for Doom. LucasArts had designed a number of historical cockpit-based simulations during the late ’80s and early ’90s, but Dark Forces was a straight riff on id Software’s work. The developer’s impressive Jedi engine allowed for vertical looking, environments busy with ambient details such as ships landing on flight decks, a range of effects such as atmospheric haze, and the ability to stack chambers on top of one another.

Going 3D: Metal Head, Descent, Quake

By the mid-’90s, developers had begun to shift from so-called ‘pseudo-3D’ techniques such as ray casting to fully-polygonal worlds, capitalising on the spread of 3D hardware acceleration and the arrival of the first mass-market graphics processing units. Released for the Mega Drive’s 32X add-on in 1994, Sega’s lumbering Metal Head is often touted as the first ‘true’ 3D shooter. Pitching large, plausibly animated mechs against one another in texture-mapped urban environments, it was a handsome creation let down by repetitive missions. There was also Parallax Software’s Descent, released in the same year—an unlikely but gripping hybrid of flight sim and dungeon crawler with 360-degree movement. But the game now regarded as a byword for polygonal 3D blasting wasn’t, to begin with, a shooter at all. 

John Romero had intended Quake to be a hybrid of Sega AM2’s arcade title Virtua Fighter and a Western roleplaying fantasy. Conceived back in 1991 and named for a Dungeons & Dragons character, the game would have alternated between first-person exploration and thirdperson side-on brawling. Romero envisioned circling dragons, a hammer massive enough to send shockwaves through the earth, and events that trigger when players look in their direction, such as glowing eyes appearing in a cave mouth. By the time John Carmack neared completion of an ambitious 3D engine in 1995, however, other id Software employees were exhausted and reluctant to depart too drastically from the Doom formula. There was also tension between the two founders over Romero’s supposedly inconsistent work ethic and Carmack’s view that the studio’s engine technology took precedence over its games. Romero ultimately resigned himself to a reimagining of Doom in polygonal 3D—and resigned from id Software itself after finishing the game.

Quake

As Big Robot’s Jim Rossignol has noted in a 2011 retrospective, something of this failure lingers in Quake as it stands. Though cut from the same coalface as Doom—it offered fast, brutal gunplay, levels made up of corridors and arenas, and a multitude of secret areas—the game’s aesthetic and fiction are curiously divided, at once crustily medieval and high tech. You can expect banks of computer monitors and teleporters, but also broadswords and monsters ripped from the pages of Lovecraft. In hindsight, it plays like a representation of the tipping point from avant-garde into profitable convention, the point at which the chimerical possibilities of 3D action solidified into the features expected of a modern first-person shooter. 

In at least one respect, though, Quake was transformative—it introduced a thrilling element of verticality, with players dashing through the air above opponents rather than simply strafing or corner-camping. This quality proved an asset in the emerging field of online multiplayer: by the late ’90s, Ethernet connections and modems had become ubiquitous and internet usage was rocketing. Quake’s multiplayer was initially designed for high bandwidth, low latency local area networks—it would check with a server before showing players the result of an action, which led to jerky performance online when there was a build-up of server requests. id swiftly released an update, titled QuakeWorld, which added client-side prediction. The result can be held up as the original esports shooter—software company Intergraph sponsored a US-wide tournament, Red Annihilation, in May 1997, which attracted around 2,000 participants. 

As with Doom, Quake’s modding tools made it an attractive platform for amateur developers—its community gave the world Team Fortress, which would later flower into a standalone shooter, along with early specimens of machinima, including an epic known as The Seal of Nehahra. Its greatest descendent, however, would prove to be a shooter from a developer founded by Microsoft alumni Gabe Newell and Mike Harrington. 

Created using a modified version of the Quake engine, Valve Software’s 1998 epic Half-Life remains extraordinary for how it reconciles the abstractions of game design with narrative tactics redolent of a novel (the game’s tale of secret government research and alien invasion was, in fact, written by a novelist, Mike Laidlaw). Its achievement versus earlier shooters can be summed up as the creation of temporal unity: almost everything is experienced in real time from the lead character’s perspective, with no arbitrary level breaks. In place of cutscenes, Valve weaves its tale through in-game dialogue and scripted events such as enemies smashing through doors—a tactic that both gives the player some control over the tempo and avoids jerking you out of the world. The game also sells the impression of a larger, unseen universe not via gobbets of textual backstory, but through the detail, responsiveness and consistency of its environment. The intro sees Gordon Freeman riding a monorail through Black Mesa, gleaning information about the location and your character from PA announcements and the sight of other employees at work. Following a disastrous experiment, you’re asked to backtrack through the same areas, now fallen into chaos.

Half-Life created a blueprint many FPS campaign developers would adopt in the new millennium. In particular, its seamless, naturalistic design would guide studios looking to explore realistic settings, such as the ‘World War’ periods. But it also introduced a note of unreality in the shape of Gordon Freeman’s murky reflection, the besuited G-Man—a personification of the game designer who sits a little outside Half-Life’s fiction. Together with the all-seeing, omnipresent AI manipulators of Marathon and the acclaimed cyberpunk RPG System Shock, the G-Man betrays a genre becoming increasingly aware of itself, and eager to turn its own structural constraints into a source of drama.

A new millennium: Unreal, Counter-Strike, Call of Duty

One of the greatest influences on first-person shooters at the turn of the millennium wasn’t a game, but a film: Steven Spielberg’s World War 2 epic, Saving Private Ryan. The movie’s thunderous portrayal of the D-Day landings would find echoes decades later in videogames like Killzone and Titanfall. Spielberg himself also has a robust association with game development: he co-founded DreamWorks Interactive with Microsoft in 1995 to work on adaptations of movies like Small Soldiers. Seeking a way to teach younger people about the war after wrapping up production on Saving Private Ryan, Spielberg asked DWI to develop a shooter, Medal of Honor, for Sony’s trendy new PlayStation platform. 

Launched in 1999 to strong sales, the game was a watershed moment in several respects. On the one hand, its more earnest, grounded approach opened the genre up to players put off by the lurid sci-fi or pulp comic settings of games like Doom and Wolfenstein. On the other, it facilitated tense discussions about the right of videogame developers to depict such events, and the possibility that violent games spark violent behaviour. Medal of Honor released a few months after the Columbine massacre in Colorado, an atrocity that gave rise to a moral panic over videogame violence. Fearful of a backlash, DreamWorks Interactive removed all blood from the game before launch. It also attracted a heated reaction from the US Congressional Medal of Honor Society, and its president voiced his concerns to Spielberg in person. The game’s release, in spite of all this, created a precedent for other studios to comment openly on history and society. 

The close of the ’90s also saw the release of the gorgeous Unreal, sparking a decade-long rivalry between creator Epic MegaGames and id Software. Conceived as a sort of ‘magic carpet’ experience where you fly through caverns dotted with robots, the game evolved into a bona fide Quake killer, running on a proprietary technology capable of 16-bit colour and ambient effects, such as volumetric fog. Like Quake, the game was designed to be modded easily and extensively. Also like Quake, its multiplayer left something to be desired at launch. Epic released a deathmatch-oriented standalone expansion, Unreal Tournament, in 1999, narrowly ahead of the arrival of id’s Quake III: Arena. A brace of colourful alternate fire options aside, it was notable for including both more competitive ‘hardcore’ and relatively playful ‘theme’ maps, such as levels floating in Earth’s orbit. The franchise found a dedicated following online, but the bedrock of Epic’s business would prove to be founder Tim Sweeney’s Unreal Engine, a highly modular entity designed for continual improvement. It would power games as diverse as Ion Storm’s legendary immersive sim Deus Ex and EA’s adaptations of the Harry Potter movies.

Where Quake and Unreal Tournament dealt in cartoon bazookas and evaporating torsos, another 1999 release, Counter-Strike, set its sights on military realism. A Half-Life mod created by attic developers Minh Le and Jess Cliffe, it saw teams of terrorists and counter-terrorists struggling to arm or defuse bombs and rescue or maintain custody of VIPs, customising their loadouts with currency earned at the end of each round. The mod wasn’t a landmark success to begin with, but Valve’s designers knew a killer formula when they smelled it and scooped up Le and Cliffe along with the intellectual property rights in 2000. Counter-Strike became an enduring phenomenon, buoyed up by thousands of user-created maps (including David Johnston’s legendary Middle Eastern levels Dust and Dust 2) and a community as resistant to fundamental rule changes as any diehard fan of football. Perhaps the definitive esport shooter, its objective-based modes and tactics-driven design are integral to the DNA of competitive multiplayer today. 

2000 was also the year that Microsoft acquired Bungie, thereby depriving Apple’s Mac of one of its more coveted games, a science fiction odyssey called Halo. The game had begun life as an open world exploration affair, running on Bungie’s Myth engine, and something of that luxuriant scale remains in the completed Halo: Combat Evolved, which was an enormous hit when it launched on Microsoft’s first Xbox console in 2001. Halo’s environments were bright, rangy and colourful, where other shooters were claustrophobic and dingy, and they were lent an intense overarching unity by the silhouette of the Halo ringworld itself, stretching up through each skybox. Its crowded encounters were far more open-ended than in most competitors, woven around delightful AI variables like Grunt footsoldiers kamikaze-rushing the player after you kill their leader. Its weapons retained something of Quake and Unreal’s excess—overcharging an energy pistol to strip an opponent’s shield in one go would become a standard multiplayer tactic—but its blend of finite player health and recharging overshields imposed a more studied, back-and-forth rhythm on firefights. Halo also showed off Bungie’s knack for world-building: the fascination of its wider universe would help cement its status as Microsoft’s flagship series. 

Halo would be eclipsed, however, by another World War 2 shooter, created using id Software’s Quake III engine by Infinity Ward—a studio founded by veterans of Medal of Honor: Allied Assault with startup money from Activision. Released in 2003, Call of Duty was among the first shooters to let players aim down a weapon’s sights—a gambit that created a sense of fearful claustrophobia, narrowing your attention to the gun roaring in your hands, even as the game’s sprawling levels and battalions of AI troopers courted comparison with Allied Assault. It was a little overshadowed by Medal of Honor on PC, but Call of Duty’s popularity caught the eye of Microsoft, who asked Activision to develop an Xbox 360 port of the sequel. With Halo 3 still a couple of years away, Call of Duty 2 was a bestseller at the console’s 2005 launch. Mindful of the risks of hanging an entire series on a single developer, Activision brought on Spider-Man studio Treyarch to design Call of Duty 3 using the second game’s engine, giving Infinity Ward an extra year at the coalface. It was the beginning of a yearly alternation that, together with the franchise’s all-year-round multiplayer appeal, would allow Call of Duty to bury competitors and exert an out-sized influence on the genre at large.

Battlefield 1942

Among Infinity Ward’s more ferocious competitors was a multiplayer-centric WW2 game created by Swedish developer DICE. Battlefield: 1942 saw up to 64 players tussling for capture points on enormous, open maps. Where Call of Duty’s own multiplayer came to prioritise pace and lone wolf virtuosity, Battlefield emphasised squad composition, the canny use of strategic resources such as vehicles, and above all, depth of simulation. The developer’s Refractor engine allowed for such crude feats of real-time physics as using TNT to launch a jeep across a bay onto an aircraft carrier’s deck. Though never quite a trendsetter in the increasingly lucrative console market, in large part die to its anaemic campaign options, Battlefield’s scale and freedom were a tonic for armchair generals weary of vanilla deathmatch. 

Crytek’s Far Cry had a similar appeal. It began life as a glorified tech demo, the catchily titled X-Isle: Dinosaur Island, but flowered with Ubisoft’s backing into the first open world FPS in the current sense of the term. Where other shooters taught players to keep pushing forward, Far Cry allowed you to run amok in a vast tropical environment, using the undergrowth for cover while tracking unsuspecting soldiers through your binoculars. The series would go onto enjoy a symbiotic relationship with Ubisoft’s third-person Assassin’s Creed games, each experimenting with new ways to structure and diversify an open world. 

Half-Life 2, CoD4: Modern Warfare, Bioshock, Crysis

If Far Cry was one of 2004’s highlights, it and every other game that year was utterly dwarfed by Valve’s Half-Life 2. While not as transformative in terms of storytelling craft as its predecessor, the new game’s post-alien invasion dystopia was a work of unprecedented delicacy. Where older shooters looked to B-movies for inspiration, Half-Life 2’s incompletely terraformed city compares to mid-20th century Communist eastern Europe (the game’s art director, Viktor Antonov, hails from Bulgaria)—at once grand and ground down, alternating steely megaliths with trash-strewn riverbeds and grubby prisons. Its principle opponents aren’t bug-eyed monsters but masked enforcers wielding batons and carbines, their presence given away by indecipherable radio chatter. It’s also, for all its linearity, a celebration of player agency, handing you a Gravity Gun that allows you to pluck and hurl sawblades at enemies, solve slightly goofy seesaw puzzles and pile up objects at whim. The game was widely imitated, within the first-person shooter genre and without, but arguably its greatest legacy is Steam, Valve’s now-globe-straddling desktop games store. It’s hard to imagine players embracing the clunky 2004 version of Steam quite so readily, were it not required to play Half-Life 2.

Call of Duty

If Valve’s offering set the standard for FPS design (in terms of its campaign, at least) it was Call of Duty that swallowed up most of the limelight during the ’00s, the critical year being 2007. Weary of World War 2 and conscious of the need to differentiate its offering from Treyarch’s, Infinity Ward decided to transport the series to the present day. The result, Call of Duty 4: Modern Warfare, unlocked a brand-new vocabulary for the first-person shooter. It traded the mud and everyman heroics of WW2 experiences for a slick, cheerfully amoral celebration of western military hardware and urban combat tactics—arming the player with laser sights, ghillie suits, Stinger launchers and drones. It also courted topicality where games like Medal of Honor had tried to distance themselves from the headlines—one level sees you living out the final moments of a country’s deposed president, while another puts you at the controls of an AC-130 gunship, in scenes familiar from news footage of the Iraq War. But what it is mostly remembered for today is the multiplayer. Infinity Ward’s decision to introduce a levelling and unlocks system derived from roleplaying games is the most influential sea change in shooter design during the past decade. Its notion of an online career, whereby players kept plugging away for small rewards rather than just enjoyment, also helped popularise the emerging concept of the game as ‘service’.

Bioshock's combat, which married chunky period firearms with pseudo-magical powers or Plasmids , would prove its weakest element. More intriguing was the universe of cruelty and hubris it sketched.

Call of Duty 4 wasn’t the only game to do a little genre-splicing in 2007. Irrational’s BioShock began life as a spiritual follow-up to the System Shock series—its creative director, the soon-to-be-famous Ken Levine, was a designer on System Shock 2—but over time it became more of a shooter than an immersive simulation or RPG. It casts the player as an airplane crash survivor exploring a disintegrating undersea ‘utopia’ created by a renegade industrialist, in a thinly disguised meditation on the philosophy of Ayn Rand. The game’s combat, which married chunky period firearms with pseudo-magical powers or ‘Plasmids’, would prove its weakest element. More intriguing was the universe of cruelty and hubris it sketched, a labyrinth of leaking glass tunnels and domed Art Deco plazas.

Building on Half-Life 2’s example, Irrational left much of Rapture’s backstory for players to discover in the form of audio diaries, graffiti and random bric-a-brac. Its environmental storytelling would attract legions of imitators across several genres, from Raven Software’s unfairly overlooked 2010 shooter Singularity through body-horror masterpiece Dead Space to so-called ‘walking simulators’ like Gone Home. It also formed part of an ongoing conversation about games as a means of rousing empathy or exploring moral quandaries. BioShock’s signature characters are the Little Sisters, mutated little girls who collect genetic material from corpses under the eye of their powerful guardians, the Big Daddies. Having disposed of the latter, you can either spare Little Sisters or kill them to harvest their ‘ADAM’, a resource you can use to upgrade your own powers. 

The late ’00s saw the rise of the open world shooter, with Crytek’s fearsome Crysis swaddling the player in power armour in order to battle aliens on yet another overgrown island wilderness. The game was sold as an exercise in technological masochism, its detail, lighting and plethora of effects ‘melting’ all but the most expensive PC hardware. But its real trump card was the ability to enhance your Nanosuit’s agility, strength or endurance on the fly by drawing power from a finite reservoir, making it an engaging risk-reward system. It was soon eclipsed, however, by the Far Cry series, which Crytek had by now sold to Ubisoft. That’s both in spite of and thanks to Far Cry 2, an astonishing, bruising shooter stretched across 50 kilometres of African brush. Drawing on his experiences with the Splinter Cell games, designer Clint Hocking set out to create a brutal, Heart of Darkness-esque sandbox in which players fought malaria, self-propagating fire and bullets simultaneously. The results were arresting, but also frustrating, thanks to a patchy narrative, alternately dim or eagle-eyed AI and an unfair enemy respawning system.

Far Cry 2

2012’s widely acclaimed Far Cry 3 removed much of the frustration, and a little of the sophistication. It opened out the terrain, fine-tuned the AI to be more predictable, and put capturing enemy outposts—each a potted stealth-combat puzzle, inspired by the Borgia towers in Assassin’s Creed 2—at the heart of exploring the map. It also created a combo system, with players chaining melee executions into ranged takedowns, reflecting a growing interest across the industry in fluid first-person animations, epitomised by DICE’s 2008 parkour game Mirror’s Edge. Less positively, it traded the second game’s understated, callous portrayal of a perpetual civil war for a farcical story about whiny, kidnapped backpackers wrestling with the definition of insanity. 

Players unconvinced by Far Cry or Crysis had a number of rival open world shooters to choose from. One of them was the Stalker series, inaugurated by Ukrainian developer GSC Game World in 2007, in which scavengers pick their way through radioactive ruins while keeping a look out for monstrous creatures and invisible, fatal anomalies. Stalker’s supporting systems were remarkable—at one point, the AI was allegedly capable of completing the game by itself—but its punishing survival simulation ethic limited its audience. Gearbox’s roleplaying shooter Borderlands took a friendlier, trashier tack. Released in 2009, it saw you touring an anarchic, comic book-style planet as one of four classes, hoovering up procedurally generated (often borderline-unusable) weapons. Part of Borderlands’s success, the novelty of its arsenal aside, was its humour—a rare quality in an often po-faced genre. 

The turn of the decade saw a number of long-running FPS series beginning to lose momentum. Most obviously, the Medal of Honor series underwent an abortive attempt at reinvention in 2010, with publisher EA looking to fill gaps in the schedule between Battlefield instalments. In jumping forward from WW2 to present-day Afghanistan, the once-proud series merely left itself open to unflattering comparisons with 2009’s Call of Duty: Modern Warfare 2. id Software’s properties were also at low ebb. Though an accomplished horror experience, 2007’s Doom 3 lost out to Half-Life 2, while Quake had all but evaporated following Quake 4’s muted reception in 2005. Raven Software’s 2009 Wolfenstein reboot doubleddown on the paranormal aspects of the series backstory, to mixed effect. Following a similarly lukewarm response to Singularity, parent company Activision retasked the studio to help out with the Call of Duty series. RAGE—id’s only new IP during these years save mobile game Orcs & Elves—proved a visual extravaganza and a gratifyingly hefty, Mad Max-ish shooter, but all too often felt like it was playing second fiddle to its own graphics technology. id’s old foe Epic, meanwhile, was increasingly dedicated to the third-person Gears of War series and its flourishing Unreal Engine business.

Battlefield 3

Call of Duty continued to reign supreme, though it attracted increasingly stiff competition from EA’s Battlefield—a franchise increasingly (and a little unfairly) pitched as a freeform ‘thinking man’s shooter’, more respectful of player agency than the linear, attrition-driven Call of Duty. After experimenting with a lighter, buddy-comedy vibe in the Bad Company spin-offs, DICE amped up the grandeur with Battlefield 3, a multiple perspective tale of abducted nuclear weapons set partly in Iran (the bestselling instalment until DICE’s journey into WW1 with Battlefield 1). The series had become famous for its Frostbite engine technology, which amongst other things allowed for real-time terrain destruction in multiplayer: participants could do everything from blasting out spyholes in walls to levelling buildings.

The modern era: Titanfall, Destiny, Overwatch

Call of Duty’s greatest existential threats, however, were a mixture of internal discord and external market pressures. In March 2010, Activision—now by far the industry’s largest publisher, following a mega-merger with Vivendi and its subsidiary Blizzard—fired Infinity Ward cofounders Jason West and Vince Zampella over alleged insubordination. A few weeks later, West and Zampella announced the foundation of new studio Respawn Entertainment. A wave of lawsuits and countersuits followed, alongside a mass exodus of staff from Infinity Ward to Respawn. Activision was forced to call upon the recently founded Sledgehammer Games to help the depleted Infinity Ward finish Modern Warfare 3. 

While the series weathered this crisis—thanks largely to Treyarch’s pop-savvy, hallucinogen-crazed Black Ops subfranchise—Activision and other publishers also had to manage a problem of budget versus expectation. Scripted corridor campaigns in the Half-Life vein were proving increasingly expensive, thanks largely to the cost of HD art assets, and telemetry showed that players spent the bulk of their time in multiplayer. However, attempts to remove singleplayer from the package led to an outcry. Among the teams that struggled with this problem was Respawn. The EA-published debut Titanfall pioneered the concept of campaign multiplayer, with narrative elements, such as picture-in-picture cinematics, dropped into rounds of team deathmatches. The game was enthusiastically received—a mixture of towering mech combat and nimble parkour duelling, it restored something of Quake and Unreal Tournament’s agility to a genre that had become bogged down in cover combat. Its audience tailed off swiftly, however—many first-person shooter enthusiasts found the mechs-and-pilots premise to be more of a novelty than a game-changing fixture, though the larger problem was perhaps that, on consoles, Titanfall was exclusive to the Xbox brand.

Destiny

Other shooter developers ‘rediscovered’ mobility during this decade—Call of Duty: Advanced Warfare and Black Ops III dabbled at length with powered exosuits, while Halo 5: Guardians added boost slides, double-jumps and ground-pounds to Master Chief’s moveset. But the game that brought it all together was 2014’s Destiny, the work of erstwhile Halo developer Bungie, now free from producing games solely for Microsoft. It’s a mixture of MMO-style looting and Titanfall-esque acrobatics, all bundled up in an aesthetic that is reminiscent of the ’70s space race and classic sci-fi book cover illustrations. Destiny is in some ways quite a soulless game: it’s as grindy as Borderlands and far less self-deprecating, but its ruined, yet sumptuous, solar system environments have an irresistible mystique. It also feels tremendous in the hands, with some beautifully judged weapon designs and class abilities. 

With last year’s Call of Duty: Infinite Warfare tracking far behind Black Ops III, Destiny has become one of Activision’s two flagship shooters. The other is Blizzard’s joyful arena shooter Overwatch, released in 2016. Overwatch is a lovely game to end on because it is essentially an interactive genre history, a celebration of its triumphs, foibles and even failures. It doesn’t merely reach out to weapons, gadgets and abilities from other shooters, but also their quirks, exploits and the antics of their communities—Quake’s rocket jumping, aimbots from Counter-Strike and internet edgelords in general. Its heroes are love letters to 30-odd years of genre history. Pro-gaming celeb turned mech pilot D.Va is both a potted Titanfall and a parody of the noxious ‘gamer girl’ stereotype, for instance. Soldier 76, meanwhile, is Call of Duty man. Even as it pays tribute, however, Overwatch also points to the future—be it in the effortless way it folds in concepts from fighting games and MOBAs, or in how it extends the FPS cast-list well beyond the muscular, dudebro protagonists beloved of so many rivals. It speaks to the enormous range of concepts that make up the modern FPS, for all its myriad hang-ups—a genre that has always been about so much more than firing a gun.

Quake

It looks like Quake has just got its own version of Brutal Doom, the ultra-violent (and ultra popular) fan-made mod for id Software's 1993 FPS behemoth. Qore adds heaps of guts and gore to Quake, and it looks bloody brilliant.

You can dismember enemies, decapitate them, and generally smash their bodies into hundreds of red, squishy pieces. If you use the lightning gun you can actually electrify bits of innards, and if an enemy dies from an explosion there's a chance that their severed limbs will flame as they fly through the air. Flesh wounds will show up more than ever, too: basically, your screen will be plastered with red.

Qore adds a scary chainsaw that pins enemies in place if they have low health, cutting through them until they go splat. You can saw dead enemies to dig out extra bits of health and ammo as well. Nice.

You can grab Qore here, where you can also watch the mod in action. Creator DaisyFlower says they will continue to work on the mod, adding new enemy attacks among other things.

Technically, Qore is actually a mod for DarkPlaces, another fan project that improves the original Quake in pretty much every area. You'll need to download DarkPlaces here to run Qore.

Hat tip, DSOGaming.

Wolfenstein 3D

Last week, PlayerUnknown’s Battlegrounds smashed the Steam peak player records. The previous record-holder, Dota 2, while admittedly made by one of the world’s biggest and most powerful games companies, began as a Warcraft mod. These days, we barely blink an eye at the idea that a game can come from nowhere and shake through word-of-mouth, clever concepts, a bit of cool technology like Portal’s… well, portals… or simply by hooking into some reservoir of good feeling, and accomplish more than any marketing budget can dream of. Minecraft is this generation’s Lego. Undertale is one of its most beloved RPGs.

Indeed, the world of indie development is now so important that it’s hard to remember that it’s only really a decade or so old. That’s not to say that there weren’t indie games before then, as we’ll see, but it was only really with the launch of Steam on PC and services like Xbox Live Arcade that the systems were in place to both get games in front of a mainstream audience, and provide the necessary ecosystem for them to quickly and confidently pay for new games.

In 1979 Richard Garriott set out on his path to buying a castle and going into space by selling copies of his first RPG, Akalabeth, in ziploc bags at his local computer store

The massive success of indie games on Steam has of course come with attendant pitfalls. The early access program gave small studios the ability to beta test their games with player numbers they would not otherwise never reach, and gave players the ability to take part in shaping games. However, a lack of guidelines left players and developers with very different expectations as was seen in the reaction to a paid expansion being released for Ark: Survival Evolved while it was still in early access. Steam Greenlight made it easier for indie games to get on Steam but became a popularity contest that was easily gamed, leading Valve to replace it with Steam Direct.

All this is largely taken for granted these days, with the big challenge for modern indie games being to stand out. Simply getting onto Steam back then could set a studio up for life. These days the market is full to bursting, with most new releases disappearing from sight almost at once.

In both cases though, it’s a world away from how the market began.

Back to the start

The exact definition of ‘indie’ has never exactly been cut-and-dry. To some, it’s an aesthetic, best summed up by the classic bedroom coder. To others, it’s a more commercial distinction, of working without a publisher. To others, it’s ultimately about the work, with an indie game standing out more for being not the kind of thing you get from a commercial company, rather than really focusing on who made it. 

There are many definitions to play with, and few hard lines to draw. The poster-children of ’90s shareware, id Software (who you may know courtesy of a little game called Doom), began working under contract for a company called Softdisk, cranking out games like Dangerous Dave in the Haunted Mansion, Hovertank 3D, and Catacomb 3D, before moving on to make games with/for shareware giant Apogee.

In the very early days of gaming, just about everybody was indie to some extent. In 1979 Richard Garriott set out on his path to buying a castle and going into space by selling copies of his first RPG, Akalabeth, in ziploc bags at his local computer store (one of those copies then ended up in the hands of California Pacific, who offered Garriott a publishing deal). Sierra On-Line began in 1980 as just husband and wife team Ken and Roberta Williams, making simple adventure games like Mystery House that nevertheless pushed the boundaries of what people expected from games at the time—like having graphics—before booming to become one of the biggest and most important companies in gaming history.

What do you do if you don t have the money for big boxes? Ziploc bags are your friends.

Companies could emerge from almost anything. Gremlin Interactive began as a computer store called Just Micro, while DMA Design, originally Acme Software, which would make its name with Lemmings and much later become Grand Theft Auto creator Rockstar, began from its founders meeting up at a computer club in Dundee and ultimately signing with Psygnosis. Whole genres were created from a single game, such as Football Manager in 1982.

The speed of all this took many by surprise, with Balance of Power creator Chris Crawford saying in 1984, "We have pretty much passed the period where hobbyists could put together a game that would have commercial prospect. It’s much more difficult to break in, much less stay in. If you want to do a game, do it for fun, but don’t try to do game designs to make any money. The odds are so much against the individual that I would hate to wish that heartbreak on anyone."

The shareware revolution

But of course, people continued. The PC was largely left out of much of it, however, due to the relatively high cost of disks and its general perception of not being a gaming machine. In the UK, the main indie scene in the ’80s was on cassette based 8-bit systems like the ZX Spectrum, with publishers happily accepting almost any old tat, recording it to a tape, sticking it in a box, and selling it for a few pounds at newsagents, game stores, and anywhere else that would take copies. They were cheap, sometimes cheerful, and allowed for endearing weirdness like 1985’s Don’t Buy This—a compilation of the five worst games sent to publisher Firebird.

It would be many years before most indie PC games could get that kind of placement. Instead, there was shareware. The concept dates back to the 1970s, though it was popularized by PC-Write creator Bob Wallace in 1982. Rather than having a central distributor like a regular published game, users were encouraged to copy software and pass it along. If they liked it, they’d then send the creator a check to unlock the full thing or get more of it. 

In the case of Apogee Software, and indeed what became known as the Apogee model, a game might have three parts. The first one would be free, and free to share, the other two commercial and only for registered purchasers to enjoy. (Not that anyone really listened, as the vast, vast numbers of pirated copies of Doom probably shows better than anything.)

The beauty of the system was that anyone could distribute these games, with the rule being that while you weren’t allowed to sell the shareware version, you could charge for materials. That meant games could appear on magazine cover disks and later CDs. They could be on any university server or dial-up BBS or services like Compuserve and AOL. If you wanted a relatively full choice however, you often needed to send off for them. Whole companies were set up to sell just the trial versions, sending out printed catalogues of their stock and charging by the disk. 

By the mid-90s of course the popularity of CD had rendered this relatively pointless, with ‘1000 Games!’ CDs available in supermarkets and bookstores and anywhere else there might be an audience, rarely mentioning the part about them being glorified demos. Much like on Steam today, at this point most smaller games got lost. Still, as a player, it was an almost inexhaustible feast.

Not every game could be Wolfenstein 3D and promise a fight with Robot Hitler if you paid

As crazy as sending off a check to get a game might seem, it worked. In a few cases, registered shareware games even made the jump to boxed products in stores, though that was relatively rare. Either way, shareware was hardly a license to print money for most, but it supported many a developer throughout the '90s and made others their fortunes. Epic MegaGames began with the text-based RPG ZZT before becoming the company that made Unreal. Duke Nukem began as a very simple 2D side-scroller, notable mostly for oddities like the main character wearing pink and just wanting to save the world so that he could get back to watching Oprah, but nevertheless blossomed into Duke Nukem 3D before publicly wilting into Duke Nukem Forever. 

And there were many more stars too, regularly appearing in new games or simply popular ones that kept showing up, like Skunny the squirrel and his awful platforming (and ultimately karting adventures), Last Half of Darkness, and Hugo’s House of Horrors, much beloved by magazine and compilation editors for its extremely pretty first screen, and never mind that it was all made of clip art and every other room in the game was barely MS Paint-level scribbles.

The alternative industry

Shareware's big draw for players was, inevitably, free games. The downside of the Apogee model and others that erred on the generous side was that a whole episode was often enough—especially as that’s where the developer’s best work tended to be. Compare for instance the deservedly beloved shareware episode of Commander Keen: Goodbye, Galaxy! where you run around a beautiful, varied planet, with the dull space adventure of its commercial sequel. Not every game could be Wolfenstein 3D and promise a fight with Robot Hitler if you paid.

Less cynically though, shareware gave many genres their home. The PC was typically seen as a business machine, with its commercial successes often adventures, RPGs and other slower and more cerebral offerings. There were platformers and beat-em-ups and similar, but they were usually poor conversions from other platforms at best, with few worth taking a risk on. 

If the PC ever had a mascot platformer , it was Commander Keen. The shareware version of Goodbye, Galaxy! was his finest hour.

Shareware removed that risk factor for customers, while letting developers show off. The original Commander Keen, while simplistic to modern eyes, was proof that the PC could do console-style scrolling, even if it wouldn’t be until 1994’s Jazz Jackrabbit that anyone could seriously claim to be doing convincing 16-bit console-style arcade action and visuals. (Even then it wasn’t a very strong claim, but luckily by this point the PC had Doom and so didn’t care.)

This led to a flurry of games you really couldn’t get elsewhere, or that were in very short supply on the shelves, from vertical shooters like Major Stryker, Raptor, and Tyrian, to fighting games like One Must Fall, to quirky top-down RPGs like God of Thunder, and racing games like Wacky Wheels. It offered a great split. When you wanted a deep, polished experience, you had the commercial game market. For action fun, there was shareware, not least because when we did get big games like Street Fighter II, they tended to stink. Shareware supported the industry through much of the '90s.

The high cost of indie

By the mid-90s though, there was a problem. Commercial games began rapidly outstripping what bedroom teams could do, both in terms of technology and complexity of content. While there were engines available, they were mostly poor quality, with nothing like Unity on the market and the likes of Quake and Unreal costing far too much for anyone but other companies to license.

If you wanted to play with that kind of technology, you were looking at making mods instead. This was the era that gave us the likes of Team Fortress (1996) and Defense of the Ancients (2003), but also where the indie scene became largely forgotten. This wasn't helped by the fact that indie had essentially no place on consoles at all, despite a few nods over the years like Sony’s Yaroze console, a development PlayStation aimed at hobbyists released in 1997. The PC saw its own push towards home development with tools like Blitz Basic/BlitzMAX (2000) and Dark Basic (also 2000), with the goal of inspiring a new generation of bedroom coders. However, despite selling reasonably well, none of them gained much traction or saw many releases.

Jeff Vogel s Spiderweb Software has been making RPGs since the '90s. They look simple, but fans keep coming back for their depth.

The indie scene as a whole ceased to be a big player in the market—which isn’t to say that it vanished. Introversion’s Uplink for instance was a big hit in 2001. Jeff Vogel’s Spiderweb Software started releasing old-school RPGs like Exile and Geneforge in 1995. PopCap began in 2000, becoming the giant of casual games like Bejeweled, Peggle, Bookworm Adventures, Plants Vs. Zombies, and Chuzzle—not bad for a company that was originally called ‘Sexy Action Cool’ and planned to make its debut with a strip poker game. 

And of course, there are other notable exceptions, such as Jeff Minter, who never stopped making his psychedelic shooters both for himself and others. However, it wasn’t until 2004 when Steam nailed digital distribution that the market had a chance to explode and offer a real chance of going it alone.

The turning point

Steam wasn’t the first digital distribution system, and at its launch it wasn’t even popular, with Valve forcing it on players for both Half-Life 2 and Counter-Strike. However, it was the first major attempt that nailed the details, like being able to download your games on any computer you owned rather than having them locked to just one, and being able to do so perpetually, rather than simply for a year, as was the case with most of the competition. 

The results spoke for themselves. When Valve was a lot pickier, and being backed by a publisher was a distinct advantage to getting onto the system, any developer who managed to get onto Steam effectively received a license to print money. Farther afield, though games not on Steam were at a distinct disadvantage, the legitimisation of digital distribution as a concept certainly raised most boats.

And with all this came something just as important: the indie game ecosystem. With money to be made and developers flocking to indie for all sorts of reasons (being tired of the big companies, wanting to make a go of an independent project) it became viable to create tools and systems to help make the scene. Game Maker for instance, and Unity and Flash. Today, would-be indie developers have the tools to go head-to-head with even the biggest studios, albeit typically on a smaller scale, as well as explore more cost-effective options like pixel art and procedural 3D, while services like Kickstarter and Fig offer a way of seeking funding without immediately selling out. 

This also opened the definition of ‘indie’ even further, with companies seriously able to consider going it alone, without a publisher. Not everyone could be Double Fine, raising $3.5 million for Broken Age, but many have had huge successes—Pillars of Eternity pulling just under $4 million, the Bard’s Tale getting $1.5 million and in the height of Kickstarter fever, even Leisure Suit Larry creator Al Lowe managing to raise $650,000 for a remake of the first game.

Cave Story was one of the first games to get people talking about indie releases, beyond Flash games and the like.

It’s at this point that the word 'indie' really catches on. Again, it’s not that it was never used, but until this point the scene wasn’t big and important enough to warrant a position as basically a shadow industry in its own right. The release of Cave Story in 2004 was where people really started talking in those terms, with Indie Game: The Movie in 2012 cementing this, highlighting three of the most successful titles of the time—Braid, Fez and Super Meat Boy. 

Microsoft embracing the scene via Xbox Live Indie Games played its part, as did their XNA development system, and attempts to make a big deal out of indie launches during its "Indie Game Uprising" events between 2010 and 2012. 

Elsewhere, the IGF (Independent Games Festival) launched in 1999 was also going from strength to strength, drawing more attention to the likes of Darwinia, Monaco and Crayon Physics Deluxe. We also saw more overtly indie friendly portals like itch.io, and the Humble Indie Bundle, offering new marketplaces and ways of selling games—even if many later bundles proved a dead-end.

Perhaps most excitingly, it’s now that we start to see whole genres and styles largely associated with the indie market either flourish or come into existence, not least the ‘walking simulator’—games primarily about exploring a space and a story through environmental detail and voiceover. The first big name here was Dear Esther, a free mod released in 2008 and later remade in 2012, with later examples including Gone Home, Firewatch, and Everyone’s Gone To The Rapture.

Braid helped prove that indie games could be artistic works of love, equal to any commercial release.

There’s also the pixel-art aesthetic of games like VVVVVV, Super Meat Boy, and the original Spelunky, and for many old-school gamers, a return to brutal old-school difficulty. And somehow I doubt we need to say much about Minecraft. (It’s been quite popular, and influential.) Classic point-and-click adventures also saw a resurgence outside of Germany, largely spearheaded by the Adventure Game Studio creation engine and the success of Wadjet Eye Games’ The Blackwell Legacy, Gemini Rue, Technobabylon, and the upcoming Unavowed.

But it’s of course reductive to pick specific genres. The joy of indie games is that as long as the money can be raised somehow, a passionate team can take on more or less whatever they like, free of publisher interference or perceived wisdom, allowing for arty games like Limbo and Bastion (distributed by Warner Bros, but only as a publishing partner), throwbacks to lost genres like Legend of Grimrock, exploratory pieces like The Stanley Parable and The Beginner’s Guide, or completely new concepts like Superhot, where time only moves when you do, and the ferociously complex Kerbal Space Program, where difficulty really is a matter of rocket science.

The downside is that as ever, it’s not enough to simply make a game. An indie title buffeted with word of mouth can sell millions, but far more are doomed to languish largely unplayed and discussed in the depths of Steam’s increasing piles or other services’ far less traveled shelves. The initial gold rush is very much over. Still, plenty of gold remains. It’s impossible to predict what game will be the next Spelunky, the next Minecraft, the next Undertale, or the next Super Meat Boy, but absolutely no risk at all to bet that whatever it is, it’s already on its way.

Call of Duty® (2003)

I have never seen a more tragic comments section than the one from a few weeks back when we asked our readers to share their most tragic save file disasters. Over 200 of you shared stories of despair and woe as hard drives crashed, Uplay cloud saves glitched, or a simple misclick spelled doom for countless hours of gaming.

We've collected the saddest, most heartbreaking stories below so that you can wallow in their misery. And if you didn't get a chance to contribute your own story, do so in the comments.

The lost library 

This one hits hard because the emotional loss is so apparent. It's one thing to fall in love with your Morrowind character and your adventures together, but Bear's story of losing his entire library of collected books in Morrowind because of a virus really stings.

Commenter: Bear

My first Morrowind character. I had made an Argonian and enjoyed the wonders that the game had to offer, discovered mods a number of hours in, got myself a few decent ones, joined House Telvanni to appreciate the irony of being an Argonian and of Telvanni, and progressed very little on the main questline but became deeply infatuated with the world.

I kept telling myself, I'll do the main quest later, and something would come up. When the "something" was the Thieves Guild, I became captivated with in-game theft, and I claimed a home that was empty after I'd murdered the owner as my loot den.

I use the word loot loosely. I was only interested in one type of item to steal: books. I ventured back and forth across the continent stealing every book I could manage, piling stacks of books as high as I could manage in my den of ill-gotten goods, occasionally tossing other stolen things on the floor, but my pride were the hundreds of books stacked taller than my Argonian. The small room would take a good ten minutes to load because of the sheer amount of books. I'd take detours while exploring just to raid places looking for books. Even if I got one book, I was pleased to be able to add it to my collection.

This was the first time I'd pumped so many hours into any game, ever. It was probably 2003 or 2004, and I had a PC that was rough around the edges at best. It was passed to me by my father as a reject for his own uses, no doubt in hopes that I would get my 12 or 13-year-old behind off the family PC with minimal trouble, and it worked. Until my young self made an uneducated choice in my forays on the internet and I picked up a particularly nasty virus while trying to download some free graphics editing software. The PC wouldn't boot. My father refused to help me fix it (apparently he had regrets for giving me my own PC, because my internet usage increased rapidly) and I couldn't figure it out.

My father finally just reformatted the hard drive and when I went to restart Morrowind, my hundreds of hours and couple years of gameplay was lost. I'd just lost the one thing that helped let me escape the troubles of being a bullied, friendless kid so easily before.

Lost in space 

Not all of these stories have to do with losing a save file entirely. Some deal with the existential horror of being trapped in one location, never able to escape. Of course, that horror becomes a lot more tangible when there's a giant xenomorph rapping at your chamber door.

Commenter: Bob McCow

Alien: Isolation is a bit mean with the saving system. You have to find what looks like a retro telephone booth and dial a number, making sure that Mr. Alien is not about to skewer you with his tongue or show you his six freaky fingers. You can only go back two save points, so you have to be very careful.

After a month spent hiding in lockers and wetting myself, I'd progressed through the game painfully slowly. I was escaping from the nest and it looked like I was finally getting Amanda off Sevastapol for good. I only had to take a lift up to a safer level. Sadly, I dropped a gun while being chased by the Alien and it got wedged in the door in a very glitchy way. The glitch meant that although I could take the lift, the next level wouldn't load. I was stuck! I couldn't retrieve an early enough save file to avoid the glitching gun. I haven't had the courage to replay the entire game to get to that point, so I'll never know if Amanda made it.

She's left forever in that lift with the Alien banging on the door outside.

It's not you, it's Witcher 3 

Listen, people make mistakes. Sometimes those mistakes can hurt us, but I'm not sure if I'd ever end a relationship over a lost game save. But I guess The Witcher 3 isn't your average game.

Commenter: Piantino

Some time ago, my ex girlfriend wanted to play The Witcher 3 and I shared it from my Steam library with her. One day she played it in my PC, and when I came to play I realized that the save files in The Witcher 3 are the same when you share from your Steam library, and she saved her game in the same slot as mine. I lost my lvl 55 Geralt, my witcher gear and swords—everything. My time in Ard Skellige looking for treasures didn't serve for anything. I broke up with her some time ago and I use this story to explain why she is now my ex hahahaha.

Happy birthday 

I'm sure parents are equally as responsible for deleted saves as failing hardware. But there's something especially tragic when it all happens because they were trying to do something nice for you.

Commenter: Robáird Mac An TSaoir

In the late nineties, my dad surprised me for my birthday with some PC upgrades: a new monitor, bigger hard drive, and new graphics card. Of course, he'd wiped my old hard drive. Ten years of save files, writing, gig upon gig of films and music, all gone.

Commenter Grom Hellscream sums up the tragedy perfectly:

"Happy birthday, son. I formatted your entire childhood."

Groundhog Day 

If you've ever saved immediately before your demise only to find that you're now stuck replaying your death over and over, you can sympathise with Berty Bennish's story.

Commenter: Berty Bennish

I was playing the first Call of Duty back when it first came out. I would regularly save my games but in this instance, my last save was a couple of levels before the incident. It was the daylight St. Mere-Eglise level. After destroying the tank that comes out of the wall I ran round the corner heading towards where you would get in the car. I killed a couple of guys and ran a bit further. Game decides to auto save right when a German soldier pops round the corner and blasts me in the head. Instant death.

Game loads.

Instant death.

Game loads.

Instant death.

and so on…

What's hilarious about this particular story is that another one of our commenters had nearly the exact same problem.

Commenter: ImpatientPedant

When I was playing Call of Duty, way back in the day, there was a tank section. I hadn't saved for the entirety of the (rather long) mission, and contrived to save at the exact moment a shell was fired in my direction, a shell which would wipe me out.

Every time I tried to reload, the shell would fire and I would die. Over and over. I was shattered.

If a psychologist interviews me years from now and asks me why my dreams often have intermittent flashes of light, this is 100 percent the reason. Poor old toddler me.

Sorry, Mom 

Parents have unwittingly destroyed thousands of hours of time invested into games, but Zach Fathaigh's story flips the script. I'm assuming his mother had a hard time looking at him for a few days after.

Commenter: Zach Fathaigh

1996's The Realm is a fun proto-MMO that my mom was obsessed with. You get four or five character slots, I can't remember which. My mom let me have one of those slots (thank you, Mom). My older brother asked me what the game was like and I wanted to show him how fun it was to start a new character. So I looked at the list and saw Mom's two really badass characters, my character, and a level 1 naked character. I deleted that one to make room for my brother's character.

The deleted character was a mule with hundreds of hours worth of loot. I forgot about this incident entirely until my mom reminded me of it over the weekend.

Sorry, Mom.

Double whammy

We've all had hardware fail. Picking up and starting a game from the ashes of an old save is awful. Having to do it twice? No thanks.

Commenter: Kyosho

Christmas of 1999, I get the one game I really wanted under the tree. That big, ugly (beautiful?) orange and purple box. Planescape: Torment. From Christmas day until just before New Years, I put about 25-ish hours into the game. I was really into it. Then my hard drive crashed. I was devastated. I had the computer fixed within a week, but it took me another month or two to work up the nerve to start the game over from scratch. I did it, though. Even made some slightly different choices. It was a bit tedious to read ALL that text again, but after a good 15 hours or so, I got back to where I'd been. Played another 20-ish hours and... BAM, another hard drive crash.

Here's a tip, kids: Don't skimp out on your power supply when building a PC. It killed two hard drives before I knew the cause. Anyway, to say it was soul crushing was an understatement. I haven't beaten Planescape: Torment to this day. I've tried going back to it, but I end up losing interest before I ever get back to where I was. Best RPG of all time? Maybe. It's too painful for me to be able to ever know.

Tower of Trials 

Speaking of hard drive failures, I can't stress enough how important it is to back up important projects. We had countless stories about people losing game saves, but entire games? Seriously, don't wind up like Matt.

Commenter: Matt Pruitt

I once made an entire game in RPG maker VX-ACE. It was called the Tower of Trials. It was short and utilized only the assets the game provided. It had some random elements, little story, and was intended for short-runs about 30-40 minutes long. I worked on it for two years, starting on my old laptop and eventually finishing it on my first PC. It was my own little project and only a few of my friends played it. Then I discovered why people told me not to buy cheap HDDs. My hard drive crapped out on me and two years of work was lost. My oldest version of the game was on my old laptop and only had three floors of the tower completed. Needless to say, my current rig is running on a Samsung SSD.

Harry Potter and the Computer Thief 

It's one thing to lose a save file, but to lose the ability to play a game altogether? Now that's tragic.

Commenter: dxdy

Back in elementary school, 2001 or so, I really liked Harry Potter. Neither me nor my parents could afford a PC or anything to play modern games (had an Atari 130 XE though), so I was very happy when someone left Philosopher's Stone installed at the school's computer lab.

I could only play video games for a limited time after classes, so I only made it to Herbology Class over the course of several months. The game felt amazing to me, probably because I was reading Harry Potter books around the same time.

Once I went to school as usual, but after arriving I noticed it was completely deserted. Normally, entire halls would be filled with sounds of children playing but there was not a single soul in sight. I went upstairs. After walking around for a minute, I was spotted by the principal's assistant who rushed me to the cafeteria.

When we arrived there, I saw that all students were crammed inside. I quickly learned from colleagues that the school was robbed overnight. Robbers broke the window and stole a boombox, whole bunch of chocolate bars from school's kiosk, and every single PC from the lab. I lost not only the save file I worked for what felt like eternity, I lost the ability to play my beloved game in the first place.

These were just a few of the great stories our commenters told us. For the rest, be sure to check out the comment thread from last week.

Some comments were edited for grammar and clarity.

Half-Life 2

Every year PC Gamer's editors and contributors vote on a list of the 100 best PC games to play right now, and every year our Top 100 list is contentious. A game is always too low, and another too high, and another unbelievably missing. Such is the inevitable fate of any List Of Things In A Certain Order.

But this year, we decided it would be fun to transform the heated comment threads under our list into a list of their own—the Readers' Top 100. Last week, I asked you to pick your top two games from our Top 100 list, and suggest two games to add. I then compiled the votes (1,445 of them), weighing the write-ins more highly than the picks from our list, given that it's much more likely that 50 people would chose the same game from a list of 100 than all write in the same game.

My totally unscientific method does cause a few problems, namely: how much more do you weigh the write-in votes? A multiplier of three produced the most interesting list in this case, though next year I may ditch that tactic all together and take write-ins only. The danger is that a write-in-only list might be more easily swayed by organized campaigns (though that certainly happened anyway), and for this first attempt, I wanted to include a baseline to build off of just in case the suggestions were too scattered, or too homogeneous.

It worked out pretty well despite the uneven, improvised methodology—but do think of it as a fun exercise and not a perfect representation of PC gamers' tastes. Caveats out of the way, check out the list below. (Games that aren't on our Top 100 list are in bold.)

The PC Gamer Readers' Top 100

  1. The Witcher 3: Wild Hunt
  2. Half-Life 2 
  3. The Elder Scrolls V: Skyrim 
  4. Dark Souls 
  5. Borderlands 2 
  6. Fallout: New Vegas 
  7. Mass Effect 2  
  8. Doom (2016) 
  9. BioShock 
  10. Doom 2 
  11. Fallout 2 
  12. Deus Ex 
  13. Portal 2 
  14. Life is Strange 
  15. Starcraft 
  16. Baldur's Gate 2: Shadows of Amn 
  17. Grand Theft Auto 5 
  18. League of Legends 
  19. Diablo 2 
  20. XCOM 2 
  21. Fallout 4 
  22. Dragon Age: Origins 
  23. The Elder Scrolls III: Morrowind 
  24. PlayerUnknown's Battlegrounds 
  25. Bioshock Infinite 
  26. Overwatch 
  27. Command & Conquer: Red Alert 2 
  28. World of Warcraft 
  29. Rimworld 
  30. Path of Exile 
  31. Planescape: Torment
  32. Fallout 
  33. Dishonored 2 
  34. Crysis 
  35. Stellaris 
  36. Crusader Kings 2 
  37. Metal Gear Solid V: The Phantom Pain 
  38. Dishonored 
  39. Half-Life 
  40. Warcraft 3 
  41. Quake 
  42. Factorio 
  43. Prey 
  44. SOMA 
  45. Fallout 3
  46. TIE Fighter 
  47. Elite Dangerous 
  48. Rocket League 
  49. Civilization 5 
  50. Heroes of Might and Magic 3 
  51. Starcraft 2 
  52. Nier: Automata 
  53. Stalker: Call of Pripyat 
  54. Wolfenstein: The New Order 
  55. Minecraft 
  56. System Shock 2 
  57. The Elder Scrolls IV: Oblivion 
  58. Psychonauts 
  59. Divinity: Original Sin Enhanced Edition 
  60. Knights of the Old Republic 
  61. Age of Empires 2 
  62. Thief 2 
  63. Endless Legend 
  64. Vampire: The Masquerade – Bloodlines 
  65. Titanfall 2 
  66. Warframe 
  67. The Secret of Monkey Island  
  68. Kerbal Space Program 
  69. Europa Universalis IV 
  70. Hotline Miami  
  71. Payday 2 
  72. Battlefield 1 
  73. Dota 2 
  74. Total War: Warhammer 
  75. Mass Effect 3 
  76. Batman Arkham City 
  77. Rainbow Six Siege 
  78. FTL 
  79. Stardew Valley 
  80. Counter-Strike: Global Offensive 
  81. The Talos Principle 
  82. Tyranny 
  83. Civilization 6 
  84. Undertale 
  85. Knights of the Old Republic 2 
  86. Team Fortress 2 
  87. The Witness 
  88. Thief Gold 
  89. Arma 3 
  90. Dying Light 
  91. Alien: Isolation 
  92. Hyper Light Drifter 
  93. Planet Coaster 
  94. Jagged Alliance 2 
  95. Call of Duty 2 
  96. Transistor
  97. Mass Effect 
  98. Freespace 2 
  99. 7 Days to Die 
  100. Ultima Online

For reference, the top 10 games on our list this year were: The Witcher 3: Wild Hunt, Dark Souls, Dishonored 2, XCOM 2, Portal 2, Metal Gear Solid V: The Phantom Pain, Mass Effect 2, Alien: Isolation, Doom (2016), and Spelunky. If you want a condensed sense of how our tastes differ from those surveyed, here are a few observations:

We like Spelunky a lot more than everyone else. It was in our top 10, but didn't even make it into the Readers' Top 100.

While Half-Life 2 has lost some stock in our minds, it hasn't in everyone's. It was 11th on our list, but 2nd on the Readers' list.

Everyone agrees that The Witcher 3 is great. It was first on both of our lists.

Skyrim is still chugging along. It was 26th on our list, but came in third in reader voting.

Borderlands 2 wasn't on our list, but came in 5th. Did Borderlands fans came out en masse, or are we just weird for not putting it on our list?

14th place is pretty impressive for Life is Strange. Rimworld ranked pretty high, too. Either these games are more popular than we realized, or the survey happened to be circulated among their biggest fans. Probably a mix of both.

League of Legends fans showed up to challenge our preference for Dota 2. It came in at 18, while Dota 2 was knocked down to 73. Justice?

If you'd like to compare the lists directly, I've put them side by side in a spreadsheet. Thank you to all 1,445 people who responded to the survey! Feel free to suggest new ways to compile this list in the comments, and I'll take them into consideration next year. My skill with Excel spreadsheet formulas is at least double what it was last week, a cursed power that will only have grown by next year.

The Elder Scrolls V: Skyrim

During the climax of Star Wars: The Force Awakens, the Millennium Falcon crashes through a dense pine forest and skirts toward the edge of an icy cliffside. The trees splinter and pancake as the space jalopy slides to a halt and our heroes emerge unscathed, leaving a shower of splinters and snow in their wake.  

But the trees are not what they seem. The forests in The Force Awakens, PlayerUnknown’s Battlegrounds, and an episode of Sesame Street are all hiding something. They’re family. 

A significant amount of Mother Nature as represented in games and film starts life at SpeedTree, a small middleware company out of Lexington, South Carolina. Stranger still is that co-founder Chris King credits Bethesda Game Studios director Todd Howard for the company's success and the eventual deforestation of the Star Wars pines.

Todd Howard and the timber in The Elder Scrolls IV: Oblivion are also indirectly responsible for the alien forests on Pandora in James Cameron’s Avatar. You can blame Howard for the best trees in videogames and even in a rival series, The Witcher 3, with its vast windblown forests. You can also blame him for 3D models of the White House and surrounding shrubbery, likely tossed around the US Secret Service’s network to this day. 

You can allegedly blame him for CG pubic hair in 50 Shades of Grey (but don’t, King is totally unaware if SpeedTree's software was used as a dynamic genital hair generator. There are better solutions, I’m told. Still, it’s enough ambiguity for hope). 

But Todd Howard didn’t make the trees or the alleged pubes or the technology behind them. Howard just gave a small company a big job that helped it keep pace in an industry overgrown with new possibilities.

A seed is planted 

SpeedTree is a middleware solution for videogame developers and, more recently, filmmakers who need to make realistic trees en masse, quickly. That doesn’t mean SpeedTree is as simple as a copy and paste engine, or that it spits out photorealistic trees within perfectly simulated ecosystems. It’s easier to think of SpeedTree as more of a specialized tree canvas, a tool used to generate whole forests of trees that look similar but aren’t carbon copies of one another. 

SpeedTree’s tools also include massive libraries of textures for rendering vegetation, and animation tools that simulate wind moving through branches and leaves. Beyond the time and energy saved making so many trees, SpeedTree’s vegetation is also designed to conserve as many system resources as possible. 

For a simple understanding of how it works, the story of their success, and where SpeedTree is being used today, I talked to King over the phone and through a few scattered email exchanges. He’s a bright, generous character, the kind of person that blends a simple answer into an extended anecdote before coming full circle into a question about me. I get the impression the trees are in kind hands. 

“It’s all procedural, meaning we use curves and numbers to input into an algorithm that will generate this tree, and as you change those curves and numbers so changes the tree,” King tells me. “But once you get a definition, you can just hit randomize, one button, and it’ll make a tree that looks similar to the last, but not exactly like it. And you keep hitting randomize and it’ll keep hitting these different random variation, so you can populate a unique forest with trees very quickly.” 

But it’s important to clear up a common fallacy, he says. “It's a misconception that Speedtree generates these procedural trees on the fly, when in fact, it is an offline procedural tool. The procedural part is there, but it's just a tool that the artist uses to create the shape.” 

Simply put, every time you play The Witcher 3, SpeedTree isn’t procedurally generating a new forest. While a game is in development, artists, level designers, and programmers work together to generate a wholly unique forest using SpeedTree’s procedural tools, then touch up and arrange the final result by hand. Once the trees are in place, SpeedTree is also capable of handling “all the loading, culling, level-of-detail, lighting, and wind effects for entire forests” in real-time.

The Witcher 3 has some of the best trees in videogames.

Without the ability to summon a forest of wholly unique trees at will and render them in a live environment, developers have to build their forests entirely from scratch. This can be done by spending endless hours making small tweaks to branch structure from the same starting model, or by building an in-house solution to generate their own trees – albeit likely one with fewer features. Now, when peering out into the verdant, swaying canopies of Toussaint, the time saved and focused elsewhere is plain to see.  

Before setting the stage for Geralt, SpeedTree was a loose assembly of good ideas without testimony. Around the turn of the millennium, videogames with massive forests of distinct timber weren’t a common technical ambition, often reigned in by console hardware. For the most part, they were all done by hand, but off the tail of The Elder Scrolls III: Morrowind—whose development predates SpeedTree and itself contains many-a-thicket—Bethesda likely wasn’t ready to create a larger space with more realistic trees capable of running on modern PCs and emerging console hardware. They would need help, and that help was just sprouting. 

Counting rings 

Unlike most modern middleware and niche software companies, SpeedTree’s origins aren’t the result of surgical, chartered market research and Silicon Valley capital juicing. King inadvertently started SpeedTree with co-founder Michael Sechrest, the result of modest ambitions steered by restlessness and passion. 

Sechrest and King met as graduate students in the Computer Engineering department at the University of South Carolina in the late 90s. “We were deeply involved with real-time visualization on those huge old Silicon Graphics Unix systems, doing visualizations for the Office of Naval Research and the Department of Energy.”

Coinciding with the release of the original Nvidia GeForce GPU in 1999, King and Sechrest wanted to start a business and bring their experience creating detailed visualizations to the PC, which was finally capable of rendering their work in real time. The pair formed Interactive Data Visualization (IDV), Inc., which King considers “A great name for a company that does engineering visualizations, not so great for vegetation in videogames and films!”

Here's what a modern SpeedTree tech demo looks like in UE4. 

By 2001, the two found steady business creating pre-rendered 3D visuals and animations for a handful of companies, building architectural renders and flyovers day after day. It was good work, but rendering trees for video and stills didn’t challenge their programming skills in the same way videogames would. For King and Sechrest, it was one thing to model a scene and let a powerful computer render it overnight; and another to generate a forest and render it live in the same environment as a little Geralt and hundreds of invisible systems without setting the average household PC on fire.

But during a discussion about the creation of a particular flyover, the client and lead architect was very particular about the trees, requesting that each species be recognizable, a specific height, and to appear wind-blown. 

Without software capable of achieving such effects, King and Sechrest made their own, which ended up as the very first version of SpeedTree. “It consisted of a very rudimentary version of the SpeedTree Modeler and a small 3ds Max plug-in,” King says, “It worked well enough. In 2002, we released it as SpeedTreeMAX.”

More than 40,000 people downloaded the demo, including Todd Howard of Bethesda.

Chris King

It was a useful tool for creating more detailed trees, and in creating a new program entirely, their coding know-how was stretched a bit. In time, King and Sechrest saw an opportunity in SpeedTreeMAX. If properly recycled, the two wondered if the SpeedTree Modeler could be used to make assets for videogames, and if so, whether or not anyone would need them. 

To check, they posted an article on OpenGL.org to introduce developers to SpeedTree’s tech and gauge interest in a version that produces trees for real-time rendering engines. In a stroke of luck, Sebastien Domine of Nvidia read the article and downloaded the included demo, which demonstrated how SpeedTree could be used in a real-time setting. Domine was impressed by the technology, but he also figured a real-time SpeedTree would be an excellent promotional tool for a new Nvidia GPU. He contacted King and Sechrest, they struck a deal, and they started work on SpeedTree’s first true real-time demo.

From a cliched tech demo to Star Wars, SpeedTree has come a long way.

In true videogame form, the demo showed a mech launching and dodging missiles amidst breeze-blown trees, and was put up for download on Nvidia’s developer portal to a fervent reception. Chris tells me that “over the next three months, more than 40,000 people downloaded the demo, including Todd Howard of Bethesda.” Videogames were growing more capable with better technology, and the task of filling bigger, more advanced worlds with realistic, varied trees was growing bigger too. And King and Sechrest now had a major game developer on board to showcase their tech. 

“The only problem was that we'd never sold middleware,” says King. “We had no idea how it should be priced. Amazingly, Todd helped us determine what we both agreed was a fair price and then and there he became our first games client.”

King continues as I imagine an ambling Linklater-directed montage, Howard, King, and Sechrest huddled over a pile of paperwork and tree textures, pinning them to the wall and nodding as revelations dawn one after the other. Emails and phone calls don’t make for a good montage.

“Fast forward to March 2006. Oblivion is released for the Xbox 360. It was a monster hit and the SpeedTree logo appeared on the back of the box,” says Chris, “The perception was that SpeedTree may not be efficient enough to run effectively on a 360 or PS3. Because Oblivion demonstrated in grand fashion that SpeedTree was ready for primetime on any platform, it's probably the single most important title we've ever been a part of. We will always be grateful to Todd.”

At the time, an open world game at Oblivion’s scale running on a console was unheard of. And while the final Xbox 360 version is widely considered a buggy, nearly unplayable mess, SpeedTree demonstrated that it could manage to render trees on 16 square miles of land on hardware that predated even the most average of gaming PCs in 2006. It worked, and that was enough to draw industry-wide attention to SpeedTree.

The exchange went both ways, too. Would Oblivion have captured imaginations like it did with boring and repetitive forests? It doesn’t look great by today’s standards, but at the time, streaming in diverse stretches of vegetation on the relatively meager Xbox 360 specs was a technical breakthrough, proving that open world games could work on lesser hardware at a high fidelity. It set a trend and games with even denser forests followed suit, evidenced by games like The Witcher 3, which would look very different (or wouldn’t exist at all) if not for SpeedTree. 

Branching out 

A snippet from White House Down featuring SpeedTree. 

By today’s metrics, Oblivion is an old game. Now we get thousands of new games every year, a good chunk of which use SpeedTree, and some of which want a little more than a procedural tree tool. What about modeling the life of a tree? When I ask King whether simulation of arboreal biology is a priority for them, I’m told no in the best way possible: an extended anecdote from the left-est of fields. 

Shit, we'll make it 8,000 feet high, we don't care. It makes no difference to us, we're not bound by these botanical rules. What do you want it to be? Do you want it to be White-House-shaped?

Chris King

“One of our competitors was based in Europe, and the US Secret Service, believe it or not, is a customer of ours, and they put together simulations of the White House grounds, or at-least they were talking to us about it.”

It turns out that spending resources on creating systems that consider the laws of nature would only sequester SpeedTree to an even smaller niche. 

“So they call the European people and told them what the problem was and they said, ‘Well, our stuff is botanically accurate and Southern Magnolia's don't grow 80 feet high, so our software won't do that.’ King gets a little louder, “And they called us and we were like, ‘Shit, we'll make it 8,000 feet high, we don't care. It makes no difference to us, we're not bound by these botanical rules. What do you want it to be? Do you want it to be White-House-shaped? We can do that as well.’”

The Secret Service weren’t even the first clients to request similar detailed models of the White House lawn. King says it’s requested so often that it’s become a running gag in the company. The amount of games and movies that feature it as a setting is surprisingly high, and the request for technology that can emulate the nearby vegetation is in equal demand. Filming there is prohibited, and so the the more often SpeedTree becomes a requested tool, the easier it is for future film crews to turn to them.   

From Oblivion to the Secret Service, SpeedTree has since branched out into generating vegetation in general. King and his cohorts are also currently prepping the full release of SpeedTree 8, which is a big leap forward in physically based rendering (PBR) tech, and showcases their new material scanning hardware and software used for an ever-growing database of megatextures and models. To create such a deep library, King even employed a specialist whose job is to literally scan as many plants as possible. Imagine that, strolling about the planet looking for pretty plants to immortalize every day. Noble. 

Prior to placement in a game or film and removed from the discerning touch of an artist, the trees and shrubs and cacti look serene, eerie. They’re nature’s aesthetic devoid of context, cleaner and healthier than any plant will ever be, the Stepford wives of common ferns. They look real, and to someone looking up at the prominence of a potential mountain of work creating their own assets, SpeedTree’s prices are a much more appealing alternative. Still, as useful and seemingly advanced as SpeedTree’s tech might seem to developers, it has plenty of room to grow.  

 “You could say we're currently near the center of the spectrum of digital fidelity in tree models.  We look good from the distance players typically view trees in a game situation—on or near the ground and maybe 10 to 100 feet away.” says King, “So, in ten years, hopefully we'll be zooming from space to a vein on a leaf seamlessly!” With more games like Star Citizen and Dual Universe looking for the same scale and adaptability, King knows they have their work cut out for them. 

 SpeedTree was even used in the 2015 Oscar Winner for Best Picture, Spotlight. 

In the meantime, with every wave of new games, we get a firsthand look at how it’s changing. King can’t disclose everywhere you’ll see SpeedTree used next, but Destiny 2, like the original, will make use of SpeedTree. He also tells me that Industrial Light and Magic has more or less standardized SpeedTree, and that barring any breakthroughs in vegetation generation it’ll be used in most of the upcoming Star Wars films. 

Ubisoft also brokered a deal with SpeedTree in 2016, agreeing to use it as their standard vegetation modeling tool. Given the size of Assassin’s Creed: Origins’ map, SpeedTree makes perfect sense. Since Avatar, SpeedTree has been used in nearly 100 films and well over 1,000 games, making it the industry standard. It's everywhere. 

Rogue One’s beachside battle on Scarif, arguably the series’ best battle ever, used SpeedTree. Whenever you slap a new player into a tree in Absolver or make a hasty getaway through the forest from an irate honey badger in Far Cry 4, SpeedTree is there. So the next time you take cover behind a big hardwood in PUBG, remember who generated its bulletproof bark, and in turn, set the table for this chicken dinner and many more to come. 

Quake

You may recall that in Doom and Doom 2, multiplayer matches took place in standard campaign maps. In other words, deathmatch had no maps designed especially for PvP skirmishes. It seems unthinkable now, because nowadays multiplayer maps are a fine art of their own (though plenty of level creators ended up making special deathmatch maps for Doom anyway). 

With Quake, id Software started adding multiplayer maps of their own, and in a recent interview with PCGamesN, Tim Willits made the claim that it was his idea. Explaining how he wanted to use remaining map fragments from single-player levels to adapt for multiplayer, Willits claimed his idea was roundly mocked.

"They [John Romero and John Carmack] both said that was the stupidest idea they'd ever heard. Why would you make a map you only play multiplayer when you can play multiplayer in single-player maps? So I said 'No, no, no, let me see what I can do.' And that's how multiplayer maps were started. True story." 

But is it true? Apparently not, according to other id Software veterans including Romero, Tom Hall and American McGee. The former wrote a lengthy blogpost on the matter, specifically denying the exchange between Willits, himself and Carmack ever happened.

"This never happened (Carmack verified this to ShackNews)," Romero writes. "In fact, we had been playing multiplayer-only maps in DOOM for years already. There had been hundreds of maps that the DOOM mapping community had made only for deathmatch by that time. DWANGO was a multiplayer-only service that had many multiplayer-only maps that are legendary today. 

"American McGee even released a multiplayer-only map in November 1994 named IDMAP01. The incredible DOOM community invented the idea of designing maps only for multiplayer mode, and they deserve the credit. The game owes so much to them."

It's worth reading all of Romero's post for the nitty-gritty, where he also discredits Willits' claim that he had designed the first episode of Quake (it was a collaboration, with Willits designing less than half of the maps). He also points out that other FPS games, such as Rise of the Triad, had featured bundled multiplayer-only maps before Quake did.

Whatever the case, American McGee denied Willits claims on Twitter, and Carmack confirmed with ShackNews that he doesn't remember the conversation happening. We'll update this story when (or if) Willits responds.

Quake

There is a high school reunion backstage at QuakeCon. The silver pots of catered food delivered by the towering Gaylord Texan above keeps everyone buoyant, and occasionally a good samaritan wanders in with a short pyramid of Domino’s pizzas. The casters are hard at work on the corner of the stage, and the on-deck circle is filled with whirring computers hardwired to LAN cable for any enterprising team looking to get a few more reps in before showtime. For the most part, the Quakers are relaxed. There is laughter and shit-talk, and enveloping bear-hugs offered between friends who haven’t seen each other in far too long. 

In recent years, fans of the mercurial Quake franchise haven’t had much reason to play outside of id Software’s yearly love letter to the franchise, but the upper echelon of the scene remains sturdy. Tim “DaHanG” Fogarty and Andrew “id_” Trulli are both in their late-20s and play for Team Liquid’s Overwatch squad—but they’ve each taken a respite from that game to form a (slightly impromptu) team for this year’s Quake Champions tournament. The lithe Shane “Rapha” Hendrixson is here—since 2008 he’s traded titles in the 1v1 dueling bracket against Alexey “Cypher” Yanushevsky. He’s entering this year’s show defending championships from both 2015 and 2016. 

I spot Sander “Vo0” Kaasjager sequestered away from the rest of the crowd, playing endless deathmatches to keep himself frosty. In his jersey and trademark gamer grimace, he doesn’t look much different from the man who famously lost to Johnathan “Fatal1ty” Wendel in the grand finals of the 2005 Cyberathlete World Tour in what was then the biggest prize pool in the history of competitive gaming. Together, they represent the first generation of esports—the first men who dared to make a living playing video games. The world has passed them by, but they’re not leaving without a fight.

“I started playing Quake in 2001, I’ve known some of these guys for 10 years,” says id_ backstage with a tub of lunch in his hands. “Quake has a longstanding community for over a decade, and those players will always come out of the woodwork to compete. Not just for money, but for the pride and the title, that’s something that Quakers live for.”

Quake reborn

The QuakeCon tournament, which previously focused on minor bounties in stale Quake Live brackets, now features a million-dollar Champions prizepool.

For the past seven years, the Quake game de jure was Quake Live, the still-active browser emulation of the legendary Quake III: Arena. It served as the franchise’s testament and tomb. There hadn’t been a new Quake game since 2005’s middling Quake 4, and as the esports industry hit its tipping point, id software instead chose to focus on their single-player ambitions with the ambitious Rage and the long-gestating Doom reboot. The cadre of Quake pros still showed up to QuakeCon every year to reignite old rivalries, but there wasn’t much to play for beyond that.

However, the mood is different this year. For the first time in forever, QuakeCon is headlined by its namesake game. The free-to-play Quake Champions is on the horizon, and the QuakeCon tournament, which previously focused on minor bounties in stale Quake Live brackets, now features a million-dollar Champions prizepool. You could consider it a commencement ceremony for an esports initiative that aims to make Quake a crucial fixture in the scene again. Already, Bethesda has announced two Dreamhack Quake Champions tournaments before the end of the year, and both are paying out decent prize money. The marketing here is transparent—at this point it’s harder to find a game company that’s not doubling-down into esports—but the circumstances are unique given the heritage that was already present. These Quake players would’ve gathered here anyway, but now, they get to be professionals again.

Rapha fits the bill of the long-suffering FPS pro perfectly. He’s an incredible duelist who can track down railgun headshots with his eyes closed, but he hasn’t been able to find a game that fits his skillset since the Quake scene dried up during his prime. He had a brief affair with Ubisoft’s dead-on-arrival ShootMania, and he tried and failed to find his groove on the Team Liquid Overwatch team. But that was it. He was doomed to a purgatory of yearly Quake Live matches against the same tired competition he faced as a college kid. The Quake Champions announcement changed everything. He can finally go back home.

"It s very painful to lose in 1v1 sometimes, because it wasn t the game you lost to, it s your opponent."

James 2GD Harding

“It’s amazing for me. I’m just excited for the opportunity to play in multiple tournaments again,” he says. “I really liked Overwatch but it feels like a lot of the skills there are confining. … I gave it my all, but Quake is just my game.”

Rapha isn’t the only one. Id_ tells me he’d consider making a full-time comeback if the Champions scene stays healthy. Anton “Cooller” Singov inked a deal with esports giant Na’Vi to return to his roots. Alexey “Cypher” Yanushevsky did the same after logging time with both Counter-Strike and Overwatch. Quake legends around the world are watching Bethesda put their money where their mouth is, and are graciously taking the opportunity to see if they've still got what it takes.

Art of the duel

It’s hard to articulate exactly what these pros find in Quake that they can’t in other FPSes, but one thing is certainly clear: there’s no true 1v1ing in Overwatch. If you’re familiar with those old CPL derbys you know what I’m talking about—two players coasting the circumference of an arena, stacking green armor, weapons, and health in hopes of winning a frantic, five-second engagement. The 1v1 format tested your twitchiness, but it also evaluated how well you could read and react to your opponent, a perfect marriage of mindgames and rocket launchers. It’s a unique, and rewarding style of play that’s been missing in our era of role-based skirmishes for quite some time. If you grew up on whip-around nailgun blasts, perhaps Soldier 76’s auto-aim might seem a little cheap. “It’s just you and the other guy. There’s no other factors. It’s just who can play more consistent, and who can outsmart the other guy,” says Rapha.

“It’s incredibly personal,” says James “2GD” Harding, another former Quake pro and someone who’s been around esports for a long time. “[In 1v1] all of your intelligence and all of your dexterity is being challenged by the best players in the world. It challenges you so much that you can never really master it, but you can try to be the best at certain things. Like, maybe you try to win a tournament by being the best at aiming, or win a tournament by being the smartest player, or being the most aggressive player. It’s very painful to lose in 1v1 sometimes, because it wasn’t the game you lost to, it’s your opponent.”

I think in some ways we re hoping to be replaced.

James 2GD Harding

Bethesda values the format enough to corner off $330,000 of the QuakeCon prizepool to the 1v1 bracket alone. Tim Willits has called Quake Champions’ dueling the “secret weapon” to the company’s esports plan, reckoning that it’s the one thing Champions has that other games don’t. It remains to be seen if Quake can crossover like it did in the ‘90s and early 2000s, but in the meantime it’s wonderful to watch the veterans get a run at something they used to obsess over. The QuakeCon tournament was full of great matches: in 2017 we had the pleasure of watching high-stakes sets between Cooller and Rapha, DaHanG and Noctis, Av3k and Vo0. These men have wives and kids, and they were still blasting off their feet in acrobatic rocket jumps. No matter what happens from here, we at least had the chance to watch the founding fathers of pro gaming live the dream one last time.

But maybe that’s also the one thing holding Quake Champions back. Esports, like any other competitive field, needs a trickle of new blood to survive. Running back the same posse of professionals under brighter lights and a felicitous bankroll doesn’t bode well for the future. “I think in some ways we’re hoping to be replaced,” says 2GD, noting that the average age of the players at Dota 2’s The International landed somewhere around 21.

That might sound like a strange thing to say, but then again, everyone at QuakeCon was there for the same reason. They love and fear for Quake, and while they’re happy to play a brand new game for a significant wad of cash, their primary concern is the continued prosperity of their favorite game. They won’t fall on their sword, but they’ll happily welcome the next generation if they earn it.

New blood

That wish was granted on the third day of the tournament. Team 2z were completely anonymous when they walked through the doors of the Gaylord Texan. Their Twitter account sports a scant 199 followers. They are unsponsored, unsanctioned, and reachable by a blasé gmail address answered directly by the players. Mostly, they’re in their early 20s and late teens, green as grass, and stacked up against a combined century of Quake experience in the other teams.

Quake is fast, brutal and ridiculously hard to become good at.

Nikita "Clawz" Marchinsky

And yet, they pulled off a clean sweep of every Quake Champions match at the show. 2z took home the team-based Sacrifice tournament with definitive wins over Team Liquid and the prodigious NOTTOFAST, and the 19-year old Nikita "Clawz" Marchinsky flat-out embarrassed Vo0 in the 1v1 championship with an icy 3-0 blow-out. They were, by far, the least famous players entering the weekend, and they exited as the undisputed best in the world.

“For me personally it was very special to compete against all the legends I grew up watching and idolizing. I think we were very underestimated LAN-wise before this event because all of them have so much more experience than us,” says Clawz, a few days after his victory. “It felt even more like that in the 1v1 tournament, where any predictions containing me among the top three were made fun of by the old legends. It felt amazing to prove them wrong and to show the world what I'm capable of.”

All four members of the 2z squad are excited about the upcoming Dreamhack tournaments: eager to defend their first-place status and clearly aware of the targets on their back painted by a legion of veterans. But they didn’t get to the top with any trickery or cheese, they’re simply outstanding FPS players who outworked their opponents in the film room and on the ladder. 

Frankly, I was surprised that they decided to choose Quake. You get the sense that 2z could easily excel at Overwatch, or Counter-Strike, or any other FPS with a healthier, less-nubile scene than Champions. One of the players, Kyle “Silentcap” Mooren has a history with Quake III and Quake Live, but the others are arriving without any ruddy nostalgia. It speaks to the game’s legacy that they still found their home here.

“I've played some Overwatch and a bit of CS:GO as well, and as much as I enjoyed them, none of them are quite like Quake,” says Clawz. “Quake is fast, brutal and ridiculously hard to become good at.”

“I like to keep this tradition, I mean to play the first and the very best, hardest shooter in the world,” says Alexander “Latrommi” Dolgov. 

QuakeCon is a high school reunion. They came across oceans to eat catered cheeseburgers, to reignite old rivalries, to remember how things were. There’s a brand new game, a lot of money, a lot of hope, and for the first time in a decade, they’re losing. For the first time in a decade, that’s the best news they could possibly get.

The Elder Scrolls V: Skyrim

We're still a ways out from VR becoming Mancubus-mainstream.

I’m not an athletic man, but in Doom VFR, the upcoming VR version of 2016’s hit reincarnation of id’s classic shooter, I am Death Himself. As a possessed soldier tries to shoot me, I point a teleport marker and watch as everything slows down. I duck his bullets, teleport close, and shoot him in the face with a shotgun.

After a few shots, a Mancubus starts to flash, indicating it's ready to be torn asunder. I point my cursor into the center of the Mancubus and release the trigger, teleporting myself into the center of the obese demon. The Mancubus explodes and his immense body sags to the floor, the corpse so large that I have to step out of the empty shell to move on with the fight. Clearly, VR is cool.

But even for a relatively young technology, there have been very few major game releases for virtual reality headsets. It makes Bethesda’s all-in approach a confounding surprise, but by bringing Skyrim, Fallout, and Doom to VR, the company is in a good position to change the pace. In associating some of the biggest names in gaming to VR, Bethesda is trying to set itself up as a leader in VR development, build its own internal expertise, and help jumpstart a technology that has grown in fits and starts.

“If you believe something is going to be big, you can't just say, ‘oh, this will be big in six or eight years, so let's ignore it until it's big, and then we'll jump on the bandwagon,’” Doom VRF executive producer Marty Stratton tells me at Quakecon this year. “There's so much to be learned, and there's so many opportunities to be leaders. We want to be technical innovators, so when we see something we believe in, we want to be at the forefront.”

Of the three VR prototypes heading for release this fall, Doom is by far the best. The closed-in spaces of the UAC’s corridors look sleek and sci-fi inside a VR headset, and teleporting up and down hallways to blast imps and shotgun cacodemons feels spectacular, fast, and smooth. Time slows for teleportation or switching weapons, so I always felt able to react faster and be more badass than my weak, fleshy mortal body could ever allow. No matter which weapons I used, pulling the trigger of the controller felt as natural as aiming down the sights.

Shooting for the Skyrim

Sadly, all of these reasons are exactly why Skyrim VR, which is coming to PSVR first in November and PC sometime in 2018, is the weakest of the bunch. The wide-open vistas of Skyrim look pixelated and low-res running on the Playstation headset, and the 180-degree motion detection of the PSVR meant that I had to constantly use physical buttons on my controller to rotate myself and change direction. 

Descending one of a Skyrim dungeon’s many spiral staircases was dizzying as I used a teleport button to hop down a step or two, then tap-tap-tap-tap-tap to rotate my body, teleport, then rotate again.

Instead of feeling bold as the Dovahkiin, legendary hero of Skyrim, I felt like a child playing Fruit Ninja on a Nintendo Wii.

None of this compares to how disappointed I was when I first heard the call of an enemy bandit. I readied my sword only to find myself waving a wand in space, awkward and unsure if I was even making contact. I wiggled it around a few times and the bad guy fell over. Both of us looked embarrassed about the whole thing. Instead of feeling bold as the Dovahkiin, legendary hero of Skyrim, I felt like a child playing Fruit Ninja on a Nintendo Wii.

“Everybody says that,” says Pete Hines, VP of Marketing for Bethesda, when I complain about how the melee weapons feel. “The problem is that when you do this [he pulls a trigger on a gun], you don't detect the funkiness of the action because it's on a predestined path.” The gun behaves like a gun, in other words, and you’re pulling a trigger just like you’d pull a real-life trigger. “But when the sword swings however you move it, you notice it a lot more, now it does feel more like Fruit Ninja.”

Even though Skyrim will be coming to VR mostly unchanged, with all its quests, dialog, and NPCs in place, Hines expects that players will change how they play the game based on what feels good. In particular, he expects more players to become mages. “Because of the nature of VR and magic, it works exactly like you'd expect it to, because you're not missing the feedback.” Dual-wielding magic, in particular, works better in VR. Being able to move your hands independently gives you the chance to shoot fireballs in two directions at once, or hold a shield against one enemy while you shoot lightning at another.

Waiting for the Fallout

I enjoyed Fallout VR more than Skyrim VR at least, because the world running on a PC looked a lot better and my ability to turn in a full circle was unrestricted. Though I had the option of attacking raiders with a baseball bat, Fruit Ninja Skyrim style, I could easily ignore it since Fallout has a huge array of guns that feel good to use. Launching a mini-nuke at a Deathclaw and watching the blast in VR was every bit as fun as it sounds.

Still, it wasn’t quite right, and I’m not sure if the trade-offs are worth the momentary wow-factor of stepping into VR. Can I explore the Commonwealth Wasteland for hundreds of hours in this gear, or will I always start to feel green like an irradiated ghoul after half an hour?

These are limitations that come with taking an existing game and bringing it into VR. Problems with movement and feedback just aren’t solved yet, and these problems wouldn’t exist if a game was built for VR from the beginning. I can’t help but compare my time with Skyrim VR with Lone Echo, an incredible game that was built to showcase everything VR can do right now, and nothing that it can’t.

But for Bethesda, getting experience is worth it even if the end product isn’t quite perfect. They’ll put out the best version of their games that can exist in VR right now, and they’ll gain valuable internal experience with VR design. I can see how it’s a win-win for Bethesda to take a risk with these experiments, but I’m not as confident that buying these experiments offers much to long-time fans of these games desperate for a familiar experience in their seldom-used VR gear.

Skyrim VR is only scheduled to come to PSVR in November, with a PC release possibly coming in 2018. Fallout VR is coming to PC on December 12, and Doom VFR is coming to both platforms on December 1.

Borderlands 2

Damage dealing is the most selfish of gaming roles. It's not about mending the wounds of your buddies or taunting off the bullies attempting to harm them in the first place; it's about the ecstasy of climbing to the peaks of damage meters and watching ever-larger numbers splash the screen. If gaming in general is a power fantasy, a strong DPS build is the wet dream.

And sometimes they gets out of hand. Some builds are so powerful that developers pull the plug, worried that they're damaging the game itself. This is a celebration of those builds that have achieved or come near those marks: the most famous, powerful, interesting builds that disrupt a game's conventions as violently as the power chords of an Amon Amarth riff at a Chopin recital.

We know we've barely scratched the list of great builds here—almost every game has one! If there's one you think we should have included, tell us about it! 

Skyrim: Sneak Archer with Slow Time (2011)

Subtlety isn't really the first thing that comes to mind upon a first glance at Skyrim—after all, it's largely a game about some rando shouting dragons to death. But for years nothing struck fear into the hearts of giant flying reptiles and creepy Reachmen quite like Skyrim's Sneaky Archer build. It's still quite beastly, but YouTuber ESO describes it in its most broken form.

The build's cornerstone was the Slow Time shout, which you could extend by 20 percent with an Amulet of Talos and up to 40 percent by visiting a Shrine of Talos. Slap some Fortify Alteration enchantments on your ring and swig a Fortify Alteration potion, and you could push that over a minute. That's longer than the shout's cooldown. Pick up the Quiet Casting perk in the Illusion line, and you could wipe out a whole band of Stormcloaks before they even knew you were there.

Combine that with plenty of points in the Sneak line, Fortify Archery enchantments on every bit of gear, and paralysis or fear enchantments on your weapons, and you might be tempted to ask the fur-clad denizens to worship you in place of Talos. Unfortunately, Bethesda killed the fun with last year's Special Edition. Slow Time now slows down time for you as well. But even without it a Sneak Archer remains a force to be reckoned with.

The Witcher 3: Alchemy and Combat build

Most crazy damage builds feel as though they're breaking with the lore of their parent games, but The Witcher 3's combination alchemy and combat builds tap into the very essence of what it's mean to work in Geralt's profession. You're a badass swordsman thanks to 36 points in Combat, and 38 points in the Alchemy line see you chugging potions and decoctions along with making sure you're using the right oil for the right monster.

YouTuber Ditronus detailed the best incarnation of this monster setup, which focuses on stacking everything that gives you both critical hit chance and critical hit damage. Ditronus claims he can get hits that strike for 120,000 damage for the build at Level 80 and on New Game+. 

The tools? Pick up the steel Belhaven Blade for its crit potential and the Excalibur-like Aerondight silver sword for its damage multiplier. Use some other crit-focused gear along with two key pieces of the Blood and Wine expansion's alchemy-focused Manticore set and use consumables such as the Ekhidna Decoction. Toss in a few key mutagens and frequently use the "Whirl" sword technique, and Geralt becomes the spinning avatar of Death herself. It's bewitching.

World of Warcraft: Paladin Reckoning Bomb (2005) 

World of Warcraft has seen some crazy damage builds over the course of its 13-year history, but none has reached the legendary status of the Paladin class' "Reckoning bomb" of WoW's first "vanilla" years. It wasn't officially a damage setup, but rather an exploit of the Reckoning talent from the tank line that some Retribution (damage) Paladins would pick up. Originally, Reckoning gave you a charge for a free attack whenever you were the victim of a critical hit, and in 2005, you could stack this to infinity and unleash them all at once for your next attack. Build enough stacks, and you could one-shot other players in PvP.So how imba was this? In May 2005 a Paladin named Karmerr from the guild PiaS (Poop in a Shoe, if you must know) got his rogue friend Sindri to attack him for three whole hours while he was sitting down, guaranteeing critical hits, until the stacks reached a staggering 1,816. Their mission? The Alliance server first kill of Lord Kazzak, one of WoW's first 40-man raid bosses. Knowing their plan was controversial, PiaS asked a Blizzard Game Master for permission, and the GM claimed it couldn't be done. But Karmerr popped his invincibility bubble and, boom, killed Kazzak in a single shot. Alone. The devastation was so intense that it locked up Karmerr's PC for 10 seconds and the system was so unprepared by the 1,816 attacks it could only register the blow in second-long ticks until Kazzak died. PiaS posted a video, and within 24 hours Blizzard nerfed Reckoning from a 100 percent chance to a mere 10 percent chance and limited the stacks to five.

Diablo 2: Hammerdin 

Overpowered Paladins are something of a Blizzard tradition. One of the most infamous overpowered DPS builds of all time is the so-called "Hammerdin" from Diablo 2. The Blessed Hammer skill was the heart of the build, which shot out a spinning floating hammer that smacked any monsters foolish enough to get near. 

Hammerdins had been around in various incarnations for months, but the build came into its own in 2003 with the introduction of synergies. With Blessed Aim, Paladins could increase their attack rating; and with Vigor, they could boost their speed, stamina, and recovery. Damage, too, got a boost with Concentration Aura. But nothing made the build so broken as the Enigma Runeword, which sent the Paladin immediately teleporting into huddles of nearby enemies and clobbering them for up to 20,000 points of damage each. Watching it in action looks a little like watching a bugged game.

The catch? Almost everything needed to complete the build was outrageously expensive. But as the obscene number of Hammerdins dominating Diablo 2 came to show, that was never much of a deterrent.

Borderlands 2: Salvador

Forget specific builds for a second: Borderlands 2's Salvador is kind of broken by default. His class ability—"Gunzerking"—lets him fire off two weapons and reap their benefits at once, all while taking less damage and constantly regenerating ammo. Nor does this kind of destructive divinity come with any real challenge. If you've got the right weapons equipped, all you really need to do is sit still and fire away. Dragon's Dogma had the right of it: divinity can get kind of boring.

The right weapons push this already preposterous setup to absurdity. Pick up the Grog Nozzle pistol from the Tiny Tina's Assault on Dragon's Keep DLC, which heals Salvador for 65 percent of all the damage he deals out. In the other hand, equip a double-penetrating Unkempt Harold pistol, which hits enemies as though seven bullets had hit them twice. Then pick up the Yippie Ki Yay talent that extends Gunzerking's duration for 3 seconds for each kill and "Get Some," which reduces Gunzerking's cooldown after each kill, and you'll always be firing, always be healing, all the time.

Elder Scrolls Online: Magicka Dragonknight Vampire (2014)

Ever wanted to be a raid boss in an MMORPG? You could in 2014, not long after the launch of Elder Scrolls Online, if you were a magicka-focused Dragonknight who'd become a vampire. You were both DPS and tank, able to take on dozens of players in PvP at once and kill most of them as well.

The root of the problem was the vampire tree's morphed "ultimate" ability, Devouring Swarm, which sent a swarm of bats down on everyone else in melee range while also healing you for everyone hit. But every class who became a vampire had access to that.

Dragonknights, though, could also use their Dark Talons skill to root all those players in melee range while roasting them with fire damage at the same time. A huge magicka pool made it even deadlier. Then a passive ability called Battle Roar factored in, allowing the Dragonknight to replenish health, magicka, and stamina based on the cost of casting Devouring Swarm. And it gets crazier. If you were wearing the Akaviri Dragonguard Set, you enjoyed a 15 percent deduction in ultimate ability costs, essentially allowing you to spam Devouring Swarm.

This was already hellish with regular Dragonknights, but players who earned the "Emperor" title in PvP might as well have been Daedric lords. Being the current emperor granted buffs like 200 percent ultimate and resource generation, leading to situations like the one in the video above. 

The easy way to stop this nonsense was always just to stay out of range (although the DK's charge ability from the Sword and Shield line complicated that). Within a month, though, ZeniMax Online nerfed it to hell.

Diablo 3: Inarius Necromancer 

How do you bring interest in your four-year-old dungeon crawler back from the dead? With a Necromancer class, obviously! At least that's what Blizzard Entertainment was apparently thinking when it introduced the class to Diablo 3 in June 2017.

That makes the Necromancer the "youngest" entry on this list, but it's no less deserving of the honor. The damage Necromancers have been dishing out this summer is so crazy that the "best" broken builds change every few weeks. Not long ago the top dog was the Bones of Rathma build, which basically let the Necromancer kick back while an army of skeletons and undead mages did all the hard work. 

Nowadays it's the Grace of Inarius build (which YouTuber Rhykker calls the "Bonestorm" build), which centers on the set's six-piece "Bone Armor" bonus that smacks enemies who get too close with 750 percent weapon damage and boosts the damage they take from the Necro by 2,750 percent. Then the Necro goes around whacking everything with his Cursed Scythe skill with the help of another bonus that reduces his damage taken. Choose the right complementary skills and weapons, you'll soon be tromping through Level 107 Greater Rifts as easily as a katana slicing through yarn. Getting the set pieces will take a bit of grinding, of course, but it's worth it for the payoff. Until, you know, Blizzard nerfs it.

...