STORE COMMUNITY ABOUT SUPPORT
Login Store Community Support
View desktop website
© Valve Corporation. All rights reserved. All trademarks are property of their respective owners in the US and other countries.
This is the latest in the series of articles about the art technology of games, in collaboration with the particularly handsome Dead End Thrills.>
When Paul Weir gave a talk at GDC 2011 about GRAMPS, the generative audio system he designed for Eidos Montreal’s Thief, the games press took notice. Not so much of the contents, though, or indeed the subject, just Thief. Here, finally, was a chance to get something> on this oh so secretive game. Maybe, while prattling on about ‘sounds’ and stuff, he’d toss them a headline or two, get ‘em some clicks. Suspecting as much, Weir recommended to his audience that anyone just there for Thief nooz should probably leave the room. Some people did.
We can often seem deaf to game audio in the same way we’re blind to animation. Maybe it’s because the best examples of both are so natural and chameleonic that they blend into a game’s broader objectives. Maybe it has to be Halo ostentatious or Amon Tobin trendy just to prick up our ears; or make the screen flash pretty colours. Or maybe Brian Eno has to be involved, as we’ll come to in a minute. (more…)
Players from around the world started poking at the cube. There was a stupendous prize inside that would take players, working in unison, ages to unlock. Presumably to alleviate boredom, Curiosity gamers tried to make their mark, by chipping out the shape of some letters or a simple picture.
According to an official time stamp on Twitter, in the wee hours of November 5, some players had actually already downloaded into the game. They were cracking blocks. And on day -1, what did they etch into that massive blank block?
Why are people doing this? Oh, this called for investigative report, of course! This article, which is all about the results of that investigation, is, obviously, NSFW...
Following a struggle with the tech folks following a fault in the SimCity press servers, the lost city of Fahey's Folly—aka the SimCity on the Edge of Forever—was found once again. In celebration, I unleashed red-hot dino fury.
Since I'll be using my own Origin account and retail servers for our upcoming SimCity review, Fahey's Folly didn't have long to live anyway. I think destruction by a Spore-looking Godzilla stand-in is what the city would have wanted.
The scaly bugger doesn't do all that much damage anyway—this video is actually the third of three dino attacks. Thanks to the GlassBox engine, any destroyed buildings are quickly replaced once the rubble has been bulldozed. Natural disasters just aren't what they used to be.
EA assures me that the bug that kept me from my city was an issue with the private servers the press preview are being held on, and should not affect retail customers, so hopefully noone should have to lose something they love, get it back and then kill it ever again.
At least not in SimCity.
But being a PC gamer...as far as I've been able to tell, that happens on purpose. That's something you declare. It's no accident. It's an effort, a conscious act.
I once was a PC gamer. Then I stopped, for years. Soon, I'll start again. I'm ready.
For most of my gaming life, the no-stress ease of the gaming console suited me well. I drive automatic transmission after all, not stick. I don't have any desire to lift the hood of a car. Tinkering is barely a pleasure; maintenance is something to pay others for. I recently installed a ceiling fan and only shocked myself slightly. That was enough home improvement for me.
The PC gamer, I've observed, is the person who will lift the hood of a car. They tinker. They fix. They expect things to not run perfectly and they assume the responsibility to make them run better. The console gamer waits for a patch. The PC gamer finds one. Or makes one.
I was a PC Gamer in 1985, when, despite my complaints, my parents bought a Commodore 64. I still recall my bizarre reaction, as I complained to my mother that using a computer was "cheating". Strange, I know, but that's how I first came to think of computers. To me, they were shortcut creators. That was their power. We did word processing through Bank Street Writer and practiced typing with a game that involved a wizard whose spell-casts I can still hear in my mind today.
We got a lot of games. Snooper Troops stands out, as does Test Drive and a batch of Accolade adventures. I played Spy Hunter off of a cartridge and Impossible Mission off of a floppy disc. My favorite game was LucasArts' Labyrinth, a text adventure that turned into a graphical adventure based on the Jim Henson movie. But here's the perfect PC gaming twist: We made some games. Basic stuff. My brother and I typed in programming code from Run magazine. I have no idea what we typed in, and I'm sure we never intentionally deviated from the code listed in the magazine. Nevertheless, that was as under-the-hood as I'd ever get.
I liked playing games on a computer, partially because that was the only machine we had games on. We'd had an Odyssey 2, not an Atari, but neglected it by the time the C64 arrived. Maybe it broke. I don't remember. We'd eventually get a Nintendo Entertainment System, later than any of my friends did, and soon we'd have an IBM PC, too (maybe a 286; probably a 386). My brother preferred the computer; I glided toward Mario and Nintendo. He played Microsoft Flight Simulator. A lot. Downstairs, my C64 pulled me back in because we got a modem for it. I logged into a service called Quantum Link, the proto-AOL that included Club Caribe, a LucasArts-looking graphical chat room with avatars and palm trees and whatnot. The gaming diet in my home was typical. We got Tetris on the NES and we got SimCity on the PC.
SimCity became an obsession. SimCity produced the worst gaming purchase decision of my life, when my mother gave me the choice of getting SimCity CD or SimCity 2000. The latter was an actual sequel; a complex improvement over the original. But I asked for CD, a re-packaging of the original game, but with live-action cut-scenes added in (click on the one in this article, if you dare). Thanks, PC gaming: you were making me feel stupid even back then.
We got Myst, of course, and I think I solved all of it. Took notes, even.
Our IBM 486 begat a Pentium 1? 2? I don't recall. One of those went to college with me, along with a copy of SimTower and some helicopter sim. This was 1994, and it was the year I learned about minimum specs and started loathing PC gaming. SimTower only ran well when my tower was one story high. Add more floors and the game started to chug. The helicopter game was smooth during take-off, but not during the moment of having a missile fired at me. I'd brought a Super Nintendo with me to college as well. It did not cause me these kinds of problems. Yoshi's Island just worked and only slowed down when you hit the fuzzy enemies that were programmed to make it seem like Yoshi had suddenly become drunk.
PC gaming began to piss me off. My computer was sort of new and already couldn't run new games well. I think we bought me a new computer. Soon enough, it was lagging as well, and soon enough I was buying my last PC game. I used my computer in college to write term papers. I used my Super Nintendo as a trade-in for a Nintendo 64. I didn't hear about any PC games that were as cool as Super Mario 64 back then, and, as it's been chronicled, I totally missed Doom.
In my later years in college I worked part-time at a magazine. The art director there was the first person I met who loved Macs. Somehow that led to me goofing off at work sometimes, playing Spaceward Ho!. This art director guy, Ken, raved about a Doom-like game for the Mac called Marathon. I had no interest. I had GoldenEye on my N64. I didn't need any other first-person shooter. The thing I liked about Macs, from the way my friend at work described them, is that they seemed airtight. They seemed hassle-free. They seemed, more or less, like consoles. So when I went to grad school right after college, I got my first laptop. It was a Mac. So was my second, and nary a game ever ran on those machines. Gaming was for my N64, then for my GameCube, my Game Boy Advance, my PlayStation 2, or, briefly, for the Dreamcast I borrowed from a friend.
I'd often hear that PC gaming was better, but when I'd do the math, I'd realize it was also ridiculously more expensive. So I lived without it. I lived without Civilization and without Quake. I lived without Baldur's Gate and without Fallout. I never played Half-Life, never touched Deus Ex. When I only owned a GameCube, I knew enough that I was missing things to spring for a PS2 and then an Xbox, but PC gaming lived on the other side of a wall I could not afford to surmount, not with the fear that as soon as I bought a PC for today's games, I'd discover it couldn't run tomorrow's.
In my second long-term job after grad school, I made a friend who loved Fallout and who foisted a dual copy of Fallout/Fallout 2 into my hands. I don't remember which computer I tried it on. I think I had a Windows-based tower PC at the time, not for games but for word-processing, checking e-mail and using the web. I loaded the first game, liked it, but got stuck. I'd made my hero too mediocre. He couldn't talk his way past some mean guards. Couldn't fight them too. I backed away from PC gaming again. I moved in with my girlfriend and she went out for a full Saturday once. During that Saturday I went from level 1 to level 12 as a Tauren Druid in World of Warcraft. That was the first and last MMO I'd ever played. Too much work.
My fear of PC gaming persisted. Being a PC gamer would demand too much, I had decided. Too much money. Too much time. Too much work. It was ridiculous to me that just about nothing could run Crysis. I could sleep at night without having played Doom 3. But from time to time I'd hear about a new PC game that must have reminded me of the top-down fun of those old SimCitys. A guy named Peter Molyneux concocted the likes of Black & White and The Movies, sims about being god or a movie mogul, respectively. Games like this started arriving in my mailbox from game publishers who wanted to catch the eye of someone who was now a game reporter. I had no computer that could run things well. Loading a PC game, for me, was like making a new friend, waving to them and then watching them have a heart attack.
PC gaming frustrated me, because I could not make sense of it. Search engines never produced the right solutions to my technical woes. I had the wrong drivers or the wrong graphics card. I didn't know. Maybe more RAM would help, or maybe my processor just sucked. I didn't want to guess if the game I was buying was going to work. I didn't want to always feel that, even if it did, it could run better if only my machine was different. I got an Xbox 360. It didn't give me these headaches.
People began talking about a game called Spore. It was only going to be on PC. I was a game reporter by the time it was close to release. I interviewed people who were making it. So I bought a gaming laptop—yes, a laptop, an acquiescence to the New York city-dweller's lack of space. It ran Spore—the first video game I spent $1500 to play—just fine. Too bad the game wasn't that good.
As I made a name for myself as a video game reporter, the good people behind the Independent Games Festival invited me to judge indie games. They'd send me half-made works of wonderfully imaginative creators. After years of playing consoles games—after years of never having touched a mod—the first batch of indie games I downloaded to try on my gaming laptop were the rawest games I'd seen in decades. The rawest games I'd seen since those ones my brother and I typed into our C64. I struggled to get some of these games to run. Some were bad; some just badly made. But they fascinated me. I played Braid this way, more than a year before it came out. I played batches of physics games and shooters, some weird adventure games and other creations that were more abstract.
For several years, I only played PC games once a year, when it was time to judge games for the IGF. That was an uptick in my rate of PC gaming. It gave me one thrilling week per 52 when I was gaming on the frontier rather than on the safe terrain of the Xbox 360, the Wii and the PlayStation 3.
A few years back, I had the sense not to tell the folks at Valve Software that the Steam press account that they gave me—the account that would unlock, for free, the majority of games available on Steam—was something I couldn't really use. I could play some indie games, but I couldn't run a lot of other games from Steam. Or, maybe I could. I wouldn't. I'd fancied myself a gaming omnivore, diving into games on any console or handheld, but I'd made a dietary exception for PC gaming. That was just too much. And, I must admit, I did not mind the signs of PC gaming's decline, because I knew it would leave me fewer gaming platforms to worry about. (To make matters worse/slightly-better, I did use my Steam account to redeem Bejeweled 3, which I liked very much. Defcon, too.)
My gaming laptop is now obsolete. It hasn't been able to run any new games of note in a few years. I'd stuck to consoles, handhelds and iOS devices for my gaming since then. In these last few years, I fully lost my ability to call myself a PC gamer, something this chronology shows I let slip away, bit by bit.
I can no longer ignore it, and I now feel as if I am missing an extraordinarily exciting section of gaming. I won't ignore it any longer.
Two years ago, I should have been playing a lot of Minecraft and The Witcher. One year ago, I should have tried Amnesia. This year, I should have been playing DayZ. I couldn't play Star Wars: The Old Republic. I can't play League of Legends. I could Bootcamp my Macbook Air, but I don't think that would be the right way to dive into the world of the so very many fascinating, indie games being made for the PC.
My computer gaming diet can't simply consist of the oddly captivating FarmVille 2 that runs now in my browser. I need to try FTL. I need to be ready for Cube World. I need to play a Paradox game, at long last. And I need to be ready, appropriately enough, for a new SimCity.
There is a cardboard box at my feet right now. In it is a brand-new gaming PC. It's a laptop, space still being tight. But it's my ticket back. I don't know if I ever really was one, considering all the classics I missed, but I'm ready, at least to make the effort. I will be a PC gamer. It's finally important to me.
You-hoo, PC gamers, there's a juicy sale going on over at EA's digital distribution shop Origin.
In the Origin sale, you can pick up Mass Effect 1 and Dead Space 1 for the rock-bottom price of £3.
Dead Space 2 will only set you back £7.50, Mass Effect 2 £10.
Dragon Age 2 and Mirror's Edge are £5.
The remaining deals worth noting are Bulletstorm for £7.50 and Spore for £6.
This Origin deal runs for "a limited time only".
Increasingly nebulous mega-brain Will Wright has finally revealed what the hell he’s up to next. He’s spent his post-Spore years working at an outfit he calls
He’s working on adapting a short story about a karmic computer, by sci-fi writer and technology ponderer Bruce Sterling, and he reckons he can get it turned around within a year.
Will Wright is making a video game inspired by a short story by science fiction author Bruce Sterling.
The game, which The Sims creator hopes to have up and running in a year, riffs off of the Sterling short story Maneki Neko.
"He describes a karmic computer that's keeping a balance of payments between different people, and causing them to interact with each other in interesting ways to improve their lives even though they're strangers," Wright told Eurogamer in a new interview conducted at E3 in Los Angeles.
"They earn karmic points that are redeemed by having somebody else help them."
At the Game Developers Conference in March Wright announced he had begun work on his first new game projects since 2008's evolution sim Spore.
Wright told Eurogamer that the Sterling-inspired game he's working on is likely set for launch on tablets, smart phones and social networks such as Facebook.
"The rate of change is increasing almost exponentially right now, which means I don't think it makes sense to go through even a three or four year development cycle any more," he said.
"Unless you can get something to market within a year, at least an initial version within a year, you're hosed.
"So that's the new model for development, which has totally changed my thinking. Almost any project I want to work on is going to be something I can at least get some version out there in about a year and then iterate from there."
But that's not all. Wright is working on other games, "one or two" of which are intended for home consoles.
"But most of our work is going to be everything else: PC, tablet, Facebook and mobile."
Wright left EA in 2009 to run entertainment think tank StupidFunClub. It's already launched a user-generated TV show, and plans are in place to manufacture toys.
If you've ever seen Will Wright, the big brain behind games like The Sims, Spore and SimCity, deliver one of his humorous, hypnotizing talks, you're probably going to want to settle in and watch this one too. If you've never had the good fortune to be assaulted by Will Wright's smarts, well, you're in luck.
Earlier this year, Wright spoke at the Summit on Science, Entertainment, and Education, tying together a ton of ideas, anecdotes and thoughts on games, toys, science and a bazillion other things and how they might relate to education. He'll talk about video games and his own experience in development in this roughly 20 minute talk, but he'll also blow your mind in other ways.
If you've got the time to spare to hear Will do his thing, watch the above recording of his Summit speech.