AMD's dropped an almost unexpected Christmas present into our laps this morning: the launch of the company's latest flagship graphics card, the Radeon HD7970. As well as stealing the 'fastest single chip graphics card' title back from NVIDIA for the time being, the HD7970 is the first card manufactured on its microscopic 28nm process and is the first to use the all-new 'Graphics Core Next' (GCN) architecture.
But what does that mean, and is it any good for gaming?
For the time being, at least, the issue is a little moot. AMD's festive gift turns out to be more like getting vouchers than an actual present. The Radeon HD7970 isn't expected to be available for sale until after January 9th, and priced at around £450 it's not exactly a new year bargain either.
Still, it is one of the more interesting tech developments of the last few months. The HD7970 is the first card that's compatible with the next version of DirectX, 11.1, which will be added into Windows next year. And GCN is quite a big departure for the company.
So far, all of AMD's post DirectX 10, unified shader cards have used a technique based around combining computations into 'Very Long Instruction Words' and clusters of four or five processor cores to run them. With GCN, however, the company has completely broken from this design and adopted one which is more similar to that used by rival NVIDIA. Each processing core is a more or less autonomous unit capable of running a single instruction at a time. This scalar architecture isn't necessarily better for gaming graphics operations, but it is arguably more efficient for the other stuff today's graphics cards are supposed to do, like physics and GPGPU supercomputing.
There are still a lot of shader cores though. The Radeon HD7970 boast 2048 versus the older HD6970's 1536, and runs them at a massive 925MHz. Coupled with 3GB of GDDR5 memory clocked at 5.5GHz (equivalent) with a 384-bit interface, this new card has a truly phenomenal amount of processing power packed into its 4.31billion transistors.
So what is GCN capable of? In terms of new features, the biggest difference is support for something AMD is calling 'Partially Resident Textures' (PRT), which is a hardware implementation of the megatexturing technique pioneered by id Software. This reduces the amount of memory bandwidth used up by transferring textures around system, as it allows games designers to create very large, high resolution images which the GPU then divides up into smaller tiles to work with as and when required.
Performance-wise, the last minute rush to get this announcement out before Christmas means we've not had a card in the office yet, so can't say definitively whether or not AMD's new architecture works. But news from around the web suggests that the HD7970 is between 15 and 30% quicker than previous cards.
Here's what the tech sites are saying.
Our colleagues at TechRadar seem vaguely unimpressed with the HD7970, but largely because recent price cuts have made NVIDIA's GTX580 almost 100 cheaper.
Anandtech meanwhile, is impressed by performance of the HD7970, especially by its lead at the highest resolutions. But again the worry is about price - specifically the relatively good value of a double chip HD6990.
Tom's Hardware hedges its bets and calls its benchmark fest a preview, but says it prefers the HD7970 to a dual chip card for the price. The hope here is that non-reference designs are a bit quieter than the stock samples.
HardOCP really likes the HD7970, very enthusiastically so and with exclamation marks. There's two apparent reasons. Firstly it draws no more power than the older Radeon series, and secondly it's a single chip card capable of powering a three screen set up (says the reviewer).
What everyone's agreed on is that this is a big shift in strategy for AMD, which in recent years has built cards which compete more on value than out and out performance, with really good Crossfire scaling. Radeon HD7970 is a throwback to the old days of monstrous cards, monstrous performance - but at a monstrous price. If you're prepared to pay it, it looks like a winner.
We'll be able to confirm or deny that when we get hardware in the new year.