I recently came back from a two (2) week vacation with my son. The games industry is typically on a hiatus of sorts during the Summer months, and even more in this console-transition year. This means that gaming news are also sparse during this time. However, you cannot have console launches without console fan-boy flame wars, and there were some tidbits reported in the last weeks that have stirred the fan-boy pot. Specifically, Microsoft announced that it has increased the clock speed of the Xbox One’s GPU by 53MHZ, to 853 MHZ. The second tidbit came from an unofficial but trusted source, and has to do with how much memory these new consoles will allocate for their Operating System overhead and media and social media/game streaming functions. Digital Foundry and Eurogamer reported that game developers could only access 4.5 GB of the PS4’s 8GB or GDDR5 memory. This led to flame wars between Xbox One and PS4 fans, as to which console has the most available memory for actual games and gaming functions. Xbox One fans contended that, since the Xbox One only reserves 3GB of its 8GB of DDR3 memory pool for the Operating System and whatnot, the larger amount memory available to developers on the system made up for its less powerful GPU (the PS4’s GPU has 1.5x as many GCN Compute Units as the Xbox One’s). Then QuakeCon happened, and John Carmack, ever the voice of reason in hardware-related matters, threw in his preliminary impression that both the Xbox One and PS4 were “about the same” in terms of graphical prowess.
To me this is the proverbial “tempest in a teapot”. When it comes to the Playstation 4 and Xbox One, I don’t care about OS memory overhead, or a slightly more powerful GPU. The GPU is fine on both machines — the AMD GCN (Radeon 8xxx) architecture is a known quantity by now. The GPUs inside these new consoles are not cutting edge by any means, but they are fine, definitely good enough. What I really care about, and what worries me about these new consoles, is the AMD “Jaguar” CPU cores they both share.
As part of my usual “retail therapy” after my Summer vacation, I recently picked up a 2013 Google Nexus 7 (which will be the subject of a future post) and an Acer Aspire V5-122 11 inch touchscreen notebook (I can’t call it an Ultrabook, since that’s an Intel trademark, but it’s the same form-factor.) What’s notable about this inexpensive little notebook, besides it’s low price for a touchscreen-equipped model, is that it’s based on AMD’s Jaguar APU technology, probably the first product on the market to use it. I’ve been playing around with it over the last couple of days, enough to get a handle on how fast the Jaguar processor core can and cannot do. My take on the architecture powering the Xbox One and the Playstation 4? The short version: the GPU is good enough, blazingly fast for what these consoles will require of it, gaming wise. Unfortunately, the CPU cores are pitifully weak.
In my testing, I put the Acer Aspire V5-122 to the usual tasks a user puts a computer through… Word Processing, spreadsheets, gaming, web browsing, email and social media, and video playback. Throughout it all, the computer felt slower than my other machines (which run on processors that go from a Core i3 to my fastest, Core i7 gaming desktop.) I could feel it struggle under the load when I ran World of Warcraft alongside Firefox (with a maximum of three (3) tabs open) and then switched to the Windows 8 Modern interface to access the Windows 8 Store. Windows 8 apps took longer to open. Copying files took a lot longer than an AMD Piledriver-based desktop, not to mention a Core i5 or Core i7 machine. It could play back a 1080p video file just fine, thanks to the fast GPU, but the Jaguar CPU cores were maxed out in doing so. This machine could not match my slowest gaming rig, much less the newest, Intel Haswell or even AMD’s own Steamroller A10 APU models. This is not the microprocessor you would want in a next-generation gaming machine.
I can understand why this particular chip was chosen for the Xbox One and Playstation 4: it’s all about cost. Neither Microsoft or Sony are in a position to lose billions on R&D for custom console chipsets anymore, and neither can lose even more billions selling gaming hardware at a loss anymore. Microsoft and Sony will not lose money on these machines. But the tradeoffs are dangerous. Unlike the Xbox 360 and the PS3, which were the most powerful gaming hardware ever released at their time, these machines cannot match the performance of the average current gaming PC. AMD cannot do miracles with what Microsoft and Sony were willing to spend: these Jaguar cores have less than one fifth the performance of an Intel Core i7, and less than a quarter the performance of an Intel Core i5. It’s not even half as fast as an AMD FX processor. That’s the reason there are eight (8) of them in the new consoles. Unless game developers get really good at multi-threading, there’s no way they’ll get good performance out of these CPU cores. And we don’t know how many of them will actually be available for running games.
And another worrisome detail about the AMD Jaguar processor, one that hasn’t been previously discussed: Jaguar CPU cores come in clusters or modules of four cores each. The Playstation 4 and Xbox One will have two (2) of these modules each, for a total of eight (8) CPU cores. But communication between the two modules is not, er, ideal. Expect games to only run on one module, and thus have four (4) cores available, with the remaining four (4) cores used for the OS and other, non-game tasks.
What about next-gen physics, game A.I., etc? Sure, part of those computations could be offloaded to the Playsation 4’s and Xbox One’s GPU, but there’s not much room to grow there… those GPUs are already middle-of-the-road when compared with current gaming PC models. Developers wanting to push the envelope in those areas had better stick with making their games for PCs.
A prime example is Sony’s own Planetside 2. Yesterday, the devs for the Sony Online Entertainment title expressed their frustration at the difficulties they were facing in getting their game to run at acceptable speeds on Playstation 4 hardware. They are essentially rewriting their game to optimize it for multithreaded performance (and, coincidentally, said that AMD gaming PC owners would notice a nice performance boost when the PC version gets the updated code in a patch.)
The Playstation 3 and Xbox 360 had plenty of headroom when they were released in 2005/2006. Each year games looked better and better, as game developers learned the intricacies of their architecture, and implemented new techniques and programming tricks to take advantage of them. “The Last of Us” is easily the most graphically impressive game I’ve ever seen, and it’s running on seven (7) year old PowerPC hardware. The Playstation 4 and Xbox One are having trouble running current PC ports right out of the gate. Food for though … and worry, there.