Print 85 comment(s) - last by someguy123.. on Aug 5 at 10:10 PM

  (Source: Heavy)
Brilliant game developer says Microsoft cloud computing does not appear to give the Xbox One a significant edge

Gaming legend and id Software Technical Director/cofounder John Carmack at his QuakeCon 2013 keynote offered up his thoughts on Microsoft Corp.'s (MSFT) Xbox One and Sony Corp.'s (TYO:6758) PlayStation 4, which will be waging war this holiday season for dominance in the eight generation of home video game consoles.

He led off by saying that he hadn't done enough head to head benchmarks to make direct comparisons between the consoles.  But based on his experience (Mr. Carmack still remains actively involved in programming top id Software titles and pioneering new graphics techniques) he says he believes they're "very close" in capabilities and that "they're both very good".

In some ways this comparison is bad news for Microsoft as it calls into queston the company's claims that its console is five times as powerful when connected to the cloud as when processing offline (if anyone would fully leverage the Xbox One's full potential, it would likely be Mr. Carmack).

id Software

Also bad news for Microsoft is Mr. Carmack's dour assessment of the Kinect sensor.  Aside from concerns regarding "always on" U.S. National Security Agency (NSA) spying (Microsoft has a major voluntary data sharing agreement with the NSA, reportedly), Mr. Carmack offers criticism of the controls themself, stating, "[The Kinect 2 is] kind of like a zero-button mouse with a lot of latency on it."

The PS4 appears to enjoy a moderate lead in preorders over the Xbox One.

You can watch Mr. Carmack's full keynote below, via the YouTube:

For the unitiated, you may ask why listen to Mr. Carmack.  Well, he coded a hardware-optimized build of Wolfenstein 3D for the iPhone in 4 days, when it was estimated to take a full team of programmers two months to perform a basic (unoptimized) port.

Source: Kotaku

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Not actually true, though, is it?
By Mint on 8/3/2013 1:08:06 PM , Rating: 3
over time it will become clear that the better hardware matters
Not when the difference is as marginal as Carmack is insinuating.

Shader performance is only one aspect of graphics. As an example, consider that the 8800GT had 75% more stream processors than the 9600GT, and look at the results:

RE: Not actually true, though, is it?
By Mitch101 on 8/3/2013 1:25:31 PM , Rating: 2
Also take into consideration its all Direct X from Microsoft some what 20 years of optimization. I doubt Sony has anything nearly as efficient or well documented as Direct X is especially on the x86 platform. I would be absolutely positive that Microsoft has a performance advantage here if we were able to compare Direct X to Sony's development offering.

RE: Not actually true, though, is it?
By lexluthermiester on 8/4/2013 2:07:01 AM , Rating: 4
So you're saying OpenGL, which has been around at least as long as DirectX means nothing eh? Riiight....

RE: Not actually true, though, is it?
By Mitch101 on 8/4/2013 11:33:49 AM , Rating: 3
There was a time when OpenGL was exceeding DirectX and you can probably thank Carmack for pointing that out but what Has Not Happened is that OpenGL doesn't get nearly as much love from NVIDIA/ATI/AMD/INTEL. Having Drivers and Having Optimized and refined drivers are two different animals. DirectX gets all the attention from the GPU developers especially on the development front while OpenGL is treated like the bastard step child ask anyone who does Linux how they feel about graphics driver support. Its not that its bad its quite excellent but the reality is DirectX is faster giving Microsoft a performance optimization boost over OpenGL. The last line of the article about sums it up DirectX on AMD which is what is in these boxes is about 20% faster on DirectX over OpenGL.

26/02/2013 A look at OpenGL and Direct3D performance with Unigine
Hence, either because the OpenGL implementations are generally slow or because streaming assets is slower with OpenGL, rendering with OpenGL is significantly slower than rendering with Direct3D 11 even with nicely crafted code like I expect Unigine to be. However, it appears that Intel OpenGL implementation is the one performing best or should I say the less badly.

the performance differences with Direct3D 11 and OpenGL implementation is roughly 10% for Intel against about 20% for AMD and 30% for NVIDIA.

By nafhan on 8/5/2013 6:27:45 PM , Rating: 2
Actually, an article showing a performance advantage in a single benchmark doesn't prove much of anything beyond a performance advantage in a single non-game benchmark. This is why review sites tend to put big disclaimers around this type of testing.

Also, OpenGL gets plenty of love from ATI (PS4/Wii), PowerVR (many Android and Apple devices), and Nvidia (many Android devices and their own game console). Don't take this as hate on DirectX - it has definitely evolved into a solid platform.

RE: Not actually true, though, is it?
By FITCamaro on 8/5/2013 1:08:56 PM , Rating: 4
You're also talking a newer, faster clocked version of the same cores with just fewer compute units. In this case the GPUs are 100% the same, just fewer units.

Now yes 50% more compute units does not translate into 50% faster performance.

Here is a good kind of "benchmark" to show the kind of real world differences between the chips:

RE: Not actually true, though, is it?
By someguy123 on 8/5/2013 7:24:51 PM , Rating: 3
Performance difference is pretty small here. In instances of high framerate you get about a 10 frame difference, and in low framerate about 2~4. Overall a 19% to 24% difference according to digital foundry. Doesn't take into account the cache/memory differences in both consoles either. Less DRAM throughput on the xbox but very fast low latency ESRAM, while the ps4 has high gddr5 throughput but no fast cache outside of CPU L2.

I'm inclined to believe Carmack here since he was able to deliver pretty similar performance on the ps3 even with a less than stellar memory config compared to the 360 for Rage, so clearly he isn't one to lose performance for the sake of simplicity or fanboyism.

RE: Not actually true, though, is it?
By Reclaimer77 on 8/5/2013 8:44:50 PM , Rating: 2
Overall a 19% to 24% difference according to digital foundry

And you call that "pretty small"?? In an era where people fork over hundreds of dollars for 10 more FPS in a video card, I would say that's pretty significant.

By someguy123 on 8/5/2013 10:10:40 PM , Rating: 2
In an era where people fork over hundreds of dollars for 10 more FPS in a video card, I would say that's pretty significant.

I don't know about that. 7870 ghz is $209 and the 7850 is $159 on newegg (both from sapphire). That's about 15~25% more performance there for fifty dollars more. Go up to the 7950 and you start looking into 40~70% territory depending on resolution for a hundred fifty more.

“So far we have not seen a single Android device that does not infringe on our patents." -- Microsoft General Counsel Brad Smith

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki