Print 85 comment(s) - last by someguy123.. on Aug 5 at 10:10 PM

  (Source: Heavy)
Brilliant game developer says Microsoft cloud computing does not appear to give the Xbox One a significant edge

Gaming legend and id Software Technical Director/cofounder John Carmack at his QuakeCon 2013 keynote offered up his thoughts on Microsoft Corp.'s (MSFT) Xbox One and Sony Corp.'s (TYO:6758) PlayStation 4, which will be waging war this holiday season for dominance in the eight generation of home video game consoles.

He led off by saying that he hadn't done enough head to head benchmarks to make direct comparisons between the consoles.  But based on his experience (Mr. Carmack still remains actively involved in programming top id Software titles and pioneering new graphics techniques) he says he believes they're "very close" in capabilities and that "they're both very good".

In some ways this comparison is bad news for Microsoft as it calls into queston the company's claims that its console is five times as powerful when connected to the cloud as when processing offline (if anyone would fully leverage the Xbox One's full potential, it would likely be Mr. Carmack).

id Software

Also bad news for Microsoft is Mr. Carmack's dour assessment of the Kinect sensor.  Aside from concerns regarding "always on" U.S. National Security Agency (NSA) spying (Microsoft has a major voluntary data sharing agreement with the NSA, reportedly), Mr. Carmack offers criticism of the controls themself, stating, "[The Kinect 2 is] kind of like a zero-button mouse with a lot of latency on it."

The PS4 appears to enjoy a moderate lead in preorders over the Xbox One.

You can watch Mr. Carmack's full keynote below, via the YouTube:

For the unitiated, you may ask why listen to Mr. Carmack.  Well, he coded a hardware-optimized build of Wolfenstein 3D for the iPhone in 4 days, when it was estimated to take a full team of programmers two months to perform a basic (unoptimized) port.

Source: Kotaku

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Not actually true, though, is it?
By Mint on 8/3/2013 6:46:15 PM , Rating: 4
Overall as an average no, but they will slow you down more than anything when they first come into view.
That's completely wrong. When things come first into view, that depends on bandwidth/latency from storage media to RAM.

Current consoles have only 512MB of memory. That's why they lack in this area, as they can't preload enough into memory. It has nothing to do with slow RAM. Both the PS4 and XB1 have 8GB of RAM and BR drives. They'll be equal at loading objects into view.
The embedded cache of the Xbox won't be enough to compensate for the huge bandwidth gap.
It definitely will be:

I've done my share of graphics programming and worked at ATI at one point as well. If you can offload all framebuffer traffic away from memory, you save well over half of bandwidth needs, and sometimes 90%+ in the most bandwidth-heavy rendering operation: alpha blending.

RE: Not actually true, though, is it?
By retrospooty on 8/3/2013 7:08:58 PM , Rating: 2
Have you worked on FPS games with massive high res textures or smaller games with low res textures designed to fit into the low end systems? everything I have ever read on this is that video RAM bandwidth is a big problem for these large textures and that is one of the main reasons why it needs to be faster and faster with every generation. I don't have personal experience with it just what I have read over the years. as far as the system's being equal, I certainly hope they are. But regardless of what Carmac said (and by the way he did mention that he didn't have a chance to actually test anything at all) most devs and h/w engineers are quite sure they are not, including Microsoft who just bumped up to speed to try and make up for being behind. Either way we'll have to wait and see when real production systems hit the market

RE: Not actually true, though, is it?
By Mint on 8/4/2013 7:43:26 PM , Rating: 3
everything I have ever read on this is that video RAM bandwidth is a big problem for these large textures
Show me where you have read this, because it's just plain wrong.

I haven't coded FPS engines myself, but at ATI I have done advanced performance analysis of AAA games, looking at hardware counters on the chip to determine where the bottlenecks are.

The vast majority of cases where texture bandwidth becomes significant is when you have thrashing (rare, usually due to developer laziness) or when you're doing framebuffer post-processing, which is perfect for eDRAM/SRAM in consoles.

By Strunf on 8/5/2013 10:08:20 AM , Rating: 1
I have read that when using Skyrim with many mods a certain card (I forget which one) that had part of its memory on much lower bandwidth than the rest was having some performance drop the moment Skyrim used that lower bandwidth memory.

Anyways your theory could be easily tested by underclock the memory of a graphics card, if what you say is true up to a certain point there should be no performance drop by a underclocking the memory.

From what I've read overclocking your memory will increase your performance and that wouldn't be the case if there was a bottleneck elsewhere.

By goatfajitas on 8/5/2013 12:07:45 PM , Rating: 1
That is becasue they arent using very high resolution textures. They are dumbing it down to the current console level, which is very low res textures that dont slow anything down.

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki