backtop


Print 85 comment(s) - last by someguy123.. on Aug 5 at 10:10 PM


  (Source: Heavy)
Brilliant game developer says Microsoft cloud computing does not appear to give the Xbox One a significant edge

Gaming legend and id Software Technical Director/cofounder John Carmack at his QuakeCon 2013 keynote offered up his thoughts on Microsoft Corp.'s (MSFT) Xbox One and Sony Corp.'s (TYO:6758) PlayStation 4, which will be waging war this holiday season for dominance in the eight generation of home video game consoles.

He led off by saying that he hadn't done enough head to head benchmarks to make direct comparisons between the consoles.  But based on his experience (Mr. Carmack still remains actively involved in programming top id Software titles and pioneering new graphics techniques) he says he believes they're "very close" in capabilities and that "they're both very good".

In some ways this comparison is bad news for Microsoft as it calls into queston the company's claims that its console is five times as powerful when connected to the cloud as when processing offline (if anyone would fully leverage the Xbox One's full potential, it would likely be Mr. Carmack).

id Software

Also bad news for Microsoft is Mr. Carmack's dour assessment of the Kinect sensor.  Aside from concerns regarding "always on" U.S. National Security Agency (NSA) spying (Microsoft has a major voluntary data sharing agreement with the NSA, reportedly), Mr. Carmack offers criticism of the controls themself, stating, "[The Kinect 2 is] kind of like a zero-button mouse with a lot of latency on it."

The PS4 appears to enjoy a moderate lead in preorders over the Xbox One.

You can watch Mr. Carmack's full keynote below, via the YouTube:


For the unitiated, you may ask why listen to Mr. Carmack.  Well, he coded a hardware-optimized build of Wolfenstein 3D for the iPhone in 4 days, when it was estimated to take a full team of programmers two months to perform a basic (unoptimized) port.

Source: Kotaku



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Not actually true, though, is it?
By piroroadkill on 8/3/2013 5:49:40 AM , Rating: 5
The PS4 simply has a more powerful GPU. What your opinion is on the massive extra RAM bandwidth aside, it has a more powerful GPU.

He's clearly saying it because he doesn't want to burn bridges with Microsoft in case they're the ones handing him cheques later on.

But they're basically extremely similar consoles. Except the PS4 has a significantly more powerful GPU.




RE: Not actually true, though, is it?
By Azethoth on 8/3/2013 6:02:10 AM , Rating: 2
This is the reality: companies are gonna build to the slightly lower XBone specs and not bother using the full PS4 specs except for exclusives.

So yeah, "they are basically the same".

On the other hand, the Wii U is already on the scrapheap of history and performance.


RE: Not actually true, though, is it?
By BZDTemp on 8/3/2013 6:32:34 AM , Rating: 4
Correction - SOME companies will build for the slowest platform. However others will keep making exclusive games for their preferred platform and over time it will become clear that the better hardware matters.

Still much more important is the ideas put into the games and seeing how Sony is focusing on the PS4 being a gaming machine rather than a Trojan box for your living room that matters. Just look at the games announced for the two platforms or for that matter how much more interesting and diverse the games for Playstation is in the current generation.


RE: Not actually true, though, is it?
By Mint on 8/3/2013 1:08:06 PM , Rating: 3
quote:
over time it will become clear that the better hardware matters
Not when the difference is as marginal as Carmack is insinuating.

Shader performance is only one aspect of graphics. As an example, consider that the 8800GT had 75% more stream processors than the 9600GT, and look at the results:
http://techreport.com/review/14168/nvidia-geforce-...


RE: Not actually true, though, is it?
By Mitch101 on 8/3/2013 1:25:31 PM , Rating: 2
Also take into consideration its all Direct X from Microsoft some what 20 years of optimization. I doubt Sony has anything nearly as efficient or well documented as Direct X is especially on the x86 platform. I would be absolutely positive that Microsoft has a performance advantage here if we were able to compare Direct X to Sony's development offering.


RE: Not actually true, though, is it?
By lexluthermiester on 8/4/2013 2:07:01 AM , Rating: 4
So you're saying OpenGL, which has been around at least as long as DirectX means nothing eh? Riiight....


RE: Not actually true, though, is it?
By Mitch101 on 8/4/2013 11:33:49 AM , Rating: 3
There was a time when OpenGL was exceeding DirectX and you can probably thank Carmack for pointing that out but what Has Not Happened is that OpenGL doesn't get nearly as much love from NVIDIA/ATI/AMD/INTEL. Having Drivers and Having Optimized and refined drivers are two different animals. DirectX gets all the attention from the GPU developers especially on the development front while OpenGL is treated like the bastard step child ask anyone who does Linux how they feel about graphics driver support. Its not that its bad its quite excellent but the reality is DirectX is faster giving Microsoft a performance optimization boost over OpenGL. The last line of the article about sums it up DirectX on AMD which is what is in these boxes is about 20% faster on DirectX over OpenGL.

26/02/2013 A look at OpenGL and Direct3D performance with Unigine
http://www.g-truc.net/post-0547.html
Hence, either because the OpenGL implementations are generally slow or because streaming assets is slower with OpenGL, rendering with OpenGL is significantly slower than rendering with Direct3D 11 even with nicely crafted code like I expect Unigine to be. However, it appears that Intel OpenGL implementation is the one performing best or should I say the less badly.

the performance differences with Direct3D 11 and OpenGL implementation is roughly 10% for Intel against about 20% for AMD and 30% for NVIDIA.


By nafhan on 8/5/2013 6:27:45 PM , Rating: 2
Actually, an article showing a performance advantage in a single benchmark doesn't prove much of anything beyond a performance advantage in a single non-game benchmark. This is why review sites tend to put big disclaimers around this type of testing.

Also, OpenGL gets plenty of love from ATI (PS4/Wii), PowerVR (many Android and Apple devices), and Nvidia (many Android devices and their own game console). Don't take this as hate on DirectX - it has definitely evolved into a solid platform.


RE: Not actually true, though, is it?
By FITCamaro on 8/5/2013 1:08:56 PM , Rating: 4
You're also talking a newer, faster clocked version of the same cores with just fewer compute units. In this case the GPUs are 100% the same, just fewer units.

Now yes 50% more compute units does not translate into 50% faster performance.

Here is a good kind of "benchmark" to show the kind of real world differences between the chips:
http://www.youtube.com/watch?feature=player_embedd...


RE: Not actually true, though, is it?
By someguy123 on 8/5/2013 7:24:51 PM , Rating: 3
Performance difference is pretty small here. In instances of high framerate you get about a 10 frame difference, and in low framerate about 2~4. Overall a 19% to 24% difference according to digital foundry. Doesn't take into account the cache/memory differences in both consoles either. Less DRAM throughput on the xbox but very fast low latency ESRAM, while the ps4 has high gddr5 throughput but no fast cache outside of CPU L2.

I'm inclined to believe Carmack here since he was able to deliver pretty similar performance on the ps3 even with a less than stellar memory config compared to the 360 for Rage, so clearly he isn't one to lose performance for the sake of simplicity or fanboyism.


RE: Not actually true, though, is it?
By Reclaimer77 on 8/5/2013 8:44:50 PM , Rating: 2
quote:
Overall a 19% to 24% difference according to digital foundry


And you call that "pretty small"?? In an era where people fork over hundreds of dollars for 10 more FPS in a video card, I would say that's pretty significant.


By someguy123 on 8/5/2013 10:10:40 PM , Rating: 2
quote:
In an era where people fork over hundreds of dollars for 10 more FPS in a video card, I would say that's pretty significant.


I don't know about that. 7870 ghz is $209 and the 7850 is $159 on newegg (both from sapphire). That's about 15~25% more performance there for fifty dollars more. Go up to the 7950 and you start looking into 40~70% territory depending on resolution for a hundred fifty more.


By Azethoth on 8/3/2013 8:32:55 PM , Rating: 2
You say that you are correcting me, but then you just repeat what I said: some titles geared to the slowest box that ship for both, and some titles exclusive to a box.

The ideas behind games and the execution is more important as you say though. However it is rather premature to claim that Sony will have more diverse and interesting stuff. Happy to argue the point in 10 years time though.


RE: Not actually true, though, is it?
By NellyFromMA on 8/5/2013 2:53:46 PM , Rating: 2
You mean like it became clear that the PS3 was the better console because it had superior hardware? Oh, wait...


By SPOOFE on 8/5/2013 6:02:16 PM , Rating: 2
To be fair, the things that Cell was "better" at didn't include "gaming".

Sony clearly learned their lesson about trying to wow the audience with overly-fancy-shmancy techno-whizbang. Cell was a lot of R&D that didn't pay off for them, and I believe they went with a much more conventional, essentially off-the-shelf solution this time around for that very reason.


RE: Not actually true, though, is it?
By retrospooty on 8/3/2013 9:49:24 AM , Rating: 1
The GPU's are quite similar, the difference is the massive gap in memory bandwidth. Todays video cards all use extremely fast ram to quickly handle the loads that regular system RAM cannot. The PS4 went with 100% high speed ram, all 8gb is high speed video card class RAM. The XBO uses the normal slow system RAM and to compensate a bit for that lack has a bit of embedded eRam on the chip. This will compensate for most smaller items...

The main effect of this is large high res textures. If you look at the same game in PC vs console (XB360 and PS3) and see that the console is blurry and nothing looks sharp, even a square wall, it is because the textures are FAR lower quality. On the new consoles, the PS4 will handle large high res textures fast and fluent and the XBO will not. Hopefully, coders will simply put in a higher res version of the same textures on PS4, rather than code for the slowest. In other words, we could have the exact same game with higher res textures on the PS4... in theory. I dont know that they will actually do that, just that they could (and should).


RE: Not actually true, though, is it?
By Mint on 8/3/2013 12:41:19 PM , Rating: 4
Textures aren't a major source of bandwidth consumption, though, because they can use lossy compression and more often than not undergo magnification (spreading texel bits over several pixels). The framebuffer can only use lossless compression, and needs far more data per pixel, especially during alpha blending.

That's why eDRAM/SRAM can make such a big impact. 1080p, 2xAA, R9G9B9E5 HDR can fit into 32MB, so the most frequency accessed data stays in high speed memory. Many games are doing deferred rendering with post-processed AA, and that allows 1080p with 12 bytes per pixel (on top of Z) just in the sRAM.

And in case anyone is going to mention 4k support, you'd have to really dumb down the graphics load per pixel for these GPUs to handle that niche market, so it's a meaningless point.


By Mitch101 on 8/3/2013 1:34:44 PM , Rating: 2
The only way I see either of them getting away with 4K is through sorcery not actual rendering it would have to be done with something like the way the foruja chipset upscaled content by comparing the line above and below and rendering the inbetween line with what it believes the data would be. Really post processing Psudo 4k. It may be able to stream 4k but not game in true 4k without sacrificing a lot.


RE: Not actually true, though, is it?
By retrospooty on 8/3/2013 1:49:33 PM , Rating: 2
Overall as an average no, but they will slow you down more than anything when they first come into view. That is the main reason you need such fast ram for video cards and that is where the current consoles severely lack. I am glad ps4 has fast gddr ram. The embedded cache of the Xbox won't be enough to compensate for the huge bandwidth gap. It may not be too far behind, but it will be behind. We will have to wait and see by exactly how much.


RE: Not actually true, though, is it?
By Mint on 8/3/2013 6:46:15 PM , Rating: 4
quote:
Overall as an average no, but they will slow you down more than anything when they first come into view.
That's completely wrong. When things come first into view, that depends on bandwidth/latency from storage media to RAM.

Current consoles have only 512MB of memory. That's why they lack in this area, as they can't preload enough into memory. It has nothing to do with slow RAM. Both the PS4 and XB1 have 8GB of RAM and BR drives. They'll be equal at loading objects into view.
quote:
The embedded cache of the Xbox won't be enough to compensate for the huge bandwidth gap.
It definitely will be:
http://www.eurogamer.net/articles/digitalfoundry-x...

I've done my share of graphics programming and worked at ATI at one point as well. If you can offload all framebuffer traffic away from memory, you save well over half of bandwidth needs, and sometimes 90%+ in the most bandwidth-heavy rendering operation: alpha blending.


RE: Not actually true, though, is it?
By retrospooty on 8/3/2013 7:08:58 PM , Rating: 2
Have you worked on FPS games with massive high res textures or smaller games with low res textures designed to fit into the low end systems? everything I have ever read on this is that video RAM bandwidth is a big problem for these large textures and that is one of the main reasons why it needs to be faster and faster with every generation. I don't have personal experience with it just what I have read over the years. as far as the system's being equal, I certainly hope they are. But regardless of what Carmac said (and by the way he did mention that he didn't have a chance to actually test anything at all) most devs and h/w engineers are quite sure they are not, including Microsoft who just bumped up to speed to try and make up for being behind. Either way we'll have to wait and see when real production systems hit the market


RE: Not actually true, though, is it?
By Mint on 8/4/2013 7:43:26 PM , Rating: 3
quote:
everything I have ever read on this is that video RAM bandwidth is a big problem for these large textures
Show me where you have read this, because it's just plain wrong.

I haven't coded FPS engines myself, but at ATI I have done advanced performance analysis of AAA games, looking at hardware counters on the chip to determine where the bottlenecks are.

The vast majority of cases where texture bandwidth becomes significant is when you have thrashing (rare, usually due to developer laziness) or when you're doing framebuffer post-processing, which is perfect for eDRAM/SRAM in consoles.


By Strunf on 8/5/2013 10:08:20 AM , Rating: 1
I have read that when using Skyrim with many mods a certain card (I forget which one) that had part of its memory on much lower bandwidth than the rest was having some performance drop the moment Skyrim used that lower bandwidth memory.

Anyways your theory could be easily tested by underclock the memory of a graphics card, if what you say is true up to a certain point there should be no performance drop by a underclocking the memory.

From what I've read overclocking your memory will increase your performance and that wouldn't be the case if there was a bottleneck elsewhere.


By goatfajitas on 8/5/2013 12:07:45 PM , Rating: 1
That is becasue they arent using very high resolution textures. They are dumbing it down to the current console level, which is very low res textures that dont slow anything down.


By FITCamaro on 8/5/2013 1:18:05 PM , Rating: 3
GDDR5 has more bandwidth. DDR3 has lower latencies for quick access. There are tradeoffs on both sides.


RE: Not actually true, though, is it?
By B3an on 8/3/2013 10:34:54 AM , Rating: 4
quote:
He's clearly saying it because he doesn't want to burn bridges with Microsoft in case they're the ones handing him cheques later on.

BS. Carmack is always honest about this stuff. If you knew anything about him you'd know that.

I'm sure he's also taking into account Microsofts better dev tools and DX features, which he has mentioned are much better on many occasions (even though his games are OpenGL he admits DX is now better).


By Mitch101 on 8/3/2013 11:54:25 AM , Rating: 3
So true Carmack is on the few developers to support OpenGL early on if anything bucking Microsoft.

Who are we to believe John Carmack a true developer of game engines or a bunch of people on a blog?

The PC the most dominating DirectX game development platform of all times and the XBONE is a perfectly optimized DirectX platform. Sony would have to bump up specs to match the performance of how optimized Direct X is. How many decades has Sony had to try an optimize an development platform for x86 process?


RE: Not actually true, though, is it?
By piroroadkill on 8/3/2013 3:39:53 PM , Rating: 2
Really? You're going to do this?

I know a lot about Carmack. I've read a book about him, seen interviews with him, I have a lot of respect for him.

Dev tools do NOT come into this. We're talking about whether the platforms are neck-and-neck in power terms.


By SPOOFE on 8/4/2013 4:46:53 PM , Rating: 4
quote:
We're talking about whether the platforms are neck-and-neck in power terms.

Actually, it looks a lot more like you're accusing someone of being paid to say things he doesn't believe. That's a hell of an uphill battle for you.


RE: Not actually true, though, is it?
By stm1185 on 8/3/2013 12:42:15 PM , Rating: 2
Let's put these weak sauce Console gpus in perspective, so the PS4 fanboys can understand how gimped it really is. The now year old GPU in my computer, which has the same architecture as the PS4 and XboxOne, has 2048 cores at 1100mhz instead of 768 cores at 852mhz or 1152 cores at 800mhz. Both console gpus are low end.

Yet do games look massively better on my PC then the current consoles? No, not really.

Multiplatform games will end up looking roughly the same on each platform as they do now. First party games might look better on the PS4, but then MS does have cloud compute, and their own studios might be able to utilize it properly.

But the TRUE reason to buy one console over another, or at all, is the exclusive games.

Halo vs Killzone, Forza vs GT... every other argument is fanboy nonsense.


RE: Not actually true, though, is it?
By Reclaimer77 on 8/4/2013 2:46:49 PM , Rating: 2
Argh, again?

How many times does this have to be explained? Yes, PC's have superior GPU's and hardware. Why? Because the PC is very inefficient at rendering graphics, so it needs that hardware. Mostly due to massive software overhead, because of compatibility concerns. PC hardware has to work with thousands of different hardware/software combinations. Meaning it's always poorly optimized for the task at hand.

Yes, we've been fortunate that we can throw tons of silicon and horsepower at the problem. But the problem still remains.

Consoles don't need as powerful a GPU or hardware, because they are purpose-built closed systems. The PS4, for example, is far more efficient at rendering graphics than our PC's.

I'm a PC "fanboi", but I just had to call you out here. The PS4 is certainly not gimped in delivering what it's designed to do.


By boeush on 8/4/2013 9:59:54 PM , Rating: 1
PCs aren't nearly as inefficient as you're trying to pretend. Compatibility concerns are largely a non-issue, because only the drivers relevant to actually installed hardware are loaded; most of the compatibility-related bloat manifests only on the mass storage footprint of the OS, not on any runtime performance. And each hardware vendor provides specifically optimized drivers for their respective hardware, so your claim of "poorly optimized" is BS. It is particularly BS in the case of graphics drivers and graphics hardware, as everyone knows how hard NVIDIA and AMD (and recently, Intel) work to squeeze every last flop of performance out of their GPUs.

The real issue (if any) with PC performance is that people load their PCs up with tons of concurrently running processes, which can have severe impacts on available memory and CPU/GPU cycles, and which can result in uneven performance/framerates as the background tasks vary their demands on the system. PCs are multitasking machines; consoles are largely intended for single-tasking.

Another thing with consoles vs. PCs is the power envelope. PCs can easily drain nearly 1 KW of power between their high-end CPUs and multi-GPU configurations, not to mention the rest of the hardware. Consoles have to be compact without needing a ridiculous cooling system. That's the main reason they're so much weaker than PCs.

And in terms of how truly weak these "next-gen" consoles really are, consider that they're already more than a year behind PCs in performance, and they haven't yet even formally launched. Then consider that they will continue to ship in their upcoming weak-ass configuration, for the next 5 years or so at a minimum -- while PCs will continue to roughly double their performance every two years in the meantime. Hell, within three or four years, I wouldn't be surprised if there are cell phone SOCs (built on ~10 nm processes) with more powerful graphics than these consoles.

In the meantime, the world will have moved on to 4k and other high-resolution multi-screen setups for gaming and media consumption...

The sad part is, these weak-ass consoles will tend to hold back the state of the art in gaming for the rest of the tech landscape -- so the much more powerful PCs will see their super-duper hardware basically going to waste in the meantime. It may even come to a point where people start seeing the power of PCs as unnecessary in a practical sense; to a large extent it's already happening with the shift to mobile/tablet/phablet systems. It's not because PCs are somehow weak or inefficient; it's because there's a dearth of software to actually harness all that power.


By retrospooty on 8/5/2013 12:11:52 AM , Rating: 2
"Yet do games look massively better on my PC then the current consoles? No, not really"

Then you're doing it wrong. Even the old games like COD4 look immensely better on a PC. Current consoles textures are atrocious. Fortunately both new ones should be much better at that.


By wallijonn on 8/5/2013 1:14:43 PM , Rating: 2
quote:
the TRUE reason to buy one console over another, or at all, is the exclusive games.


That may be the ultimate reason but the XB1 all-seeing-and-hearing-eye may turn off many.

Last gen the PS3 wouldn't play many games beyond 720p and games could be saved to the XB° (360). Those were major advantages to the XB°.

As far as exclusives go chances are that one can live without them when the majority of games are on both consoles. But if EA decides not to make games for the PS4 because they won't implement DRM, I'm buying a PS4.


RE: Not actually true, though, is it?
By Arsynic on 8/5/2013 10:21:27 AM , Rating: 1
The PS3 supposedly had a GPU that was 2X the power of X360. How did that turn out? Turns out that Sony and Nvidia lied about RSX specs. Even with the "power of the Cell", PS3 barely keeps up with X360.

Moral of the story. These systems are more than one or two components cherry-picked and compared. For example, MS keeps talking about keeping the GPU well-fed via the eSRAM. How does GPU efficiency affect this comparison?

In the real world, XONE launch games look just as good (or even better in some cases) as PS4 launch games. This shouldn't be the case if the PS4 is 50% faster with more mature dev tools (according to select devs). And don't give me the "untapped power" BS. Both console GPUs are almost two years old and devs have experience working with them unlike previous generations where both consoles had new architectures.

In the real world however, we're seeing PS4 games running sub-1080p at 30 fps where the same genre on XONE is running in 1080p at 60 fps.


By retrospooty on 8/5/2013 11:19:34 AM , Rating: 3
"In the real world, XONE launch games look just as good (or even better in some cases) as PS4 launch games."

"In the real world however, we're seeing PS4 games running sub-1080p at 30 fps where the same genre on XONE is running in 1080p at 60 fps."

Where is this real world yo are referring to? Neither is released. If you are referring to pre-release testing, show your links.


"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki