backtop


Print 83 comment(s) - last by laststop311.. on Jun 9 at 4:02 AM

New Kinect-free edition unlocks memory bandwidth reserved for Microsoft's camera sensor

Given Microsoft Corp.'s (MSFT) supply struggles, sales struggles, and privacy concerns regarding the Xbox One's "Kinect" video and audio sensor, it's perhaps unsurprising that Microsoft decided to at last offer a cheaper ($399) SKU of its console without the sensor.  Now that decision has brought about an interesting twist -- Xbox One users who ditch the Kinect may be able to enjoy better graphics performance.
 
It turns out that Microsoft's always-on sensor reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources for watching its user(s).  Comments a Microsoft spokesperson to Eurogamer:

Yes, the additional resources allow access to up to 10 per cent additional GPU performance.  We're committed to giving developers new tools and flexibility to make their Xbox One games even better by giving them the option to use the GPU reserve in whatever way is best for them and their games.

According to Eurogamer and The Verge, Microsoft will for the first time be allowing game developers to use these freed up resources to enable special "Kinect-free" enhanced graphics settings.  While that's good news for Xbox One gamers, it also adds an interesting extra nuance to an already heated debate.

Xbox One Kinect

Microsoft's Xbox One has trailed its rival Sony Corp.'s (TYO:6758) console after committing a number of missteps, including initially planning to use DRM to ban used games and threatening to make users' consoles unplayable offline.  While it later recanted on each of these controversial terms, the console has continued to struggle through a number of controversies.  To add insult to injury, Sony recently announced that its rival offering, the PlayStation 4 (PS4) was already profitable.
 
Nonetheless it has moved millions of units and isn't too far behind Sony's PS4 in lifetime sales.  Microsoft has remained very committed to its console, releasing a major update in April.
 
The future of the Xbox One should be elucidated at next week's 2014 E3 (Electronic Entertainment Expo), which is being held in Los Angeles, Calif.  Stephen Elop, the new executive vice president of Microsoft Devices (which makes the Xbox), is expected to present his vision for the future of the console along with new Xbox chief Phil Spencer at Microsoft's Monday, June 9 press event.

Sources: Eurogamer, via The Verge, Microsoft



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Options
By Motoman on 6/5/2014 11:23:56 AM , Rating: 1
It's nice that they did it...but it seems like a poor engineering decision in the first place to have the Kinect use (let alone reserve) memory bandwidth that the video rendering depends on. Granted that the fluidity of graphics rendering is kind of important on a console.

I would be interested to know why they though that was a good idea, rather than segregating the video away from other such stuff.


RE: Options
By Mitch101 on 6/5/2014 11:30:04 AM , Rating: 2
Its still not fully optimized. Plenty more power to squeeze out of it. Look at the Direct X12 enhancements where a single core can cause a bottleneck and how they plan on fixing it. They claim a double of performance but dont mistake that for double the framerate and they claim that's not marketing talking there is much more performance they will get over time. I'll refer to waiting and hoping they explain more and hopefully demo at E3.


RE: Options
By Motoman on 6/5/2014 11:34:17 AM , Rating: 1
I'm not sure that you're going to "optimize" hardware once it's in the field already...

...but the developers can probably continue to write better code, and make use of stuff like AMD Mantle for example.

Either way...it still seems odd that they'd set up the memory allocation for Kinect that way in the first place. Regardless of whatever else was going on.


RE: Options
By Mitch101 on 6/5/2014 11:57:05 AM , Rating: 1
Graphics cards companies make new drivers not just to fix bugs but to improve performance.

DirectX12 is two things, much improved CPU parallelism and more bare metal enhancements.

Its also worth mentioning that AMD's Mantle shows much more performance increases on lesser hardware than on top end hardware. The CPU/GPU in the One would stand to gain something. Lets says its another 10% is that not worth doing?


RE: Options
By Motoman on 6/5/2014 6:26:43 PM , Rating: 1
So...you're just going to repeat what I just said for fun?


RE: Options
By karimtemple on 6/5/2014 7:31:31 PM , Rating: 2
None of this has anything to do with Xbox One performance. In fact the most likely scenario is that DX12 is based on the Xbox One in the first place.


RE: Options
By Alexvrb on 6/5/2014 11:53:04 PM , Rating: 3
Agreed, although it would be more accurate to say that the XB1 low-level API is a prequel and subset of DX12. Plus I'm sure there's more optimizations to be done including both PC and console platform-specific optimizations. Either way it seems like the larger impact is going to be on PC, where currently the only low-level API in town is Mantle.

On a semi-related note, I always felt they should have added dedicated hardware (whether part of the APU or external) to help offload Kinect functions. Then again, the types of games that utilize Kinect fully don't tend to be as focused on graphical performance. Simple voice commands and the like probably don't cost much performance, so pretty much any game could add that for basically free if the developers want it.


RE: Options
By karimtemple on 6/6/2014 1:00:51 AM , Rating: 1
It would be really expensive to do that, no matter how little or how much power the part has. The Xbox One already deviated from the shelf enough as it is, which is why it's so expensive. Doing so even more would've been out of the question. I'm sure someone already came up with that idea and was shot down immediately.

What they really should've done is dropped the price by $50 and not done any of that idiotic stuff in the beginning where they dicked around with everyone's good will. They honestly made a lot of right moves as far as the design and strategy, they just had a really stupid price point and really bad PR. They should never have done a non-Kinect version, either.


RE: Options
By Alexvrb on 6/7/2014 7:00:03 PM , Rating: 2
Really expensive? I don't think so. They could have gone one of several routes, including dropping in an extremely cheap off the shelf low power ARM chip with sufficient chops. It could handle any OS/background needs as well. 10% or more of the XB1's GPU wouldn't be hard to muster.

Then again, if power was the bigger question, they could have handled the majority of the Kinect's workload with a custom ASIC. Either way, it would have been worth it to enable a consistent experience for developers regardless of whether they used the Kinect and to what extent.

I think they underestimated Sony's performance targets, overestimated the GDDR5 supply issues (although to be fair they actually helped Sony by not fighting over GDDR5 supply), and underestimated developers laziness (regarding eSRAM and other logic).


RE: Options
By karimtemple on 6/8/2014 11:20:16 AM , Rating: 2
Oh yeah, just drop a custom ASIC in there. Not expensive at all. NBD. lol.

And ARM isn't all that powerful off the shelf. You won't be seeing Watch Dogs on Android.

Another problem with your hypothesis is that it isn't 10% of the GPU, just 10% memory throughput. Ultimately it really doesn't make a huge impact.


RE: Options
By Alexvrb on 6/8/2014 3:18:00 PM , Rating: 3
When you're talking about volume sales, in the millions? On a modern process? No, a tiny ASIC doesn't cost much. After a die shrink they could perhaps even bring it onboard the APU. But that was just one of several options, I think the ARM route would have been the cheapest and easiest.

Read the article again.
quote:
reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources

There are computations being done on the Kinect's output, you know. It says "up to" so we don't know exactly how much it borrows from the GPU, but it certainly isn't just memory bandwidth. 10% is the stated upper limit, so I'm going with that. 10% is quite doable on a modern ARM SoC. Both in terms of bandwidth and compute.

The Kinect doesn't reserve the eSRAM, so you're talking 1/10th of the main RAM bandwidth, or rougly 7GB/sec. Piece of cake. Assuming it can potentially consume an equal 10% of compute power (probably overkill), you're looking at roughly ~130 GFLOPS. Could be done even by a 4 cluster first gen Rogue, the design of which has been out for quite a while. Actually even before that there were SGX554 designs that could have fit the bill since you could scale those to 16 cores.


RE: Options
By hughlle on 6/5/2014 12:12:26 PM , Rating: 2
More like optimize software for the hardware. Not an alien concept.

As to the allocation. Here's a question that i don't personally know the answer top given that i dislike consoles. Did the xbox Oone suffer from visible graphics and performance issues prior to this whole revelation? If not, well you'd therefore assume they allowed such an allocation for the kinnect because there was still more than enough to go around.


RE: Options
By inighthawki on 6/5/2014 12:16:28 PM , Rating: 1
It just means that games will have to lower their quality or resolution to obtain the performance they want.


RE: Options
By Makaveli on 6/5/2014 6:15:50 PM , Rating: 2
Last time I checked Mantle was a PC api only and it not the same api that is used on the Xbox one. While I agree with your point I think mantle is a bad example.


RE: Options
By Motoman on 6/5/2014 6:26:04 PM , Rating: 2
You need to check again.


RE: Options
By Manch on 6/5/2014 6:31:44 PM , Rating: 2
It is a PC api only*. The caveat to that is it's designed give developers of PC games the same level of access to the hardware as the console API's. Makes it a lot easier to port the games between the platforms. So "Mantle" is the PC version of what the consoles use. of course MS will not say what is used on the XBONE is Mantle because they're pushing Direct X. 12 is supposed to incorporate some of the things the console API/Mantle does.

An analogy would for this would be like some of the Mazda and Ford cars. Says Mazda but pull that dipstick, pop that starter off, or remove the engine cover and it has ford printed on it.

Either way it works out for PC gamers. Now you will not need insanely better hardware to run a slight bump in graphics performance. Ports should be less sloppy as well. hopefully


RE: Options
RE: Options
By Mitch101 on 6/6/2014 8:24:47 AM , Rating: 2
That further proves there is more performance to come from both systems from low level API enhancements.

For the X-Box One
Kinect Disable in some games = 10%
Low Lever API Enhancements = ? lets just say 10% conservative but if you look at mantle on AMD low end chips its more like 30%
Better multiprocessor distribution which Microsoft says will be huge. No single core holding it back. But again lets play conservative and say another 10%
Developers optimizing code to squeeze out every bit of performance maybe another 10%.

It all adds up.

Ill say this the next time a new console comes out Im going to wait a year before purchasing to let the game base build up and for developers and manufacture to work out the kinks. I suspect the next games consoles will need to do 4k video thats tough today to make it cost effective.


RE: Options
By Motoman on 6/6/2014 10:05:07 AM , Rating: 2
Yes...which I have pointed out all along.

The only difference is that I made it clear that nothing is happening to the *hardware* itself. All improvements will be made through software, and possibly firmware.

It's amazing how many people seem to lack basic reading comprehension.


RE: Options
By Mitch101 on 6/6/2014 10:43:17 AM , Rating: 2
It read as if you were implying they couldn't because of hardware limitations.

Text is often taken out of context. Sorry if that occurred in your message.


RE: Options
By Motoman on 6/8/2014 2:17:12 PM , Rating: 2
quote:
I'm not sure that you're going to "optimize" hardware once it's in the field already......but the developers can probably continue to write better code, and make use of stuff like AMD Mantle for example.


quote:
It read as if you were implying they couldn't because of hardware limitations. Text is often taken out of context. Sorry if that occurred in your message.


How, exactly, can one read my OP above and come to the conclusion that I'm saying that you *can't" optimize existing systems in the field by making better software and using things like Mantle?

...granted that that's *exactly* what I said.


RE: Options
By Reclaimer77 on 6/6/2014 10:26:16 AM , Rating: 2
As others have pointed out, disabling Kinect only frees up 10% more resources. This most certainly cannot truly translate into a 10% performance increase.

Also the scenario you're depicting is highly unrealistic. A 40%+ performance boost across the board from optimizations? I don't see this happening for either console.

quote:
I suspect the next games consoles will need to do 4k video thats tough today to make it cost effective.


Agree. But even the next gen consoles might have problems delivering acceptable framerates for 4k gaming (native).

I can forgive the current generation. After all, they started designing the performance parameters for these back in 2009-9'ish, when 4k wasn't even on anyone's radar.


RE: Options
By Mitch101 on 6/6/2014 11:04:41 AM , Rating: 1
Sony certainly did better out of the gate by having the extra performance on tap.

Microsoft XBONE hardware is certainly capable of doing 1080p but they need to fix a few bottlenecks that currently exist that shouldn't have which is why only certain titles do it today.

I still say both are an AMD chip generation early or should have gone discreet graphics because I think both consoles will start to show their age in 3 years. To me there is no excuse for not having 1080p 60FPS on day one with room to spare so developers don't have to determine sacrificed eye candy or something down the road. What you wind up with is the PC versions will have additional eye candy over the console versions. Its the same game and gameplay just one subjectively will look better.


RE: Options
By retrospooty on 6/6/2014 11:41:12 AM , Rating: 3
"I still say both are an AMD chip generation early or should have gone discreet graphics because I think both consoles will start to show their age in 3 years."

It was a cost thing, but I agree. That would have been better, but with a console, you always have to place a flag in the ground at some point and say "here us where we are" and build it. There will always be faster stuff coming.

"Sony certainly did better out of the gate by having the extra performance on tap."

Can I just point out in a friendly non-aggressive way that you have attacked many people in the past that said that exact same thing , myself and Reclaimer included. I mean went off the rails over the same claim.


RE: Options
By Reclaimer77 on 6/6/2014 12:05:21 PM , Rating: 3
quote:
Sony certainly did better out of the gate by having the extra performance on tap.


/brain explode...

Wow, so like, it's okay for you to say this. But when me or retro does, we "hate Microsoft".

Ugh, anyway, moving on...

quote:
because I think both consoles will start to show their age in 3 years.


Agree. However maybe one of the advantages of going with off-the-shelf PC parts is a MUCH faster refresh cycle. No longer will Sony and Microsoft spend billions and years and years coming up with custom console hardware solutions that also cause massive developer headaches.


"When an individual makes a copy of a song for himself, I suppose we can say he stole a song." -- Sony BMG attorney Jennifer Pariser














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki