Print 83 comment(s) - last by laststop311.. on Jun 9 at 4:02 AM

New Kinect-free edition unlocks memory bandwidth reserved for Microsoft's camera sensor

Given Microsoft Corp.'s (MSFT) supply struggles, sales struggles, and privacy concerns regarding the Xbox One's "Kinect" video and audio sensor, it's perhaps unsurprising that Microsoft decided to at last offer a cheaper ($399) SKU of its console without the sensor.  Now that decision has brought about an interesting twist -- Xbox One users who ditch the Kinect may be able to enjoy better graphics performance.
It turns out that Microsoft's always-on sensor reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources for watching its user(s).  Comments a Microsoft spokesperson to Eurogamer:

Yes, the additional resources allow access to up to 10 per cent additional GPU performance.  We're committed to giving developers new tools and flexibility to make their Xbox One games even better by giving them the option to use the GPU reserve in whatever way is best for them and their games.

According to Eurogamer and The Verge, Microsoft will for the first time be allowing game developers to use these freed up resources to enable special "Kinect-free" enhanced graphics settings.  While that's good news for Xbox One gamers, it also adds an interesting extra nuance to an already heated debate.

Xbox One Kinect

Microsoft's Xbox One has trailed its rival Sony Corp.'s (TYO:6758) console after committing a number of missteps, including initially planning to use DRM to ban used games and threatening to make users' consoles unplayable offline.  While it later recanted on each of these controversial terms, the console has continued to struggle through a number of controversies.  To add insult to injury, Sony recently announced that its rival offering, the PlayStation 4 (PS4) was already profitable.
Nonetheless it has moved millions of units and isn't too far behind Sony's PS4 in lifetime sales.  Microsoft has remained very committed to its console, releasing a major update in April.
The future of the Xbox One should be elucidated at next week's 2014 E3 (Electronic Entertainment Expo), which is being held in Los Angeles, Calif.  Stephen Elop, the new executive vice president of Microsoft Devices (which makes the Xbox), is expected to present his vision for the future of the console along with new Xbox chief Phil Spencer at Microsoft's Monday, June 9 press event.

Sources: Eurogamer, via The Verge, Microsoft

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Options
By karimtemple on 6/5/2014 7:31:31 PM , Rating: 2
None of this has anything to do with Xbox One performance. In fact the most likely scenario is that DX12 is based on the Xbox One in the first place.

RE: Options
By Alexvrb on 6/5/2014 11:53:04 PM , Rating: 3
Agreed, although it would be more accurate to say that the XB1 low-level API is a prequel and subset of DX12. Plus I'm sure there's more optimizations to be done including both PC and console platform-specific optimizations. Either way it seems like the larger impact is going to be on PC, where currently the only low-level API in town is Mantle.

On a semi-related note, I always felt they should have added dedicated hardware (whether part of the APU or external) to help offload Kinect functions. Then again, the types of games that utilize Kinect fully don't tend to be as focused on graphical performance. Simple voice commands and the like probably don't cost much performance, so pretty much any game could add that for basically free if the developers want it.

RE: Options
By karimtemple on 6/6/2014 1:00:51 AM , Rating: 1
It would be really expensive to do that, no matter how little or how much power the part has. The Xbox One already deviated from the shelf enough as it is, which is why it's so expensive. Doing so even more would've been out of the question. I'm sure someone already came up with that idea and was shot down immediately.

What they really should've done is dropped the price by $50 and not done any of that idiotic stuff in the beginning where they dicked around with everyone's good will. They honestly made a lot of right moves as far as the design and strategy, they just had a really stupid price point and really bad PR. They should never have done a non-Kinect version, either.

RE: Options
By Alexvrb on 6/7/2014 7:00:03 PM , Rating: 2
Really expensive? I don't think so. They could have gone one of several routes, including dropping in an extremely cheap off the shelf low power ARM chip with sufficient chops. It could handle any OS/background needs as well. 10% or more of the XB1's GPU wouldn't be hard to muster.

Then again, if power was the bigger question, they could have handled the majority of the Kinect's workload with a custom ASIC. Either way, it would have been worth it to enable a consistent experience for developers regardless of whether they used the Kinect and to what extent.

I think they underestimated Sony's performance targets, overestimated the GDDR5 supply issues (although to be fair they actually helped Sony by not fighting over GDDR5 supply), and underestimated developers laziness (regarding eSRAM and other logic).

RE: Options
By karimtemple on 6/8/2014 11:20:16 AM , Rating: 2
Oh yeah, just drop a custom ASIC in there. Not expensive at all. NBD. lol.

And ARM isn't all that powerful off the shelf. You won't be seeing Watch Dogs on Android.

Another problem with your hypothesis is that it isn't 10% of the GPU, just 10% memory throughput. Ultimately it really doesn't make a huge impact.

RE: Options
By Alexvrb on 6/8/2014 3:18:00 PM , Rating: 3
When you're talking about volume sales, in the millions? On a modern process? No, a tiny ASIC doesn't cost much. After a die shrink they could perhaps even bring it onboard the APU. But that was just one of several options, I think the ARM route would have been the cheapest and easiest.

Read the article again.
reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources

There are computations being done on the Kinect's output, you know. It says "up to" so we don't know exactly how much it borrows from the GPU, but it certainly isn't just memory bandwidth. 10% is the stated upper limit, so I'm going with that. 10% is quite doable on a modern ARM SoC. Both in terms of bandwidth and compute.

The Kinect doesn't reserve the eSRAM, so you're talking 1/10th of the main RAM bandwidth, or rougly 7GB/sec. Piece of cake. Assuming it can potentially consume an equal 10% of compute power (probably overkill), you're looking at roughly ~130 GFLOPS. Could be done even by a 4 cluster first gen Rogue, the design of which has been out for quite a while. Actually even before that there were SGX554 designs that could have fit the bill since you could scale those to 16 cores.

"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki