Print 83 comment(s) - last by laststop311.. on Jun 9 at 4:02 AM

New Kinect-free edition unlocks memory bandwidth reserved for Microsoft's camera sensor

Given Microsoft Corp.'s (MSFT) supply struggles, sales struggles, and privacy concerns regarding the Xbox One's "Kinect" video and audio sensor, it's perhaps unsurprising that Microsoft decided to at last offer a cheaper ($399) SKU of its console without the sensor.  Now that decision has brought about an interesting twist -- Xbox One users who ditch the Kinect may be able to enjoy better graphics performance.
It turns out that Microsoft's always-on sensor reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources for watching its user(s).  Comments a Microsoft spokesperson to Eurogamer:

Yes, the additional resources allow access to up to 10 per cent additional GPU performance.  We're committed to giving developers new tools and flexibility to make their Xbox One games even better by giving them the option to use the GPU reserve in whatever way is best for them and their games.

According to Eurogamer and The Verge, Microsoft will for the first time be allowing game developers to use these freed up resources to enable special "Kinect-free" enhanced graphics settings.  While that's good news for Xbox One gamers, it also adds an interesting extra nuance to an already heated debate.

Xbox One Kinect

Microsoft's Xbox One has trailed its rival Sony Corp.'s (TYO:6758) console after committing a number of missteps, including initially planning to use DRM to ban used games and threatening to make users' consoles unplayable offline.  While it later recanted on each of these controversial terms, the console has continued to struggle through a number of controversies.  To add insult to injury, Sony recently announced that its rival offering, the PlayStation 4 (PS4) was already profitable.
Nonetheless it has moved millions of units and isn't too far behind Sony's PS4 in lifetime sales.  Microsoft has remained very committed to its console, releasing a major update in April.
The future of the Xbox One should be elucidated at next week's 2014 E3 (Electronic Entertainment Expo), which is being held in Los Angeles, Calif.  Stephen Elop, the new executive vice president of Microsoft Devices (which makes the Xbox), is expected to present his vision for the future of the console along with new Xbox chief Phil Spencer at Microsoft's Monday, June 9 press event.

Sources: Eurogamer, via The Verge, Microsoft

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Options
By Alexvrb on 6/7/2014 7:00:03 PM , Rating: 2
Really expensive? I don't think so. They could have gone one of several routes, including dropping in an extremely cheap off the shelf low power ARM chip with sufficient chops. It could handle any OS/background needs as well. 10% or more of the XB1's GPU wouldn't be hard to muster.

Then again, if power was the bigger question, they could have handled the majority of the Kinect's workload with a custom ASIC. Either way, it would have been worth it to enable a consistent experience for developers regardless of whether they used the Kinect and to what extent.

I think they underestimated Sony's performance targets, overestimated the GDDR5 supply issues (although to be fair they actually helped Sony by not fighting over GDDR5 supply), and underestimated developers laziness (regarding eSRAM and other logic).

RE: Options
By karimtemple on 6/8/2014 11:20:16 AM , Rating: 2
Oh yeah, just drop a custom ASIC in there. Not expensive at all. NBD. lol.

And ARM isn't all that powerful off the shelf. You won't be seeing Watch Dogs on Android.

Another problem with your hypothesis is that it isn't 10% of the GPU, just 10% memory throughput. Ultimately it really doesn't make a huge impact.

RE: Options
By Alexvrb on 6/8/2014 3:18:00 PM , Rating: 3
When you're talking about volume sales, in the millions? On a modern process? No, a tiny ASIC doesn't cost much. After a die shrink they could perhaps even bring it onboard the APU. But that was just one of several options, I think the ARM route would have been the cheapest and easiest.

Read the article again.
reserves as much as 10 percent of its graphics hardware's memory bandwidth and processing resources

There are computations being done on the Kinect's output, you know. It says "up to" so we don't know exactly how much it borrows from the GPU, but it certainly isn't just memory bandwidth. 10% is the stated upper limit, so I'm going with that. 10% is quite doable on a modern ARM SoC. Both in terms of bandwidth and compute.

The Kinect doesn't reserve the eSRAM, so you're talking 1/10th of the main RAM bandwidth, or rougly 7GB/sec. Piece of cake. Assuming it can potentially consume an equal 10% of compute power (probably overkill), you're looking at roughly ~130 GFLOPS. Could be done even by a 4 cluster first gen Rogue, the design of which has been out for quite a while. Actually even before that there were SGX554 designs that could have fit the bill since you could scale those to 16 cores.

"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il

Latest Headlines
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
The Samsung Galaxy S7
September 14, 2016, 6:00 AM
Apple Watch 2 – Coming September 7th
September 3, 2016, 6:30 AM
Apple says “See you on the 7th.”
September 1, 2016, 6:30 AM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki