backtop


Print 80 comment(s) - last by titanmiller.. on Oct 22 at 6:23 PM

4K resolution hardware will officially be called ultra HD

The Consumer Electronics Association (CEA) has announced the official name for the next generation 4K high-definition display technology: "Ultra High-Definition" or "Ultra HD." According to the CEA, the name is intended to infer the new format's superiority over conventional HDTV.

The CEA Board of Industry Leaders voted unanimously this week to recommend the previously mentioned names for the new next-generation HD resolution. Along with agreeing on a name, the CEA also outlined minimal performance characteristics to help consumers and retailers understand the benefit of the new technology set to begin rolling out this fall.

“Ultra HD is the next natural step forward in display technologies, offering consumers an incredibly immersive viewing experience with outstanding new levels of picture quality,” said Gary Shapiro, president and CEO, CEA. “This new terminology and the recommended attributes will help consumers navigate the marketplace to find the TV that best meets their needs.”

The core characteristics that the CEA agreed on include a minimum display resolution of at least 8 million active pixels with at least 3840 pixels horizontal and at least 2160 vertical pixels.

To meet the minimum needs the display will have to have an aspect ratio of at least 16 x 9. Devices meeting the specifications will also be required to have at least one digital input capable of carrying and presenting native 4K format video at 3840 x 2160 resolution without relying on upconverting.

Source: CE.org





Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: PC Monitors
By chemist1 on 10/20/2012 9:24:57 PM , Rating: 2
I'd say you're pretty close to being right, for most TV viewers, when it comes purely to the effect of the pixel count (that 1080p provides sufficient pixel density). Though for those with a home theater, a little more resolution would help. I have a 98" diagonal screen; to achieve the THX-recommended HDTV viewing angle of 36 degrees (for an immersive experience), I sit 11' away. At that distance, obtaining a resolution of 1 arc minute would require a 13% higher pixel density (in each dimension) than what HDTV currently provides (plus slightly improved optics on my projector). And for those who want larger viewing angles, for an even more immersive movie experience, THX recommends 40 degrees as the ideal, which corresponds to human horizontal vision; however here resolution is more noticeably compromised with 1080p.

But the real problem is that even mere 1080p (never mind 4K) is not adequately provided by the current HDTV standard. First, to fit on Blu-Ray, there's compression of ~50:1 to 100:1 vs. a raw 1080p camera feed. This compression is done in a very sophisticated way, and much of the image is static from frame to frame and doesn't need to be reproduced each time, but this compression is nevertheless not visually lossless. Further, HDTV doesn't give us a full color space (relative to what the eye can distinguish), and provides only 8 bits of color depth, which is easily distinguishable from 10 bits. Additionally, because of chroma subsampling, we don't get full 1080p resolution of the individual colors -- on my projector, I can see the difference between a 1080p 4:4:4 image from a jpeg, vs a 1080p 4:2:0 image from a Blu-Ray. So the 1080p standard is in need of updating, irrespective of the issue of pixel density. For more information on this, you can read: http://hometheaterreview.com/high-definition-we-ha...

That's not to say they should have done things differently -- given the bandwidth limits of Blu-Ray, significant compromises had to be made. My point is rather that, looking forward to the "Ultra HD" standard, there's a lot more to it than just more pixels -- my understanding is that it has a full color space and a higher color bit depth, both of which are needed (not sure what they are doing about chroma subsampling). But to deliver this without suffering even more compression loss than what we have now with Blu-Ray, they are going to have to significantly increase the pipeline and further improve the compression schemes. If they don't, then Ultra HD will not live up to its potential --thus some feel we'd get better quality from Ultra HD if we devoted less bandwidth to increased pixel count vs. other areas. Myself, I'd like to see both increased pixels/bit depth/color space, and less compression loss, but it remains to be seen how well they can deliver that.

Finally, note that Blu-Ray is the best a consumer can get -- most 1080p sources have less bandwidth than Blu-Ray, and are thus even more compromised. So the other huge challenge with Ultra HD will be how to get the non-optimal sources (cable, streaming) up to snuff.


"My sex life is pretty good" -- Steve Jobs' random musings during the 2010 D8 conference










botimage
Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki