backtop


Print 80 comment(s) - last by titanmiller.. on Oct 22 at 6:23 PM

4K resolution hardware will officially be called ultra HD

The Consumer Electronics Association (CEA) has announced the official name for the next generation 4K high-definition display technology: "Ultra High-Definition" or "Ultra HD." According to the CEA, the name is intended to infer the new format's superiority over conventional HDTV.

The CEA Board of Industry Leaders voted unanimously this week to recommend the previously mentioned names for the new next-generation HD resolution. Along with agreeing on a name, the CEA also outlined minimal performance characteristics to help consumers and retailers understand the benefit of the new technology set to begin rolling out this fall.

“Ultra HD is the next natural step forward in display technologies, offering consumers an incredibly immersive viewing experience with outstanding new levels of picture quality,” said Gary Shapiro, president and CEO, CEA. “This new terminology and the recommended attributes will help consumers navigate the marketplace to find the TV that best meets their needs.”

The core characteristics that the CEA agreed on include a minimum display resolution of at least 8 million active pixels with at least 3840 pixels horizontal and at least 2160 vertical pixels.

To meet the minimum needs the display will have to have an aspect ratio of at least 16 x 9. Devices meeting the specifications will also be required to have at least one digital input capable of carrying and presenting native 4K format video at 3840 x 2160 resolution without relying on upconverting.

Source: CE.org



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: PC Monitors
By rdhood on 10/19/2012 11:13:15 AM , Rating: 1
quote:
4K TVs wont be mainstream, i'm betting, for another 5-10 years.


I think you are right, though your timeline is a little far out there.

If this unfolds a lot like 1080p & 3D, then 4K TVs will come first... THEN 4k content will begin to trickle in.... THEN 4k TVs will drop in price and start to become mainstream as people anticipate the arrival of much more 4k content, THEN the 4k content arrives in huge amounts, and somewhere along the line xboxXXX will support 4k.

1080p sets were introduced around 2005. It wasn't for another 3 years before enough 1080p content became available to finally push up sales of 1080p sets. The same type of thing is happening with 3D, though most people view 3D as a "want" rather than a "need".

5 years for 4k seems about right.


RE: PC Monitors
By PrinceGaz on 10/19/2012 12:47:46 PM , Rating: 1
The improvement from 480(NTSC) or 576(PAL) lines at 720 columns up to 1920x1080 full HD was a major improvement everyone could see, assuming decent eyesight.

I'm not convinced that 4K is really going to make any difference to most people's perception of the quality of what they are watching, and is likely to go the same way as 3D TVs-- something nice to have, but not worth paying extra for. As such, adoption of it is likely to be very slow, slower even than that for 3D TVs, because you will also need a whole new BD-player/cable/satellite box to provide the 4K video. I wouldn't be surprised if the 4K or UltraHD BD discs are also more expensive than normal BD discs, which are still more expensive than DVDs.

I'd love "retina"-resolution PC monitors to become readily available at affordable prices though.


RE: PC Monitors
By bsd228 on 10/19/2012 2:52:35 PM , Rating: 2
The just announced GoPro Hero3 will record 4k (albeit only at 15fps), and that's a $399 sku. The initial driver for 4k use will be SLR owners who have been having to display their 18,21,24,36mp pics at 2mp resolution - now they have ~8mp of screen.

No, the question becomes what will they name 8k screen in 15 years...super ultra hd? 8k may be the end game for technology as we know it now - they demo'd this at the Olympics covering the swimming events and people described it as damn near reality. As nice as current HD is with a good signal, you still won't call it looking at the real thing.


RE: PC Monitors
By titanmiller on 10/22/2012 6:18:30 PM , Rating: 2
I assume it will be UHD 4K and UHD 8K.


RE: PC Monitors
By Solandri on 10/19/12, Rating: 0
RE: PC Monitors
By Shadowself on 10/19/2012 4:18:14 PM , Rating: 5
You're clearly a person who knows nothing about resolution and what is perceived by a person (or even a non human digital sensor for that matter) in the real world. You've read somewhere the brain dead simple explanation that the *average* person's "pixel" resolution (based upon an individual rod or cone resolution) is approximately one arc minute. You've bought into the lame excuse for reality that anything at a higher resolution than that one arc minute is 100% useless.

In a single word: Bullshit.

There are many, many levels of perception that the average person "sees" that goes well beyond that of the basic one arc minute per pixel fallacy. Look up vernier resolution as just one one of *many* examples of higher perceived resolution. Some levels of perception by people are as much as 10 to 30 times better than that one arc minute per pixel.

Therefore the "you have to be X feet away to see Y resolution" is off by at least a factor of 10. Before you lose *any* perceived effect/benefit of a 4K screen that is 50" diagonal you have to be over 16 feet away.

The argument really comes down to, "Is the additional perceived resolution/clarity of the imagery worth it to ME?" That is a personal response based upon each individual's eyesight, values, and willingness to pay for it. But to say that you have to be two feet away from a 50" screen to see any benefit of 4K is absolutely not true.

And this is true on a non human/non personal level too. The classic example is the GOES satellites. Their sensors take a picture of the Earth at 1 km resolution at nadir (straight down). However, they routinely image the bridge going across Lake Pontchartrain. Clearly that bridge is no where near one kilometer wide (in reality it is under 0.1 km wide!) yet it shows up in GOES imagery! And the resolution of the GOES sensor is worse than 1 km per pixel at that latitude!

If the dumb GOES sensor can image a < 0.1 km object with > 1 km pixels then it must be clear (pun intended!) that the human mind with human vision at one arc minute per "pixel" in our vision sensors can see much, much better than that.


RE: PC Monitors
By Shadowself on 10/19/2012 6:21:11 PM , Rating: 2
I guess rating this way down is because reality is more difficult to understand than simple, wrong charts!

Too many people would rather believe the simple -- and wrong -- writings by pundits out there posting crap on the 'net and having lots and lots of people repeat it rather than pay attention what what's really there.

If enough people repeat it then it must be fact, right?

How many people posting and re-posting and re-re-posting the inaccurate information have actually designed real-time optical sensors for satellites? How many of these people have designed optical sensors and imagery interpretation systems for the military? How many of these people have have done research into optical imagery perception including such things as edge effects and super resolution effects of motion imagery (among others)?

People can continue to rate these posts down, but people need to know what reality is. It is not what is most often being posted and re-posted across the 'net.


RE: PC Monitors
By macca007 on 10/19/2012 8:05:13 PM , Rating: 3
Thank you,Spot on!
Same applies to this 30fps bullshit, I am getting sick of the argument that you don't need or can't tell the difference over 30FPS, BULLSHIT I can't.
If anyone is a gamer out there especially fps games I bet they will notice the difference playing on a 120hz monitor compared to a 60hz screen, Let alone 30fps. The more you play or use pc the more you start picking up slightest details be it resolution or speed.


RE: PC Monitors
By Solandri on 10/20/2012 3:26:53 AM , Rating: 2
Looks like the videophiles are out in force. I don't deny there will be a market for 4k TVs, just like there's a market for $5000 stereo components. I'm just saying for the typical TV buyer they'll be useless. Two of the major TV stations in LA broadcast in 1080i, the other two broadcast in 720p. I've challenged many people to tell me which two are 1080 and which are 720. Only one has been able to say correctly, and he admitted he was just guessing.

The 1 arc-minute of resolution is derived from the rayleigh limit. At a typical pupil diameter of 3 mm, the minimum separation before two objects blend into one is about 1/100 of a degree, or about 2/3rds an arc-minute. Even if you had a perfect retina, you can't distinguish apart objects smaller than this. It's the optical limit of your eye's lens.

Having binocular vision as well as being able to scan helps improve slightly over this. But we're talking about levels of detail where you'd have to strain to see differences. This is something that's relevant if you're examining scientific or crime scene photos. Not something you'll be doing while watching TV. Feel free to think you can see 10x higher resolution than this. Next time you're using a telescope try magnifying 10x higher than the theoretical max magnification and see if it really helps.

quote:
Their sensors take a picture of the Earth at 1 km resolution at nadir (straight down). However, they routinely image the bridge going across Lake Pontchartrain. Clearly that bridge is no where near one kilometer wide (in reality it is under 0.1 km wide!) yet it shows up in GOES imagery! And the resolution of the GOES sensor is worse than 1 km per pixel at that latitude!

... If that's the basis for your misguided beliefs, I don't even know what to say. Please read up on some sampling theory.

Having a resolving limit of 1 km doesn't mean you can't detect objects smaller than 1 km. It means you can't distinguish between objects smaller than 1 km. That is, if you imaged a 300m ship with the same reflectance characteristics as the bridge (same brightness and color), you wouldn't be able to distinguish it from the bridge. For the purposes of the GOES sensor, all objects 1km or smaller with the same reflectivity are the same.

In TV terms relevant to this discussion, at a 6 foot viewing distance, a pixel smudge on a 50" 1080p screen would be indistinguishable from 2x2 pixels on a 50" 4k screen. The additional detail is there on the 4k screen. It's just that like the GOES sensor, your eye cannot tell the 1080p smudge pixel apart from the sharper 2x2 pixels on the 4k screen.


RE: PC Monitors
By DarkUltra on 10/22/2012 4:25:45 PM , Rating: 2
It's not just about the number of pixels and cone cells, but about pixel blending and aliasing. What we need is a proper "blind" test where the viewers does not know what resolution they are looking at.

Like this one, but with pixels instead of framerate:
http://www.youtube.com/watch?v=a2IF9ZPwgDM


RE: PC Monitors
By chemist1 on 10/20/2012 9:24:57 PM , Rating: 2
I'd say you're pretty close to being right, for most TV viewers, when it comes purely to the effect of the pixel count (that 1080p provides sufficient pixel density). Though for those with a home theater, a little more resolution would help. I have a 98" diagonal screen; to achieve the THX-recommended HDTV viewing angle of 36 degrees (for an immersive experience), I sit 11' away. At that distance, obtaining a resolution of 1 arc minute would require a 13% higher pixel density (in each dimension) than what HDTV currently provides (plus slightly improved optics on my projector). And for those who want larger viewing angles, for an even more immersive movie experience, THX recommends 40 degrees as the ideal, which corresponds to human horizontal vision; however here resolution is more noticeably compromised with 1080p.

But the real problem is that even mere 1080p (never mind 4K) is not adequately provided by the current HDTV standard. First, to fit on Blu-Ray, there's compression of ~50:1 to 100:1 vs. a raw 1080p camera feed. This compression is done in a very sophisticated way, and much of the image is static from frame to frame and doesn't need to be reproduced each time, but this compression is nevertheless not visually lossless. Further, HDTV doesn't give us a full color space (relative to what the eye can distinguish), and provides only 8 bits of color depth, which is easily distinguishable from 10 bits. Additionally, because of chroma subsampling, we don't get full 1080p resolution of the individual colors -- on my projector, I can see the difference between a 1080p 4:4:4 image from a jpeg, vs a 1080p 4:2:0 image from a Blu-Ray. So the 1080p standard is in need of updating, irrespective of the issue of pixel density. For more information on this, you can read: http://hometheaterreview.com/high-definition-we-ha...

That's not to say they should have done things differently -- given the bandwidth limits of Blu-Ray, significant compromises had to be made. My point is rather that, looking forward to the "Ultra HD" standard, there's a lot more to it than just more pixels -- my understanding is that it has a full color space and a higher color bit depth, both of which are needed (not sure what they are doing about chroma subsampling). But to deliver this without suffering even more compression loss than what we have now with Blu-Ray, they are going to have to significantly increase the pipeline and further improve the compression schemes. If they don't, then Ultra HD will not live up to its potential --thus some feel we'd get better quality from Ultra HD if we devoted less bandwidth to increased pixel count vs. other areas. Myself, I'd like to see both increased pixels/bit depth/color space, and less compression loss, but it remains to be seen how well they can deliver that.

Finally, note that Blu-Ray is the best a consumer can get -- most 1080p sources have less bandwidth than Blu-Ray, and are thus even more compromised. So the other huge challenge with Ultra HD will be how to get the non-optimal sources (cable, streaming) up to snuff.


"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki