backtop


Print 80 comment(s) - last by titanmiller.. on Oct 22 at 6:23 PM

4K resolution hardware will officially be called ultra HD

The Consumer Electronics Association (CEA) has announced the official name for the next generation 4K high-definition display technology: "Ultra High-Definition" or "Ultra HD." According to the CEA, the name is intended to infer the new format's superiority over conventional HDTV.

The CEA Board of Industry Leaders voted unanimously this week to recommend the previously mentioned names for the new next-generation HD resolution. Along with agreeing on a name, the CEA also outlined minimal performance characteristics to help consumers and retailers understand the benefit of the new technology set to begin rolling out this fall.

“Ultra HD is the next natural step forward in display technologies, offering consumers an incredibly immersive viewing experience with outstanding new levels of picture quality,” said Gary Shapiro, president and CEO, CEA. “This new terminology and the recommended attributes will help consumers navigate the marketplace to find the TV that best meets their needs.”

The core characteristics that the CEA agreed on include a minimum display resolution of at least 8 million active pixels with at least 3840 pixels horizontal and at least 2160 vertical pixels.

To meet the minimum needs the display will have to have an aspect ratio of at least 16 x 9. Devices meeting the specifications will also be required to have at least one digital input capable of carrying and presenting native 4K format video at 3840 x 2160 resolution without relying on upconverting.

Source: CE.org



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

PC Monitors
By Lazarus52980 on 10/19/2012 9:26:12 AM , Rating: 2
I wonder what impact this will have on the PC monitor market? Now that I think about it... I wonder if this will impact the Xbox 720 and PS4. This could actually have a good trickle down effect on PC gaming as a whole.




RE: PC Monitors
By mcnabney on 10/19/12, Rating: -1
RE: PC Monitors
By Vinny141 on 10/19/2012 9:48:45 AM , Rating: 5
i don't think xbox "screwed up"

4K TVs wont be mainstream, i'm betting, for another 5-10 years.

xbox 360 did petty good without a blu-ray drive


RE: PC Monitors
By kattanna on 10/19/2012 11:08:15 AM , Rating: 5
quote:
4K TVs wont be mainstream, i'm betting, for another 5-10 years.


you still have a LOT of people who dont even have a 1080p TV right now..and they are ..relatively cheap. oh.. also remember that the first 1080p TV's came out way back in 2005

these new 4K TV's will be WOW expensive for years..and everything will have to be up converted from 1080 for a long time.

then.. think of current cable/sat TV services and there existing compression to feed you 1080i/p they will be very hard pressed to handle such feeds.


RE: PC Monitors
By inperfectdarkness on 10/20/2012 2:56:35 AM , Rating: 2
This. There will be so very little in media to compel consumers to upgrade to "ultra-hd", that it will take years before it becomes a mainstream standard. Heck, I don't think 1080p quite standard yet--since you can still find some TV's that don't support it.

Furthermore, games have stagnated at being programmed for 1080p (or lower, and being upscaled) for several years now. As of this post, I know of no game that even comes close to being programmed for 8Mp rendering. 2560x1600 has HALF the display resolution as this standard...and it's still twice the resolution of 1080p. Final Fantasy: The Spirits Within is probably the highest-definition source material already available for > HD viewing, and it supposedly only has twice the HD resolution as well--putting it on par with WQXGA as well.

I do have hopes that pushing the resolution envelope will eventually spread across the entire tech spectrum and improve the quality of media offered--but I also know that without some mandated push by the government, that it will trickle down very, very slowly. HD was somewhat easier to press because digital was necessitated by dictation. Pushing Ultra-HD isn't going to be nearly as easy--especially with virtually no media currently being produced with that magnitude of resolution.


RE: PC Monitors
By rdhood on 10/19/2012 11:13:15 AM , Rating: 1
quote:
4K TVs wont be mainstream, i'm betting, for another 5-10 years.


I think you are right, though your timeline is a little far out there.

If this unfolds a lot like 1080p & 3D, then 4K TVs will come first... THEN 4k content will begin to trickle in.... THEN 4k TVs will drop in price and start to become mainstream as people anticipate the arrival of much more 4k content, THEN the 4k content arrives in huge amounts, and somewhere along the line xboxXXX will support 4k.

1080p sets were introduced around 2005. It wasn't for another 3 years before enough 1080p content became available to finally push up sales of 1080p sets. The same type of thing is happening with 3D, though most people view 3D as a "want" rather than a "need".

5 years for 4k seems about right.


RE: PC Monitors
By PrinceGaz on 10/19/2012 12:47:46 PM , Rating: 1
The improvement from 480(NTSC) or 576(PAL) lines at 720 columns up to 1920x1080 full HD was a major improvement everyone could see, assuming decent eyesight.

I'm not convinced that 4K is really going to make any difference to most people's perception of the quality of what they are watching, and is likely to go the same way as 3D TVs-- something nice to have, but not worth paying extra for. As such, adoption of it is likely to be very slow, slower even than that for 3D TVs, because you will also need a whole new BD-player/cable/satellite box to provide the 4K video. I wouldn't be surprised if the 4K or UltraHD BD discs are also more expensive than normal BD discs, which are still more expensive than DVDs.

I'd love "retina"-resolution PC monitors to become readily available at affordable prices though.


RE: PC Monitors
By bsd228 on 10/19/2012 2:52:35 PM , Rating: 2
The just announced GoPro Hero3 will record 4k (albeit only at 15fps), and that's a $399 sku. The initial driver for 4k use will be SLR owners who have been having to display their 18,21,24,36mp pics at 2mp resolution - now they have ~8mp of screen.

No, the question becomes what will they name 8k screen in 15 years...super ultra hd? 8k may be the end game for technology as we know it now - they demo'd this at the Olympics covering the swimming events and people described it as damn near reality. As nice as current HD is with a good signal, you still won't call it looking at the real thing.


RE: PC Monitors
By titanmiller on 10/22/2012 6:18:30 PM , Rating: 2
I assume it will be UHD 4K and UHD 8K.


RE: PC Monitors
By Solandri on 10/19/12, Rating: 0
RE: PC Monitors
By Shadowself on 10/19/2012 4:18:14 PM , Rating: 5
You're clearly a person who knows nothing about resolution and what is perceived by a person (or even a non human digital sensor for that matter) in the real world. You've read somewhere the brain dead simple explanation that the *average* person's "pixel" resolution (based upon an individual rod or cone resolution) is approximately one arc minute. You've bought into the lame excuse for reality that anything at a higher resolution than that one arc minute is 100% useless.

In a single word: Bullshit.

There are many, many levels of perception that the average person "sees" that goes well beyond that of the basic one arc minute per pixel fallacy. Look up vernier resolution as just one one of *many* examples of higher perceived resolution. Some levels of perception by people are as much as 10 to 30 times better than that one arc minute per pixel.

Therefore the "you have to be X feet away to see Y resolution" is off by at least a factor of 10. Before you lose *any* perceived effect/benefit of a 4K screen that is 50" diagonal you have to be over 16 feet away.

The argument really comes down to, "Is the additional perceived resolution/clarity of the imagery worth it to ME?" That is a personal response based upon each individual's eyesight, values, and willingness to pay for it. But to say that you have to be two feet away from a 50" screen to see any benefit of 4K is absolutely not true.

And this is true on a non human/non personal level too. The classic example is the GOES satellites. Their sensors take a picture of the Earth at 1 km resolution at nadir (straight down). However, they routinely image the bridge going across Lake Pontchartrain. Clearly that bridge is no where near one kilometer wide (in reality it is under 0.1 km wide!) yet it shows up in GOES imagery! And the resolution of the GOES sensor is worse than 1 km per pixel at that latitude!

If the dumb GOES sensor can image a < 0.1 km object with > 1 km pixels then it must be clear (pun intended!) that the human mind with human vision at one arc minute per "pixel" in our vision sensors can see much, much better than that.


RE: PC Monitors
By Shadowself on 10/19/2012 6:21:11 PM , Rating: 2
I guess rating this way down is because reality is more difficult to understand than simple, wrong charts!

Too many people would rather believe the simple -- and wrong -- writings by pundits out there posting crap on the 'net and having lots and lots of people repeat it rather than pay attention what what's really there.

If enough people repeat it then it must be fact, right?

How many people posting and re-posting and re-re-posting the inaccurate information have actually designed real-time optical sensors for satellites? How many of these people have designed optical sensors and imagery interpretation systems for the military? How many of these people have have done research into optical imagery perception including such things as edge effects and super resolution effects of motion imagery (among others)?

People can continue to rate these posts down, but people need to know what reality is. It is not what is most often being posted and re-posted across the 'net.


RE: PC Monitors
By macca007 on 10/19/2012 8:05:13 PM , Rating: 3
Thank you,Spot on!
Same applies to this 30fps bullshit, I am getting sick of the argument that you don't need or can't tell the difference over 30FPS, BULLSHIT I can't.
If anyone is a gamer out there especially fps games I bet they will notice the difference playing on a 120hz monitor compared to a 60hz screen, Let alone 30fps. The more you play or use pc the more you start picking up slightest details be it resolution or speed.


RE: PC Monitors
By Solandri on 10/20/2012 3:26:53 AM , Rating: 2
Looks like the videophiles are out in force. I don't deny there will be a market for 4k TVs, just like there's a market for $5000 stereo components. I'm just saying for the typical TV buyer they'll be useless. Two of the major TV stations in LA broadcast in 1080i, the other two broadcast in 720p. I've challenged many people to tell me which two are 1080 and which are 720. Only one has been able to say correctly, and he admitted he was just guessing.

The 1 arc-minute of resolution is derived from the rayleigh limit. At a typical pupil diameter of 3 mm, the minimum separation before two objects blend into one is about 1/100 of a degree, or about 2/3rds an arc-minute. Even if you had a perfect retina, you can't distinguish apart objects smaller than this. It's the optical limit of your eye's lens.

Having binocular vision as well as being able to scan helps improve slightly over this. But we're talking about levels of detail where you'd have to strain to see differences. This is something that's relevant if you're examining scientific or crime scene photos. Not something you'll be doing while watching TV. Feel free to think you can see 10x higher resolution than this. Next time you're using a telescope try magnifying 10x higher than the theoretical max magnification and see if it really helps.

quote:
Their sensors take a picture of the Earth at 1 km resolution at nadir (straight down). However, they routinely image the bridge going across Lake Pontchartrain. Clearly that bridge is no where near one kilometer wide (in reality it is under 0.1 km wide!) yet it shows up in GOES imagery! And the resolution of the GOES sensor is worse than 1 km per pixel at that latitude!

... If that's the basis for your misguided beliefs, I don't even know what to say. Please read up on some sampling theory.

Having a resolving limit of 1 km doesn't mean you can't detect objects smaller than 1 km. It means you can't distinguish between objects smaller than 1 km. That is, if you imaged a 300m ship with the same reflectance characteristics as the bridge (same brightness and color), you wouldn't be able to distinguish it from the bridge. For the purposes of the GOES sensor, all objects 1km or smaller with the same reflectivity are the same.

In TV terms relevant to this discussion, at a 6 foot viewing distance, a pixel smudge on a 50" 1080p screen would be indistinguishable from 2x2 pixels on a 50" 4k screen. The additional detail is there on the 4k screen. It's just that like the GOES sensor, your eye cannot tell the 1080p smudge pixel apart from the sharper 2x2 pixels on the 4k screen.


RE: PC Monitors
By DarkUltra on 10/22/2012 4:25:45 PM , Rating: 2
It's not just about the number of pixels and cone cells, but about pixel blending and aliasing. What we need is a proper "blind" test where the viewers does not know what resolution they are looking at.

Like this one, but with pixels instead of framerate:
http://www.youtube.com/watch?v=a2IF9ZPwgDM


RE: PC Monitors
By chemist1 on 10/20/2012 9:24:57 PM , Rating: 2
I'd say you're pretty close to being right, for most TV viewers, when it comes purely to the effect of the pixel count (that 1080p provides sufficient pixel density). Though for those with a home theater, a little more resolution would help. I have a 98" diagonal screen; to achieve the THX-recommended HDTV viewing angle of 36 degrees (for an immersive experience), I sit 11' away. At that distance, obtaining a resolution of 1 arc minute would require a 13% higher pixel density (in each dimension) than what HDTV currently provides (plus slightly improved optics on my projector). And for those who want larger viewing angles, for an even more immersive movie experience, THX recommends 40 degrees as the ideal, which corresponds to human horizontal vision; however here resolution is more noticeably compromised with 1080p.

But the real problem is that even mere 1080p (never mind 4K) is not adequately provided by the current HDTV standard. First, to fit on Blu-Ray, there's compression of ~50:1 to 100:1 vs. a raw 1080p camera feed. This compression is done in a very sophisticated way, and much of the image is static from frame to frame and doesn't need to be reproduced each time, but this compression is nevertheless not visually lossless. Further, HDTV doesn't give us a full color space (relative to what the eye can distinguish), and provides only 8 bits of color depth, which is easily distinguishable from 10 bits. Additionally, because of chroma subsampling, we don't get full 1080p resolution of the individual colors -- on my projector, I can see the difference between a 1080p 4:4:4 image from a jpeg, vs a 1080p 4:2:0 image from a Blu-Ray. So the 1080p standard is in need of updating, irrespective of the issue of pixel density. For more information on this, you can read: http://hometheaterreview.com/high-definition-we-ha...

That's not to say they should have done things differently -- given the bandwidth limits of Blu-Ray, significant compromises had to be made. My point is rather that, looking forward to the "Ultra HD" standard, there's a lot more to it than just more pixels -- my understanding is that it has a full color space and a higher color bit depth, both of which are needed (not sure what they are doing about chroma subsampling). But to deliver this without suffering even more compression loss than what we have now with Blu-Ray, they are going to have to significantly increase the pipeline and further improve the compression schemes. If they don't, then Ultra HD will not live up to its potential --thus some feel we'd get better quality from Ultra HD if we devoted less bandwidth to increased pixel count vs. other areas. Myself, I'd like to see both increased pixels/bit depth/color space, and less compression loss, but it remains to be seen how well they can deliver that.

Finally, note that Blu-Ray is the best a consumer can get -- most 1080p sources have less bandwidth than Blu-Ray, and are thus even more compromised. So the other huge challenge with Ultra HD will be how to get the non-optimal sources (cable, streaming) up to snuff.


RE: PC Monitors
By Digimonkey on 10/19/2012 10:02:54 AM , Rating: 2
quote:
it looks like Microsoft has really screwed up on the 720. We already know the GPU in the 720 is equivalent to a $50 GPU you can buy NOW.


Your source for this information is where?


RE: PC Monitors
By Kefner on 10/19/2012 10:23:55 AM , Rating: 2
I could have sworn I heard the new Xbox was going to support 4K and multiple screens. For the life of me I don't remember where I read that, but will look for it when I get home this evening (at work now, don't have time to search), unless someone here has seen it and can link to it.


RE: PC Monitors
By Milliamp on 10/19/2012 12:10:27 PM , Rating: 2
I think multiple screens is a really good idea. They could make a cheaper version that supports 1 screen and split screen and a more expensive version that supports 2 player games on 2 full sized screens.

They can't boost the hardware specs in a refresh but they could probably add support for a 2nd display.


RE: PC Monitors
By Milliamp on 10/19/2012 12:36:43 PM , Rating: 2
Great, a new version of Blu Ray already. Assuming an average 1080p stream is 5 mb/s 4k is going to need 20mb/s of throughput (not bandwidth).

I guess if there is one positive side to 4k (UHD) it is that there really isn't much/any need for anything after that for home use.


RE: PC Monitors
By kattanna on 10/19/2012 12:48:53 PM , Rating: 2
quote:
I guess if there is one positive side to 4k (UHD) it is that there really isn't much/any need for anything after that for home use.


i heard that from people when 1080 first came out..

LOL


RE: PC Monitors
By geddarkstorm on 10/19/2012 1:01:09 PM , Rating: 3
Why do we buy the mountain? Because it's there.


RE: PC Monitors
By Milliamp on 10/19/2012 2:45:15 PM , Rating: 2
Yes but with 4k (UHD?) you have to be 5 feet away from an 80 inch screen, or 3 feet away from a 60 inch screen to notice based on this article: http://carltonbale.com/does-4k-resolution-matter/

With that said I find the calibration of that diagram pretty questionable. Based on that you have to be 15 feet on a 50" TV before the "full benefit" of 480p is visible. I think for many people these are living room distances and there is a huge difference between 480p and HD even if your eye sight is wonky.

I personally don't think I would notice much difference between 1080p an 4k on my 50" TV but if I replace it with an 80" TV I might or start using one as a PC monitor I might.

If TV manufacturers need something else to set a goal for instead of racing to the bottom in pricing I say go for it. Maybe by the time my kid is old enough to play games he will have a couple of 80" 4k TV's in his game room for his friends to play photo realistic games on. This doesn't serve much purpose for average use but it moves us a little closer to that.


RE: PC Monitors
By MrTeal on 10/19/2012 3:24:44 PM , Rating: 2
quote:
With that said I find the calibration of that diagram pretty questionable. Based on that you have to be 15 feet on a 50" TV before the "full benefit" of 480p is visible. I think for many people these are living room distances and there is a huge difference between 480p and HD even if your eye sight is wonky.


There's a big difference, but I think the benefit of going to a 1080p was exaggerated for most people by all the other changes that were made at the same time. You're changing a lot of variables when you're going from SDTV over analog component cables to a tube TV vs 1080p signal over digital cabling to a 1080p TV.

Try watching a 1080p Youtube video (particularly an animated one that was rendered at hi res like http://www.youtube.com/watch?v=XSGBVzeBUbk) at full resolution, and then change your monitor's resolution to VGA (640x480). The difference you'll see just from the resolution increase isn't nearly as large as you see going from something like SDTV sports to the HD version.

4k is going to be pretty underwhelming, IMO. People will limited on the size of TVs they can use by their room, and the benefit from a 4k display on a 40-50" TV won't be a huge leap from 1080p without other features.


RE: PC Monitors
By Shadowself on 10/19/2012 4:35:04 PM , Rating: 2
The Blu-ray spec is that the video imagery is capped at 40 Mbps with audio capped at 8 Mbps.

The average over a movie with high action sequences (think of things like the recent Avengers movie) in 1080p is likely to be < 20 Mbps (honestly, I haven't checked). However, high action sequences with lots and lots happening in each frame will usually smack up against that 40 Mbps limit for several seconds. For these kinds of movies the post production guys generating the Blu-ray versions spend a lot of time to live within that cap and still have no apparent blocking of the imagery.

If they implement H.265 rather than H.264 and go to this "4K" Ultra HD standard (as different from the Digital Cinema 4K standard) then they'll still need double the bandwidth (H.265 proponents claim that H.265 requires half the bandwidth, on average, as H.264).

Therefore in order for the "next generation of Blu-ray" [or whatever they end up calling it] will need a video bandwidth cap of at least 80 Mbps with a total storage of 100 GB. If they don't go with H.265 but stay with H.264, you need to double that -- 160 Mbps for video and 200 GB for the disc size.

And just so you know, there have been 200 GB working units in labs for over four years and 1 TB versions working units in the labs for almost as long. They just haven't become commercialized because there really was no consumer need for them.

And to just stop any lingering arguments before they start... No, your satellite or cable provider does not deliver Blu-ray quality imagery. The imagery they provide is much more highly compressed. And yes, some people will claim they cannot tell the difference, but that is a personal level thing. It is not that there is no difference or that many people cannot tell the difference.


RE: PC Monitors
By Silver2k7 on 10/20/2012 2:59:49 AM , Rating: 2
There are blu-ray recordable discs with 100GB and 128GB avalible right now with 3 & 4 layers.

The question is will those be used as the new standard for the next generation video players for the home or will there be something else.


RE: PC Monitors
By Reflex on 10/19/2012 1:30:34 PM , Rating: 2
Yeah, ever since Microsoft announced the specs to the next Xbox I have felt much the same....er...wait a second, they haven't announced any new device or specs, have they? You are just talking out of your ass, aren't you?

Never mind then...


RE: PC Monitors
By aju on 10/19/2012 1:48:15 PM , Rating: 1
Most likely the PS4's support is similar to that found in current home theater receivers such as the Onkyo HT-RC460 which supports upscaling to 4K. Chances are that the game is rendered in 1080p by a similar $50 GPU and then upscaled to 4K. After all, it would be tough for a current high end video card from Nvidia or AMD to pull off decent frame rates powering over 8.2 million pixels and 4k has at a minimum 8,224,400 pixels. For example the EVGA Geforce GTX 680 Classified only can handle a minimum frame rate of 29.7 for Crisis Warhead on a resolution of 5760x1200 that is only 6.9 million pixels. Throw in another 1.2 million pixels in and you asking for very poor performance. That card currently costs $559.00 on Newegg. There is no way Sony would put in a $500.00 GPU in a PS4 and that is what it would take even a year from now to support true 4K rendering of games. Chances are it will just be a cheap upscaler chip. If that is the case why pay $24,999.99 for a Sony XBR-84X900 display if you are not really seeing a true 4K rendered game.


RE: PC Monitors
By titanmiller on 10/22/2012 6:23:23 PM , Rating: 2
I personally would rather Microsoft and Sony hold off 24 months so that they can properly implement 4K rendering.


RE: PC Monitors
By BZDTemp on 10/19/2012 4:36:22 PM , Rating: 2
There is a chance the next gen. consoles will offer 4K. Though the current ones may be able to display 1080p but most games run lower res and are then upscaled so even if 4K is supported it may be more in name than in actual real resolution.

Anyway I'm so hoping this will mean 4K monitors in 30" size or there about.

Considering I had a 21"(20" viewable) monitor giving me 2048x1536 over a decade ago and that 30" 2560x1600* has been the max for some years now it is high time we see an real improvement.

* Not counting the specialist monitors that cost $10,00 and more.


RE: PC Monitors
By BansheeX on 10/19/2012 4:58:15 PM , Rating: 2
What's ridiculous about your comment or the desire for 4k gaming in general, is that 1080p games don't look like 1080p recordings of reality. That tells you that resolution isn't the problem, it's polygons and physics. We're still a long way away from games looking real and driving the resolution higher is simply going to take horsepower away from where it should be going.

Oh and I'll take a 1080p OLED over a 4k LCD any day of the week no hesitation. This is just like the 3D novelty all over again. To hell with LCD, we need a real improvement in display quality.


RE: PC Monitors
By BZDTemp on 10/20/2012 1:51:06 PM , Rating: 2
I think you need to cool down a bit.

When asking for 4K in way I suggested that other improvements are also desirable. In fact I actually raised the subject that 4K support could be in name only with regards to the new consoles.

As for you 1080p OLED over a 4K LCD then all I can say you should wait till the choice is possible before making up your mind. Right now your statement is just meaningless name dropping.


Finally!
By retrospooty on 10/19/2012 9:26:42 AM , Rating: 1
Hopefully now that its official manufactureres will get past this latest 3d push and move on to higher res TV's. We have had 3d since the 1950's, its neat for a while but people just dont want it. Its irritating. Also irritating is the fact that by the end of the year we will be getting the same 1920x1080 res on our 5 inch phones as we do on our 80 inch TV sets. Way past time for a res bump.




RE: Finally!
By theapparition on 10/19/2012 10:12:03 AM , Rating: 5
I disagree. Monitors and handheld devices are used close to the face, where it's possible to resolve much higher resolution. Even so, 1920x1080 on a 5" screen is overkill. It's really only useful in that no interpolation or conversion from HD needs to be performed.

No one sits close enough to an 80" TV to see the bump in higher resolution. We've already gone through enough consumer confusion upgrading from analog to digital. We don't need another broadcast standard so quickly.

The next question comes in bandwidth. Broadcast bandwidth is already fixed, so these UltraHD broadcasts would have to be even more compressed than they are now. Higher definition more lossy compression is not an improvement.

I'm a spec junky as much as the next guy, but 4K and UltraHD are a bit unnecessary right now.


RE: Finally!
By retrospooty on 10/19/2012 10:34:51 AM , Rating: 2
" Even so, 1920x1080 on a 5" screen is overkill."

Totally agree its overkill on anything phone related, but not on large screen TV's. They aren't just for video anymore. I use my 42 inch 1080p as a second monitor and its way noticeable, and I can see lines on my 55 inch 1080p. I agree its not "necessary" TV itself isnt "necessary" but a res bump in large screen TV's is long overdue. It was delayed many times over, I am just happy to see it coming.

Picture buying a mid range 32-42 inch screen with 3840x2160 res and plugging your PC into it. NICE!.


RE: Finally!
By theapparition on 10/19/2012 12:42:18 PM , Rating: 3
Ok, I get that argument. But to be quite honest, you can already get very large format monitors supporting those insane resolutions. Just be prepared to fork over a lot of coin to do so. What you're doing, very few other people do to. And unless you move a chair to sit up close, then the seating position is still the same and the resolved resolution isn't too terrible.

I have a few computers hooked up the the home TVs also, and agree it's a bit grainy, but nothing too bad. I would consider a 4k or higher TV if available. But only if the price is right.

But this UltraHD spec is talking about broadcast. It may be a chicken and egg scenerio, where unless there's a standard no one is going to make a TV that supports it. But at the end of the day, I see this failing miserably just because it's too soon. Dust hasn't even settled on HDTV, a good portion of the US populace hasn't even upgraded yet, and we're talking a new standard.


RE: Finally!
By retrospooty on 10/20/2012 9:57:22 PM , Rating: 2
That's exactly it... until its mainstream the price will be high. Once its standard, it will drop regularly. So this is a good thing.


RE: Finally!
By Guspaz on 10/19/2012 10:59:47 AM , Rating: 3
I have an 80" screen with a 2K (1080p) projector on it, seated with my head 8 feet away. According to graphs indicating resolvability, even I and my too-close-to-giant-screen setup would see no benefit from anything above 2.5K (1440p).

Let's keep in mind that the vast majority of cinemas out there, with their hundred foot screens, are 2K. Only a few are 4K. And audiences generally can't tell the difference.

4K offers no benefit whatsoever to the average consumer, it's just a moneymaking scheme. I mean, yes, it's technologically superior, but it provides an identical experience at a higher cost.

100GB or 128GB quad-layer discs, on the other hand, offer tangible benefits. Less discs in packs mean less space on shelves, or the ability to cram more stuff and bonus features into a single disc.


RE: Finally!
By retrospooty on 10/19/2012 11:49:58 AM , Rating: 2
Plug it in to your PC and tell me how it looks.


RE: Finally!
By Azethoth on 10/19/2012 5:42:10 PM , Rating: 2
My local theater recently switched to a 4k projector. I can assure you I can tell a difference. Not with the picture though, but with flicker. The old one had nasty flicker with white areas. Snow scenes were horrid for example, may as well be in a room with fluorescent lighting on the fritz.

The 4k system bumped frame rate or something because the white areas are now totally flicker free.

As for the picture itself, I used to be able to concentrate (I sit in the 2nd or 3rd row) on pixels and then see them. But unless you are doing that your brain just seamlessly blends it all. When watching a movie you are too busy doing that than to stare at a particular patch of pixels.


RE: Finally!
By Milliamp on 10/19/2012 12:14:32 PM , Rating: 2
Now kids will have a real excuse for sitting too close to the TV: "But mom, I can't fully appreciate the resolution from the couch"


At least 16 x 9
By Jeremy87 on 10/19/2012 9:39:16 AM , Rating: 3
What does that even mean? "At least" in which direction?
So 3840x2160 is UHD, but only one of 3840x2400 and 5040x2160 is? And in that case, which one?




RE: At least 16 x 9
By mcnabney on 10/19/2012 9:46:45 AM , Rating: 2
I think the last line has a typo. It should be 3840x2160 NOT 2840x2160.


RE: At least 16 x 9
By DanNeely on 10/19/2012 9:48:46 AM , Rating: 2
I think it's safe to assume movies aren't going to go back to narrow aspect ratios; so if 16:9 is the minimum 5040x2160 would count but 3840x2400 wouldn't.


RE: At least 16 x 9
By Jeremy87 on 10/19/2012 9:52:43 AM , Rating: 2
But the allowed 3840x2160 is contained within 3840x2400.
How can something bigger than UHD not meet the minimum requirements for UHD?


RE: At least 16 x 9
By DanNeely on 10/19/2012 11:37:21 AM , Rating: 2
Because they define both minimum horizontal, vertical, and aspect ratios. Starting at the minimum for all three, you can't only increase only height without dropping below the minimum aspect ratio.


RE: At least 16 x 9
By Ammohunt on 10/19/2012 11:27:50 AM , Rating: 2
Its mean you have to buy a new TV yet again. HD->HD3D->UHD


RE: At least 16 x 9
By MZperX on 10/19/2012 11:56:39 AM , Rating: 3
I think the answer is simple. The term "at least" is meaningless in the context of aspect ratio. It's not a "more or less" characteristic like for instance battery life where more is clearly always better. Which aspect ratio is better depends on what the display is used for, and even when that's defined, the answer is highly subjective. For this reason, people have argued and will continue to argue until blue in the face about the "best" aspect ratio.


RE: At least 16 x 9
By Shadowself on 10/19/2012 6:09:52 PM , Rating: 2
This is probably to cover the case of the 4K Digital Cinema standard which is 4096 x 2160. Such a screen would still be covered under the rules to be allowed to be called UHD. My guess (purely guess) is that since the 4K Digital Cinema standard has been around for a few years these guys wanted their UHD definition to be "backwards compatible" and cover that older standard too.


In other news...
By djcameron on 10/19/2012 10:36:08 AM , Rating: 5
In other news, aging news anchors, everywhere, groan at the thought of another layer of spackle.




RE: In other news...
By Shadowself on 10/19/2012 6:27:28 PM , Rating: 2
I'd give this a 6 if I could.

This is by far the best comment (and truest) in this entire thread.


RE: In other news...
By futrtrubl on 10/20/2012 5:13:05 AM , Rating: 2
I read it as ageing nude actors. Still valid ;']


By DanNeely on 10/19/2012 9:51:05 AM , Rating: 2
... that by setting quadHD as the minimum pixel count for horizontal and vertical resolution that when they hammer 21:9 screens down our throats they're have to do it by boosting horizontal pixel counts not by cutting vertical ones.




By Guspaz on 10/19/2012 11:06:46 AM , Rating: 2
Movies are rarely wider than 1.85 (37:20), why would they try to hammer 21:9 (2.333) down our throats when there's virtually no content for it?

16:9 makes sense since significant amounts of content are in that aspect ratio.


By Shadowself on 10/19/2012 6:25:51 PM , Rating: 2
They were rare, but some of the old ulta wide screen movies were as high as 2.35:1. I don't know anyone doing that format anymore.


By DanNeely on 10/19/2012 7:39:09 PM , Rating: 2
Because the TV industry has become financially dependent on getting consumers to buy the Next big Thing every few years to keep their sales up. Sooner or later wider is going to work its way to the top of the list instead of being an oddball niche product (ex the 2650x1080 TV someone launched a year or two ago), and the TV industry drags he computer screen industry around like a dog on a leash.

The use of diagonal sizing will make this more attractive than the average bogoupgrade to the panel makers because Joe Sixpack won't realize that a 60" 2.33:1 TV is actually a smaller screen (area) than a 58" 16:10 tv (the equivalent area would be only 53"). As a result they can charge the same price per diagonal inch; the only number Joe will look at and boost their profit margins since they're getting more screens out of the same amount of glass/substrate.


Next up
By spamreader1 on 10/19/2012 10:41:31 AM , Rating: 5
Ludicrous HD




RE: Next up
By TSS on 10/20/2012 8:57:44 AM , Rating: 2
M-m-m-monster HD.


mainstream first
By Vinny141 on 10/19/2012 9:52:53 AM , Rating: 2
great , we just need the TV/Movie content to match this now.

5-10 years minimum before this is mainstream im betting. closer to ten.

how long did it take HD TVs to go mainstream with prices/content/blu ray etc...




RE: mainstream first
By retrospooty on 10/19/2012 10:37:40 AM , Rating: 2
Yes, but if it doesn't start, that end will never come. At least its starting. Way too late if you ask me. The industry was pushing this 3-4 years ago and then they steered toward the waste of time 3D tv's we have seen. Now they are finally getting that people just don't like 3D long term, they are back to pushing higher res.


4K is old already, 8K is here!
By Blood1 on 10/19/2012 10:17:29 AM , Rating: 2
By geddarkstorm on 10/19/2012 1:03:25 PM , Rating: 2
I admit, that second link does look pretty dang good. If one had a TV that size.


Source Content?
By dashrendar on 10/19/2012 11:38:43 AM , Rating: 2
Most of you are talking about video games and PCs. What about actual TV broadcasting and movies? What drove HD to become mainstream when it came out? The availability of TV sets or the content that was viewed on it?

Will content providers have issues with the increased resolution in terms of bandwidth? They already compress the hell out of these video streams.




RE: Source Content?
By Pneumothorax on 10/19/2012 12:00:25 PM , Rating: 2
Very valid question. They're either going to have to either: 1. compress video even more than the already HORRID rates (Yes, I'm talking to you TWC) - even at 1080i the artifacts on broadcast HDTV over cable is nasty. 2. Lay down new fiber/copper lines 3. severely reduce number of available channels.

Now that more USA cable companies like to put bandwidth/download restrictions, a 4K stream would have you hit 50gb monthly limit in no time at all.


Consumer adoption?
By danjw1 on 10/19/2012 11:44:06 AM , Rating: 2
Who actually thinks that any but a few actually want this? Most people still haven't even switched to Blu-ray. Why do these crazy people think they are going to be able to sell these sets to any but a few who have more money than sense?




RE: Consumer adoption?
By dlmartin53 on 10/20/2012 5:09:41 PM , Rating: 2
Very true, it is like 3D, every once and a while they roll it back out and try to sell it to the masses, but the masses don't see why they need it. If we had not stopped analog transmission and eliminated 4x3 it would still be the norm.

People on sites like this like technology, the rest of the world does not have a clue what were talking about or even care.
They buy it if the adds for it are compelling enough but most have it because we took away the old stuff. I am all for higher resolution but the manufactures are only looking for the next cash cow down the road. If they don't get it ready you can't push it on the masses.
Several comments on PS3 and XBOX whatever all point to how far behind the software for our tech lags. Programmers are still tweaking game consul output, consuls that are dinosaurs compared to the current PC's an enthusiast can put together.
And the transmission quality of our 1080p still sucks. You only get really good quality on physical media. I love all this stuff but I don't as yet do Blue-ray, I would like to but most of the content I watch is lousy compressed digital Satellite or cable, artifacts and all. But I find very few in my family that even seems to mind it or may even know what their seeing.
That is why 4k is 5 to 10 years off, we care but the vast majority do not have a clue or want the next step up.


Monster Cables may make a comeback!
By JediJeb on 10/19/2012 3:13:07 PM , Rating: 4
Just think: Now Monster Cables will be needed to handle all that extra bandwidth these TVs will require. You absolutely have to have the new "thick as your arm" Monster Cable or the picture on your new TV will simply look the same as the one on your old TV. Don't be the laughing stock of the neighborhood, make your new UHD TV perform better than all the others with our new HyperUltraBandwidth Monster Cables!




By bupkus on 10/19/2012 4:35:42 PM , Rating: 2
On the presumption that businesses get a tax break when investing in R&D it is to allow reduced risk to staying out front with advancing technology and business methods.

When consumers make more money they either save or buy bigger and better shit. Some actually do save and even use that money to advance their own education.

So if I am to argue in favor of the middle class on what do I base my arguments? Can I argue that the middle need super fuckin high res TVs? It's beyond embarrassing. That kind of resolution may be great for communicating medical imaging but should we consumers lust for this kind of crap? And before you give me the argument of how hard you work and that it's none of my business, which by the way I agree with, let me compare this to the obese citizen who eats 2x his normal intake but with 90" TVs you don't show your overindulgence when wearing a bathing suit.

So... I know I don't need or even want that. Should I think you should follow my lead? Hell no. When you lose your job and need to sell your shit at almost give away prices I'll be there pickin it up.

Okey, what's my point?
Hey, this is DT; I don't need one a point.

P.S. Deleted




By Shadowself on 10/19/2012 6:44:42 PM , Rating: 2
Of course no one needs a 4K television. Hell, my life would probably go on just fine with a 720p or 480i television or even the 525 line analog TV (NTSC's original format) -- or maybe even no television at all (at least I would not be watching all those political ads!).

4K or 8K monitors for things like medical x-rays? Yes, they're probably needed there. Believe it or not, there are limited cases where 300k x 300k sensors are needed.


Can someone explain it to me?
By inperfectdarkness on 10/20/2012 3:17:57 AM , Rating: 2
Why is an 8Mp picture described as "4k"? Where do they get 4,000 pixel horizontal resolution from a format that's 3840x2160? Why not call it 8Mp or 2160p?

Better yet, why not just go with WQUXGA and tell the 16x9 guys to shove it?




By delphinus100 on 10/20/2012 2:08:42 PM , Rating: 2
Yeah, just try to do marketing with 'WQUXGA...'


Sure I buy a 4K TV
By Nutzo on 10/19/2012 3:52:42 PM , Rating: 1
Just what I need, a new 4K TV so I can watch scaled up 480i content, overly compress streaming video, or bandwidth compressed 1080i Tv from my cable company full of major compression artifacts.

No thanks. Guess I wait until they bring out Super-Ultra-HD TV's. Maybe by then we'll actual have decent 1080p content.




RE: Sure I buy a 4K TV
By delphinus100 on 10/20/2012 2:06:31 PM , Rating: 2
Display technologies running way ahead of media storage or distribution bandwidth?

I'm not surprised. Think of it as the anti-bottleneck...


By Gungel on 10/19/2012 9:55:52 AM , Rating: 2
I just read a story this morning about the 4k OLED TV's from Samsung and LG that should have launched this month. Unfortunately they delayed it for another 6 months.




Of course they do... .
By damianrobertjones on 10/19/2012 11:16:45 AM , Rating: 2
We've now passed the wave of HD and 3D is nearly old hat so how else do they start the next range of updates to maximise profit for the industry? Welcome... 4k.




Ultra High-Definition
By Fastyle on 10/19/2012 11:25:30 AM , Rating: 2
When I went to film school back in the late 90's, before we had HD and the industry was establishing the standards. I remember discussing the then "new format" and how (I believe it was Mitsubishi or Hitachi) was developing a format with 3000 lines of resolution. ultimately being rejected due to all the work/money that had already been put into 1080!




Relevance
By crazy1 on 10/19/2012 1:12:57 PM , Rating: 2
4k displays will be relevant as more touchscreens are integrated in the house. Imagine your coffee table being a giant tablet computer. Imagine putting a 40 inch TV in your kitchen for entertainment and cooking instructions. Increasing displays to 4k will be useful in these applications, and more.




Super Mega Hyper Ultra High-Definition
By log on 10/19/2012 7:00:33 PM , Rating: 2
I never really got this thing with the namas. Why don't just call it 4k?

What are they going to call it when 8k comes around?




So what?
By TechIsGr8 on 10/22/2012 12:35:43 PM , Rating: 2
The source material barely gets to 1080p, with all the help from compression schemes that render a "hi-def" image something less than stellar. So now Comcast, Dish, AT&T, TW, and others will have to figure out how to compress even more, to fill out one of these new sets? I don't see it happening any time soon.




planned obsolescence
By superstition on 10/22/2012 3:06:21 PM , Rating: 2
Consumerism is speeding up. People have become so socialized to dispose of electronics that even the iPad 1 is now "obsolete" according to Apple and won't receive any software updates.

1080p broadcasts are hard to find and yet we're supposed to abandon our current HD sets in favor of yet another resolution increase in short order (compare to how long SD was around).

This isn't good for the environment. And, like it or not, every one of you lives in that environment. Every product standard that demands the replacement of a previous product needs to be looked at with a cost/benefit ratio that includes environmental impact. But, people act like mice on wheels and they spin that wheel really quickly to get the latest in unnecessary tech.

Just think of how clear the moles will be on people's faces with these new 4K sets!




"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki