Print 92 comment(s) - last by konnor.. on Apr 21 at 9:51 AM

Sony's 4K TVs get priced

Sony has announced the official price and availability for its new XBR 4K Ultra HD LED TVs. The line includes the XBR-55X900A and the XBR-65X900A, which have screen sizes of 55-inches and 65-inches respectively. While some other manufacturers have offered ultra HD television sets at prices ranging all the way up to $20,000 or more, Sony is actually offering “reasonable” prices, at least comparatively.

The 55-inch TV will sell for $4,995 with the 65-inch version going for $6,999. Both TVs will be available for pre-order on April 21, but the final shipping date is unannounced. Along with pricing and launch information for the TVs Sony is also unveiled its 4K Media Player called the FMP-X1. This device will deliver movies and video shorts in 4K resolution for $699. The media player will be available later this summer.

55" XBR-55X900A
The media streamer itself will come bundled with 10 feature-length films and users will be given access to a fee-based distribution service offering a library of titles from Sony Pictures Entertainment and other production studios. The films that are included with the purchase include Bad Teacher, Battle: Los Angeles, The Bridge on the River Kwai, The Karate Kid (2010), Salt, Taxi Driver, That's My Boy, The Amazing Spider-Man, The Other Guys and Total Recall (2012).

FMP-X1 4K Media Player

Sources: Sony [1], [2]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Dear computer monitors
By BRB29 on 4/8/2013 10:06:07 AM , Rating: 5
Please catch up soon. I'm sick of 1080p monitors when I have it on my phone.

RE: Dear computer monitors
By safcman84 on 4/8/13, Rating: 0
RE: Dear computer monitors
By BRB29 on 4/8/13, Rating: 0
RE: Dear computer monitors
By lagomorpha on 4/8/13, Rating: 0
RE: Dear computer monitors
By BRB29 on 4/8/2013 10:52:36 AM , Rating: 2
what motherboard? it was done on a Radeon GPU. Cost should be more than negated by several die shrinks.

There are people running 5760 x 1080 already and that's much higher than 4k.

There are 2 options
1. Faster memory
2. wider bus

Manufacturers will figure out which one is more cost effective. Don't worry, they've made billions off you from years of price fixing already.

RE: Dear computer monitors
By BRB29 on 4/8/2013 10:54:49 AM , Rating: 2
i was wrong. 4k HD is 3840x2360.

RE: Dear computer monitors
By karimtemple on 4/9/13, Rating: 0
RE: Dear computer monitors
By lagomorpha on 4/8/2013 11:00:47 AM , Rating: 1
Until stacked memory becomes available, making the bus wider means running more traces on the PCB which is something that does not get much cheaper as transistors get smaller.

RE: Dear computer monitors
By BRB29 on 4/8/2013 11:27:49 AM , Rating: 1
stacked memory is used for lowering cost and increasing memory density. It was not really meant to increase bandwidth. The chips are on top of each other, lowering its ability to cool itself effectively because of a lower surface area to air ratio. This is not meant for a high performance application yet.

RE: Dear computer monitors
By inighthawki on 4/8/2013 11:31:57 AM , Rating: 2
He is referring to the new feature coming on NVIDIA graphics cards in Volta. Please look it up.

RE: Dear computer monitors
By mcnabney on 4/8/2013 12:07:53 PM , Rating: 2
Not quite. Three 1080p monitors is a bit over 6 megapixels. 4K will push that up to over 8. Close, but not exactly the same. Now three 2560x1600 displays - that is 12 megapixels!

RE: Dear computer monitors
By zephyrprime on 4/9/2013 12:02:53 PM , Rating: 2
It doesn't have anything to do with die shrinks. Traces are run on the circuit board. Costs for that tend to remain the same over time. Having more traces will require more layers on the PCB which increases costs. Really what they should do is mount the memory onto the GPU package (not die).

Most people have integrated video so they do not have enough power to handle 4K. With gpu clocks speed stuck around 1ghz, it will take quite a while before gpu's are powerful enough to run 4K.

RE: Dear computer monitors
By FITCamaro on 4/8/2013 10:56:10 AM , Rating: 3
In reality, very few gamers are gaming on more than one monitor. Even if they have 2-3 monitors. You don't design for only a few percentage of the marketplace.

I'm still running a 1680x1050 panel. Why? Because it works and lets me run all the details better than if I was at 1080p. But I also run an HDMI cable to my 50" plasma for if I want to game on the big screen and in 1080p.

RE: Dear computer monitors
By BRB29 on 4/8/2013 11:09:17 AM , Rating: 2
Yes, i know very few gamers do but the fact is that it's possible and has been done by a few thousand people at the very least. does 1680x1050 give you more details than 1920x1080? Or do you mean that zoomed in effect that it has because of lower res. If that's the case, run a bigger monitor on the higher res.

This is a new standard and i welcome it. Is it overkill for our current mainstream hardware to make use of it. Yes, a little. But the point is, I want to buy a TV/monitor that has a standard lasting me 5+ years. I can buy my other hardware to keep up with games but I don't feel like buying a new TV every other year.

The Ipad4 can run Infinity Blade II beautifully at its retina high res and sips power. The hardware driving these have 20x the power envelope to work in, they will figure it out.

RE: Dear computer monitors
By FITCamaro on 4/8/2013 11:25:30 AM , Rating: 2
Less pixels = easier to run the game at the highest detail settings.

RE: Dear computer monitors
By BRB29 on 4/8/2013 11:36:40 AM , Rating: 3
Then you have to run AA to get rid of the jaggies. You take a hit in performance.

Do you not want to enjoy high details and no detectable jaggies? The fact that your GPU has to blur out diagonal lines to make it look better on a low res screen should tell you that screens are really the bottleneck in graphics.

I don't care how high your graphics settings is on a low res screen, it will still look bad. The difference between 480p and 1080p was night and day but it's still not good enough for a 24" monitor, let alone a 50" TV. 4k and 8k is the near future.

Maybe at 8k res, it will stay around for 20+ years before we start getting into holographics.

RE: Dear computer monitors
By mcnabney on 4/8/2013 12:13:37 PM , Rating: 2
8K will never go anywhere besides iMAX-like screens. The required viewing angle to see that much detail almost completely fills your vision - over 100 degrees.

Do you sit in the front row at movie theaters? Of course you don't. People prefer a video image that is between 30-60 degrees. Going beyond 60 is uncomfortable and difficult for your eyes and head to move enough to see everything. Your eyes would also have to constantly change focus since the sides of the screen would be much farther away than the center.

RE: Dear computer monitors
By kmmatney on 4/8/2013 12:20:29 PM , Rating: 1
'I don't care how high your graphics settings is on a low res screen, it will still look bad."

Sorry - it doesn't look bad. You'd hardly tell the difference in most games, and it plays much smoother.

As someone who has 1920 x 1200, 1920 x 1080, and 1680 x 1050 gaming machines at the house, I agree that 1680 x 1050 is a very good resolution, and for a given video card it can "play" much better than a higher res screen. It's really a nice sweet spot. The larger pixels do not look bad at all while playing games, and the aspect ratio of a 22" 1680 monitor make it almost the same size as a 1080p 23" display.

My personal monitor is a 1920 x 1200 Soyo Topaz, but I often have to dial down the settings to 1680 x 1050 to get the frame rate I want. For a budget gamer, a 22" 1680 x 1050 monitor is a perfect match,

RE: Dear computer monitors
By inighthawki on 4/8/2013 5:44:34 PM , Rating: 2
I agree. For a long time I had a 1680x1050 monitor and it was perfectly fine. In fact most time I never turned on AA because I rarely, if ever, noticed the jaggies while playing. Sure if I stop what I'm doing and just stare at the screen trying to "appreciate the image quality" I'll notice. But when actively playing the game, it's typically unnoticeable in most scenarios.

I game now on a 1920x1200 display which is great, but it doesn't bother me. I think AA was a serious problem back in the 1024x768 and lower days, but these days anyone who complains that 1080 provides too many jaggies either has a huge monitor so there's poor DPI or they're just being spoiled and nitpicking. I would bet money that 99% of people wouldn't even notice.

It's the same concept with audiophiles. There's a couple percent of people that will spend hundreds or thousands on a good pair of headphones or sound setup, but the $50 headphones at newegg on sale are good enough for almost everyone else.

RE: Dear computer monitors
By FITCamaro on 4/8/2013 12:25:40 PM , Rating: 2
Only if I care enough about the jaggies. Which I don't. Besides most modern cards can run 4x AA with hardly any performance penalty.

I have no problem playing games on my 22" monitor. And with my system, it runs just fine on my TV too.

I figure my 7850 should last another 2-3 years.

RE: Dear computer monitors
By MrBlastman on 4/8/2013 12:45:59 PM , Rating: 4
Do you not want to enjoy high details and no detectable jaggies? The fact that your GPU has to blur out diagonal lines to make it look better on a low res screen should tell you that screens are really the bottleneck in graphics.

I still play Atari 2600 games. Jaggies? What jaggies. :)

We have bigger things to worry about with games these days than graphics. Gameplay needs to take priority right now as there is a huge lack of it in most releases.

RE: Dear computer monitors
By TakinYourPoints on 4/9/2013 1:16:57 AM , Rating: 2
Then you have to run AA to get rid of the jaggies. You take a hit in performance.

Jaggies are the product of pixel density, not resolution.

RE: Dear computer monitors
By BRB29 on 4/9/2013 10:25:36 AM , Rating: 2
pixel density is a product of resolution over a surface area.

On low res screen, you will always get jaggies playing 3D games unless you turn up AA. The only way you won't see it is when you stand far enough away. Then it's like playing on your phone.

Either way, you come to the conclusion that resolution must increase for the bigger screens like monitors and TVs.

I agree with you that 1080p and 2320p is undetectable by my eyes on a 4" smartphone. But I can definitely tell the jaggies even on my 1080p 23" and 27" monitor even sitting 3-4 ft away.
AA and all its derivatives all pretty much a bandaid to fix one of the low res issues.

RE: Dear computer monitors
By random2 on 4/9/2013 3:57:22 AM , Rating: 2
Maybe people will finally catch on and start sitting at the recommended distances for watching HD content. Always amazes me when I see people with a HD projector and screen of say 80 inches and the seating is 25 feet away. Or with their chairs 15 feet from their 46 inch TVs.

RE: Dear computer monitors
By BRB29 on 4/9/2013 10:33:01 AM , Rating: 2
Your logic is flawed. People didn't buy bigger TVs, 3DTVs, elaborate sound systems to sit further from the set. They bought it to feel more immersed in the media. By your logic, I should just my laptop and it put closer to my face instead of buying an 80 in TV.

I welcome 8k res so I can enjoy my 80" TV more. I prefer 7.1 surround sound. I look forward to a better home theater experience.

If you want to pay money for a bigger screen just to sit further then it defeats the purpose of buying the bigger screen. I would rather just sit closer since my eyes have an easier time focusing on things 6 feet away rather than 25 feet away.

RE: Dear computer monitors
By inighthawki on 4/8/2013 11:30:10 AM , Rating: 2
The Ipad4 can run Infinity Blade II beautifully at its retina high res and sips power. The hardware driving these have 20x the power envelope to work in, they will figure it out.

not only does it not "sip" power, but a 4k display is over 2x as many pixels as a retina display.

RE: Dear computer monitors
By BRB29 on 4/8/2013 11:38:53 AM , Rating: 1
You completely missed the point. I think you just see ipad and started hating.

RE: Dear computer monitors
By inighthawki on 4/8/2013 5:34:22 PM , Rating: 2
I'm not even sure how you got that from my statement. I was simply stating that although the ipad can run that at retina quality, that's still less than half the resolution of a 4k display. What does that have to do with hating on the ipad? I'm so confused how you jumped to that conclusion.

RE: Dear computer monitors
By mcnabney on 4/8/2013 12:17:09 PM , Rating: 2
Actual data

720p is about 1MP
1080p is about 2MP
iPad Retina is about 3MP
2560x1600 is about 4MP
4K is about 8MP

RE: Dear computer monitors
By BRB29 on 4/8/2013 1:24:53 PM , Rating: 1
OMG, the point is if a little chip on the ipad can push 3MP at under 8W, then a graphics card running 70-250 watts can easily push a 4K screen without much problem. 70-250W is what a desktop mid-high end discrete graphics card will use.

There are a GPUs that are already capable of pushing the latest titles on 3X1080p or 2x1600p

You clearly did not even read the previous posts.

RE: Dear computer monitors
By EnzoFX on 4/8/2013 3:03:51 PM , Rating: 2
Ugh. I think it's pretty clear that mass production of cheap components is what wins. This is why most people only have 1080p these days and probably a TN panel on their monitors. It's like we're in a lame middle point, or maybe even saturation, at least when it comes to gaming. As far as monitors go, if you're talking the same 30" or less sizes, what's the great benefit for games? I mean of course there's a difference, I'm just saying the real driving force for 4k will be big TV's, industrial applications for monitors, or people that just want more things to fit on their screen. Why does everything have to be measured around games? They are hardly the target market for everything.

I guess I think of games as not being a big deal at all. Why would you buy a 30" 2560x1600 res monitor, just to watch 1080p movies at 5' away? Similarly, a 4k monitor, just for games that will definitely not run natively at that res. You speak of the iPad Retina handling high res, and sure that will translate to 4K gaming, but at that res, you do lose detail and complexity. Either that or you design around a lower res, and upscale. I doubt any complex (not casual) game runs natively 1080p on either the 360 or PS3.

RE: Dear computer monitors
By EnzoFX on 4/8/2013 3:09:15 PM , Rating: 2
To put it another way, every time I upgraded monitors, to bigger and higher res, it was to gain benefit in productivity/desktop use. It most definitely as not dictated to wanting games to look better. The best way to make games look better, is a heftier GPU. I see these being adopted by businesses/industrial applications quite easily, and thank goodness, it'll drive the cost down for us. I do not see these going down due to gamers lol. Games first have to be designed for 4K that won't come quite a while after it has widely become adopted, as it was with 1080p.

RE: Dear computer monitors
By Reclaimer77 on 4/8/2013 10:37:21 PM , Rating: 2
OMG, the point is if a little chip on the ipad can push 3MP at under 8W, then a graphics card running 70-250 watts can easily push a 4K screen without much problem.

Man you are just coming off really poorly here. There's so much wrong with this comparison, I don't even know where to start.

And no, only like the top 1% of PC's in the world could run 4k games at acceptable framerates.

You clearly did not even read the previous posts.

I did, they were wrong too. Embarrassingly so.

By TakinYourPoints on 4/9/2013 1:15:01 AM , Rating: 2
OMG, the point is if a little chip on the ipad can push 3MP at under 8W, then a graphics card running 70-250 watts can easily push a 4K screen without much problem. 70-250W is what a desktop mid-high end discrete graphics card will use.

There are a GPUs that are already capable of pushing the latest titles on 3X1080p or 2x1600p

The problem is the difference in display size. You can cut a LOT of corners when optimizing for a 10" screen and get away with it.

For example, 2K Games is porting X-Com Enemy Unknown to the iPad:

It looks to be completely faithful to the Windows version. However, they're going to cut a lot of corners to make a game that is about 15GB on Windows fit within 2GB on the iPad. Texture detail, mesh detail, audio compression, all sorts of things. In the video it looks great on the 10" or 8" iPad screen, but if you use AirPlay or HDMI to output to a TV it just won't look the same as the PC version.

The same applies to something like Real Racing HD, looks amazing on the iPad, anti-aliasing, 60fps, higher resolution than most desktop monitors, etc etc. Display it on a 24" or 60" and the compromises made for mobile are clear.

Valve recently got DOTA 2 running on an iPad and they'll probably release it in a year (I think just to spectate games, not to play with). I GUARANTEE that if you blew it up to a desktop monitor or HDTV size, the differences in image quality would be huge.

Mobile devices can crank out amazing visuals at ridiculously high resolutions but they cut as many corners as possible in order to get there, and limited display size is how they do it.

You can't compare the requirements of a desktop GPU with a mobile GPU, no matter what resolution we're talking about.

RE: Dear computer monitors
By CZroe on 4/8/2013 3:04:05 PM , Rating: 3
I hope you're kidding because very, very, VERY few console games actually render at 1080p.

RE: Dear computer monitors
By random2 on 4/9/2013 4:08:07 AM , Rating: 2
It's why us PC gamers are such smug bastards. :-)

By TakinYourPoints on 4/9/2013 5:41:20 AM , Rating: 2
Off the top of my head there is Critter Crunch and the PixelJunk games on the PS3 that render at 1080p. Very simple. I can't think of any others, otherwise you're looking at lots of low res games that are upscaled.

Even the 360 UI doesn't display in 1080p.

RE: Dear computer monitors
By zephyrprime on 4/9/2013 12:00:24 PM , Rating: 2
360 and ps3 do not actually render at 1080. They usually render at lower resolutions and then upscale to 1080 because they lack the power for real 1080.

RE: Dear computer monitors
By euler007 on 4/8/2013 11:15:25 AM , Rating: 2
If 4k was standard for desktop monitors, every one would need a dual GPU set up.

BS. A single GTX 680 can push 56 fps with 4X AA in BF3 at 5760X1080.

RE: Dear computer monitors
By euler007 on 4/8/2013 11:18:34 AM , Rating: 2
Misread the chart, make that 60fps with AA off.

RE: Dear computer monitors
By B3an on 4/8/2013 11:55:01 AM , Rating: 2
You don't even need to run games at 4k anyway, you could just use something like 2560x1600 on a 4k monitor because the PPI is so high you wouldn't even get any LCD scaling issues, as the pixels would be too small to see from even a couple of feet away.

RE: Dear computer monitors
By B3an on 4/8/2013 11:45:49 AM , Rating: 2
Total ****ing BS.

Any remotely modern GPU is easily capable of running 4K. And both Display Port 1.2 and HDMI 1.4 can handle the connection bandwidth. Maybe gaming would be an issue, but Newsflash: no one constantly plays games 100% of the time, or even at all.

And another super obvious thing is that 4k will just provide better image quality for everything. I can see the pixels on my 2560x1600 monitor from quite a distance but 4k would be perfect.

RE: Dear computer monitors
By B3an on 4/8/2013 11:49:51 AM , Rating: 2
I'd also like to add that i can run two 2560x1600 monitors, plus another 1080p TV off of ONE graphics card from 2011. Combined thats more pixels than 4k pixels.

But it's a fact that pretty much any card with DP 1.2 or HDMI 1.4 will run a 4k monitor.

RE: Dear computer monitors
By random2 on 4/9/2013 3:45:03 AM , Rating: 2
The LCD panel industry has been in a race to the bottom over the last almost decade undercutting one another and trying to produce the cheapest passable desktop solution for a public who rarely if every buys on quality of colour reproduction, clarity, and viewing angles when it comes to monitors.

Consumers need to be educated into appreciating quality panels. Producing and distributing 4K panels at very reasonable prices is part of that education. Those that appreciate the best they can get, will buy them. There is a reason we are all not sitting in front of 10" monochrome monitors. This is just the next step.

RE: Dear computer monitors
By zephyrprime on 4/9/2013 12:07:59 PM , Rating: 2
And durability. They use cheap ass capacitors in those things that always die quickly.

RE: Dear computer monitors
By MrBlastman on 4/8/2013 11:56:16 AM , Rating: 2
I can wait. Framerate > Resolution in gaming.

More pixels = more upgrades more often.

RE: Dear computer monitors
By mcnabney on 4/8/2013 12:21:29 PM , Rating: 2
Actually, if you drop the AA (because 4K has over 8 million pixels) even moderate GPUs should be able to handle it. I suspect that the AA that is part of the Ultra settings provides a massive hit to performance, but might not even be visibly beneficial due to the tiny pixel size. AA is great on displays that have easily visible pixels, but unnecessary when there are no pixels to appear 'jaggy'.

RE: Dear computer monitors
By MrBlastman on 4/8/2013 12:43:00 PM , Rating: 2
Well I'm not worried about handling it now, I'm worried about being able to handle it a year or two later after upgrading. The bad economy means things have to last longer than ever before.

I'll admit that it'd be a nice screen. :) Stuff would look sharp on it. Especially sims.

RE: Dear computer monitors
By inighthawki on 4/8/2013 3:55:03 PM , Rating: 2
The hit from AA will be nothing compared to the extra pixels. AA works by executing the pixel shader multiple times over edge pixels. The number of extra samples on edge pixels will be nowhere close to 4x the resolution in shader costs. I would be way more interested in being able to use the 4k monitor for better desktop productivity while being able to do a perfect linear scaling of 1080p by 2x in both directions, meaning no blurring effect from scaling the backbuffer to the monitor resolution.

RE: Dear computer monitors
By Argon18 on 4/8/2013 12:08:22 PM , Rating: 2
What's the point? the further you sit from the display, the fewer pixels you can discern. Sitting on my living room couch, which is ~10 feet from the TV, I cannot discern individual 1080p pixels and I've got perfect 20/20 vision.

Moving to even greater pixel density on a television serves no purpose, other than to drive up the cost. There is no benefit to the consumer, except possibly on very very large tv's (80+ inches).

High pixel density works on a phone or tablet, because it's 18" from your face. When the screen is 10 feet away, it's a different matter altogether.

RE: Dear computer monitors
By FITCamaro on 4/8/2013 12:28:42 PM , Rating: 2
I'll just let the anal people buy what they want. I'm perfectly fine with 1080p and will be for years. Next time I upgrade my TV will be when this one dies or 3D TVs that don't require glasses come out.

RE: Dear computer monitors
By Nortel on 4/9/2013 11:17:25 AM , Rating: 2
News flash: with 4K you can sit closer to your TV.

By inperfectdarkness on 4/8/2013 9:31:30 PM , Rating: 1
HELL F**KING YES. How many years do we have to bitch about it before manufacturers wake up and pull their heads out of their collective a**es?

They are selling these at useless sizes
By mcnabney on 4/8/2013 9:57:25 AM , Rating: 1
A 4K HDTV at 55 or even 65 inches is useless since a typical installation won't allow a sharper picture to be visible under normal use than 1080p is providing.

Why? Because the ideal viewing distance for that 55" display is about 4' and the 65" display is about 5'. How many typical living room arrangements puts the sofa that close to the TV? None.

4K should be either much larger (75"+) for the living room or much smaller (30-40") for the desktop to be used by a computer.

RE: They are selling these at useless sizes
By quiksilvr on 4/8/2013 10:00:18 AM , Rating: 2
Agreed. It's just like having 1080p 32" TVs when you are sitting over six feet away. You can't tell between 720p and 1080p at that distance and are just wasting money and electricity for pixels that don't even benefit you.

By FITCamaro on 4/8/2013 10:02:05 AM , Rating: 2
Heh. Beat me.

RE: They are selling these at useless sizes
By lagomorpha on 4/8/2013 10:05:13 AM , Rating: 2
Do more smaller pixels actually draw more electricity than fewer larger pixels?

By mcnabney on 4/8/2013 10:14:22 AM , Rating: 2
Having to turn more pixels on/off will slightly increase power. Also, having a tighter grid will require a brighter backlight (LED) to provide the same number of nits since less light gets through. It will be an increase, but not that big.

HOWVEVER, playing games at a higher resolution will require a MUCH more powerful (and power hungry) GPU.

RE: They are selling these at useless sizes
By FITCamaro on 4/8/2013 10:01:33 AM , Rating: 2
A 32" 1080p TV is also largely unnecessary at the standard viewing distances. 720p is fine. Didn't stop the largely phasing out of 720p TVs. It's all about people being able to say "I have a 4K TV". Nevermind they don't know what that really even means or that it will likely make standard definition content look even worse than it does on 1080p TVs.

By mcnabney on 4/8/2013 10:20:05 AM , Rating: 1
However that 32" screen at 4K would look fantastic attached to my workstation! If they can sell that for $1000 - they will OWN the high end display industry (assuming IPS or equivalent technology is used).

What is sad is that IBM released a 4K monitor over a decade ago. So much for progress.

By marvdmartian on 4/9/2013 7:58:27 AM , Rating: 1
THIS. It's more about bragging rights than anything else, with most people.

I've got a 1080P set, and will likely stick with that until one of two things happens:
1. it DIES
2. they come out with holographic, life-sized, 3D, HD projection. Think of R2D2 doing the Princess Leia projection, only full sized, and in HD quality. THAT will be worth upgrading to!!

RE: They are selling these at useless sizes
By bug77 on 4/8/2013 10:38:52 AM , Rating: 2
Not an issue when there's no content available. By the time there is, this TV will be obsolete anyway.

RE: They are selling these at useless sizes
By mcnabney on 4/8/2013 10:47:44 AM , Rating: 2
Why would it be obsolete? Unless 4K 3D, a very high frame rate, or a new connector becomes dominant I don't see why it wouldn't be just as useful in 10 or even 20 years. 4K is a resolution plateau. It allows a 60 degree viewing angle - which is huge and truthfully just too big for many people. Home use of 8K is just never ever going to happen unless people start turning their entire wall into a display.

RE: They are selling these at useless sizes
By bug77 on 4/8/2013 11:48:22 AM , Rating: 2
Because by then, TVs will offer better contrast, more uniform backlighting, lower power usage, better signal processing. You name it, there is so much to improve on current LCDs...

RE: They are selling these at useless sizes
By mcnabney on 4/8/2013 12:28:23 PM , Rating: 2
Regular consumers don't drop thousands of dollars to replace a display in order to save a few bucks a year on power or to increase contrast a tiny bit. They do it to buy important features. For example - OLED (when it finally arrives) will provide great color and black levels, but people won't buy it for that reason alone. Most sales will be people with lots of disposable money that want a TV that is half an inch thick that they can hang on the wall. So really, the next killer TV function after 4K is going to be super slim and not anything directly related to the image. Look at how horribly 3D has done in the home setting.

By inighthawki on 4/9/2013 11:18:24 AM , Rating: 2
I've wanted an oled tv for years. I couldn't care less about 4k res, but I'd shell out thousands for an oled tv.

By Shadowself on 4/8/2013 11:00:00 AM , Rating: 4
I hate how so many people have bought into the "you can't see any difference in anything better than one arc minute" crap.

This concept assumes the only thing that goes into resolution is the angular resolution of the individual rods and cones of the human eye and assumes the standard "20/20" vision concept. It takes into account NONE of the other things that goes into true vision and what people can perceive in their sight.

There have been countless studies over the years that have shown there are many, many things that go into vision acuity ranging from edge effects (you can see a long straight line that is much, much narrower than one line of rods/cones could possibly see using the "one arc minute rule") to vernier effects (you can see the offset in two lines that are offset by much less than the "one arc minute rule" allows) to angular effects (you can see the difference in rotations of regular objects to a much finer degree than the "one arc minute rule" would allow). The list of additional things that go into the full range of perceived visual acuity is quite long.

Some studies have shown that reproducible, perceived resolution is more than 10 times better than that dumb one arc minute rule.

So if you want to pay for a display that you cannot distinguish from a form of continuous media (e.g., a painting) then even an 8K display is likely not high enough resolution.

The only real question is, "What resolution is worth it to you?" If you do not care about resolutions above 720p (and don't think you, personally, can see anything better than 720p) then don't bother buying 1080p displays or anything higher. If you're like a friend of mine with 20/10 vision and who can actually see the flicker in a standard TV set, a higher resolution and higher refresh rate is definitely worth it.

However, just don't be so naive as to run around telling the world that anything better than one arc minute cannot be seen. It's just not fact.

Also, just as an aside, I hope people really stop misusing the "4K" nomenclature. The term 4K is a Digital Cinema standard. It is 4096x2160. This is quite different from Ultra HD which is 3840x2160.

RE: They are selling these at useless sizes
By euler007 on 4/8/2013 11:04:18 AM , Rating: 2
And then Apple sells 2048 X 1536 that people hold 18 inches away from their face and people say praise about the high dpi display.

I for one would love a 4k 24inch monitor, I definitely CAN see the pixels on my 1920X1200 monitor.

By TakinYourPoints on 4/9/2013 1:24:05 AM , Rating: 2
I have a 27" 2560x1440 monitor sitting next to a 24" 1920x1200. The 27" has only about 10% higher pixel density than the 24" but the difference is massive, things are much sharper and cleaner.

The 15" retina Macbook Pro is a look at the future. Fonts are ridiculously good on that thing, totally accurate, no need for aliasing or smoothing, they're just how they should look. Same with images, UI, etc. Crisp and clean.

Bring on the ultra-high pixel density desktop monitors, I want it! :)

By lagomorpha on 4/8/2013 10:03:04 AM , Rating: 2
So how many nVidia Titans do I need to run games comfortably at this resolution?

RE: Hmm
By mcnabney on 4/8/2013 10:10:44 AM , Rating: 2
Current hardware already supports 4k. If you are curious about performance look at the cards that are doing the multi-display games. Setting up four 1080p displays is just making your own 4K, so you could do benchmarks right now. I imagine that current $400 hardware could do pretty much any game with perhaps just a few requiring the detail to be dialed back a bit. 2560x1600 has been standard for a while now(4 megapixal) and 4K is just 8 megapixels.

Now the next generation of consoles WILL NOT do games at 4K. PS4 will play movies at 4K, but not games. Advantage : PC

RE: Hmm
By lagomorpha on 4/8/2013 10:16:06 AM , Rating: 2
4k = 4,096 × 2,304 = 9437184 pixels
1080p x4 = 1920 x 1080 x 4 = 8294400 pixels

It looks like you'll need at least a pair of 7970s or 680s...

RE: Hmm
By deathwombat on 4/8/2013 10:28:42 AM , Rating: 2
Um, nope. 4K UHD is defined as 3840x2160, so it's exactly 4x the pixels as 1080p. 8K UHD is defined as 7680x4320, or exactly 16x the pixels as 1080p.

RE: Hmm
By mcnabney on 4/8/2013 10:39:07 AM , Rating: 2
Yep, the standard is going to be QuadHD and not true 4K (which is even more letterboxy).

I can live with that. A direct scaling of existing 1080p content is a huge plus. Only the true movie-tech nerds would put that extra width to good use. Technically, a 4K display would have black bars on the sides when viewing ALL Blu-Ray. That isn't a good thing.

RE: Hmm
By mcnabney on 4/8/2013 10:35:22 AM , Rating: 2
What are you talking about?

A single 7970 did just fine.

RE: Hmm
By lagomorpha on 4/8/2013 10:54:37 AM , Rating: 2
A single 7970 did just fine.

22.95 fps average dropping down to 14 fps minimum?

This must be some strange usage of the word 'fine' I was not previously aware of.

RE: Hmm
By euler007 on 4/8/2013 11:16:15 AM , Rating: 2
RE: Hmm
By FITCamaro on 4/8/2013 11:31:40 AM , Rating: 2
For current games built around technology that has to largely run on current gen consoles, sure. Next generation games that are designed to run on the next generation consoles? We'll see. Yes those new consoles still are only designed around 1080p. But they're still a pretty big leap forward from current consoles. They've achieved pretty amazing things with those consoles. They'll do the same with the next consoles. Which means games that push PCs harder too. With the next Xbox and the PS4 both having 8GB of RAM, that means probably at least 4GB of RAM being used in games, if not 6-7GB. A heck of a lot better than the < 512MB currently.

Sony's Design
By laviathan05 on 4/8/2013 10:06:49 AM , Rating: 2
What was the point of including the extra wide bezel on the side to incorporate speakers into the set?

Does Sony think that someone spending $5,000 on a TV is not going to have an external speaker set-up?

RE: Sony's Design
By lagomorpha on 4/8/2013 10:26:48 AM , Rating: 2
Probably adds so little to the cost that it doesn't hurt, and some stores will want to use it as a demonstration without adding speakers to it. Also there will probably be companies that use them for things like presentations where sound quality doesn't matter.

RE: Sony's Design
By PrinceGaz on 4/8/2013 10:33:28 AM , Rating: 2
Perhaps because you need to sit relatively close to a 4K display in relation to its physical size to actually benefit from the increased resolution; just to the side of the screen is probably an optimal position for the front L/R speakers.

RE: Sony's Design
By BRB29 on 4/8/2013 10:34:00 AM , Rating: 2
Their target market is tech trendy people that has high income and most likely live in the city. Where I live, 5k is most often rent for a luxury apartment. At the same time, you don't need a whole speaker system for a small space. Most people use their TV speakers or a sound bar.

Some do have speaker systems but maybe use it once every other month. With most apartment complexes enforcing quiet hours, you really only get to use your speaker system on Saturday and Sunday during the day.

So yes, if the TV's speakers are any good, it will be a good thing. With limited free time, space and barely being home, most people I usually like all-in-one gadgets like these.

Decent launch price for 4K
By corduroygt on 4/8/2013 10:34:21 AM , Rating: 2
In 3 years it'll be competitively priced with high-end 55" LCD's, so around 1-1.5k.

RE: Decent launch price for 4K
By ShieTar on 4/9/2013 10:06:07 AM , Rating: 2
It's not really the launch price, the 4K Toshiba has been around for over a year now at a bit over 7 000€. It also lacks a Display-Port Input, but does come with VGA for whatever reason. LG are also selling an 84" 4K-Screen, for a very affordable 18k€. No Display Port either, so it remains completely unusable as a PC monitor.

I wouldn't bet on prices dropping below 2k too fast though. 30" PC monitors basically launched at 1.5k, and never really made it below 1k. It all depends on factory upgrade cycles more than any kind of R&D.

It's About Time!
By CaedenV on 4/8/2013 10:03:01 AM , Rating: 2
Sony has had such a long string of OK products that are way overpriced. It's about time that they come out with something good which has a much more appropriate pricing on it! They really need a win. While it looks like the PS4 is going to be good, they cannot float a whole company on a single product.

Personally I am really excited about this new 4K revolution. I never ended up purchasing a HDTV because my computer monitors could already do it, and (while lacking the contrast and massive screen size) they were 'good enough' for the point in life I was at. But now I look at my phone and am blown away at the crispness of text and images afforded by the high pixel density, and then I look at my monitors which struggle to make even medium sized text look particularly clear.
I just cannot wait for the resolution revolution to hit the desktop, and I also would love to have a screen that can make movies pop, so I will definitely be in the market for a 4K display of some sort. I just have a few other required purchases to make (fridge, waterheater, roof... all the joys of home ownership), but by the time all that is done I should be able to afford a nice 4K TV for what I hope is $3K or less in about 3-4 years. Seeing the price go from $20,000 down to $5-8,000 in 6 months is very encouraging!

how much does 4k matter?
By Spoogie on 4/8/2013 3:25:05 PM , Rating: 2
What tv shows and movies are available in 4k? And how many can be upscaled to 4k either over the air, by cable, or by dvd?

By haukionkannel on 4/8/2013 3:51:43 PM , Rating: 2
How many have seen the 4K in real use? Those who says that it it not possible to see pixels in 1080p are somewhat right, but I have to say that 4K picture looks definitely much sharper than 1080p picture! Human eyes can fix "problems" in the picture, so 1080p guality is not bad, but there definitely is big difference to the guality of 4K sceens!
Is the difference worth of price difference. Most propably not at this moment, but when these 4K screen becomes cheaper, I will definitely move on to those. 5000$ is so much cheaper than those 20000$ models that we get last year that some people can move on even now.
If you can not get a view of real 4K monitor or projector, try to compare normal and retina iDevice side by side and the difference between them is guite near when comparing 1080p and 4K. Normal iDevice is ok, but retina versio has much sharper picture!

Surprising prices
By Phoque on 4/8/2013 6:32:43 PM , Rating: 2
I would have thought that for such (I presume) low volume sales and also the fact that it is the XBR product line that the TVs would be more expensive.

By random2 on 4/9/2013 3:59:05 AM , Rating: 2
Why oh why would they include built in speakers? Anyone capable of coughing up 6K for a TV will have the sound thing covered quite nicely I expect.

Blu-Ray upscaling
By cmart on 4/9/2013 10:08:08 AM , Rating: 2
I just wonder how Blu-Ray movies look upscaled to 4K. Or if that capability is even included.

By mutatio on 4/9/2013 12:15:33 PM , Rating: 2
I have to say I'm pretty impressed with the entry level prices for these. My wife and I have loved our 1080p Sony NX810 ever since we got it. I'm not so sure about the speakers on the front plate but I am otherwise stoked for these sets to start rolling out! If Apple can launch its TV this year and have enough demand for 4k panels we could see the prices dip even further. I do like Sony's TVs but if Apple enters the market they're going to have such a high leverage based on volume that smaller manufacturers will have a hard time competing just at the level of component pricing. I think it's reasonable to say that an Apple TV would have a pretty polished design, a la Jony Ive. I've said for some time now that if you give me 4k on a big screen TV I'll start working in the living room rather than at my desk. A nice zero gravity recliner with a keyboard and pointing device in front of a wall mounted panel angled down toward me? -Ahhhh. No more back pain while working. :-)

4k tvs
By konnor on 4/21/2013 9:51:25 AM , Rating: 2
4k TVs are already dropping fast in price, down to $1,300 now. It's only a matter of time before all the big brands do the same:

"It's okay. The scenarios aren't that clear. But it's good looking. [Steve Jobs] does good design, and [the iPad] is absolutely a good example of that." -- Bill Gates on the Apple iPad

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki