backtop


Print 77 comment(s) - last by tastyratz.. on Jul 21 at 8:59 AM


  (Source: goplok.com)
Sony announces no 1080p resolution for 3D games on PS3; will leave strictly in 720p.

According to Joystiq , PlayStation 3 games in 3D will have their HD resolution capped. While demonstrating the newest version of the system at the Develop Conference,  Sony representative Simon Benson announced that games that run at 1080p resolution will be downscaled in 3D mode -- per eye -- to 720p.  

Speaking to
Joystiq, Benson stated that although the PS3 has the capability of displaying a 1080p image, a  resolution higher than 720p has been restricted because Sony contends that a higher frame rate would impact the quality of viewing.  

Blu-ray movies will retain the 1080p resolution.  Blu-rays run at 24 frames per second, but games run at 60 frames per second -- upping the resolution for games would compromise the smoothness of the frames.  While a "more cinematic game" could be equipped to handle the 1080p resolution at the cost of frames, Sony's current guidelines won't allow users to change settings, Benson said.

A true 1080p image consists of 2M individual pixels, about twice the amount shown in a 720p image. Benson added that even trained computer graphic artist could barely tell the difference between resolutions.

On the Newbies Inc. website, Benson indicated that  that online gamers with a 3D TV may have a competitive advantage over those playing on HD sets.

"It all depends on the gamers to be honest. Initially we were slightly concerned about this because we were thinking, what if it makes it twice as easy or something like that."

He also stated that 3D can have the effect of making games more accessible for inexperienced players.

"I think what’s basically going to happen is that anyone who has stereoscopic 3D televisions and, for example, is playing a driving game, I would imagine you’re likely to find that the accessibility level is higher, that people would generally perform better on their first go. But I think at the high end with the hardcore gamers you’ll still see a [3D] advantage there, potentially, but the margins will be far smaller."



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

there is a diff?
By ixelion on 7/19/2010 2:49:20 PM , Rating: 2
I've never compared 720p to 1080p but if its anything like 1280x720 versus 1920x1080 LCD resolutions, then there is a HUGE difference.




RE: there is a diff?
By makius on 7/19/2010 3:07:21 PM , Rating: 2
quote:
I've never compared 720p to 1080p but if its anything like 1280x720 versus 1920x1080 LCD resolutions, then there is a HUGE difference.


That's exactly what 720p is. And yes there is a huge difference. I'm not sure what Benson was smoking when he said you can barely tell the difference between 720p and 1080p.

However, I can understand the reason they would limit the resolution for 3D gaming on these current gen consoles. The PS3 and 360 just don't have the hardware to properly render stereoscopic 3D gaming environments at high resolutions in real time. Although I still think 3D is just a giant gimmick and won't be wasting my time or money on it.


RE: there is a diff?
By mckirkus on 7/19/2010 3:13:06 PM , Rating: 3
It's easy to see when reading text but a bit harder to see when watching video, especially super compressed HD tv broadcasts. It's easy to spot the difference with a big tv and a blu-ray player.


RE: there is a diff?
By Mitch101 on 7/20/2010 8:42:10 AM , Rating: 3
The viewing distance and screen size plays a huge roll in this as well. If you have a 42" screen and sit back 15' I bet it would be difficult to tell the difference. Probably the most likely scenario why people are satisfied with DVD still.


RE: there is a diff?
By tastyratz on 7/20/2010 2:23:09 PM , Rating: 1
more than bet, its beyond the perceived resolution of the human eye at that distance and size tv with 20/20 vision. Any differences you are seeing is in the SCALER and compression. Modern video deivery (read: cable/sat) compress their signal so far beyond bluray that you will see a huge difference there (think 4gb movie vs 40gb movie).
What people are usually seeing is that since modern flat panels are fixed pixel displays a 1080p image looks best on a 1080p tv, and 720p on 720p tv, etc.

Your average videophile couldn't pick out a 1080p vs 720p signal on a GOOD set from a normal viewing distance under a 50 inch display.

Hell most people cant spot the difference between a 300dpi or 150dpi 4x6 photo print at normal viewing distances.

Isn't the marketing placebo effect grand?


RE: there is a diff?
By tastyratz on 7/21/2010 8:59:11 AM , Rating: 2
aww a down vote.
But hey don't take my word on it, try cnet:
http://reviews.cnet.com/720p-vs-1080p-hdtv/
quote:
Whether you're dealing with 1080p/24 or standard 1080p/60, doesn't alter our overall views about 1080p TVs. We still believe that when you're dealing with TVs 50 inches and smaller, the added resolution has only a very minor impact on picture quality. In our tests, we put 720p (or 768p) sets next to 1080p sets, then feed them both the same source material, whether it's 1080i or 1080p, from the highest-quality Blu-ray player. We typically watch both sets for a while, with eyes darting back and forth between the two, looking for differences in the most-detailed sections, such as hair, textures of fabric, and grassy plains. Bottom line: It's almost always very difficult to see any difference--especially from farther than 8 feet away on a 50-inch TV.


or pc magazine:
http://blogs.pcmag.com/miller/2010/02/displays_doe...
quote:
Of the 64 participants, 59 percent said they preferred the 1080p set.

I would hardly call 59% an overwhelming majority. With 2 sets to pick from it leaves a lot of room and is almost break even. It goes to show however that the difference is not what people think of it as

Note both of these tests involved a 1080p source bluray signal which means the 720p sets are scaling to begin with (where the perceived quality loss is on a lower end tv usually)


RE: there is a diff?
By Alexvrb on 7/19/2010 8:47:05 PM , Rating: 2
Hmm, what about games that run in 2D mode in 720p (or less...)? Are they going to run in stereoscopic 480p? :P


RE: there is a diff?
By afkrotch on 7/19/2010 9:00:47 PM , Rating: 2
If it's a 720p game, upscaled to 1080p, then what is the difference? It's not like the console is going to let me increase my POV to make use of the higher resolution.


RE: there is a diff?
By someguy123 on 7/19/2010 9:18:32 PM , Rating: 2
Increased resolution isn't necessarily linked to increased viewable area in-game.

The benefit of higher resolution is clearer picture and reduced aliasing. On a smaller screen 720p runs fine, but on larger screens it's quite a bit more jagged (and/or blurry if you upscale) than 1080p.


RE: there is a diff?
By driver01z on 7/20/2010 1:06:10 PM , Rating: 2
For me the difference depends - typically yes, it is noticable, because typically I'm playing a game and you notice more aliasing in 720p. If the game has good anti-aliasing though, then its possible for me to not notice a difference.


RE: there is a diff?
By Marlonsm on 7/19/2010 3:12:42 PM , Rating: 2
quote:
if its anything like 1280x720 versus 1920x1080 LCD resolutions


Bingo!!!

That's the difference, maybe it's even worse because your shiny new 1080p 3D TV won't be running at its native resolution.


RE: there is a diff?
By quiksilvr on 7/19/2010 3:25:47 PM , Rating: 2
Or your shiny new 1080p 3DTV will upscale the already 60 frames per second 3D 720p signal for its 1080p screen resolution?

And this doesn't affect Blu ray movies. Since they are running at a much less hectic 24 frames per second, it will be 1080p.


RE: there is a diff?
By someguy123 on 7/19/2010 9:20:28 PM , Rating: 2
Upscaling is always worse than having the actual resolution.

This doesn't affect movies because movies aren't rendering 3D scenes in real-time.


RE: there is a diff?
By Xaussie on 7/19/10, Rating: -1
RE: there is a diff?
By hemmy on 7/19/2010 5:30:52 PM , Rating: 4
Blu Ray looks much much much much much much better than DVDs. I cringe whenever I have to watch movies in standard def now, it kills me.


RE: there is a diff?
By SPOOFE on 7/19/10, Rating: 0
RE: there is a diff?
By daniyarm on 7/19/2010 5:31:30 PM , Rating: 3
No offense, but I am a bit surprised that a "monitor/resolution junkie" thinks blu-ray is a waste of time. Blu-ray is not only movies, it's video also, try watching Planet Earth on BR vs DVD. And BR is also lossless audio. These new audio tracks would be enough to fill a whole DVD, so I wouldn't call BR a waste of anything.

Video lenses are close to an order of magnitude more expensive than professional SLR lenses that already outresolve even the best sensors on the market. The problem with video is that unlike photography it is usually used in much darker environments and lack of contrast and grain start to overtake resolution.

And image quality has nothing to do with distance (that's just our eyes not being sharp enough), but monitor quality. On my 24" NEC monitor that costs more than your average 32" HDTV the difference is night and day between DVD and BR. If they were able to produce large displays of the same quality (like in medical fields), the difference would be even more obvious.


RE: there is a diff?
By integr8d on 7/20/2010 12:37:13 AM , Rating: 2
"The problem with video is that unlike photography it is usually used in much darker environments and lack of contrast and grain start to overtake resolution."

Really? It's usually used in much darker environments?


RE: there is a diff?
By Xaussie on 7/20/2010 1:55:48 PM , Rating: 2
Most Blu-Ray movies I've seen are 48KHz/16 bit and IMHO they sound pretty ordinary compared to DVD which is 96KHz/24 bit (albeit with compression). Overall I feel that DVDs sound better than Blu-Ray discs and I was really disappointed about that because I was expecting great things from Blu-Ray (like the magic of listening to Brothers In Arms in 5.1 on SACD for example).

As for lenses I shoot with $2000+ Nikkors so I know what lenses can resolve and in most live action movies I'm not seeing anything close to that level of detail. And distance does come into it because the further away from the screen we sit the less detail we can resolve (Europeans anyway, Aboriginal Australians probably have good enough eyesight to justify 1080p at 20ft viewing distance).

A monitor is a different story because we sit right in front of it. I use a $2500 NEC 3090 Spectraview which is 2560 x 1600 and is internally calibrated. It doesn't get much better than that but it's complete overkill for watching video of any kind.


RE: there is a diff?
By JazzMang on 7/19/2010 5:56:55 PM , Rating: 2
And on what planet do Blu-rays look just like their DVD counterparts? That's a joke, right?


RE: there is a diff?
By acer905 on 7/19/2010 6:27:57 PM , Rating: 2
Perhaps a planet where the person owns a 32" flat CRT instead of an LCD or Plasma? When the native input is 480, does it really matter if you up the video to 720 or 1080?


RE: there is a diff?
By SPOOFE on 7/19/10, Rating: 0
RE: there is a diff?
By AmbroseAthan on 7/19/2010 6:54:38 PM , Rating: 2
While the things you mention do greatly matter, most productions (movie and TV) people wish to see in 1080p are normally recorded on either film, or digital resolutions higher then 1080p; 2k, 3k, 4k, 5k, all are common now.

There is a very large shift occurring in Theatres to 4K projectors over the next couple of years.

I bring this up to point out these productions normally have already solved all of the issues you listed, well before HD TV existed, and are now recording at ~4x 1080p levels (4K). While they matter more then resolution, they are also non-issues for most productions.


RE: there is a diff?
By someguy123 on 7/19/2010 9:22:16 PM , Rating: 2
And all of those things retain more detail with HD video.


RE: there is a diff?
By integr8d on 7/20/2010 12:44:31 AM , Rating: 2
He didn't say 'just like'... Obviously, it's not 'just like'. He made a good point about motion resolution (which a lot of people failed to pick up on). He also made a good point color gamut (which I'd assume he meant to say color bit depth). All consumer U.S. video is output at the REC 709 color space.

After sitting in a DI suite with Joe Kane and watching upscaled DVD on a PS3 and having him acknowledge the amount of detail in a properly transferred DVD, I can tell you that DVD isn't bad.

Too often people get too wrapped up in resolution. They either forget or don't know that contrast and detail play an equally, if not more, important role than resolution. Watch Baraka upscaled through a real video processor and then watch '1080p' video off someone's handicam and tell me which looks better.


RE: there is a diff?
By Xaussie on 7/20/2010 2:04:48 PM , Rating: 2
Properly mastered DVD being a key point here. If everything I saw on DirecTV was of the calibre of something like "Horton Hears a Who" on DVD I'd have no interest in HD at all, but sadly most DVDs are not nearly that good (and it would even put some Blu-Rays to shame), and off air DirecTV can be apalling at times.

Also color bit depth has nothing to do with gamut. Perceptually we can only resolve about 6.5 bits in a normally lit environment. Bit depth is simply how many slices you divide the color space into, and too few results in banding. Gamut is how much of the color space you can display and the original LCD displays (regardless of bit depth) could only display about 72% of NTSC. Modern displays like my NEC 3090 pretty much cover all of NTSC and all of Adobe RGB (and do it while being driven by an 8 bit graphics card).


RE: there is a diff?
By AnnihilatorX on 7/20/2010 6:17:06 AM , Rating: 2
I remember the sales girl selling the LG monitor.

They claim that people generally prefer this 3D LCD monitor which uses passive polariser glasses which is interlaced and have half of the resolution of of 1080p instead of the one which uses active shutter glass but full HD.

Reason being along the lines of 'information overload'.


Confused Article
By mckirkus on 7/19/2010 3:11:49 PM , Rating: 2
They should point out that 1080p@60hz per eye isn't possible yet with HDMI. You need a dual link DVI cable to handle that much bandwidth. I think HDMI 1.4 can handle 1080i@60.

Also, this sentence doesn't make sense:
quote:
"a resolution higher than 720p has been restricted because Sony contends that a higher frame rate would impact the quality of viewing."

They're using frame rate and resolution interchangeably.




RE: Confused Article
By integr8d on 7/19/2010 4:35:04 PM , Rating: 2
1.4 has the bandwidth for 1080p/60 as well as 4Kx2K resolution.


RE: Confused Article
By mckirkus on 7/19/2010 5:43:19 PM , Rating: 2
True but this article is about 1080p 3D content. From Wikipedia:
quote:
"HDMI 1.4a requires that 3D displays support the frame packing 3D format at either 720p50 and 1080p24 or 720p60 and 1080p24, side-by-side horizontal at either 1080i50 or 1080i60, and top-and-bottom at either 720p50 and 1080p24 or 720p60 and 1080p24.[114]"


Note there is no mention of 1080 p 60 in there.


RE: Confused Article
By integr8d on 7/20/2010 12:55:47 AM , Rating: 2
Copy that. Still reading papers on HDBaseT. Too many specs!


RE: Confused Article
By SPOOFE on 7/19/2010 4:35:05 PM , Rating: 2
quote:
They're using frame rate and resolution interchangeably.

Not necessarily; they're saying that with the finite bandwidth or processing power, there's a trade-off between frame rate and resolution, as upping the latter may impact the former.


RE: Confused Article
By mckirkus on 7/19/2010 5:47:50 PM , Rating: 2
No, they're confused. This should read:

quote:
"a resolution higher than 720p has been restricted because Sony contends that a lower frame rate would impact the quality of viewing."


RE: Confused Article
By SPOOFE on 7/19/2010 6:32:37 PM , Rating: 2
So it was just misspeaking. All of the other comments seem to quite clearly recognize the trade-off between resolution and framerate given static hardware.


RE: Confused Article
By zephyrxero on 7/19/2010 5:53:55 PM , Rating: 2
Guess what...the PS3 does NOT ship with an HDMI 1.4 port...so doesn't matter, they're limited to the original specs that max 1080p out at 24fps. Maybe new systems will come out with an updated port, but what about the millions already out there?


RE: Confused Article
By afkrotch on 7/19/2010 9:48:54 PM , Rating: 2
Wonder if they can push more in a different way. Like create an adapter to utilize the HDMI port and component connections.


RE: Confused Article
By Aloonatic on 7/20/2010 9:53:12 AM , Rating: 2
Out of interest, what is possible with the component video output that the original xBox 360 was release with? There are still a few of them about in the wild, so I assume that MS would have to take that into account when deciding on specs like this if they want to produce 3D content too.


Who cares.
By Ristogod on 7/19/2010 2:40:25 PM , Rating: 4
Like many a before have already stated, until they come up with a 3D interface that doesn't rely on wearing some stupid glasses, I'm not even considering replacing my very nice and very new still 55" LED LCD.




RE: Who cares.
By Phoque on 7/19/2010 4:57:02 PM , Rating: 3
I care. I understand though you're not interested, I wouldn't either if I had just bought such a tv as yours. But I've only got a 32" and since I am planning to buy a bigger one after the coming holyday season (boxing day), why not go for 3D? I personally don't mind wearing the glasses. I don't mind either if there is only a handful of games that come out with the feature. I do hope that all 3D movies will come out in 3D blu ray though. I don't mind forking out the extra 500$ either for the glasses + TV 3D support ( likely to be less by the time I buy the stuff ).


RE: Who cares.
By cmdrdredd on 7/19/2010 5:33:32 PM , Rating: 2
quote:
and since I am planning to buy a bigger one after the coming holyday season (boxing day), why not go for 3D? I personally don't mind wearing the glasses. I don't mind either if there is only a handful of games that come out with the feature. I do hope that all 3D movies will come out in 3D blu ray though. I don't mind forking out the extra 500$ either for the glasses + TV 3D support ( likely to be less by the time I buy the stuff ).


That's all well and good but there is no broadcast standard, every watching needs the $100 a pop glasses, and they do induce headaches. Watching a 2 hour movie is one thing, playing a 8-10 hour marathon session on the latest game is quite another thing. Plus I wear glasses already, it's extremely uncomfortable to wear 2 pairs of glasses (no prescription 3D glasses and I can't wear contact lenses so don't even mention it). I can also see games being made gimmicky just cause they're 3D. I can see silly stuff being done in the games just to make some fancy effect you couldn't do in 2D. It will be a very small section of the market that actually uses 3D so I doubt developers will dedicate real resources to it. That's why Nintendo didn't do 3D games before. They actually had a Gamecube with a special LCD that could output 3D, but they realized that people would have to buy the LCD also. So they waited until you could build a hand held system with a 3D display built in that doesn't require any glasses or extra peripheral.


RE: Who cares.
By Phoque on 7/19/2010 9:22:46 PM , Rating: 3
I wear glasses too. Everytime I go to cinema, I look at the movie in 3D and I forget about the two pair of glasses I am wearing ( and they`re cheap 3D glasses ). For me, depth adds up to the experience of any good movie and similarly would add to the experience of any good games. But it has to be a good game to begin with, not some gimmick you refer to.

We naturally see in 3D, how could it not add up? Even if it means lowered details, I believe I will feel more immersed with 3D perspective than with lots of eye candy I anyway don`t have much time to give attention to in action games ( unless I stand still to admire it while a sniper is aiming for my head ).

If it gives you the headache, I can understand it`s a turn off. Also, at this stage of my life, I don't play for 8 hours in a row and I`ve never had a headache after watching several 3D movies, including the 2h42 long Avatar.

I agree though it probably won`t catch up that many people this gaming generation around, that`s why I hope movies will be sold in their 3D version.


RE: Who cares.
By afkrotch on 7/19/2010 9:51:26 PM , Rating: 2
Many ppl won't mind wearing the glasses. Me, I dislike the flickering. Yes, there is a noticeable flicker with 3D, because of the way it's done.

Every other frame is meant for a seperate eye, which the glasses are needed for this. One less will be black, while the other is clear. They swap between the two to match up with the tv. This creates a flickering effect, which is extremely annoying.

Until they can get rid of that completely, I won't bother.


Excuses...
By Marlonsm on 7/19/2010 3:22:34 PM , Rating: 2
First, they say the resolution is restricted because a higher frame rate would impact negatively the quality of viewing.(Makes lots of sense!!! Not)

Then, they say that 720p and 1080p are just too close to tell the difference.(Well, 1080p is just twice the number of pixels, that's not a big difference, is that?)

And finally, they say that games with 3D could have an advantage.(Not completely wrong, but if they want gamers to have equal conditions, don't allow the PS3 to be used in different screens.)

Sony, just be honest and say that your hardware can't handle that.




RE: Excuses...
By afkrotch on 7/19/2010 9:56:39 PM , Rating: 2
Not seeing how it doesn't make sense. To get higher frame rates, the quality will needs to be dropped. Yes, it's due to hardware limitations, but it doesn't make it a false statement.

You can't see the difference between 1080p and 720p when it's concerning movies. With a game, there is a difference, if the game makes use of the difference. If it's a 720p game upscaled, wtf is the difference? If you get a higher degree of POV with 1080p over 720p, then yes, there is a massive difference.


RE: Excuses...
By SlyNine on 7/20/2010 2:49:49 AM , Rating: 2
Actually, I CAN tell the difference between 720p and 1080p in movies. If you cannot, get a better screen or pay more attention. Or maybe that's a personal limitation others are not bound to.

Im not sure noticing the difference between 720p and 1080p in 3d would be as easy honestly. As each eye is getting 2 different references the brain would have more to deal with and more information to make a complete vision of the scene. Since you/it(:your brain) doesn't have one 2d image to worry about it might be less noticeable as you have more information making up for it, After all the brain doesn't care about what it can't see, that's why you never notice your blind spot.


Quite obvious...
By Daniel8uk on 7/19/10, Rating: 0
RE: Quite obvious...
By EclipsedAurora on 7/20/2010 11:10:58 PM , Rating: 1
Actually, it has 2 root problems:

First, 3D need double frame rate over tradition 2D display. Say a 1080p60 under 3D mode means two 1080p60 stream for 2 eyes. Together it would be 1080p@ 120Hz. Even fastest PC based GPU today may ran into trouble into rendering such high frame rate!

Second, even though PS3 was over-engineered as a 2006 hardware. The PS3's HDMI 1.3 can handle 3D transmission at 1080i60 only. We need HDMI 1.4 in order to achieve 1080p60 3D mode transmission. Furthermore, as we can observed in Samsung 3D-TV, in practise interlanced flickering is very obvious when image is displayed under 3D mode. Even the European Broadcasting Union recommend 720p over 1080i for stable and clear video transmission!


What's not being said
By SSDMaster on 7/19/10, Rating: -1
RE: What's not being said
By jnemesh on 7/19/10, Rating: 0
RE: What's not being said
By jbizzler on 7/19/10, Rating: 0
RE: What's not being said
By SSDMaster on 7/19/2010 3:47:50 PM , Rating: 1
I agree 720P with a higher frame rate than 1080P in 3D would be better... Your telling me being able to discern between 1080P and 720P is something I don't "know"?

I don't get it.


RE: What's not being said
By Xaussie on 7/19/2010 4:00:50 PM , Rating: 3
Anyone with enough training can tell the difference between 720p and 1080p but the differences are slight. I am a trained graphic artist and professional photographer and to me the differences are very subtle. Stuttering framerates on the other hand are very annoying and seriously affect my ability to play certain games.

I for one will happily take 720p if the frame rate is smooth. Ratchet and Clank is one of the best looking games out there (stunning on my 52" Samsung) and it's only 720p.


RE: What's not being said
By sviola on 7/19/2010 3:23:14 PM , Rating: 2
I think the limitation is regarding hdmi connection. If I am not wrong (and correct me if I am, please), hdmi 1.3 does not have the bandwith to move 1080p 3D, that's why they have created the hdmi 1.4 standard.


RE: What's not being said
By GuinnessKMF on 7/19/2010 3:25:00 PM , Rating: 2
Movies can be displayed at 1080p@24fps because they are temporally aliased, that is, they are 'blurred' over time. A single frame in a movie consists of all the light from 1/24th of a second blurred together, a single frame in a video game is a snapshot of the exact position of all elements at the time of rendering. Because or brains can notice these exact frames, kind of a like a flipbook, we need more frames in order for our brains to interpret them as motion. So while 1/24th looks smooth for a movie, 1/60th looks smooth for a video game.

For a 3D scene, you need to take this snapshot twice per frame (once slight shifted to the other eyes perspective) and both the hardware, and hdmi 1.3 cable are incapable of keeping up with the requirements. To say the cell processor is powerful enough is not only incorrect, but makes assumptions that developers are actually programming to properly utilize all of that power (most do just enough performance tuning to make it acceptable at the current level, and there isn't any leftover headroom to double the requirements).

Additionally movies don't have to worry about anything other than decoding what has already been recorded, they don't have to worry about all the other factors that go into a video game and could degrade performance (pathing, anti-aliasing, etc).


RE: What's not being said
By SSDMaster on 7/19/2010 3:53:02 PM , Rating: 1
quote:
To say the cell processor is powerful enough is not only incorrect, but makes assumptions that developers are actually programming to properly utilize all of that power


Not sure how you got that assumption, but the Cell processor is a long ways off from being fully utilized.

HDMI 1.3 vs 1.4 is just a higher quality spec. HDMI 1.0 over a very short distance could easily handle 1080P over 3D.


RE: What's not being said
By SPOOFE on 7/19/2010 4:28:06 PM , Rating: 2
quote:
the Cell processor is a long ways off from being fully utilized.

Tell that to the boys at Naughty Dog, they seem to have a different opinion on that matter.


RE: What's not being said
By cmdrdredd on 7/19/2010 5:27:41 PM , Rating: 2
quote:
Tell that to the boys at Naughty Dog, they seem to have a different opinion on that matter.


What IS being used to the fullest is the SPUs in the system. Uncharted 2 used all of them up but barely touched the CELL CPU because it's harder to code a game to use the CELL. It's more simplistic for the developer to offload various tasks to the multiple SPUs.


RE: What's not being said
By SPOOFE on 7/19/2010 6:57:13 PM , Rating: 3
quote:
What IS being used to the fullest is the SPUs in the system. Uncharted 2 used all of them up but barely touched the CELL CPU because it's harder to code a game to use the CELL.

The SPUs essentially ARE the Cell processor, as well as being the most difficult aspect to use for games. The other processing element is a derivitive of IBM's Power architecture and is not at all exotic, mysterious, or difficult to utilize. It is completely conventional.

Note: IBM's Power architecture is also the basis for the X360's CPU.

quote:
It's more simplistic for the developer to offload various tasks to the multiple SPUs.

False; what you just described is the most difficult part of coding for the Cell. Multithreading is very difficult for games; asynchronous multithreading even more so. The Power-derived "main" core of the Cell is the easiest to code for, as it is the most robust and has the most developer tools and experience behind it. The SPUs are very, very difficult.


RE: What's not being said
By SlyNine on 7/20/2010 3:03:38 AM , Rating: 2
Do you mean the SPE's, All the cell is, is one PPE and 8 SPE's, one disabled and one dedicated to OS. In fact the Cell is very weak in many things due to this design.

The Tri PowerPC "xeon" ( the same chip that is the PPU in the PS3, but 3 instead of the 1 in the PS3) is more flexible and arguable has more usable power, even though it has less theoretical power. You seem to have been duped by hype, popular myth, and marketing. Until someone is able to get that kind of performance in a GAME, the Cell for all its theoretical and fold@home performance is nothing great for gaming. It's a great science workstation for SOME types of work.

The "its harder to code games for" is a cop out, Until someone does so you don't even know if it's possible.


RE: What's not being said
By SlyNine on 7/20/2010 3:10:47 AM , Rating: 2
Sorry meant for SPOOFE


RE: What's not being said
By SlyNine on 7/20/2010 3:12:55 AM , Rating: 2
Did it again Sorry spoofe, meant for SSDMaster


RE: What's not being said
By StevoLincolnite on 7/20/2010 12:19:32 AM , Rating: 2
quote:
Not sure how you got that assumption, but the Cell processor is a long ways off from being fully utilized.


The Cell is a "Cheap" processor, it was designed so it wouldn't cost the Earth, whilst providing adequate performance.
I mean you can't expect a $1000 processor in a console can you?

The Cell used it's transistor budget well, but it's no where near the level of a 6 Core Phenom or Core i7.


RE: What's not being said
By spazmedia on 7/19/2010 3:58:53 PM , Rating: 2
quote:

Movies can be displayed at 1080p@24fps because they are temporally aliased, that is, they are 'blurred' over time. A single frame in a movie consists of all the light from 1/24th of a second blurred together, a single frame in a video game is a snapshot of the exact position of all elements at the time of rendering. Because or brains can notice these exact frames, kind of a like a flipbook, we need more frames in order for our brains to interpret them as motion. So while 1/24th looks smooth for a movie, 1/60th looks smooth for a video game.

Regarding 1/24th of seconds for cinema, you are incorrect. When shooting a movie you can capture a frame at anywhere from 1/25 second (if shooting at 24FPS) to around 1/10000 of a second depending on the camera and available light. What's more you could easily simulate what you call temporal aliasing in a video game. In fact many games already do this, even for the PS3. This is one of the benefits of the Cell processor...


RE: What's not being said
By integr8d on 7/19/2010 4:44:30 PM , Rating: 2
I think you mean 1/48th of a second @24fps, 180degree shutter. With no shutter, you could capture at 1/24th, albeit with a lot of smearing (film) or blurring (imager). But who would do that?


RE: What's not being said
By SPOOFE on 7/19/2010 7:00:21 PM , Rating: 2
quote:
When shooting a movie you can capture a frame at anywhere from 1/25 second (if shooting at 24FPS) to around 1/10000 of a second depending on the camera and available light.

And if you compare 1/25 of a second to 1/10000 of a second, you'd see smoother results with the former as opposed to the latter when played back at 24fps.


RE: What's not being said
By GuinnessKMF on 7/19/2010 11:49:25 PM , Rating: 2
Sure you can simulate temporal aliasing in video games, but it requires a substantial amount of processing power, it's not trivial, it's actually easier to just output 60fps than to output nicely temporal alias'd 24fps, the point is that the processing required to produce smooth 3d video games is greater than is required to output smooth 3d video. (As for the shutter speed stuff, I didn't want to have to post an entire wikipedia article to tell someone that you can't say "you can do it in movies, why can't you do it in video games").


RE: What's not being said
By FormulaRedline on 7/19/2010 3:24:13 PM , Rating: 5
quote:
Considering how powerful the Cell is


The PS3 is four years old. The technology inside of it is probably closer to six. When is the last time you saw a gaming computer running with 256MB of RAM or a high end GPU with 256MB of memory? Those are the specs of a PS3. It's not a question of cable bandwidth, the machine just can't process or store that much information for a modern game with high polygon models and decent textures.

Asking it to run in stereo is asking for twice the graphics output. If it reduces back to 720p, then it gets over half the pixels back, making up for the difference.

And yes, I completely agree with the other posters that anyone that can't tell the difference between 720 and 1080 needs to be relegated to SD tube televisions and a Wii.


RE: What's not being said
By SSDMaster on 7/19/10, Rating: -1
RE: What's not being said
By SPOOFE on 7/19/2010 4:30:21 PM , Rating: 2
His comments seemed to focus on RAM. Can you say, "bottleneck"?


RE: What's not being said
By SlyNine on 7/20/2010 3:09:14 AM , Rating: 1
Yes, Theoretical Gflops, what happens when you apply real world code that is useful to a GAME. That performance drops below that of my CPU. My Core I7 would wipe the floor with the Cell in most apps we use today and for what it doesn't, my 5870 would continue the ass whopping in the rest.


RE: What's not being said
By gamerk2 on 7/19/2010 3:46:43 PM , Rating: 2
Keep in mind, the actual image doesn't need to be repeated twice, just the output. The problem is more a limitation of HDMI then anything else; HDMI does NOT have the bandwith for 1080p @ 120Hz


RE: What's not being said
By SSDMaster on 7/19/2010 3:55:47 PM , Rating: 2
Yes it does! They already have it...


RE: What's not being said
By Phoque on 7/19/2010 5:23:52 PM , Rating: 1
quote:
And yes, I completely agree with the other posters that anyone that can't tell the difference between 720 and 1080 needs to be relegated to SD tube televisions and a Wii.


And I think you are an asshole, because you could'nt make the difference yourself. The difference between 1080 and 720 is only 2X.

The difference between SD and 720 is more like 3X, so you can easily discern it.

The main reason people are shocked at HD quality VS analog TV is that analog TV broadcast is 330x485 ( 6X factor compared to 720 !!! ).

http://hometheater.about.com/cs/television/a/aavid...


RE: What's not being said
By quiksilvr on 7/19/2010 3:28:06 PM , Rating: 2
Haters gotta hate.
Trollers gotta troll.

For games, its going to be 720P at 60 frames per second (x2 because its 3D)

For movies, its going to be 1080p at 24 frames per second (x2 because its 3D)

The high frame rates of games is the reason why its 720p.


RE: What's not being said
By integr8d on 7/19/2010 4:16:05 PM , Rating: 2
"Considering how powerful the Cell is..." In relation to this discussion, do you have metrics?

720p will be fine in 3D. The 3D effect tricks the mind into seeing resolution, where it is not. Your brain 'fills in the blanks' so to speak.

Add to that, most TVs don't render motion at 1080p. The Panasonic, like another poster mentioned, is one of the few exceptions. But that TV (I'm currently using one on a large budget 3D feature) suffers from horrible white balance issues in 3D. The TV overcompensates for the color shift in the glasses. So I've had to yank the handles all over the place to get it even remotely close to a REC 709 white point. Also, it seems to throw out half of its color depth in 3D. Possibly having to split it up for each eye. All in all, while not a digital cinema projector, it's not bad.

But back to the PS3. Most people won't notice the lower resolution. The material I'm viewing is 1080p, side-by-side (960x1080 left eye, 960x180 right eye). It's not at all objectionable. On a still frame, you COULD nit pick. But the brain compensates so easily. And while watching a moving 3D image, it's a complete wash.

Best not to get caught up in marketing terms. They'll do anything to 'sell you'.


RE: What's not being said
By Phoque on 7/19/2010 4:45:22 PM , Rating: 2
You don't know what you're talking about. The Cell is not suited for graphical rendering. Rendering is a job best suited for the GPU.

"Not enough processing power?"

So yes, as you asked, it is a problem of processing power, gpu processing power.

"37"+ its easily discernible."

Sigh. MAYBE for a static image you peek at a foot away from the screen. Never for anything that moves the slightest.


RE: What's not being said
By afkrotch on 7/19/2010 9:59:51 PM , Rating: 1
1080p movies at 24 hz. Games push 60 hz. Try again.


"I f***ing cannot play Halo 2 multiplayer. I cannot do it." -- Bungie Technical Lead Chris Butcher

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki