backtop


Print 29 comment(s) - last by Hawkido.. on Jan 29 at 4:54 PM

The latest Playstation 3 SDK solves half of the 1080i upscaling equation

A sore spot for the PlayStation 3 is its lack of resolution scaling for native 720p games to 1080i. Such a feature is especially important to those with older HDTVs that do not support 720p, as the PS3 will downconvert games to 480p rather than upconverting to the preferred 1080i.

Unlike the Xbox 360, PS3 does not feature a dedicated scaling chip, so it was believed that those without 720p support would have to either buy a new television or just live with a highly compromised 480p/i image. That all may soon change, if the latest SDK (software development kit) is any indication on where things may be headed.

According to an article on Beyond3D, the late January PlayStation 3 SDK update brings with it the ability for developers to access a hardware scaler – albeit one that’s limited to only horizontal scaling. For the a 720p image to be upscaled to 1080p/i, both horizontal and vertical scaling are required.

The added scaling functionality, as it stands now, must be coded into every game by the developer. With only horizontal scaling available, a 1080p/I image must still be internally rendered by the PlayStation 3 at a vertical resolution of 1080. The horizontal resolution, however, can now be less than the standard 1920, reducing the computational costs on graphical subsystem.

For example, developers can render at 960x1080, which requires only a slightly larger framebuffer footprint than 720p, which can be either downscaled to 1280x720 or 1920x1080 for output to the HDTV. As can be deduced, rendering at the rather non-standard 960x1080 sacrifices horizontal resolution in exchange for a increased vertical resolution and compatibility with 1080i sets.

It is unknown if there is any more scaling functionality waiting to be unlocked in future developer tools, and if there are, what Sony’s reasons are for not making them available today. For now, it appears that the horizontal scaling feature is the only patch for PS3’s scaling limitations.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

At least, but....
By WayneG on 1/27/2007 6:41:56 AM , Rating: 2
Well at least Sony are trying to increase the compatibility, it's just such a shame that they chose such a mismatch of hardware 512mb total RAM for the whole system is such a bottleneck to games. Poor dev's have to find the best way to exploit the graphics chip without strangling the rest of the system, why didn't SCE think of this when they had an extra year to design the console?




RE: At least, but....
By StevoLincolnite on 1/27/2007 7:04:20 AM , Rating: 3
512Mb of ram is not a bottleneck for a console, The Nintendo 64, Even back in its ime where graphics looked amazing only had 4Mb of memory. (8Mb when upgraded - Mind you it was Rambus...).
The original Xbox had 64Mb of ram, Which If ran on a PC you need to have 256Mb for a decent experience...
The xbox 360 has 512Mb of system memory (shared with video card, so 256mb really...) running at 700mhz.
The play station 3's system memory is only 256Mb but its running at 3.2ghz.
Remember with a console they're isn't an operating system which consumes 200mb or more of ram on its own. Plus bandwidth and latencies are generally better than that of the PC, And there are no drivers, etc. A console excels in 1 thing and thats gaming, Why do you think a console with much lower system specs could match a PC gaming performance wise? Theres no bloat anywhere! A game which may need 512Mb of ram on a PC may only need 64Mb on a console.


RE: At least, but....
By goku on 1/27/2007 7:46:22 AM , Rating: 5
Half true. While consoles do in in the fact that they have less bloat, your arguement of the 512MB ram requirement for
PC vs. 64mb for the xbox is flawed.

One thing you didn't and A LOT of people don't consider is that the original XBOX, PS2, Dreamcast only had to render in 480i/p, as for the PS1 and N64, they only really had to render in 320X240 resolution. The resolutions of the older consoles is considerably less than that of the PC and therefore you could get away with having much less video memory and system memory for that matter.

Because you can use lower resolution textures due to the lower resolution, you don't need as much ram. Now the question of whether 256MB of ram is enough for 1080P, thats up for debate, personally if they had gone with 512MB ram, I'd have been a LOT happier as I don't believe 256MB allows for much expansion in future titles.

The reason I feel that 256MB isn't enough is because when you get into the realm of 1280X720 and 1920X1080, the potential of using higher resolution textures in the future and or more textures in a given environment grows tremendously and so having more memory would have allowed a lot more expansion and more detailed environments in later game titles. While you won't get more pixel pushing power or shader power by having more memory, it would however allow for much larger games to be loaded and you'd still be able to get more detail without the need for a more powerful graphics processor.


Now the reverse was true, where you have little video memory and say an upgradable graphics processor, it would allow for lots of more cool effects but you wouldn't have the ability for a more lush and interactive environment had you had more vram.

Having more VRAM and system ram allows for more lush, interactive environment with more depth as well.

Now you're wondering, why are all these video card reviews I see show that having more Vram don't show a performance improvement between a 256MB version of a video card and a 512MB version of that card? The reviewer says it's cause the card is too slow and can't take advantage of the higher memory making the increase pointless.

Yes, the review is right, the card is too slow to take advantage of the more memory due to the lack of processing power. However despite this fact, having more VRAM (for every increase in VRAM, you need to increase system memory) is still useful but isn't taken advantage of due to how PC games are made. See a PC game is made to take advantage of the latest cards' processing power, that means that you have more of a demand on pixel pushing power because of things such as special effects, VRAM, system ram and a host of other things, they scale in a fairly linear fashion, with the need for a bump in VRAM being about every 3 video card generations.

PC games have the mentality of total performance improvement, a console does not. So if I game is designed with everything increasing in performance and you throw in an older generation card with lots of VRAM, you'll get no performance improvement due to this mentality of everything needing to improve as this 'new game' puts a demand on every part, improving one part will only put the bottleneck on everything else making the upgrade moot as told by the reviewers.. The PC adds new demands to every aspect of a game, Special effects, interactivity, textures, physics, draw distance, etc, a console cannot...

Now for a console, it's completely custom hardware, same platform and everything, the only thing that changes is the code that is run on the platform. A game can infact take advantage of the extra memory in the console even if the PC can't because it's specialized hardware. By having specialized hardware, you're supposed to basically prevent any one component from bogging down the whole system, optimizations are very possible on consoles while not nearly as easy to do on a PC since there is a vast number of different configurations on the PC. So on the console, you can have it so that the pixel pushing requirement is in check with the GPU on the console by having less special effects like HDR opposed to the PC port and then setup the environment to take advantage of 512MB of Vram and system ram by having cans roll around on the ground or have lots of trees scattered around.


Do you see how this makes sense? Trees, bushes, bodies, buildings and cans aren't very demanding for pixel pushing power yet some games won't work with that FX5200 card with 256MB while a 9800pro can tackle that game with a measly 128MB ram, thats because the game increased the demands on amount of VRAM, system ram AND pixel pushing power, effectively making that FX5200 worthless. Now one aspect I never mentioned is how will these new objects effect the CPU? Since adding moveable objects puts a demand on the physics processor, you may not be able to add very many cans or dead bodies but you will be able to have very clean and crisp textures as you've got all of that system ram and VRAM, right?

I hope I this make more sense for people interested in the balancing act that all console makers AND game writers have to do.


RE: At least, but....
By EclipsedAurora on 1/27/07, Rating: -1
RE: At least, but....
By sxr7171 on 1/27/2007 2:45:48 PM , Rating: 2
What the heck are you talking about? I can play games on my PC at 16x FSAA and never see a jaggie. While my stupid PS3 has jaggies all over the place and yes, it is being connected to a HDTV with internal post processing. I don't want my TV to mess with the signal it is being given, I want the source to provide it right and the TV to maintain fidelity to that source.


BTW a Dell 2407FPW, currently the cheapest 24" computer monitor is around $700 and more expensive than all but a few 26" LCD TVs. Maybe having a real high resolution like 1920x1200 is the reason (not some joke like 1366x768).


RE: At least, but....
By saratoga on 1/27/2007 8:19:01 PM , Rating: 3
1) Bullshit. Doom3 is an obvious example:

http://www.bluesnews.com/plans/313/

The UT series also use it.

3) I don't know why you think this, but its completely false (not to mention really bizzare). Theres no difference between console and PCs, they use the same display technology and interfaces (which is why you can use a PC monitor as a TV or a TV as a PC monitor). Additionally, its not actually possible to do FSAA after you're already committed your frame to the output buffer (since you've lost all the geometry info), so even if they used different displays, what you're describing would be impossible.

You could make a TV that blurred everything you sent to it. That would make it harder to see aliasing, but its probably not what most people would want.


RE: At least, but....
By Tyler 86 on 1/28/2007 10:28:31 PM , Rating: 2
FSAA comment is wrong; FSAA is not MSAA. FSAA is possible.
(...but to save you some reading time, no monitor or television performs it , because of the environment they are designed to operate in)

MSAA clearly uses geometry information to produce more accurate samples to reduce geometric jaggies, negative side effect of requiring transparency AA to handle antialiasing of textures.

Simple AA can be done per texture, per geometric shape, (reguards to PC optimizations), but also, it's native conception, full-scene anti-aliasing was merely over-rendering (large framebuffer) and down-scaling (smaller output buffer, or second framebuffer) - this meant exponential increases in scene render latency.

Instead of merely blurring, antialiasing can be applied additively (pre-weighted average of N samples per frame) in such a fashion.

Many optimizations have come to pass since then, however. For example, the smart embedded ram in the XBox 360's GPU provides 'free' antialiasing in a vaguely similar additive manner.

A game rendered at 90 fps, sent at 90hz to an output buffer, with each frame antialiased as 3 weighted offset samples (90/3), displayed to a 30hz monitor, is a fantasticly simple way of achieving this.

Now, in your favor, it's true that this sort of thing simply doesn't exist on today's televisions and monitors, since it can (and should) be done in the graphics processor.

The anti-aliasing and post-processing of modern monitors and televisions is designed around motion-based blurring, and contrast-based anti-aliasing and sharpening. ATi & nVidia TV cards have these features for streaming video enhancement.

So, basicly, by buying a motion-blurring, contrast sharpening and antialiasing television, you're wasting your money if you're not attempting to watch low-quality signals.

Monitors are designed with perfect signal quality in mind, whereas televisions intend to compensate... that's all.



RE: At least, but....
By Sid7039 on 1/27/2007 9:28:28 AM , Rating: 2
quote:
The xbox 360 has 512Mb of system memory (shared with video card, so 256mb really...) running at 700mhz.
The play station 3's system memory is only 256Mb but its running at 3.2ghz.


1. it's better not to talk about megahertz (700Mhz vs 3.2Ghz) but about system memory bandwidth that is 22.4GBps fox X360 and 25.6GBps for PS3 (which is not a big difference)

2. you have 512MB shared memory in X360 so you can use e.g. 128MB for graphics and the rest for other things or vice versa - it's hard to say that because of sharing it's really 256MB


RE: At least, but....
By EclipsedAurora on 1/27/2007 1:13:07 PM , Rating: 1
quote:
it's better not to talk about megahertz (700Mhz vs 3.2Ghz) but about system memory bandwidth that is 22.4GBps fox X360 and 25.6GBps for PS3 (which is not a big difference)
It's better not to talk about system memory bandwidth only this time. For X360, that 22.4GBps bandwidth is shared among CPU and GPU. Hopefully it has a very limited 10Mb eDRAM for GPU, but tjhe utilization of those eDRAM only limited on certain among of operations only.

Meanwhile PS3's 25.6GBps XDR-RAM bandwidth is completely dedicated to CPU, while another ~20GBps VRAM bandwidth is dedicated to GPU (RSX). As I said in previous reply, it is also possible for RSX to "TurboCache" memory from XDR to improve available VRAM bandwidth and size.

In the most extreme case, RSX can call data driectly from the L1 cache inside Cell's SPE which speed only limited to the bus connection between RSX and Cell, resulting another 35Gbps extra bandwidth! But again, like the eDRAM inside X360, not all the processing is suitable to "shared" into those SPEs. However, it's a truth that many PS3 title developer are investing around these to improve the thoughtput of PS3!


RE: At least, but....
By Sid7039 on 1/27/2007 2:27:59 PM , Rating: 2
Althought I still think that we shouldn't talk about megahertzes when comparing X360 and PS3 memory performance, generally it looks that you are right :-)


RE: At least, but....
By saratoga on 1/27/2007 8:04:09 PM , Rating: 1
The DDR in the 360 will probably have much better latency then the XDR in the PS3 though, so it'll likely be able to make much better use of it's memory bandwidth.

IMO they're pretty evenly matched, provided you can use the eDRAM on the 360 and that you don't need more then 256/256 on the PS3.


RE: At least, but....
By KernD on 1/28/2007 10:56:26 AM , Rating: 2
The eDRAM is used for render-targets only, you render to eDRAM and you resolve to a texture in video ram(multi-sampling if you want), then you can present the texture, or use it for rendering some more.

All the system ram is enough to store enough high res texture, and you can do some streaming with any of the 6 hardware thread you have.

Developing on the 360 is easy, you get great tools from microsoft, like PIX. The only thing I hate is that you can't remote debug on a test kit, you need devkit for that, and they cost way more, so we mostly have test kits at work.


RE: At least, but....
By DingieM on 1/29/2007 4:28:44 AM , Rating: 2
Xbox360's total RAM is one piece of 512MB and not split between video subsystem and main processor system. It's the PS3's memory architecture that is conventional thus split between dedicated main RAM and video RAM, hardware wise.
This means that the Xbox360 CPU and GPU can have full access to the whole "flat" 512MB. Even so its GPU can write and read DIRECTLY to the cache.


For Future Games Only
By jtesoro on 1/27/2007 6:29:26 AM , Rating: 2
This kinda helps for upcoming games, but does nothing for those existing right now. Oh well, it might not be so bad considering the number of PS3 games in the market at this point.




RE: For Future Games Only
By hadifa on 1/27/2007 7:27:59 AM , Rating: 2
Unlike the previous generation of consoles , developers can patch and add to their games now. Look at saint row (xbox360) for example.


RE: For Future Games Only
By sxr7171 on 1/27/2007 2:49:11 PM , Rating: 3
All that means is that they can get sloppy with their coding and release an unfinished product knowing that they can just patch it later. Console gaming really should just be about buying the game and the game working right from day one, not about waiting for patches to get things working right.


Pausible explanation
By borowki on 1/27/2007 7:26:50 PM , Rating: 3
From the forum at Beyond3D:

quote:

Folks, "horizontal scaling" does not require what people conventionally think of as a scaler. VGA cards and videogame consoles have been doing "horizontal scaling" for decades now. That's how 720x480-, 640x480-, 512x448-sized buffers from the PS2 all had the same horizontal size (more or less) on your TV. To "horizontally scale" your video output, you merely have to adjust the ratio of the RAMDAC's pixel clock to the horizontal scan rate. These are fundamental registers in every video circuit.

So, sorry, news of this "horizontal scaling" says absolutely nothing about the presence of a usable scaler in the PS3.

Phat




RE: Pausible explanation
By saratoga on 1/28/2007 5:17:46 PM , Rating: 2
I think that only applies to analog outs. If you're using digital out, there is no RAMDAC, so I think you'd still need a scalar. Though to be honest, I don't know much about analog displays.


RE: Pausible explanation
By psychobriggsy on 1/29/2007 9:35:05 AM , Rating: 2
That doesn't work for HDMI.

Most likely the 'hardware' scaler in the PS3 is some latent functionality in the southbridge (a variant of Toshiba's Cell Companion Chip) whereby it can read in some graphics data serially, and transmit out a stretched version of that data - that only requires a few small buffers rather than a full framebuffer (and inherent latency). Vertical scaling may come in time, but that will require more buffer space (e.g., to scale 4 lines of graphics data into 6 lines (720P -> 1080P) or 4->3 (720P -> 1080i)).

The question is about the scaling algorithm. Nearest Neighbour? Bilinear? 960x1080 -> 1920x1080 could be done with simply doubling the output, but the scaling system also supports 1280x1080, 1440x1080 and 1600x1080, so the scaling is a bit more advanced than that.

Note that games can now natively support 720P and 1080P with only a 10% difference in rendered pixels / framebuffer memory for the latter (960x1080). A simple menu option and you won't have any more forced-480P games on old 1080i HDTVs.


Bad....
By achintya on 1/27/2007 6:42:20 AM , Rating: 2
quote:
As can be deduced, rendering at the rather non-standard 960x1080 sacrifices horizontal resolution in exchange for a increased vertical resolution and compatibility with 1080i sets.


This looks bad. If the ps3 will render games at 960x1080, that will spell much decreased resolution for the games, if you're upscaling it to something like 1920x1080. This could lead to a big bad issue when competing with x360 games in graphical quality.

And current users will still have to suffer till the newer games start coming out-which i guess will take approx a year or so minimum.




RE: Bad....
By koss on 1/27/07, Rating: 0
RE: Bad....
By tkSteveFOX on 1/29/2007 3:26:20 AM , Rating: 2
Keep in mind that the Xenon is more powerfull than the RSX.As far as the Cell vs the XBX360 cpu goes the cell is more powerfull.This is sony`s trademark from the original PS to PS2 and now PS3 , to build a machine with a much more powerfull cpu with a not so great GPU.And so far it`s working for them but who knows.The future is suggesting that you need a much more powerfull GPU than a CPU.After all the highest end GPU if used as a CPU would be 5-6 times more powerfull than the current top CPU.


RE: Bad....
By Hawkido on 1/29/2007 4:54:43 PM , Rating: 2
quote:
If the ps3 will render games at 960x1080, that will spell much decreased resolution for the games, if you're upscaling it to something like 1920x1080. This could lead to a big bad issue when competing with x360 games in graphical quality.


LOL, read between the interlaced lines, man. This is just yet another rendering resolution to support the HDTVs without native 720P support. If your TV can display 1920X1080 then the PS3 will use the 1080P format, if available on disc; if not, then it will use the 720P format.

Remember, this fix is ONLY for the early adopters of HDTV that got screwed, because the early sets couldn't support 1080P due to poor technology, and 720P hadn't become recognized as a broadcast standard at the time.


Coding into games...
By daftrok on 1/27/2007 1:05:43 PM , Rating: 2
With at least 20 GB at their disposal, is it remotely possible to introduce a 1 GB patch for all the games to fix the scaling issues? I mean, it isn't that many games and the future games (with this knowledge NOW) can fix the problem and allow some form of scaling. I'm glad that they are trying SOMETHING but I think those poor souls without 720p would breathe alot easier if this becomes a reality. This issue (and the price) are the only two issues concerning the ps3 now. My hopes for the future is 480i/480p/720p/1080i/1080p scaling for all DVDs, PS2 games, Blu Ray Disc games, PS3 games and ps1 games (if possible) sometime this year via HDMI and component. Keep your fingers crossed!




RE: Coding into games...
By daftrok on 1/27/2007 1:06:38 PM , Rating: 2
Blu Ray Disc MOVIES*


What a shame .....
By kilkennycat on 1/27/2007 3:08:06 PM , Rating: 2
... that Sony will have to upgrade the hardware so soon after shipping the PS3. Should result in some very unhappy "early adopters"... such is life... If Sony had studied the Xbox360 hardware design and the installed base of HD-TVs in the US more closely, appropriate changes to the PS3 hardware would have been made before shipping the product. Another symptom of Sony arrogance. The proposed software workaround will result in some nasty and perceptually obvious jaggies -- aberrations in horizontal detail are perceptually far more obvious than vertical. No software hack is going to solve the current upscaler problem in a satisfactory way.




RE: What a shame .....
By cplusplus on 1/27/2007 4:31:58 PM , Rating: 2
quote:
The proposed software workaround will result in some nasty and perceptually obvious jaggies -- aberrations in horizontal detail are perceptually far more obvious than vertical. No software hack is going to solve the current upscaler problem in a satisfactory way.


This is not a software workaround in the way that you seem to think it is. The processing time doing this should be almost no different than the processing time currently spent downgrading 720p games to 480p. From reading the forums on Beyond3D, it looks as if what sony is doing is adding functionality via the SDK to a chip that is already in the system right now, and that the Cell is not computing this stuff directly. It also says there the exact opposite of what you said, that abherrations in vertical detail are more noticable than horizontal.


What a shame .....
By kilkennycat on 1/27/2007 3:08:07 PM , Rating: 2
... that Sony will have to upgrade the hardware so soon after shipping the PS3. Should result in some very unhappy "early adopters"... such is life... If Sony had studied the Xbox360 hardware design and the installed base of HD-TVs in the US more closely, appropriate changes to the PS3 hardware would have been made before shipping the product. Another symptom of Sony arrogance. The proposed software workaround will result in some nasty and perceptually obvious jaggies -- aberrations in horizontal detail are perceptually far more obvious than vertical. No software hack is going to solve the current upscaler problem in a satisfactory way.




RE: What a shame .....
By kilkennycat on 1/27/2007 3:14:48 PM , Rating: 2
... oops. Apologies for the double posting. I accidentally double hit the "Post Comment" box. Seems as if this triggers a double posting...

BTW: The after-post editing facility on DailyTech is non-existent. I refer the DailyTech folk to the people over at "The Tech Report" on how to handle all facets of this comment-posting business properly.


"DailyTech is the best kept secret on the Internet." -- Larry Barber

Related Articles













botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki