Print 59 comment(s) - last by andrep74.. on Aug 21 at 1:53 PM

The war of words continues

It's not uncommon for companies to downplay features that its products don't have in comparison to its competitors. Companies will usually talk about how customers won't likely use the feature or how it's not cost-effective to implement such features at a certain price point. Microsoft has been on the defensive with its XBOX 360 for quite some time. The company has defended its lack of a built-in next generation DVD drive, it has downplayed the importance of HDMI with next generation games and now it is saying that 1080p doesn't really matter in the grand scheme of things.

Andre Vrignaud, Microsoft's Director of Technical Strategy for XBOX Live says that 1080p is mostly hype with the PlayStation 3. He goes on to say that 99% of the PS3 games released will be rendered at 720p while the only ones with 1080x native support will be arcade ports or games that sacrifice in-game effects. Here's a snippet from Vrignaud's Ozymandias Blog:

The PS3 has roughly the same pixel-pushing capabilities as the Xbox 360. Don't need to take my word for it, it'll be obvious soon enough over the next year. Even if this wasn't the case, consider we now live in a multi-platform development world, and that the current sweet spot developers are targeting is 720p due to the extremely similar system specifications. Simply put, a developer who is planning to release their game for both the Xbox 360 and the PS3 will aim for a common attainable ground. In fact, I'll stick my neck out and predict that that you won't see any 1080"x" games for the PS3 this year.

Vrignaud goes on to cite Home Theater Magazine's Geoffrey Morrison as validation for his criticisms of 1080p on today's consoles. Since Vrgnaud notes that 99% of PS3 games will render at 720p, that leaves the argument for 1080p with movies. "In this case, the only difference between 1080i and 1080p is where the de-interlacing is done. If you send 1080i, the TV de-interlaces it to 1080p. If you send your TV the 1080p signal, the player is de-interlacing the signal. As long as your TV is de-interlacing the 1080i correctly, then there is no difference," said Morrison.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: De-interlacing
By Trisped on 8/16/2006 11:58:20 AM , Rating: 2
PC monitors don't do up conversion, but the computers do. Just take a small movie file and open it in Windows Media Player and set it to scale with the size of the window, then adjust the window so it is really small and really big. You will see an amount of distortion, but not squares that are now made up of 4-16 pixels.

The problem with up converting is that the video processor must guess what the missing pixels are supposed to be. If you read some of the Anand Tech reviews of video cards they talk about it, and the older review have comparisons that clearly show the differences.

So what does this all mean for PS3? If movies are recorded in 1080p, then PS3 will have a better picture. It may not be a huge improvement over an unconverted 1080i, but it will be clearer and cleaner. But, with the HD movies compressed so much already, and designed to output 720p and 1080p it is possible that some corners were cut, so a 1080p signal may not be a true 1080p recorded signal. In this case it would probably fall somewhere between a true 1080p picture and an up scaled 1080i image quality.

What does this mean? If you have a display device capable of receiving and displaying a 1080p signal, and you are buying movies formatted in 1080p, then you will receive a benefit. If you are missing any of those, the you will find no advantage.

RE: De-interlacing
By DallasTexas on 8/16/06, Rating: -1
RE: De-interlacing
By masher2 on 8/16/2006 12:37:35 PM , Rating: 3
> "The problem with up converting is that the video processor must guess...if movies are recorded in 1080p, then PS3 will have a better picture. It may not be a huge improvement over an unconverted 1080i"

You're confusing upconversion with deinterlacing. A signal digitized properly can be deinterlaced with zero loss of image quality. Upconversion is a different matter entirely, and involves interpolation of missing pixels.

You can run into cases where a video stream has been improperly encoded (particularly when performing 3-2 pulldown to an NTSC signal) that will cause the video processor to improperly deinterlace and cause some loss of image quality, but this isn't an issue in the general case.

RE: De-interlacing
By glennpratt on 8/16/2006 12:47:45 PM , Rating: 2
Deinterlacing is never perfect as far as I know. Matching odd and even fields from adjacent frames is extremely likely to produce artifacts, even if it is just ocasionally or not noticeable.

RE: De-interlacing
By masher2 on 8/16/2006 1:21:52 PM , Rating: 3
Deinterlacing is perfect when you're reinterweaving material interlaced through 3-2 pulldown. You're just reversing the process, so you wind up with data identical to the original.

When you're dealing with material interlaced for broadcast via NTSC or otherwise (or material improperly marked as such) its a whole different ballgame. In that case, you're going to get interlaced artifacts galore, as the video processor has to guess which fields go with which frames.

RE: De-interlacing
By leonowski on 8/16/2006 1:53:27 PM , Rating: 2
One thing to keep in mind about the 1080p NTSC standards:

Unlike 720p, 1080p does not have a 60 FPS mode. 1080p runs in 24 and 30. After de-interlacing, a 1080i signal should equal a 1080p signal. You shouldn't see any difference.

While I am all for improvements in technology (as we all are), I don't think Sony has anything better to offer than Microsofts HDTV support. The buzzword "1080p" is used way too often by people who haven't seen all the formats.

RE: De-interlacing
By Trisped on 8/17/2006 2:54:14 PM , Rating: 2
A deinterlaced 1080i video is not the same as a native 1080p video. When the video is interlaced half the data is thrown out. It is gone, never to be found there again. Yes, you could go back to the original, but that is not what you have now. For example. Consider a video where half the scan lines are blue and the other half are yellow and where the color alternates each frame (so first frame will be green, yellow, green, yellow and the second frame is yellow, green, yellow, green and the third is the same as the first). An interlaced video will only pull one of the colors. As a result the deinterlacer will examine the picture, see that everything is the same color, and guess that that is what the picture is suppose to be. Then your flashy green screen turns into a yellow or blue one. While this is an extreme example, it does happen to a lesser extent with normal TV images.

Oh, and we are not talking about the over air transmission of HDTV, we are talking about HD movie play back, so the 1080p30 format does not apply. Also, NTSC does not have a 1080 format. That is ASCL or something like that. Same people, different name.

"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki