backtop


Print 66 comment(s) - last by tastyratz.. on Apr 4 at 11:03 AM


A comparison of bitrates between FiOS and Comcast reveals signals compressed up to 38%.  (Source: AV Science Forum)

Red Hot Chili Peppers live: crisp and clean on one side, a blocky mess on the other.  (Source: DViCE/AV Science Forum)
Comcast tries to fit three HDTV channels in the space of two

HDTV aficionados with Comcast service might be in for a rude awakening: the nation’s largest cable provider seems to have ratcheted up the compression on its cable HDTV signals.

A thread at AV Science Forum updated last Monday details what appears to be compression of up to 38%, allowing Comcast to deliver more HDTV channels per line while using the same amount of bandwidth. A side effect of this, however, means that HDTV’s pristine video is now jagged and muddy for Comcast customers, full of MPEG-style compression artifacts and stuttered movement:

For the most part, fine detail remains very good on static (non-moving) images with Comcast's added compression, but you do see reduced contrast, with more dithering artifacts (banding) between colors and objects. With some channels, it looks a bit like Comcast is taking a 24-bit image and reducing it to 18-20 bit. This tends to reduce the 'pop' effect in some images. The difference in 'pop' was quite noticeable on Food HD, despite the relatively small bitrate reduction.

The greatest differences are seen with movement. With slow movement on Comcast, the first thing you notice is added noise and a softer image, as fine detail is filtered from the picture signal. The greater the rate of movement, the more detail you lose and the more noise you see. With intense movement, you see more blocking and skipped frames. In VideoRedo, I noticed that a number of frames in the FiOS signal simply did not exist in the Comcast signal during motion intensive scenes. This may be responsible for the stutter and excessive motion blur seen with some video sequences on Comcast.

Still images comparing Verizon’s FiOS HDTV service with Comcast’s HDTV service, taken at the exact same time in the exact same broadcast, show Comcast’s images losing much of the legendary detail that HDTV is so well known for – in a screenshot  of the Red Hot Chili Peppers playing live in Milan, the Comcast image was almost completely stripped of all fine-grained detail; lead singer Anthony Kiedis’ textured wristband becomes flat and blocky, and the tattoo on his left arm made pixellated and blurry.

A request for comment was received by Comcast, but not replied to.

The purpose of Comcast’s increase in compression is unclear; however it would appear that the company is attempting to fit three HDTV video streams inside of one QAM (Quadrature Amplitude Modulation, Comcast’s DTV broadcast format) signal, as opposed to the previous two. In a bitrate comparison between each provider’s broadcast of the same show, the Verizon signal was recorded at 17.73 Mbps, while the Comcast signal recorded at 13.21 Mbps, a 34% reduction in size.

According to Ken Fowler, the A/V buff known as “bfdtv” at AV Science Forum, Comcast’s compression increase currently affects most customers that were not originally in Adelphia’s cable system, which Comcast purchased in 2005. Further, the increased compression only affects national networks like A&E or HBO; local TV signals are rebroadcast at whatever bitrate they were originally sent in.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Don't really care...
By mpjesse on 4/1/2008 10:57:08 PM , Rating: 2
Listen... if the choice is higher quality HDTV picture and less channels OR slightly lower quality HDTV picture and more channels, I'll take more channels anyday. I'm soooo thankful to have Sci Fi in HD... and just in time for Battelstar!

I'm a Comcast customer with all those channels in the article. I'm watching everything on a 1080i 46 inch HDTV and I could not tell that other channels had reduced quality when Comcast added SciFi and a number of other channels.

Admittedly I'm not a videophile, but I do know the difference between good and crap. By no stretch of the imagination is any of Comcast's HDTV signals crap. Reduced quality? Perhaps. But definitely not crap. Also consider that a lot of the content on any given HDTV channel is filmed or broadcast in 720p, not 1080i and definitely not 1080p. Most reality TV shows are filmed in 720p and some big budget stuff is done in 1080p but rebroadcast in 1080i. Point is this: i don't think a 34% bit rate reduction is going to be as discernable in the majority of HDTV shows out there. Certainly anything shot in 1080 and broadcast in 1080 will suffer, but that's probably around 50% or less of programming.

You guys want crap? Give Brighthouse a go. They've got more frikin signal drops than a satellite radio receiver.




RE: Don't really care...
By Belard on 4/2/2008 5:13:27 AM , Rating: 2
Uh it does matter.

The examples shows in the screenshot at the beginning of this article shows a HORRIBLE picture - far worse than SD quality.

Digital artificats is BAD enough on "digital" cable, and its gonna suck when they pull my Analog line in 2009 (perhaps they won't - I dunno) but I have both digital and Analog - and yes, I have 500 digital channels with pixelation that I don't get on my 70channel analog in my bed room.

But what is shown above is pure crap, its pointless to even both watching. What's the point of having HD in 1080 if its crap?

The future is 1080, 720 is the low end, its scaled to fit...


"There is a single light of science, and to brighten it anywhere is to brighten it everywhere." -- Isaac Asimov











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki