backtop


Print 70 comment(s) - last by augiem.. on Aug 17 at 5:55 PM

New compression standard could be in commercial products as early as next year

The Moving Picture Experts Group (MPEG) met recently to issue a draft international standard of a new video compression format offering twice the performance of current standards. The new video compression format is called High Efficiency Video Coating or HEVC. The new H.265 compression codec is roughly twice as effective as the current H.264/AVC standard.
 
“There’s a lot of industry interest in this because it means you can halve the bit rate and still achieve the same visual quality, or double the number of television channels with the same bandwidth, which will have an enormous impact on the industry,” says Per Fröjdh, Manager for Visual Technology at Ericsson Research, Group Function Technology, who organized the event as Chairman of the Swedish MPEG delegation.
 
H.265 could usher in ultra high definition television with significantly more clarity than the 1080p we have today. The new compression format will also significantly reduce the bandwidth required for streaming video on mobile networks where wireless spectrum is at a premium. The format will pave the way for wireless carriers to offer more video services within the confines of their available spectrum.
 
“Video accounts for the vast majority of all data sent over networks, and that proportion is increasing: by 2015, it is predicted to account for 90 percent of all network traffic,” Fröjdh says.
 
He believes that the HEVC format discussed during the meeting in Stockholm could find its way into commercial products as early as 2013.
 
“It will take time before it’s launched for a TV service, but adoption is much quicker in the mobile area, and we’ll probably see the first services for mobile use cases next year,” Fröjdh added.

Source: Ericsson



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

You know what would be even more helpful?
By inperfectdarkness on 8/16/2012 9:46:22 AM , Rating: 2
If we could get the MPAA to agree to a 30fps standard for all movies...thereby eliminating the need for 120fps televisions as the lowest whole-integer common denominator for 24fps and 30fps. This would allow 120fps tv's (or even 60fps tv's) to effectively present 3D images, rather than requiring 240fps. It would also allow 240fps tv's to potentially be "upgraded" to glasses-less 3D for a broad spectrum of viewers (technology that is estimated to require 340fps displays currently).




RE: You know what would be even more helpful?
By Nortel on 8/16/2012 10:04:09 AM , Rating: 2
I've wondered about this for the last 10 years. I've heard many snobs say "24fps" looks more 'film' like but it doesn't, it just looks more jerky. Who wouldn't want to see the smoothest possible video? Movies now a days should be shot at 120fps to be equal to the devices playing them back, they have the technology to do so!


RE: You know what would be even more helpful?
By bug77 on 8/16/2012 10:09:25 AM , Rating: 1
Historically, 30fps was used in countries using 60Hz AC and 24fps in countries 50Hz (it's about half the frequency). I suppose it eased some stuff back in analog equipment age, but now...
As for video shot at 120fps, who would deliver that?


RE: You know what would be even more helpful?
By crimsonson on 8/16/2012 10:30:57 AM , Rating: 5
Incorrect. 24 is a film creation. It was chosen mostly for sound reasons. At 24 fps, audio playback, which is placed on track of the same film was good enough compared to 18 fps and previous frame rates.
24 has nothing to do with 50 Hz.


RE: You know what would be even more helpful?
By Dorkyman on 8/16/2012 12:08:59 PM , Rating: 2
Ah, but there is a very powerful lobby that is happiest with film remaining at 24, because of the ease of showing such product in any 50Hz country (or more accurately, any country having adopted a 50Hz field rate for video).

The industry has already argued these matters exhaustively; 24, 30, 60. Back when the HDTV standards were hammered out in the late 1980's there were two camps--50 fields per second and 60. And that's where things remain.


RE: You know what would be even more helpful?
By amanojaku on 8/16/2012 12:55:33 PM , Rating: 2
Guys, you're talking about two different things:

1) The PAL standard has a frame rate of 50 fps interlaced (often listed as 25 fps, but not the same as 25p ), which synchronizes with the 50Hz electrical signal. 25p is the PAL equivalent of NTSC's 24p.

2) NTSC 24p is used in American film for a variety of reasons, and is not the same as PAL. Back in the day, video was hand-cranked, so you got slow, irregular frame rates. Estimates were 8-16 fps. When video was automated and sound was added, playback varied from theater to theater, usually between 12-24 fps. Sound was played on a record and had to be synchronized with the video.

It made sense to standardize a rate to synchronize video and audio, so a survey was done (I think by Western Electric) of all the movie theaters. It turns out, the larger theaters had a higher frame rate. The larger the theater, the faster the audio needed to be played back. I think it has something to do with the speed at which sound travels: the further you get from the source, the more the sound distorts and looses synchronization with the video. Faster playback distorts less over the same distance. I don't have a record player, so I can't test this out. 24 fps was found to be adequate for most theaters.

Why did the smaller theaters go with a slower frame rate? The machines broke down less frequently at slower speeds.


By amanojaku on 8/16/2012 1:10:08 PM , Rating: 2
Ugh, 24p is not an NTSC standard, is is a film standard, sorry. NTSC is a 29.97 frame rate. Film transferred to NTSC is a 23.976 refresh rate. Thank the flying spaghetti monster for digital and its whole numbers!


By crimsonson on 8/16/2012 2:16:24 PM , Rating: 3
quote:
1) The PAL standard has a frame rate of 50 fps interlaced (often listed as 25 fps, but not the same as 25p ), which synchronizes with the 50Hz electrical signal. 25p is the PAL equivalent of NTSC's 24p.


Some mistakes here.
There is no such thing as 50 fps interlaced, or at least in the current broadcast standards. 50i refers to 25fps at interlaced scanning. SMPTE for all their wisdom decided to make the convention as follows, if "i" follows a number, the number will indicate FIELDS. If a "p" follows a number then it will indicate FRAMES. 2 FIELDS equals 1 FRAME.

50i = 25 fps interlaced (aka PAL)
50p = 50 fps progressive
60i = 30 fps interlaced (more commonly known, but not 100% accurate as 29.97 fps) - NTSC
60p = 60 fps
23.976 = is 24 fps with the NTSC .01 factor added. Basically in order to maintain a more coherent and "simpler" relationship with 29.97 fps, 23.976 (aka 23.98) was created. This is "24p" for the BROADCAST video world.
24p (24fps) = is true 24fps originated from FILM production. Created after film with sync sound.

I don't blame any one for confusing the matter as engineers and SMPTE members themselves are often wrong and often cannot even explain why such things.

"NTSC" are often referred to frame rates compatible with the 60 Hz cycle and "PAL" for 50 Hz cycle. Though technically both terms refer to something more than frame rate.

Bonus point: NTSC .01 factor was added to make color and B&W transmission compatible with each other.


By Jeffk464 on 8/17/2012 9:41:35 AM , Rating: 2
By the way, OLED TV's don't have the same blurring issue with 24fps that LCD TV's do. My understanding is that they display motion pretty much about the same as plasma. This might really help solve this issue instead of going to the higher frame rate.


By zephyrprime on 8/16/2012 3:35:45 PM , Rating: 2
The argument will be settled ad hoc in the field rather than by useless committees. Pretty much everything is going 30fps so I expect film will do so to eventually.


RE: You know what would be even more helpful?
By augiem on 8/16/2012 2:25:38 PM , Rating: 3
quote:
Who wouldn't want to see the smoothest possible video?


I for one.

Have you ever watched a show in 120/240Hz interpolation? I almost fell down laughing watching Xmen 3 on a TV at BestBuy. It looked SO fake and tacky. Every CG effect went from being almost believable to suddenly it looks like it's straight out of Maya's viewport. As a 3D artist myself, it was a little heartening as the CG didn't seem as out of reach in complexity as it did before, but as a viewer it was distracting and horrendous. Watching Magneto prance around in his costume all the sudden looked like someone dressed up in a bad $20 halloween costume. Not to mention the whole show had that cheap 80's soap opera video camera feel.

It's amazing what an effect frame rate can have. The lower frame rates of movies helps to hide the flaws so you can concentrate on the action/story and not so much on the visual details.


By EnzoFX on 8/16/2012 3:08:03 PM , Rating: 2
I think a real enthusiast prefers 24p for this reason. People need to not get caught up on numbers, sure they sound better, but it's a whole different show.


By Jeffk464 on 8/16/2012 3:53:23 PM , Rating: 2
Yes but is this caused by the interpretation computations or would it look the same shot at 60fps and played back at 60fps?


RE: You know what would be even more helpful?
By someguy123 on 8/16/2012 5:31:47 PM , Rating: 2
That's just your own perception. If 60fps capture and delivery were the standard and we'd dropped to 24fps you'd be complaining about the substantial motion blur and general choppiness, especially when people convert down high framerate shots to film standards while trying to increase visible detail in high action scenes. It seems like you were watching completely interpolated video anyway. Televisions don't actually support 120hz (just LED flicker), and I don't believe xmen 3 was shot at 60fps.


By augiem on 8/17/2012 4:18:14 AM , Rating: 2
quote:
If 60fps capture and delivery were the standard and we'd dropped to 24fps you'd be complaining about the substantial motion blur and general choppiness


Untrue. I'm already very used to extremely high frame rates -- it's called gaming. Also, real life has no frame rate limit. The degradation in quality has nothing to do with being used to seeing 24 fps as the norm.

The higher frame rate makes it VERY easy to pick out tiny details from the image and motion. There is a reason most action films tend to have violent camera shake -- it distracts you from being able to see what's going on and helps make the VFX look more believable. It is absolutely intentional. The fidelity of CG VFX is nowhere near good enough to stand on its own yet without all the tricks of the trade (did I mention I have a degree in 3D graphics?) like added motion blur, camera shake, tons of particles, and lower framerates. The higher framerate allows you to more clearly see through all these smokescreens and see things as they really are. And frankly, the costumes and CG really ARE crappy looking.

Yes, Xmen was interpolated, but there's no reason to believe a movie shot natively at that frame rate would look substantially better. Interpolation can do a very good job at filling in the blanks between frames of video. If anything, the interpolated video is probably making it look better than it would natively by again, masking some of the visual flaws a little bit.


By augiem on 8/17/2012 4:41:45 AM , Rating: 2
quote:
Televisions don't actually support 120hz (just LED flicker), and I don't believe xmen 3 was shot at 60fps.


I'm not sure what you're getting at here. From the information I can find, the LCD panel does indeed refresh at 120Hz, but it does not accept a 120Hz signal.


By guffwd13 on 8/17/2012 9:16:41 AM , Rating: 2
It doesn't matter if its interpolated or not. My cameras shoot home videos at 30fps and when I play them back they look like home movies not because they aren't post-processed or refined, but solely because of the frame rate.

It all comes down to stylistic approaches. I want movies to appear far what they are - an escape into another world/story that involves a certain amount of craft and artistry to as would a painting or fine art sculpture. In other words it is another medium to present art. The faster the framerate, the closer to what we see with our own eyes (up to 60 fps), the less fantastic it feels and thus the more the separation between real and story disappears.

Newscasts can be filmed at 60 fps. They look silly, but nevertheless I don't care. I have a TV capable of 960Hz (yes, interpolated) and can accept 120Hz signals (it has to cause its 3D - ie 60 hz per eye) and I still have it forced to 24 fps no matter what cause I can't stand what it looks like any faster. Not because someone just told me to.

I realize I'm not everyone, and everyone's opinion is equally valid (so long as they've seen the difference between the two - if they can't tell then don't tell others what the standard should be), but that's where I stand. If the 24fps standard is changed, I'm going to enjoy movies significantly less.


RE: You know what would be even more helpful?
By Silver2k7 on 8/17/2012 4:38:12 PM , Rating: 2
You got to separate something recorded in high fps
from something that is interpolated to higher fps.. there is a difference, in one scenario the frames in between exists in the other they are just faked =)

"Who wouldn't want to see the smoothest possible video?"

"I for one.

Have you ever watched a show in 120/240Hz interpolation? I almost fell down laughing watching Xmen 3 on a TV at BestBuy. It looked SO fake and tacky"


By augiem on 8/17/2012 5:55:30 PM , Rating: 2
I understand that. See my explanations in the posts above.


By augiem on 8/16/2012 2:30:48 PM , Rating: 2
Oh, and don't forget the added cost of having to render out the effects at 5x the framerate. That'd be a huge cost increase right there.


RE: You know what would be even more helpful?
By amanojaku on 8/16/2012 10:07:41 AM , Rating: 5
It seems like you haven't been paying attention to sales figures. No one gives a fig about 3D. I'll take the reduced bandwidth, thank you. I could give a crap about cable TV, but once this gets to PCs it will be beneficial in light of bandwidth caps on domestic and mobile broadband.


By Jeffk464 on 8/16/2012 3:46:58 PM , Rating: 2
I've got mixed feelings about it, I think it can be cool for some movies in the theater. But in the home theater, eh.


RE: You know what would be even more helpful?
By crimsonson on 8/16/2012 10:42:31 AM , Rating: 1
They will never do that. It goes "beyond" being a snob.

The aesthetic connotation of 24 fps for the world is something very hard to replicate with 30 fps. We associate 24fps for cinematic work. Many have tried to shoot dramatic movies at 30fps even at 60 fps. The effect is not the same. It is a subconscious effect for many viewers. You can simmilate the difference now by watching HBO or Showtime with your TV 120 Hz mode on and off. You will notice the difference.

Second, at higher frame rate cost increases regarding production. It was more so with film but with today's digital process (and cost are calculated per frame) I am sure price difference between frame rates will remain to exist. Imagine the cost of Lord of The Rings with 25% cost higher in renders, compositing, labor, etc.

Third, distribution. Until a standardized digital distribution is found for theaters it would be difficult to change the current film projectors that are in all movie theaters. There are major inroads happening now, but no local movie theater is going to pay for add and maintain new equipment unless compensated. Are we willing ot pay for 25% price increase?
At 25% less frames, digital distribution via file or optical is also cheaper at 24 fps.

There was and currently a push to go to 48fps (peter Jackson). It solves the aesthetic argument but makes the economic argument worse, not better.

For the most part, this has nothing to do with MPAA BTW. Just simple artistic and economic issues.

Unless TV manufacturers offer to pay for the cost to go higher frame rates, your idea will fall on deaf ears.


By TakinYourPoints on 8/16/2012 12:12:13 PM , Rating: 2
One reason people have rejected higher framerates in narrative for so long, at least outside of found footage films, is because of its increased realism. It is something that turns plausible fictional reality into sets, costumes, lights, etc.

It is the difference between watching hobbits and dwarves, and watching grown men play dress-up.


RE: You know what would be even more helpful?
By FaceMaster on 8/16/2012 4:05:15 PM , Rating: 2
For a tech site a lot of people seem against higher frame rates. 'It looks too real!' 'it gives away the flaws!' ...then CGI and sets will have to get better. Revealing the flaws in the scene isn't something I have a problem with, HD over SD, colour over black and white. Film it at 60, then lop out every other frame for the movie enthusiasts. I'd take the 60 any day, thank you.


By Reclaimer77 on 8/16/2012 9:02:49 PM , Rating: 2
This isn't a "tech site" issue. We're not talking CPU's where you ALWAYS want a faster one. 24fps gives a cinematic experience. The goal of movies isn't to make them as "real" as possible.

Why do you think they add motion blur and film grain effects in video games, despite their higher frame rate?

Look at peoples reaction to seeing the Hobbit screened at 48FPS. Not good at all.

"It looked completely non-cinematic. The sets looked like sets. I've been on sets of movies on the scale of The Hobbit, and sets don't even look like sets when you're on them live... but these looked like sets. The other comparison I kept coming to, as I was watching the footage, was that it all looked like behind the scenes video. The magical illusion of cinema is stripped away completely."

nuff said


By TakinYourPoints on 8/16/2012 11:04:03 PM , Rating: 2
60hz has been with us for decades and for narrative work it looks ridiculous. It is the difference between believalbe fiction and behind-the-scenes footage. The difference between the actual cameras they use for filming and the BTS cameras is smaller than its ever been.

If you really like the look of cheesy realism where everything looks like sets and costumes and lights and makeup, go for it, but I'd have to question your taste in general. Its the same "smooth motion" nonsense that they build into some LCDs, so bad.


RE: You know what would be even more helpful?
By Jeffk464 on 8/16/2012 3:52:02 PM , Rating: 2
Yup, because people associate the motion blur with speed/action but this isn't how our eyes see motion in the real world.


By augiem on 8/17/2012 2:25:15 PM , Rating: 2
Really? Take a pencil and hold onto nee end of it in front of you. Now start waving the pencil up and down. Biological equivalent of motion blur related to speed. This is not forced conditioning, its nature.


By someguy123 on 8/16/2012 6:10:41 PM , Rating: 2
Many theaters have already installed new, upgraded equipment, like digital projectors, "high definition" digital projectors with higher framerate support like XD, and stereoscopic 3D projectors. Equipment swaps aren't holding anyone back, though they may increase ticket price.


RE: You know what would be even more helpful?
By Jeffk464 on 8/16/2012 2:37:07 PM , Rating: 2
"If we could get the MPAA to agree to a 30fps standard for all movies.."

Yes, 30fps is considered the base line for smooth motion. 24fps causes big motion blur effects especially for LCD tv's. The newer 60fps format would be great but requires an extremely high bit rate requiring a new disc format and extreme internet bandwidth.


By Jeffk464 on 8/16/2012 3:56:38 PM , Rating: 2
30fps is considered the base line for smooth motion in video games.

oops


Encoding times?
By aegisofrime on 8/16/2012 11:21:16 AM , Rating: 2
As someone who uses x264 on a daily basis, what I want to know is how this halving of bitrate affects encoding times. There's probably no point if encoding times were to double.




RE: Encoding times?
By int_21h on 8/16/2012 11:27:03 AM , Rating: 2
No doubt. This will be very valuable for things like netflix and other streaming functions, but the encoding/decoding may be prohibitive.


RE: Encoding times?
By titanmiller on 8/16/2012 2:05:02 PM , Rating: 2
Computing is dirt cheap now a-days.


RE: Encoding times?
By FaaR on 8/16/2012 11:32:19 AM , Rating: 3
You wouldn't use a codec that offers same visual accuracy at half the bitrate because encoding (something you do only once) would take longer, even though playback - which is repeated many many times - would benefit greatly?

That's ludicrous.

Besides, hardware encoding acceleration is becoming more and more common, and work will of course continue on this front as well as time progresses.


RE: Encoding times?
By SlyNine on 8/16/2012 11:39:12 AM , Rating: 2
Still haven't seen a good X264 encoder.


RE: Encoding times?
By SlyNine on 8/16/2012 11:39:33 AM , Rating: 2
hardware encoder that is.


RE: Encoding times?
By XZerg on 8/16/2012 2:29:13 PM , Rating: 2
that depends on how resource intensive the .265 is when decoding. if very intensive now then eventually hardware will be fast enough but that's what will slow down the adoption even when the compression efficiency is so great.


RE: Encoding times?
By augiem on 8/16/2012 2:48:33 PM , Rating: 2
Well it just makes sense, doesn't it? When did you ever see a new standard in technology come out where the whole playing field was ready to go running with it? Standards are always developed with a mind toward the future, as the next step of evolution. Then when that standard is no longer sufficient for the growing needs, a new one is developed that has some room to grow into.


RE: Encoding times?
By Sivar on 8/16/2012 12:53:13 PM , Rating: 2
You may find this interesting:

http://x264dev.multimedia.cx/archives/360

"After a full 6 hours, 8 frames had encoded. Yes, at this rate, it would take a full two weeks to encode 10 seconds of HD video."
Note this was talking about an experimental, non-optimized, pre-h.265 encoder, but it gives an idea of the degree of possible performance changes.

Reducing media encoding size at a given quality is usually an exponential problem, as you have probably discovered using "placebo" mode for h.264, that's pretty bad.


RE: Encoding times?
By TakinYourPoints on 8/16/2012 1:28:37 PM , Rating: 2
You know what else took obscene amounts of time to encode for a little while? mp3s.

The first encoders I used in the 90s were slowwwwwwww, ridiculously slow, but they got faster. Encoder performance will increase, give it time.


RE: Encoding times?
By Sivar on 8/16/2012 3:41:39 PM , Rating: 2
True, encoders will get better and faster, just as they did with h.264.
I'm willing to bet that those first, initial encoders would run quite fast on a modern processor, though. In fact, the Xing codec of the day was faster than LAME is now. Of course, it produced offensively awful sound quality.


RE: Encoding times?
By bupkus on 8/16/2012 2:18:47 PM , Rating: 2
quote:
"After a full 6 hours, 8 frames had encoded. Yes, at this rate, it would take a full two weeks to encode 10 seconds of HD video."

I'd be willing to contribute cpu cycles a la protein folding.


RE: Encoding times?
By Guspaz on 8/16/2012 5:16:03 PM , Rating: 2
Holy out of context quote, Batman!

He was testing one of the proposal submissions (from Samsung/BBC) for h.265, back when various organizations were submitting their proposals for what the standard should include. Not only was this not the actual h.265 spec, but the encoders for the submissions were all tuned to spend ludicrously large amounts of time doing exhaustive analysis to provide optimal image quality (and score higher in "competition").

The actual h.265 draft is a mixture of aspects from various submissions, and while it will be significantly more processor-intensive to encode than h.264, it won't be anywhere near that bad.

This is like judging the performance of x264 based on the "placebo" preset (which does exhaustive motion searches), when nobody actually uses that in practice and it's something like ten times slower. Except, instead of ten times slower, try like 10,000x slower, since they tuned every aspect of their submission for quality over performance.


RE: Encoding times?
By Sivar on 8/16/2012 6:21:07 PM , Rating: 2
You are totally correct, and I did mention this:

quote:
Note this was talking about an experimental, non-optimized, pre-h.265 encoder


As an aside I do, in fact, use x264 placebo mode and have encoded over 1,000 hours of video doing so. I only get 1 - 3% smaller file sizes, but weeks usually pass between buying and watching a video, so I prefer the 30GB or so of total disk space savings.

Also of minor technical interest is that placebo mode it doesn't actually try every possible predictor combination ("exhaustive"); it does a Hadamard–Rademacher–Walsh comparison (sum of differences between each motion vector) which has the same result but is much faster (though still slower than a dead slug stuck in frozen molasses).


RE: Encoding times?
By Jeffk464 on 8/16/2012 3:50:23 PM , Rating: 2
It also means you need more cpu horsepower to decode/play the video. My E350 system can't handle hulu or netflix in HD, but does just fine on 1080p broadcast or blu-ray.


RE: Encoding times?
By Guspaz on 8/16/2012 5:23:15 PM , Rating: 2
That is a factor of your hardware acceleration being broken, not CPU horsepower. There are known issues with Brazos (the E350 is a Zacate chip in the Brazos family) and Silverlight, which Netflix uses, even though Brazos supports VC-1 acceleration (the codec Silverlight uses).

Blu-rays, which are typically encoded with h.264 and sometimes VC-1, would work fine with hardware acceleration.

When using commercial video playback software on a modern computer, it's very rare that you're actually decoding anything in software. Heck, even with MPC-HC, it'll default to DXVA hardware acceleration if supported.


Consumer demand?
By danjw1 on 8/16/2012 11:19:55 AM , Rating: 2
While I know that the electronics manufacturers all want to push consumers to upgrade there hardware as often as possible, I don't know consumers really want higher resolution. Heck, Blu-ray adoption is still way behind DVD. So, why does the industry feel like consumers want higher resolutions? I doubt many consumers will be able to tell the difference.

Don't get me wrong, I like the additional compression capability. I just don't see that reasoning of increased resolution particularly compelling.




RE: Consumer demand?
By FITCamaro on 8/16/2012 1:50:14 PM , Rating: 2
Yeah I don't see the world moving beyond 1080p for at least a decade.


RE: Consumer demand?
By hubb1e on 8/16/2012 2:17:06 PM , Rating: 2
I don't even see any mass adoption beyond 1080p in 20 years. I've got an 80" TV and the picture with 1080p is very good. Will people really go bigger than 80" on a normal basis? There was a big improvement in going from 480i to 1080i as a broadcast format. Everyone could see the benefit once they put it in their house. But with 1080p we've hit the limit of human vision unless you're building a whole wall out of a TV.


RE: Consumer demand?
By augiem on 8/16/2012 2:55:43 PM , Rating: 2
While I'd love an 80" TV, 1080p is not anywhere close to sharp on a screen that big. If anything, it looks about like SDTV did on a 30" screen. My 24" monitor just slightly higher res than your 80" TV. And of course we now have the Mac book with an even higher res screen, but that's just be extravagant as I didn't see any benefit while using it on a panel that small.

I agree adoption of anything bigger like 4K would take forever, but to say there would be no benefit at screen sizes like yours isn't accurate.


RE: Consumer demand?
By Guspaz on 8/16/2012 5:46:31 PM , Rating: 2
I've got a 1080p projector on an 80" screen in my apartment. Your head sits about 8 feet from the screen. The resolution is more than sufficient for sharp detail. There might be a benefit to 4k video at that size/distance, but there comes a point of diminishing returns where the source material doesn't benefit.

Remember, the vast majority of movie theatres these days use 2K projectors (similar to 1080p) on ginormous screens, and nobody really notices the difference if they see something in the few theatres that use 4K projectors.


RE: Consumer demand?
By Moishe on 8/17/2012 11:07:56 AM , Rating: 2
You're wrong.

I have a 120" projection setup (DLP) and I have a 27" tube TV. The difference is significant.

1080p HD is a big improvement over SD and like the previous poster says, the sizes it enables are large enough that most users will not outgrow the options.


RE: Consumer demand?
By Netscorer on 8/16/2012 3:27:44 PM , Rating: 2
Damn you,

and I thought my 65'' TV was huge.

I totally agree with you. 1080P is more then enough for home presentation. I would even dare to say that 720P is absolutely fine for most video, maybe with sport being an exception. On my 65'' screen sitting 8 feet away from TV, I can never see any difference in resolution between quality encoded 720P stream and Blu-Ray 1080P content. I am not saying there is no difference, as Blu-Ray can present much better colors and contrast comparing to 20 times smaller 720P encode. But when it comes to resolution, for movies 720P resolves just fine. You don't want to see any wrinkle on the actor's face anyway.


RE: Consumer demand?
By Jeffk464 on 8/16/2012 3:47:51 PM , Rating: 2
I'm waiting for a $1,500 100" OLED TV. :)


RE: Consumer demand?
By Jeffk464 on 8/16/2012 4:00:48 PM , Rating: 2
I've heard that 4K has a stunningly good picture, so the human eye must be able to see the improvement. But yeah, heard its more important past 75" screens.


RE: Consumer demand?
By Guspaz on 8/16/2012 5:33:33 PM , Rating: 3
This isn't about pushing resolutions higher; there's no arbitrary resolution limit on h.264 (the highest level tops out at 4k but that's just the highest resolution defined in a level, not a technical limitation). This is all about doing the same thing with less bandwidth, and that is something that IS at a premium.

Example: Say streaming 720p video to my iPad required 3.6 megabits per second, and I've got a 6GB data cap. That gets me 230 minutes of Netflix streaming per month.

But if I can suddenly get the same quality for half the bitrate, now my mobile plan enables me to stream 460 minutes of Netflix per month.

It's not just that, it's also about improving efficiencies for people like Netflix so that they can afford to lower prices, or license more content. And for places where you have to store video, like on a bluray, an updated bluray standard supporting h.265 could enable you to fit more content on a disc.

Current video Blu-Ray discs are 25 or 50 gigabytes, depending on if they're single or dual layer. Now imagine a new version of the bluray spec using BD-XL and h.265... BD-XL takes that up to 128 gigabytes per disc, and h.265 would raise that to an effective 256 gigabytes... Five times the capacity, which would certainly be more convenient. You could get an entire television season on a single bluray disc.

Another advantage on the network side of things is peer to peer video streaming like what BitTorrent Live is trying to do. The big limitation there in terms of the quality you can achieve is the average upstream capacity in the swarm, which dictates your target bitrate. But if you can suddenly get double the quality out of that bitrate, that's going to make it a much better experience.


RE: Consumer demand?
By someguy123 on 8/16/2012 5:59:17 PM , Rating: 2
Improved codec efficiency is mostly good for the industry. For consumers it means better video quality over the same amount of bandwidth, less data being eaten on their wireless plans, or higher fidelity. I don't think the argument that people "don't want" better looking video holds true at all considering current HD adoption. The only thing holding people back from HD was the price. 3D is a different story considering it actually reduces visual quality for the sake of cardboard cutouts.


RE: Consumer demand?
By Moishe on 8/17/2012 11:11:55 AM , Rating: 2
Agreed that more is better, almost always. What slows adoption is the cost benefit for consumers. As prices drop, more people will buy.

If they can encode video to half the size with the same quality, we'll all be better off.


The main use will be streaming video at 1080p
By hubb1e on 8/16/2012 11:51:12 AM , Rating: 2
The main use for this format will not be 3d, or higher resolution. It will be to serve 1080p streaming video on the next generation of streaming devices. And I'm not even talking about the PS4 or the Xbox 720. Since those devices are in the pipeline already, they probably won't have the fixed function hardware to decode the format unless the CPUs in those boxes are able to handle software decoding like they had to do for h.264.




By TakinYourPoints on 8/16/2012 12:22:02 PM , Rating: 2
Another potential advantage will be increasing color accuracy and reducing dithering. The difference between Blu Ray and even a good 1080p stream is huge, but improved codecs can potentially close up this gap. I'm afraid that companies may only use better codecs to shrink file sizes down, thus negating any potential improved image quality, but we'll see.


RE: The main use will be streaming video at 1080p
By FITCamaro on 8/16/2012 1:49:30 PM , Rating: 2
No reason why Microsoft and Sony couldn't push out a decoder onto the console. Just as they did with H.264.


RE: The main use will be streaming video at 1080p
By hubb1e on 8/16/2012 2:12:55 PM , Rating: 2
Sure there is. There is no garruntee that those consoles will have the CPU power to decode the format. It's likely they will, but we don't know how fast your computer needs to be to decode h.265. h.264 decoding pushed the limits of CPU power when it was adopted. I expect h.265 to do the same. Video cards and current fixed function decoding chips won't be able to handle h.265. Bluray players won't have to use h.265 to meet requirements. I don't expect h.265 to catch on as fast as h.264 did since h.264 is meeting the current requirement fairly well.


By Guspaz on 8/16/2012 5:44:04 PM , Rating: 2
The original goal was 3x the computational complexity of h.264, although some proposals were ten times that. I expect they fell somewhere in between.

I don't think next-gen consoles would have any trouble with the decoding. Even a PS3 should be able to decode it, although it might be limited to only videos encoded with tile or wavefront support (h.265 was designed with multi-threaded decoding in mind), or it might have to resort to some decoding trickery (decode different chunks of video at sub-real-time into a buffer on each SPE).

That's only really possible because the PS3 had a ton of dumb brute-force processing power, like a bunch of fast glorified DSPs, which are perfect for this sort of work, but terrible at other things (like, the SPEs have no branch predictors, so they're terrible at logic).

I don't expect the PS4 or the 360 to have anything like the Cell (that was largely a failed experiment), but the processing hardware in a console is typically a bit better than the fastest consumer processor hardware available when they come out (they're bleeding edge so they're slightly ahead of the curve). I would be surprised if they couldn't handle h.265 in hardware, at 1080p at least.


Patents ...
By ZorkZork on 8/16/2012 6:42:40 PM , Rating: 2
Let's just hope that this will not be yet another video standard where the patent trolls are waiting around the corner for approval.




RE: Patents ...
By BugblatterIII on 8/16/2012 6:48:19 PM , Rating: 2
Yes; the crap with H.264 caused another standards war between that and patent-free formats. If this new standard ends that then it's worthwhile even without the improved performance.


'Coating'?
By int_21h on 8/16/2012 10:47:44 AM , Rating: 3
Coating? Really? This is sad, and is deflating (due to the amount of ignorance).




Youtube
By titanmiller on 8/16/2012 2:01:31 PM , Rating: 2
Just think how much money and bandwidth this would save major video websites like Youtube. Halving their bandwidth would be huge!




Frame Rate
By Zhukov on 8/16/2012 7:28:40 PM , Rating: 2
Because of the "persistance of vision" phenomenon, 24 frames per second is enough for static images to eliminate the "flicker" phenomenon. But when portions of the image have moving objects, they will lose dynamic resolution (get blurry). For many scenes like well-lit indoor soap opera sets, this is often difficult for the viewer to detect. But camera panning of many scenes is very easy to detect at low frame rates because everything in the scene is moving.

Next time you watch a football game, pay attention when the camera man pans the audience on the opposite side of the field. At a very slow camera pan, the audience gets very blury. The same effect happens in nature scenes like mountain landscapes. I hate these artifacts and have wished for higher frame rate for decades. 60 frames per second produces much greater dynamic resolution than 30. Sports has a lot of action, which requires more dynamic resolution than soap operas, and is why ESPN broadcasts in 720p60 instead of 1080i30. 720p60 also promotes better slow motion playback.




"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki