Print 42 comment(s) - last by FaceMaster.. on May 29 at 8:11 PM

(Click to enlarge)

(Click to enlarge)

Available on all Radeon HD 4000 series video cards (Click to enlarge)
Video enthusiasts and media corporations alike will be looking very closely at ATI video cards

ATI, the graphics division of AMD, has been working to spread its Stream GPGPU technology, which helps speed up applications by exploiting the parallel nature of its GPU products. Since ATI is a part of AMD, Stream has been designed to work with the CPU, instead of exclusively offloading all work onto the GPU. Signal processing, financial analysis, and protein modeling are just some of the fields that Stream can help.

Video transcoding is one of the most CPU intensive tasks, especially when converting high definition video. ATI has been working on this problem for a while, and has released its ATI Video Converter which can use Stream technology. UVD2 Fast Decode and GPU Scaling will be available for the entire Radeon HD 4000 series of graphics cards. GPU encoding in parallel stages will not be available in 45xx and 43xx cards due to bandwidth limitations, but all other 4000 series cards support it as well.

ATI is partnering with Independent Software Vendors (ISVs) to use Stream's transcoding framework. One of the first out is CyberLink, which is leveraging ATI's Unified Video Decoder (UVD) in its MediaShow Espresso video converter application. This allows it to quickly convert digital video files for use on portable devices like Apple's iPhone and Sony PSP. CyberLink has also optimized its PowerDirector 7 video editing software to take advantage of ATI Stream.

Tests conducted by ATI showed a reduction in transcoding time of at least 50% in most applications. For example, a 94 second 1920x1080 video encoded at 24 frames per second using H.264 took 131 seconds to transcode to a format suitable for the iPhone using just the CPU. It took just 46 seconds using a mid-range Radeon HD 4670 video card, reducing the time needed by almost 65%. Results would be even more dramatic using a higher end Radeon 4870 or Radeon 4890 GPU. Addition GPUs through CrossFire would provide even greater performance improvements.

Support is currently supplied by a hotfix to Catalyst 9.5, version 8.612.3 RC2 dated May 25. Full support will be included in Catalyst 9.6, which will be released in mid-June.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By aegisofrime on 5/28/2009 2:33:01 PM , Rating: 2
ATI should forget about all these half-assed transcoders and start work on a GPU Accelerated x264 encoder. x264 by far offers far more options and features than Badaboom, Powerdirector and Mediashow. Hell, Mediashow doesn't even let you choose custom resolutions and bitrates! Nobody doing serious encoding is going to use Mediashow for that.

I do alot of video processing via Avisynth, and my workflow goes like this: Avisynth ---> hfyu ---> 2-pass x264 encoding. Mediashow and Badaboom does not support opening hfyu files, so they are useless to me.

RE: x264
By omnicronx on 5/28/2009 3:55:38 PM , Rating: 5
ATI has laid the groundwork, I don't see how you can expect anything else. In this case ATI can do nothing more than lend a helping hand to the guys who are continuing with x264 development.

On the otherhand I do agree with everything else you said. x264 is miles ahead of other codecs.

RE: x264
By omnicronx on 5/28/09, Rating: 0
RE: x264
By VitalyTheUnknown on 5/28/09, Rating: -1
RE: x264
By omnicronx on 5/28/2009 8:49:10 PM , Rating: 4
Its not the responsibility of ATI or Nvidia to implement the changes you are asking for. ATI is a video card company that is laying down the framework in which anyone/anything can take advantage of. ATI should not have to curtail to certain pieces of technology, it is up to the x264 team to take advantage of this technology, not the other way around. From a development perspective it is a terrible idea to curtail to a particular piece of technology especially one that is not under your control.

I understand why you may not agree, but I think you are blaming the wrong party here.

As for your first comment, I obviously like to hear myself talk, thanks for asking.

RE: x264
By VitalyTheUnknown on 5/28/09, Rating: 0
RE: x264
By VitalyTheUnknown on 5/28/09, Rating: 0
RE: x264
By EricMartello on 5/29/2009 2:17:41 AM , Rating: 3
ATI is offering a proof of concept. Yes, their GPU encoder is optimized for speed over output quality so it's probably going to output relative crap...BUT...that is not to say that in the future it will be impossible to write an encoder that produces high quality output and uses the GPU to speed up its performance.

Neither ATI nor NVIDIA are HARD WIRING any kind of x264 codec into their hardware; it is software based and eventually, with the right motivation, things like CUDA and ATI's thing can be likened to CPU extensions like MMX, SSE, 3DNow and such...where an encoder will use it if available and benefit from it. If it's not there, it will encode as normal.

RE: x264
By aegisofrime on 5/29/2009 2:57:55 AM , Rating: 3
I'm not saying that it's ATI's obligation to write an ATI Stream version of x264. Sorry if I sounded that way. However when you think about it, it's very good business sense.

nVidia has constantly been talking about how great Badaboom is. However as I said, Badaboom pales in comparison to x264, in terms of quality, configurability (I'm aware that's not a word, but it gets the meaning across), and features. To a gamer who also does alot of video encoding, a ATI Stream x264 encoder might tip the scales in favor of buying an ATI card when he's considering an upgrade.

CUDA and ATI Stream are all about value-adding. What do I mean? Lets compare the 4890 and GTX 275. They perform similarly, and are priced in the same spectrum as well. Other than brand loyalty, what's gonna make a gamer choose one over the other? Extra value provided by GPGPU applications!

RE: x264
By VitalyTheUnknown on 5/29/2009 4:41:29 AM , Rating: 1
If you look at the second picture (new ati stream transcoding runtime) you will see that they apply UVD which is part of ATI's Avivo HD and since UVD
based on an ATI Xilleon video processor, incorporated into die of the GPU -
it is hardware based solution, of course software accompanies that but you can't fix something with software witch is fundamentally flawed.
Well, don't get me wrong I hope that they succeeded and I'm waiting for first genuine results of "perceived image quality" and "compression efficiency" but until now all their tinkering with mpeg-4part10 compression was fruitless.
No independent software developer can improve what ATI offers it seems people here just don't get it.

RE: x264
By EricMartello on 5/29/2009 6:17:58 AM , Rating: 3
The Xilleon processor is not a codec, it is a decoder which is used to enhance MPEG-2 and x264 video playback. UVD means "Universal Video Decoder" and it is being used to decode the video prior to processing and transcoding it (in the second pic of this article). The quality of the decode is going to be identical to that which you'd see on screen anyway, because the purpose of the Xilleon processor is to decode video.

ATI Stream Computing, on the other hand, is not tied into that specific video processor and is more like AMD's answer to NVIDIA's CUDA. The parallel encoding processes are programmable and controllable in software...if they are not, then ATI Stream is going to be DOA.

RE: x264
By VitalyTheUnknown on 5/29/2009 7:33:38 AM , Rating: 1
On your first point your are completely right, but I think you misunderstood me, what I meant to illustrate by UVD is that it is still part of their GPU engine, it is hardware based it can't be changed or substantially improved like some people said here, just like their encoder if we believe ATIs claims that it is actually a genuine GPU encoder.
Their encoding algorithm is probably complex but it is not what produces quality that we expect from software encoders like open source x264 library.
ATIs environment in this instance is close environment, no one has any routes nor instruments to work with or improve Ati stream computing.

RE: x264
By dardas on 5/29/2009 7:26:55 PM , Rating: 3
"ATIs claims that it is actually a genuine GPU encoder"

i'm sorry, but you are completely WRONG on that account.
ATI's "stream", identical to "Cuda", is just a way to write software to use the gpu for computations. what you do with it is up to you.

what differentiates the two is that Nvidia chose to do everything on the gpu, while ATI decided to use a more balanced approach - some of the functions run on the gpu, other will still use the cpu. this approach led them to use the already present UVD engine (like NV's PureVideo, but using dedicated transistors and not general shaders) to DECODE the original video. that frees up cpu resources tremendously when transcoding HD content, and doesn't require NV's shaders power. the ENCODING is then distributed between the general purpose Stream processors of the gpu and the cpu.

hence, the quality of the encoder is totally up for the stream programmer. the only constant will be the preliminary video decoding (which is of superb quality, anyway).

RE: x264
By VaultDweller on 5/29/2009 8:08:19 AM , Rating: 2
Any comments I've seen from the x264 developers indicate that they've looked into CUDA, but they don't think it is worthwhile to implement. Whatever led them to that conclusion is probably also true of Stream.

Maybe OpenCL will change that. I don't know, but I hope so.

By Zandros on 5/28/2009 1:33:00 PM , Rating: 5
Speed comparisons are worth nothing without proof of equal quality and file size.

RE: Yawn
By Doormat on 5/28/2009 1:51:27 PM , Rating: 5
This is basically what I came here to post. Its one thing to convert the same video to h.264 but its another thing to do it at the same quality at a lower bitrate, or higher quality at the same bitrate. One of the problems that plagued CUDA until recently (2.2) was that you had to had to load everything into GPU memory to access it, and the x264 developers concluded at the time that it wasn't worth it to write a motion estimation engine for CUDA. Now its possible they may revise this as more features get added to CUDA and it migrates to OpenCL, but as of right now, it doesn't seem likely.

I follow x264 (and Handbrake) development, and watch what types of improvements those guys can pump out. Just so far this year x264 performance has been boosted substantially. So if they're using an old version of x264 to compare to, then they could be off as much as 15-20% on a Core 2 processor.

RE: Yawn
By plonk420 on 5/28/2009 2:28:26 PM , Rating: 2
yeah, unless "CPU" is x264 (with settings cranked), and "GPU" is 95% of x264's quality ... sorry, not interested.

RE: Yawn
By omnicronx on 5/28/09, Rating: 0
RE: Yawn
By Darksider on 5/28/2009 1:56:46 PM , Rating: 1
Not to mention the fact that the nVidia card was not running Badaboom. I would like to see numbers with each of them using their 'special' transcoding programs.

Regardless, this is good news for ATI users with 4000-series cards.

RE: Yawn
By omnicronx on 5/28/2009 2:47:02 PM , Rating: 1
I assume they are using the same encoder, thus there is no reason the size/quality would be any different. CyberLink PowerDirector software for example will result in the same file size and quality regardless of the method used to offload to different devices as they are still using the same encoder regardless of where the stream is being processed. Where the actual process of transcoding is being offloaded to is irrelevant.

RE: Yawn
By mindless1 on 5/28/2009 5:01:32 PM , Rating: 2
You've never seen encoders that had different subroutines, different adjustable settings that aren't necessarily accessible by the user? It wouldn't be surprising if certain encoder features were abandoned in the case of one or the other doing the processing if a large enough speed gain were reaped.

That doesn't necessarily mean the GPU is slower than the test implied with same quality and bitrate, it could be the opposite instead. Bit for bit comparison of the frames should provide some clues.

RE: Yawn
By knutjb on 5/28/2009 5:59:56 PM , Rating: 2
It doesn't matter how much "proof" ATI provides in a press release. Who believes theirs or Nvidia's or Intel's claims solely on the surface of a claim. I'm just waiting to see a reputable source put it to a test. It might pan out.

By tyildirim on 5/28/2009 1:23:16 PM , Rating: 2
that's quick...hope to have it some time in my rig..

RE: wow..
By chmilz on 5/28/2009 1:27:05 PM , Rating: 5
ATI just patched in a kick to nVidia's balls.

RE: wow..
By FaceMaster on 5/28/09, Rating: -1
RE: wow..
By GodisanAtheist on 5/28/2009 10:53:54 PM , Rating: 2
Personally, price and games would be the deciding factor, with price being paramount.

If they are both the same price, move on down to games.

Both DX 10.1 and Physx are supported by a handful of mutually exclusive (and mutually lackluster) set of titles, but if someone is dying to play some cryo-what-have-you or BattleForge or whatever... then it should make or break the buying decision.

As it stands the future of both Physx AND DX 10.1 is sealed, for their own respective reasons.

RE: wow..
By FaceMaster on 5/29/09, Rating: 0
RE: wow..
By FaceMaster on 5/29/2009 8:11:08 PM , Rating: 2
I mean ATI fanboys, just if anybody was wondering.

I have to ask, why is everybody here an ATI fanboy? I posted a long, reasonable comment saying how PhysX swayed my decision between two very similarly specced cards. Is it a sin to say that? Is it a crime to buy a car that offers acceleration for tattered cloths and broken windows NOW? (No, it isn't. It's just sensible). It sounds to me like there are a lot of people on these forums with about as much sense as a devout Christian, instantly rating anybody down who dares speak against their beloved hero / graphics card in any sort of negative manner, even if it's justified.

I know that AMD has gone through a hard time, it's no need to support them at every opportunity, though. You're just as bad as any other fanboys.

Oh and before somebody shouts 'OH UR A FANBOIII', I have had 5 graphics cards altogether, and it's gone ATI, NVIDIA, ATI, NVIDIA.... you guessed it, ATI. Perhaps this gives me more reason and balance than you lot.

Not excited
By lemonadesoda on 5/28/2009 7:55:09 PM , Rating: 1
There is ANOTHER way to halve the transcoding time. Use an Intel QUAD. Use an i7 to halve it again. And Nehalem-EP dual to halve it again.

Much simpler.

RE: Not excited
By GodisanAtheist on 5/28/2009 10:45:59 PM , Rating: 4
And a helluva lot more expansive than a 4670...

Although how changing HSF, RAM, mobo, and the processor is simpler than slotting a videocard is a mystery to me.

Bizzaro universe, perhaps?

By Ammohunt on 5/29/2009 3:36:34 PM , Rating: 2
Until ATI/AMD improves their software/drivers ATI has no place in any system i build.

RE: Garbage
By AnnihilatorX on 5/29/2009 3:57:34 PM , Rating: 2
What exactly is wrong with their drivers?
I have had zero issues so far with my HD4850 on Vista x64. I have never had a crash resulting from display drivers.
I owned Nvidia card previously.

The only thing is there's no automatic profile activation base on game exe you run. But that's not a issue for me.

proprietary solution
By alpensiedler on 5/28/2009 1:50:15 PM , Rating: 3
although this is a really nice app/library, what we need is for someone to implement transcoders in opencl so that we don't have to worry about what gpu we are transcoding on.

Additional GPU's.....
By flurazepam on 5/28/2009 1:30:28 PM , Rating: 2
Addition GPUs through CrossFire would provide even greater performance improvements.

Although not explicitly demonstrated, I both welcome this and would like to see it demonstrated. Right now Badaboom (for nVidia GPUs) cannot use both GPU's in tandem; only independently. It would be slick to be able use as much of the system as possible to do this. Maybe this will entice nVidia to enable this as well.

Good show, ATI
By Ozziedogg on 5/28/2009 5:27:26 PM , Rating: 2
Roll on the GPGPU.

Roll on parallel coding and processing.

By CZroe on 5/29/2009 5:19:42 AM , Rating: 2
As someone who bought a notebook for the X1400 discrete graphics and the promise of AVIVO encoding, I feel very burned by ATI's promised encoding technologies. When they finally released it for "all X1000-series GPUs," it was a lie. Not all manufacturers supported Mobile Catalyst and the hacked Mobile Cat drivers available didn't include it.

Instead, I needed to download the same hacked AVIVO software that non-ATI customers used. Anandtech proved that it was not using the GPU as promised but that it was still the fastest encoder for a Portable Media Center (exactly what I wanted to use it for). That means I could have just used the hacked software on my nVidia MCE desktop in the first place. Of course, ATI never moved forward and added GPU support or tied it into Windows Media Center for encoding your television shows and syncing to the PMC and it turned out that the high speeds were hard to quantify without any input into the encoding process (bitrates and other settings).

Looks like Nvidia is failing
By FaceMaster on 5/29/2009 8:06:10 PM , Rating: 2
Nvidia's latest graphics cards have been inferior to AMD's sleek, efficient designs.

ATI has shown true innovation, knocking Nvidia off the top spot. Well done, AMD. I hope the 5000 series does the same thing.

So says ATI...
By Operandi on 5/28/09, Rating: -1
RE: So says ATI...
By ajfink on 5/28/2009 1:34:04 PM , Rating: 4
They use a mid-range CPU - but also mid-range GPU's. A 4890 or 4870X2 would clearly do better, as would an i7.

RE: So says ATI...
By Jansen on 5/28/2009 1:38:29 PM , Rating: 5
When I spoke to ATI, they said they specifically chose this setup to show a mid-range, affordable solution. It would work much faster with a Phenom II or Core i7.

They pointed out that it is basically free to everyone who bought a 4000 series Radeon.

RE: So says ATI...
By Chapbass on 5/28/2009 1:34:12 PM , Rating: 4
Wait wait, does everyone own an i7? Yeah, the difference isn't AS great in the higher end machines, but at the same time, those that already own ATI cards get a free boost to this, and corporations/media pros no longer need to buy an entire new machine to boost their performance by quite a huge number.

Definitely not worthless. Maybe they did it to exaggerate a little bit, but any time something gets roughly DOUBLE the performance, I don't think you can really just pass it off...

RE: So says ATI...
By omnicronx on 5/28/2009 2:50:11 PM , Rating: 2
Heh the 2.7GHZ X2 is not really from the X2 line. It is exactly the same as the Phenom B2 stepping with only two cores. I.e it has l3 cache.

"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki