backtop


Print 55 comment(s) - last by elgoliath.. on Sep 10 at 3:23 PM


AMD compares noise reduction between its ATI Radeon HD 2600 XT and NVIDIA GeForce 8600 GT  (Source: AMD)

Visible ghosting on the NVIDIA offering with the noise reduction slider at 75%  (Source: AMD)
NVIDIA PureVideo HD causes ghosting in noise reduction tests, AMD alleges

Last week, AMD briefed members of the press regarding Silicon Optix’s HD HQV benchmark. AMD alleged its competitor, NVIDIA, cheated in the benchmark’s noise reduction section. The noise reduction test accounted for 25% of the overall score in HD HQV.

Avivo HD and PureVideo HD equipped offerings from both camps deliver perfect scores of 100 in the HD HQV benchmark. However, AMD alleges NVIDIA optimizes its drivers with an aggressive noise reduction algorithm that causes visible ghosting. AMD also claims its noise reduction algorithm reduces noise but does not leave any visible ghosting, preserving picture detail.

AMD’s internal testing with NVIDIA ForceWare 163.11 beta drivers revealed the ghosting in both noise reduction test scenes. AMD provided a side-by-side comparison showing the visual ghosting.

DailyTech contacted NVIDIA for an official response. The company denies the alleged cheating optimizations, citing the ForceWare 163.11 drivers are old and had a aggressive default driver setting that causes the ghosting, according to Rick Allen, Notebook and Multimedia PR manager, NVIDIA.

The latest ForceWare 163.44 beta drivers have a less agressive default setting. Users can also adjust the amount of noise reduction applied during post processing with the 0-100-percent noise reduction slider, he added.

“As we openly told reviewers, using aggressive noise reduction settings may cause ghosting depending on the content played so we recommend using moderate settings. We also recommend the improved 163.44 drivers release a few weeks ago which reduce this effect,” Allen said.

NVIDIA also fights back at the allegations, highlighting that the ForceWare driver allows users to adjust the noise reduction levels on their own, and even completely disable the feature. AMD’s noise reduction implementation is adaptive and applies varying levels of noise reduction, depending on the situation and what the driver deems necessary.

However, users cannot turn off AMD's noise reduction feature, which may be an issue in intentionally grainy movies such as 300, according to Allen.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

So whats the conclusion?
By SilthDraeth on 9/4/2007 6:44:40 PM , Rating: 2
And the last sentence was sort of vague.
quote:
However, users cannot turn off the noise reduction feature, which may be an issue in intentionally grainy movies such as 300, according to Allen.


Above that sentence it states users can disable the feature all together for NVidia. So was that statement meant for AMD, or NVidia?

And what is the conclusion, was the test where AMD and NVidia received 100 score ran with the .11 drivers or the .44, and are the .44 actually available, etc, etc.




RE: So whats the conclusion?
By Makaveli on 9/4/2007 6:56:13 PM , Rating: 2
User can disable it on Nvidia's card not ATI. Yes that statement was prooly worded. i'm not sure about your other question tho.


RE: So whats the conclusion?
By Khato on 9/4/2007 6:58:20 PM , Rating: 1
Well, just read the paragraph above the last sentence and you have your answer.

As for a conclusion... Grats AMD, you can make your competitor's product look worse by messing with settings that you don't even give your customers. Why do they feel the need to do this? Well, just search for a review that uses Silicon Optix’s HD HQV benchmark and you'll have your answer.


RE: So whats the conclusion?
By Zstream on 9/4/2007 9:44:10 PM , Rating: 2
What are you talking about? You have no frigging idea of what the setup was. So please spare me the drama....


RE: So whats the conclusion?
By Lightning III on 9/4/07, Rating: -1
RE: So whats the conclusion?
By abhaxus on 9/4/2007 11:36:04 PM , Rating: 2
Where do you get that this is the 3rd generation of AMD cards with mandated hdcp? When I bought my 8800 GTS in March, a large part of my reason for going with that card was that finding a card from either brand with HDCP support was a chore. A few 1900GT cards had it and a few 1950 cards, and a few 7900GS and higher nvidia cards had it. But ALL of the 8800s had it at the time. So I went with an 8800.

I love my 8800 but the drivers drive me crazy (sorry for the pun). Flat panel scaling is still broken and some older games simply don't work. I've heard that AMD has similar issues with flat panel scaling but it will be at least a year before I buy a new video card so I will have to wait and see.


RE: So whats the conclusion?
By Lightning III on 9/5/2007 8:38:31 AM , Rating: 1
Lets see it was all the way back with my X1600 or what was really the X1800 family. The next revision was the X1650 or the X1950 family of cards and we are not counting the X1950PRO which was really a whole new chip in between.

So yeah depending how you count em, three generations. I think my X850PRO had it as well but I don't think it was AIB mandatory till the X1800 line.


RE: So whats the conclusion?
By Anh Huynh on 9/5/2007 10:50:15 AM , Rating: 2
The first AMD cards to support HDCP was the X1950XTX, X1900XT 256MB, X1950 Pro and X1650 XT. That was all one generation.

The X1900 lacked proper HDCP keys, even though the GPU was HDCP compatible.

It was a major issue: http://www.dailytech.com/article.aspx?newsid=851


By Lightning III on 9/5/2007 12:33:38 PM , Rating: 2
I got a HIS X1600 512mb turbo that's HDCP Compliant that was the previous generation I guess I can scan the box for you when I get home. I was unsure about my old X850 maybe it was there on the built by ATI cards but not for AIB partners.


RE: So whats the conclusion?
By Anh Huynh on 9/5/2007 12:03:54 AM , Rating: 2
I don't consider HDMI audio on the AMD solutions a necessary plus, considering it provides absolutely no benefit compared to the S/PDIF input solution.

It still doesn't output LPCM audio for HD DVD's and is still a high-definition audio based solution. At least with a S/PDIF input solution, you can choose your own sound card and take advantage of Dolby Digital Live encoding.

And the NVIDIA solutions support VC-1 decoding.


RE: So whats the conclusion?
By Lightning III on 9/5/2007 9:09:45 AM , Rating: 2
Really so working with a micro ATX or SFF system max PCI slotage is 2. so the real choice is dual tuners or your X-FI or paying for the super pricey PCI EXpress 1x cards.

I personally use the spdif out solution.

But not everybody starts out with the corvette, most build a escort first and then transition that way.

on a seperate note AMD includes their HDMI adaptor I don't seem to see anything resembling that on the included bundles photos on newegg do you have to pay extra for that.

Also I seem to remember NVIDIA boys having to pay extra for the purevideo codecs.

Surely this is no longer true.


RE: So whats the conclusion?
By Anh Huynh on 9/5/2007 10:52:22 AM , Rating: 2
Some motherboards have onboard audio that features Dolby Digital Live encoding.

You have to pay for the PureVideo decoder itself, which allows you to decode DVD/MPEG2 content.

It's no different than having to pay for Cyberlink PowerDVD to watch DVD movies on AMD cards or take full advantage of the Avivo HD optimizations.

The DVI to HDMI adapter bundle is up to the partner.


RE: So whats the conclusion?
By Lightning III on 9/5/2007 1:35:50 PM , Rating: 2
I dont know my power DVD came for free with my 550 pro,my HD-TV wonder and my 650pro.

all at no extra cost

And what I hope is a little extra compatability, since my tuners and my video card are designed and manufactured by the same company.


RE: So whats the conclusion?
By elgoliath on 9/6/2007 5:02:38 PM , Rating: 2
If you think power DVD was free just because it wasn't listed on the invoice, I have something I'd like to sell you...

You paid for it, they just rolled the charge into the total price.


RE: So whats the conclusion?
By Lightning III on 9/5/2007 9:25:28 AM , Rating: 2
NVIDIA
VC1 support = software decode = cpu utilization = more heat or higher cpu requirement


RE: So whats the conclusion?
By Anh Huynh on 9/5/2007 10:54:18 AM , Rating: 2
Unless you know something NVIDIA doesn't, their PureVideo HD page specifically states VC-1 HD decode acceleration.

quote:
NVIDIA PureVideo HD technology delivers outstanding picture clarity, ultra-smooth video, vivid color, and precise image scaling for video and HD DVD and Blu-ray movies. PureVideo HD accelerates and enhances high-definition movies in H.264, VC-1, WMV, and MPEG-2 formats, delivering life-like images that have up to six times the detail of standard DVD movies. High definition post-processing features, including advanced de-interlacing, noise reduction, and edge enhancement, provide spectacular picture clarity at resolutions up to 1080p—the highest HD resolution available.


http://www.nvidia.com/page/purevideo_HD.html


By Lightning III on 9/5/2007 12:18:28 PM , Rating: 2
software acceleration is not hardware acceleration

quote:
while the 8600 GTS and 8800 GTS share roughly the same performance. The HD 2600 XT leads the pack with an incredibly low CPU overhead of just 5 percent. This is probably approaching the minimum overhead of AACS handling and disk accesses through PowerDVD, which is very impressive. At the same time, the savings with GPU bitstream decode are not as impressive under VC-1 as on H.264 on the high end.


or

quote:
VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage


perhaps a 754 sempy or a socket A northwood

most first time HTPC attemps are spare part rigs

AMD's solution is more flexible for legacy parts or the new low power AMD BE processors

I just hope Mr Wilson got a pair of ULTRAS in the mail for this quote
quote:
quote:
For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.


Because he managed to ignore all of my previous posts reasons to give the recomondation to NVIDIA.
Who has made good fist attempt at breaking in to the ATI dominated HTPC world.

and thats why AMD had to step forward and cry foul


RE: So whats the conclusion?
By Thorburn on 9/5/2007 6:52:16 AM , Rating: 3
Points 2 and 3 are a little unfair.

First of all, NVIDIA us an 80nm process for the G84 and G86 GPUs, G80 (8800) is 90nm.

Additionally testing has shown the 2600 XT and 8600GTS to be roughly equal in terms of power consumption.

The PEG power connector on the 8600GTS seems to be more a case of NVIDIA accounting for overclocking and being a little conservative than it being needed at stock.

For HTPC's however the HD ranges mandatory HDCP support and more elegant HDMI solution make it a great choice, the lackluster 3D performance being less important in this marketplace.


RE: So whats the conclusion?
By stevenplatt on 9/5/2007 8:05:06 AM , Rating: 3
I did much research on which solution was better for HD playback. ATI always came out on top. I am not an ATI fanboy, quite the contrary, but ATI HD 2000 series cards are more feature rich. There is a reason their production was delayed. I remember reading reviews where 2600xt would spank 8600GT in video quality. On numerous reviews Nvidia magically produced new "Beta" drivers that sudenly acheived perfect scores on the image quality tests. I dont doubt for a second that there was some driver optimizing. Besides this is nothing new. Didn't Nvidia optimize for 3D mark? In other news...My HD 2600xt arrives tomorrow. Don't think I'm biased. I genuinly thought the image quality on the Radeon was that much better.


RE: So whats the conclusion?
By jrb531 on 9/5/2007 10:56:04 AM , Rating: 2
Nvidia cheat on tests? Never! ROTFL

Anyone remember the entire DX9 mess in which Nvidia produced those god awful FX cards (good in DX8 but awful in DX9) and to try and "fix" the problem (the issue was that DX8 requires 16bit precision and DX9 requires 24bit - FX cards only had 16/32 bit and 32bit was very very slow) by secretly dropping from 32bit mode (DX9) to 16bit (DX8) without telling you that the benchmark was not running DX9 code.

Nvidia cheat.... NEVER! *smiles*

-JB

P.S. And yes ATI is not perfect either but Nvidia still has issues with the 8xxx cards and until recently their Vista drivers were terrible!


RE: So whats the conclusion?
By Anh Huynh on 9/5/2007 10:59:39 AM , Rating: 2
AMD's Avivo HD has a slight edge in HD HQV. However, I personally find HD HQV a pointless benchmark, especially for 1080p content.

The problem is, post processing is nice, when applied to say television broadcasts where noise reduction and such would be useful, which is typically 1080i or 720p.

HD DVD and Blu-ray movies are already very well mastered and any noise or grain on the transfer is usually intentional and adds to the director's vision. Any post processing applied to an HD DVD or Blu-ray movie ruins it, IMHO.


RE: So whats the conclusion?
By swatX on 9/5/2007 8:14:34 PM , Rating: 2
Is this all what AMD can do?

Complain and whine? Why not release a damn product first then throw words at each other.


Monitor
By PrimarchLion on 9/4/2007 6:39:01 PM , Rating: 2
Does anyone know what kind of monitor is shown in that screenshot?




RE: Monitor
By ObscureCaucasian on 9/4/2007 6:42:14 PM , Rating: 2
Looks like a Dell of some sort.


RE: Monitor
By Anh Huynh on 9/4/2007 7:10:51 PM , Rating: 2
Dell 2407WFP if I recall correctly.


RE: Monitor
By TomZ on 9/4/2007 7:17:18 PM , Rating: 2
It does look just like my 2407WFP-HC - same exact control panel.


RE: Monitor
By TomZ on 9/4/2007 7:18:38 PM , Rating: 2
Correction, not "exactly" the same as the HC. The HC has a couple of extra icons associated with the - and + buttons. Probably this monitor is the older 2407WFP, non-HC.


RE: Monitor
By Tegeril on 9/4/2007 7:47:11 PM , Rating: 2
Yup, looks -identical- to the non HD 2407 I'm looking at it on :)


RE: Monitor
By Tegeril on 9/4/2007 7:48:40 PM , Rating: 2
Bah, non-HC.


RE: Monitor
By retrospooty on 9/4/2007 10:15:45 PM , Rating: 2
I can confirm that. I have the regular 2407 and its the same.


RE: Monitor
By idconstruct on 9/4/2007 11:50:02 PM , Rating: 2
w00t for our sexy 2407's! :P


And the difference is...?
By Goty on 9/4/07, Rating: 0
RE: And the difference is...?
By Hacp on 9/4/2007 9:20:24 PM , Rating: 2
Click on the picture, and look at the hood. You clearly see the vertical lines running across the hood of the Nvidia picture.


RE: And the difference is...?
By Goty on 9/5/2007 12:28:54 AM , Rating: 1
I can see the vertical lines in both samples, so that's not enough of a difference to matter for me.


RE: And the difference is...?
By vignyan on 9/5/2007 2:34:26 AM , Rating: 2
Agreed. I dont know why you were rated down... but i looked th e picture up-close but there were no ghosting artifacts that were present in the picture. the AMD's picture is prettier in the sense of proper lighting and color but hey.. they provided the pictures!


RE: And the difference is...?
By rdeegvainl on 9/5/2007 5:01:23 AM , Rating: 3
Wrong way to look at it. Look at the first picture of both NVVIDIA and ATI, then look at the Second picture of them both. notice on the nvidia one you can still see the car, that is the ghosting thing going on.


RE: And the difference is...?
By Goty on 9/8/2007 8:09:22 AM , Rating: 2
Aren't you the quick one? That picture was added later.


RE: And the difference is...?
By Anh Huynh on 9/5/2007 11:03:10 AM , Rating: 2
Look at the bottom right near the ground and look at the red line. Compare them side by side. The NVIDIA solution has a bigger line due to ghosting.

I've also added a second image that shows the ghosting better.


Fanboy vs Fanboy
By jrb531 on 9/5/2007 11:09:51 AM , Rating: 2
You all know this is going to boil down to...

1. I am an Nvidia Fanboy so this means nothing and ATI is just trying to stir things up because the can't compete

2. I am an ATI Fanboy so this is yet another example of Nvidia cheating on drivers which is why Nvidia cards "appear" faster than ATI

The truth, as always, lies somewhere in the middle. IMHO until we see real DX10 games (produces as DX10 from the very start) then which card's are better or look better mean very little. After all those Nvidia "FX" cards were fast as all hell in DX8 but when real DX9 games hit then the truth came out.

After all, who pays $300+ for a video card just to watch movies? You could do it much cheaper so in the end gaming is the reason you pay so much for a video card and in this regard the jury is still out on "true" DX10 games.

-JB

P.S. Just for the sake of aurgument what will happen if next year a bunch of real DX10 games come out and it shows that the ATI cards are better at DX10 but Nvidia is better at DX9. Probable? no but possible :)




RE: Fanboy vs Fanboy
By Lightning III on 9/5/2007 1:23:33 PM , Rating: 1
first this is not about gaming

this is about a 150 dollar avg against a 179 average priced card although for nvidia you would still have to purchase the HDMI adaptor and the pure video codec which should push it to 200 avg

not about a 300+ card

and its not to just watch Movies

1.I like to surf on my 48in rear projection

2. play movies

3. record ota(over the air) HD content (AMD 650 Pro required )

4.download movies and tv from itunes (non cable video on demand)

5.a 750gig DVR

6. timeslip features pause rewind tv

7. universal memory card reader and firewire connection (for slideshows or camcorder movies )

8. interactive on screen program guide (via snapstream)

9.looking to add HD-DVD or Beta Ray when drives drop to a hundred dollars

10. intergrated dvd burning if I can watch it I can record it and burn it

11. use it with my rhapsody acct and use it more than my high dollar stereo

12.savings after building this I took back my time warner HD DVR and canceled all the preimium packages kept standard 2 through 77 + HBO and saved 70 dollars a month

13. and last but not least as I can tell by showing it off to friends they are most impressed at the scaling abilites at the porn sites yes it is also the ultimate Porn Machine

perhaps one of these 13 reasons will appeal to you and why this is a growing segment

regards


RE: Fanboy vs Fanboy
By elgoliath on 9/6/2007 5:01:00 PM , Rating: 2
All of your points are basically the same thing- showing PC content on your TV. So, unless you are wanting to that then they are all moot.


RE: Fanboy vs Fanboy
By Lightning III on 9/7/2007 2:21:36 PM , Rating: 2
hence the name HTPC or home theater Personal computer

like uhh duhhh

I guess your just another stupid gamer who thinks it's all about fps


RE: Fanboy vs Fanboy
By elgoliath on 9/10/2007 3:23:40 PM , Rating: 2
Jump to conclusions much? Reading comprehension for the win. As I said, unless you are wanting to view PC content on your TV (which at this time is a small segment, but growing), then this doesn't matter to you. I was just pointing out that your list was basically a bunch of bullets pointing to the same thing.

As far as your last comment- grow up. I never once mentioned anything related to gaming (those video cards are not sold as only HTPC cards, but as gaming cards with HTPC funtionality). You though did mention porn though so I guess I could call you a pervert-


That is there sample?
By gsellis on 9/4/2007 10:14:36 PM , Rating: 2
Bah. Yep, the nVidia pic is softer with a little more noise. Both are freaking noisy. The pic has some chromic(that spelling looks wrong) aberration and is just a poor sample.

While I use ATI almost exclusively, PLEASE.... benchmarking with beta drivers :(

(not WQHL <> beta; WQHL = certified by MS testing procedures)




RE: That is there sample?
By GoodBytes on 9/4/2007 10:38:17 PM , Rating: 2
Heuu their like the same image.
If I look extremely closely without reading the video card wile examining it, I find the Geforce 8600 looks better.


RE: That is there sample?
By gsellis on 9/5/2007 9:49:29 AM , Rating: 2
I have a Acer AL2051W driven through a 1950Pro at home, and they are definitely different.


Why does nvidia always recomend beta drivers.
By sc3252 on 9/4/2007 7:49:28 PM , Rating: 5
I have always noticed when a major bug pops up that is sometimes show stopping(not in this case) they recommend using their beta drivers until down the road in 6 months they decide to release a new driver. Its really anoying, and anyone who has to use a production system cant do this. So I complain, then some stupid anandtech says "Why don't you use the beta drivers?" Why the fuck would I want to use beta drivers? They are called beta for a reason, there are problems with them. Seriously Nvidia needs to actually release drivers, instead of consistently releasing beta drivers and telling people to use them.




By Roy2001 on 9/5/2007 12:55:58 AM , Rating: 1
My E6400 CPU usage is about 55% or even lower for HD/BD/X264 playback. who cares video card decoding?




By Lightning III on 9/5/2007 9:22:04 AM , Rating: 4
Heat

or

off loading cpu utilization could enable you to run a 45 watt BE processor for HD Playback maybe even run a 350watt psu for the whole system

or

anybody who has to put their HTPC in a poorly ventilated entertainment center


Please, can AMD stop complianing............
By gudodayn on 9/4/2007 9:22:40 PM , Rating: 1
Please, can AMD stop complianing about what others are doing and work on their own stuff and get K10 out on the market!!!!
Dont get me wrong, I use a 4200 x2 and it has served me well even now with it OCed to 2.75GHz, its still very capable.
I am hoping to upgrade soon though, so can AMD please, please, please get your K10 out on the market.
If its better than Conroe Quads, then I'll get a K10. If not, I'll go with Intel Quads.
Performance matters ~ not the brand!!




By SlyNine on 9/5/2007 5:25:58 AM , Rating: 3
I'm assuming that as an owner of a 4200+ (like me) its more of a Cost/performance issue , not a pure performance issue. I'll get what ever one gets me to the best performance level I can afford. not the one that happens to have the best 1000$ CPU. If AMD doesn't quiet match Intel in performance at the highest end I'm sure they will try and match them in price. Lets just hope they can bring back some balance.


LOL-copter
By umeng2002 on 9/4/2007 8:36:54 PM , Rating: 3
I see the ghosting, but I love how a company ALWAYS washes out the color of the competition's screen shots. Marketing BS at it's finest.

I wish that these companies (nvidia & ATI/AMD) would spend their time at making their products work as advertised instead of nit-picking each other's crappy drivers.




Sharpening and more differences
By edlight on 9/5/2007 2:17:43 AM , Rating: 3
nvidia shows lots of detail in the headlight lens, while AMD blocked the highlights up so those details are blank in their screen shot. However, that gave AMD more contrast so theirs is snappier and overall more attractive. It comes down to whether you like snap or detail. Hopefully both cards could be adjusted either way.

Nvidia has something AMD doesn't besides a noise removal slider -- a sharpening slider. It does amazing things to DVDs and streaming video.

Nvidia is doing dumb things in their drivers' controls and their user feedback page was still under construction, last time I checked, so they evolve rather slowly. The last driver I tried had an issue with Chris-TV and I had to get rid of it. You can't be sure they will keep the desktop sharpening -- they may only let you sharpen the overlay. The desktop sharpening sharpens up games that are too soft, such as Trainz.

The sharpening sways it to nvidia for me, until AMD comes up with some. Hard to understand why they haven't, it's such a simple idea.




By beepbeep999 on 9/5/2007 9:16:38 AM , Rating: 1
I found this review from TechReport who mentioned the problem. It's funny, I didn't understand what he meant at the time but I can see it now in the pictures. So nVidia sends drivers to press to try get high HQV HD scores in the review, then bust us with crappy noise reduction in the general release? Freaking A$#4holes. Not the first time they cheat on a benchmark if some remember the 3DMark saga.

Beep Beep

http://techreport.com/articles.x/12843/11

quote:
Also, even on the 8600 GTS, Nvidia's noise reduction filter isn't anywhere near ready for prime-time. This routine may produce a solid score in HQV, but it introduces visible color banding during HD movie playback. AMD's algorithms quite clearly perform better




"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki