Print 19 comment(s) - last by Goty.. on May 30 at 8:30 AM

Call of Juarez (Source: PCPer, Ryan Shrout)

Lost Planet (Source: PCPer, Ryan Shrout)
AMD and NVIDIA are at it again, accusations abound

Marketing is an interesting game big companies like to play. It is the equivalent of two schoolchildren yelling, “My dad can beat up your dad,” back and forth without ever accomplishing anything.

AMD and NVIDIA are having the same fight, albeit on the DirectX 10 playground with NVIDIA touting the GeForce 8800 GTS’ performance prowess in Capcom’s Lost Planet while AMD flexes its muscle in Techland’s Call of Juarez demo.

Ryan Shrout has posted an editorial on the events leading up to AMD’s ATI Radeon HD 2900 XT launch and the post-launch events.

For the launch of the ATI Radeon HD 2900 XT, AMD issued a benchmark of Techland’s Call of Juarez to use in the impending reviews. The benchmark, however, had a flaw with MSAA that rendered it incompatible on NVIDIA hardware with MSAA enabled. NVIDIA became aware of the issue and worked with Techland to issue a patch, a day before the release of AMD’s ATI Radeon HD 2900 XT.   According to more sources obtained by Shrout:
NVIDIA: "NVIDIA has a long standing relationship with Techland and their publisher Ubisoft. In fact, the original European version Call Of Juarez that was launched in September 2006 is part of the "The Way Its Meant To Be Played" program.  As a result of the support Techland and Ubisoft receives for being part of the "The Way Its Meant To Be Played" program, NVIDIA discovered that the early build of the patch that was distributed to the press has an application bug that violates DirectX 10 specifications by mishandling MSAA buffers which causes DirectX 10 compliant hardware to crash.  Our DevTech team has worked with Techland to fix that and other bugs in the Call of Juarez code.

Benchmark testing for reviews should be performed with the latest build of the game, one that incorporates those DirectX 10 fixes.  Previous builds of the game should not be used for benchmarking with or without AA. For details on how to get the patch and for further information please contact Techland or Ubisoft."

The benchmark still ran on NVIDIA’s DirectX 10 hardware, however, MSAA had to be disabled. Even without MSAA enabled, NVIDIA’s GeForce 8800 GTS still lost to AMD’s ATI Radeon HD 2900 XT.

Not one to accept a loss, NVIDIA hooked reviewers up with the PR company for the developers of Lost Planet, allowing reviewers to have a copy a day before the demo was to go public. Lo’ behold, NVIDIA hardware beats out AMD hardware in the demo.

AMD countered by claiming they had insufficient time to optimize for the benchmark, in an email sent out to reviewers. Shrout cites:
AMD: "Today Nvidia is expected to host new DirectX 10 content on in the form of a “Lost Planet” benchmark.  Before you begin testing, there are a few points I want to convey about “Lost Planet”.  “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for.  The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion.  Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game."
Ryan Shrout brings the attention to NVIDIA’s The Way its Meant to be Played and AMD’s Get in the Game marketing campaigns and the implied accusations made by both companies.

“While neither company would really come out and say it, both NVIDIA and AMD are hoping that the idea will come across that the other "cheated" in each particular DX10 benchmark scenario," Shrout said. "AMD kept the "Juarez" demo from performing up to the level it could on NVIDIA hardware and NVIDIA kept the "Lost Planet" demo from running properly on AMD hardware; or at least that’s what the general reader might gather from all of this.”

Both companies can make all the accusations they want but in the end, the consumer is the one that loses. Until a reputable developer that isn’t in bed with AMD or NVIDIA develops a DirectX 10 game or benchmark, the consumer is left out in the dark on which card will provide the superior DirectX 10 gaming experience. 

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

factor in the Vista overhead?
By hellokeith on 5/24/2007 3:38:10 PM , Rating: 3
It's pretty obvious that both AMD & Nvidia Vista drivers are very immature, and then factor in the Vista overhead - what is it like 15-20% framerate penalty compared to XP DX9? - and you have a fairly meaningless DX10 battle.

Give both companies another 4-6 months, and all this will be forgotten.

RE: factor in the Vista overhead?
By Puddleglum1 on 5/24/2007 5:44:52 PM , Rating: 4
...and then factor in the Vista overhead - what is it like 15-20% framerate penalty compared to XP DX9?
That's not logical. The [i]overhead[/i] of Vista should compare DX9 Vista to DX9 XP, which was shown to be minimal in the review.

From the review:
As you can see, the DX9 codepath currently runs faster than the DX10 version. At 1600x1200 the DX10 codepath ran 16% slower than DX9, both under Windows Vista with the GeForce 8800 Ultra in the snow demo, and 2% under the cave sequence.

RE: factor in the Vista overhead?
By hellokeith on 5/24/2007 7:12:39 PM , Rating: 2
Wow, you couldn't be more wrong.

Vista is noticeably slower in DX9 vs XP DX9.

By Puddleglum1 on 5/24/2007 8:08:13 PM , Rating: 2
That's a good link =)

It looks like, for Lost Planet, the difference is minimal, but that in other games it's a rather stark difference.

RE: factor in the Vista overhead?
By Zelvek on 5/25/2007 1:19:47 AM , Rating: 2
well in my personal experience I have had 2% to 6% frame lose in switching over to Vista and given that my frames are still at outrageously high levels that no human can possibly see... frankly I don't give a dam.

And the article itself says the same thing
Really, there was only one instance where Vista was able to pick up a few more frames than XP - World of Warcraft at greater than 90fps, where the human eye can't even see the difference.

RE: factor in the Vista overhead?
By Goty on 5/30/2007 8:30:22 AM , Rating: 2
HardOCP is hardly a reliable source anymore. There are so many things wrong with their reviews and methods lately and they base their conclusions on such tenuous results that it's not even worth looking at.

By James Holden on 5/24/2007 1:58:28 PM , Rating: 5
It's fun to see these companies pissing at each other when the fact remains that maybe 1,000 people will even play either of these games.

I wonder which end of the fence Epic and Crysis end up on.

RE: Heh
By Rampage on 5/24/2007 5:06:04 PM , Rating: 5
And those are the only engines that really matter.

Regardless of either ATI or NV being in bed with Epic or Crytek.. He Who Dominates in their respective engines wins the DX10 crown as far as most of us are concerned.

They will be massively more popular than both Juarez and Lost Planet, and will power many games for years to come.

And no, 3dMark 2010 and its floptastic marks mean nothing.
Unreal Engine 3 and CryENGINE2 decide it all.

RE: Heh
By thartist on 5/26/2007 7:33:56 PM , Rating: 1
Thanks a lot for speaking my mind.

RE: Heh
By Tides on 5/30/2007 3:58:17 AM , Rating: 2
We all know the only engine that really matters is the Return to Zork engine.

Sad Sad World
By Mitch101 on 5/25/2007 1:28:23 PM , Rating: 2
Its a very sad world when we only have ATI and NVIDIA as a choice.

Word of silence for:

And any others I left out.

RE: Sad Sad World
By Vertigo101 on 5/25/2007 3:46:24 PM , Rating: 2
*Quietly bows head*

Good old Oak Tecnologies....

I remember when I got my first S3 VirgeDX card...boy it was junk.

And 3dFX, well, I wanted my Voodoo 5 6000!

And Number 9! Here I thought nobody else remembered them!

RE: Sad Sad World
By spluurfg on 5/27/2007 5:28:12 AM , Rating: 2
If I remember, only Matrox and 3DFX were ever real competitors to ATI/Nvidia as far as 3d graphic went. I never knew Hercules or Diamond made their own graphics processors... I always saw them using Nvidia chips. Don't forget about SIS =P

RE: Sad Sad World
By Axbattler on 5/29/2007 6:09:18 PM , Rating: 2
I remember Hercules, from prior 3D acceleration. Matrox was reknown for it's 2D performance, but I don't think they were particularly ahead of S3 in performance in later years (and it took them so long to get OpenGL drivers right). For 3D, PowerVR was not a big player, with a troubled start (not much different from ATI back in the days really), but they were starting to get things right by the time the Kyro II was released. Shame that they picked that time withdraw.

By therealnickdanger on 5/24/2007 2:18:39 PM , Rating: 2
Isn't this the whole point of 3DM? A somewhat generic benchmark that both hardware makers can optimize for all they want? Some argue that 3DM isn't a true enough representation of actual hardware performance... but games themselves can't be good measures either since they are often lopsided to favor specific hardware functions.

It's funny, the whole point of DirectX is to provide a platform of standards by which a game developer can design a game and hardware developer can design a card to achieve a uniform and efficient method to playing games. However, it now seems like hardware devs have to make individual drivers for every game.

Is this all because both software and hardware devs cut corners to reach compliance with DX?

RE: 3DMark07?
By StevoLincolnite on 5/25/2007 4:44:06 AM , Rating: 2
I find that 3D mark (For me) is only good at allowing me to see, if I have over clocked to far or not enough, and the fact it looks pretty and I want it as a screen saver.

RE: 3DMark07?
By peritusONE on 5/29/2007 7:22:35 PM , Rating: 2
I find that 3D mark (For me) is only good at allowing me to see, if I have over clocked to far or not enough

That's what a lot of people fail to realize about 3DMark. Everyone always complains that it's worthless, but it's far from it. 3DMark is excellent as a quick and easy benchmark of your system to compare against other systems or configs with the SAME EXACT benchmark. That's the beauty of the program.

It seems like a lot of people bash it because it obviously doesn't give you real-world expectations. Well, guess what? When you benchmark a game, it will only give you real-world expectations for THAT game. Everything else will always be different (save for same-engine games). Why do you think tech sites bench different engines/games? People complain because you can't run 3DMark and get any reliable info to tell you how well Crysis (an example) is going to run, but you sure can't bench STALKER (another example) and expect it to give you any better of a clue.

3DMark is a comparison benchmark through and through, and it's damned better at it then any game you'll find.

Why not...
By Screwballl on 5/25/2007 4:53:09 PM , Rating: 1
...create GPU specific versions of the games (2 discs per pack, 1 for nVidia, 1 for ATi/AMD), this way they take advantage of whatever architecture you have. Since most games are not usually played for more than a few weeks, its not like it should be a concern that someone who buys the game now to use with their 2900XT, they shouldn't worry if it plays on their next-gen nVidia N800XTX SpecEd GPU. Very few games span multiple GPU generations and keep the attention of gamers for this length of time. For those games that do span multiple GPU generations such as Counter-Strike and Unreal, those will be the bloated games that have to be made to work with any reasonable card.

RE: Why not...
By boogle on 5/26/2007 4:58:04 PM , Rating: 2
They already do that, any game engine worth its salt has multiple render paths.

“And I don't know why [Apple is] acting like it’s superior. I don't even get it. What are they trying to say?” -- Bill Gates on the Mac ads
Related Articles
DirectX 10 Benchmarks: Part Two
May 18, 2007, 7:15 AM

Latest By Anh Tuan Huynh
More NVIDIA SLI Chipsets Around the Corner
September 25, 2007, 7:40 AM
NVIDIA Prepares New AMD IGPs
September 25, 2007, 7:39 AM
NVIDIA Launches Intel IGP
September 25, 2007, 7:25 AM
Intel Shows Off 32nm Test Shuttle
September 18, 2007, 4:34 PM
Intel Sets Official "Penryn" Launch Date
September 18, 2007, 1:17 PM
VIA Launches EPIA SN With 1.8 GHz Processor
September 17, 2007, 4:00 PM
Freescale Licenses AMD Technologies
September 17, 2007, 3:43 PM
AMD Adds Triple-Core Processors to Roadmap
September 17, 2007, 2:45 PM
Lenovo Announces Solar Power Capable Desktop
September 13, 2007, 2:00 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki