backtop


Print 333 comment(s) - last by t5chris.. on May 13 at 7:25 PM


The 12 inch long OEM ATI Radeon HD 2900 XTX

Sizing up the difference between the 12 inch and 9.5 inch cards
AMD's flagship ATI Radeon HD 2900 XTX fails to usurp the GeForce 8800 GTX's performance crown

AMD is close to unveiling its long-awaited R600-based ATI Radeon HD 2900 XT. DailyTech previously posted benchmarks comparing the Radeon HD 2900 XT and GeForce 8800 GTS 640MB.

Up until March of 2007, the spearhead of the ATI Radeon HD 2900-family was the upcoming ATI Radeon HD 2900 XTX. This model is the big daddy of AMD’s DirectX 10 lineup, poised to take on NVIDIA’s GeForce 8800 GTX and upcoming 8800 Ultra.

ATI Radeon HD 2900 XTX video cards feature 1GB of GDDR4 video memory, besting the GeForce 8800 GTX’s 768MB of GDDR3 video memory.  GPU manufacturers often wait to set the exact clock frequencies of bins until just weeks before launch.  Our Radeon HD 2900 XTX sample was released to board partners in the second week of April, features memory clocked at 2.02 GHz and a core clock of 750 MHz. 

Physically, the Radeon HD 2900 XTX core is identical to the Radeon HD 2900 XT core.  Both feature 320 stream processors, but the XTX differs by bringing GDDR4 to the package.

Despite the reference clock speed differences, DailyTech managed to push the ATI Radeon HD 2900 XT up to 845 MHz core and 1.99 GHz memory.

Unlike the ATI Radeon HD 2900 XT benchmarked yesterday, the HD 2900 XTX is a 12-inch card geared specifically towards OEMs and system integrators. Expect retail box ATI Radeon HD 2900 XTX graphics cards to use the shorter 9.5-inch design, as with the HD 2900 XT.

The test system specifications are as follows:
  • AMD ATI Radeon HD 2900 XT (745 MHz core, 800 MHz GDDR3)
  • AMD ATI Radeon HD 2900 XTX (750 MHz core, 1010 MHz GDDR4)
  • AMD ATI Catalyst v8.361 (drivers slated for retail)
  • NVIDIA GeForce 8800 GTX (650 MHz core, 1000 MHz GDDR3)
  • ASUS P5N32-E SLI (nForce 680i)
  • Intel Core 2 Extreme QX6800
  • Corsair XMS2 PC2-8500 (800 MHz, 5-5-5-18, 1T)
  • Acer X241W
It's important to note that the GeForce 8800 GTX is a vendor overclocked board that comes shipped with the 650 MHz core clock.

Frames per second 1280x1024
Game
Radeon HD
2900 XTX
Radeon HD
2900 XT
GeForce
8800 GTX
Company of Heroes
97.1 N/A
128.6
F.E.A.R.84
79
125
Half Life 2: Episode 1
117.9
118.9
157.1
Elder Scrolls IV: Oblivion
100.3
101.4
110.5

Frames per second 1600x1200
Game
Radeon HD
2900 XTX
Radeon HD
2900 XT
GeForce
8800 GTX
Company of Heroes
73.7 N/A
94.5
F.E.A.R.58
54
90
Half Life 2: Episode 1
91.5
90.8
134.2
Elder Scrolls IV: Oblivion
87.9
86.2
102.9

Frames per second 1920x1200
Game
Radeon HD
2900 XTX
Radeon HD
2900 XT
GeForce
8800 GTX
Company of Heroes
53.2 N/A
80
F.E.A.R.53.7
52
81.7
Half Life 2: Episode 1
68.2
67.8
100.2
Elder Scrolls IV: Oblivion
75.1
73.4
98.4

The benchmarks DailyTech performed yesterday utilized release candidate drivers. Today's tests used retail drivers ATI released to its board partners.

The less than stellar performance benchmarks are no surprise to board partners and AMD insiders.  Two independent ATI board builders told DailyTech that Radeon HD 2900 XTX will not be a part of their initial portfolios.  Given the additional memory cost and bulky card footprint, it seems unlikely the XTX will ever see the light of day.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Very strange...
By tenks on 4/26/2007 2:13:30 AM , Rating: 4
I am really surprised with how crappy the performance is for the XTX....This seems really odd as the XT beat the GTS right? Also I noticed the clocks for the 8800GTX are higher then the 575/900mhz clocks of the original. Are you using an OCed 8800?




RE: Very strange...
By KristopherKubicki (blog) on 4/26/2007 2:15:20 AM , Rating: 3
It's a vendor OC (its best if we dont' reveal the exact card), but you are correct it's not the reference clocks.


RE: Very strange...
By mkruer on 4/26/2007 3:12:27 AM , Rating: 1
I seriously question Anandtechs bench marking ethics. Either do Stock to Stock OR Best over clocked to Best over clocked. This Stock to Overclocked is crap. The only time that Stock to Overclocked is legit is when you are comparing the same make an model to set up a metric for how the part scales with speed.

Anandtech is becoming or perhaps has become another THG.


RE: Very strange...
By oddity21 on 4/26/2007 3:17:54 AM , Rating: 2
Since they had a limited time window to benchmark the cards, you shouldn't expect a thorough review.

Plus, it doesn't really matter if the XTX's performance is this poor.


RE: Very strange...
By mkruer on 4/26/2007 3:31:25 AM , Rating: 3
That makes very little sense. Limited time window yes, but if they have a test system that is fully reproducible, then why can they not come back put in a stock nvidia card and generate the results for the stock nvidia comparison? Or are you saying that the test is not reproducible? If that’s the case, then you might as well chuck out this entire article, because it’s pointless.


RE: Very strange...
By oddity21 on 4/26/2007 3:34:49 AM , Rating: 2
I agree that they could've (and should've) done that. Then again, this is better than nothing, IMO.

Let's just wait for more information. I'm as shocked as everyone else by this.


RE: Very strange...
By mkruer on 4/26/2007 3:54:49 AM , Rating: 2
They still can. If Kris is still reading this it would take all but a minute to add the note.


RE: Very strange...
By DTAllTheBest on 4/28/2007 5:27:20 PM , Rating: 1
Yeah! Everybody lets wait. I hope the GTX will not beaten up by this card.


RE: Very strange...
By Ard on 4/26/2007 4:55:39 AM , Rating: 1
Not quite sure why anyone is complaining about the use of an OCed 8800. That's pretty much all you can buy. Who, in this ridiculously competitive, uses stock clocks anymore?


RE: Very strange...
By theapparition on 4/26/2007 8:00:26 AM , Rating: 5
Agreed, but would like to point out the 8800 is stock. Stock is what you buy, and if the vendors are factory overclocking with full warrantee support, then so be it.


RE: Very strange...
By Dactyl on 4/26/2007 2:02:12 PM , Rating: 2
quote:
Agreed, but would like to point out the 8800 is stock. Stock is what you buy
What's stock?
Stock is when you clock, no less than 30 up
Stock is when your chips, ain't safe because of heat


RE: Very strange...
By coldpower27 on 4/26/07, Rating: 0
RE: Very strange...
By Sunrise089 on 4/26/2007 3:18:47 AM , Rating: 4
Good point, but this is Dailytech, not AT. Though we all know most of the staff here are AT vets, they have to be different companies for NDA purposes.


RE: Very strange...
By mkruer on 4/26/2007 3:53:10 AM , Rating: 2
Yeah that didn't occur to me until after I posted. But still with such a close association I would assume they would be using similar if not identical ethics clauses.


RE: Very strange...
By mkruer on 4/26/2007 3:20:59 AM , Rating: 2
I didn't mean to should pissy, but at the very least when anyone does one of these "comparisons" the should note it on the graphs listed above. Place a asterisk notifying that the results list from a card are from a non-stock card. Is that too much to ask?


RE: Very strange...
By JSK on 4/26/2007 3:24:38 AM , Rating: 4
quote:
It's important to note that the GeForce 8800 GTX is a vendor overclocked board that comes shipped with the 650 MHz core clock.


Maybe if you just read the article they wouldnt need asterisks?


RE: Very strange...
By mkruer on 4/26/2007 3:39:31 AM , Rating: 2
I did see that, that’s why I made the comment. Apparently you have never written a proper thesis paper; forgetting to reference your notes usually lead to a failing grade on the paper.


RE: Very strange...
By TomZ on 4/26/2007 9:04:15 AM , Rating: 2
Good think this is a news site then. :o)


RE: Very strange...
By behemothzero on 4/26/07, Rating: 0
RE: Very strange...
By analex on 4/29/2007 5:10:38 PM , Rating: 2
quote:
I did see that, that’s why I made the comment. Apparently you have never written a proper thesis paper; forgetting to reference your notes usually lead to a failing grade on the paper.

Maybe you're the one who has never written a proper thesis paper. That bolded part is bad sentence construction.


According to you, what would be correct?

Changing the punctuation, like so: "I did see that -- that’s why I made the comment." Or reconstructing the sentence completely? Like so: "I made the comment because I saw that?"

Perhaps, I, too, am wrong?


RE: Very strange...
By deeznuts on 4/26/2007 1:14:27 PM , Rating: 4
quote:
I did see that, that’s why I made the comment. Apparently you have never written a proper thesis paper; forgetting to reference your notes usually lead to a failing grade on the paper.
If you saw it then why did you even complain about ethics? Questionable ethics is when you compare stock vs. overclocked and dont' even mention it. He made full disclosure here, get over it.


RE: Very strange...
By Crank the Planet on 4/27/2007 1:23:27 PM , Rating: 3
Also agreed that he discloses the fact that he is testing a stock card and an OC'd card. It doesn't matter if you can buy it OC'd and it has a full warranty- OC'd is OC'd. That means it has been MODIFIED from how the manufacturer created it. The fact that you can buy a Saleen and know you are getting a high performance car is the same thing- it's no longer just a Mustang.

Another note- even though he puts in his disclaimer the title is what's misleading. It has been worded to create the shock that everyone is experiencing so more people will read it. That's called journalism- lol


RE: Very strange...
By Ard on 4/27/2007 2:39:19 PM , Rating: 2
Get off the overclocked vs. stock comparison already. If you buy a factory overclocked card, that's stock, out of the box performance. End of story. That's what the XT/XTX are going to be competing against or do you expect consumers to just look at stock cards because it wouldn't be a fair comparison? The point is, the XTX would get slapped around by a stock GTX as well. Hell, ATI admitted as much. Leave it alone, it's done.


RE: Very strange...
By defter on 4/26/07, Rating: 0
RE: Very strange...
By mkruer on 4/26/07, Rating: 0
RE: Very strange...
By JSK on 4/26/2007 4:02:58 AM , Rating: 3
The 8800Ultra clocks were posted over at Dell's site earlier this week...


RE: Very strange...
By cheetah2k on 4/26/2007 4:10:38 AM , Rating: 2
Post a link dude


RE: Very strange...
By JSK on 4/26/2007 4:15:35 AM , Rating: 4
RE: Very strange...
By cheetah2k on 4/26/2007 5:44:45 AM , Rating: 2
cheers!


RE: Very strange...
By enlil242 on 4/26/2007 8:50:55 AM , Rating: 5
My first thought was like yours, stock to stock or nothing. But then I thought about it and came to the conlcusion that I have waited for a DX10 card until ATI's XTX was released and expected it to stomped on the 8800GTX, not catch up to it.

I would have thought, if anything else, the XTX would be on par with an OC'd 8800. To wait this long for ATI to meet the 8800's performance is a failure in my opinion. Let's hope further benchmarks reveal a bit more.


RE: Very strange...
By CBone on 4/26/2007 11:08:48 AM , Rating: 3
If it comes from the factory at those clocks, it is stock. Would you prefer they underclock it to reference clocks? What would that prove?


RE: Very strange...
By coldpower27 on 4/26/2007 1:02:53 PM , Rating: 2
It's legitimate if your comparing a retail available OC 650MHZ card, I know it's not the Nvidia reference design one, but this card is available to be bought, then it's a retail sample vs retail sample.

They will probably do a reference vs reference later.


RE: Very strange...
By ninjit on 4/26/2007 6:23:31 PM , Rating: 2
it appears that some people still need to be reminded
dailytech is NOT anandtech, they spun-off and formed their own website for news.

Anandtech now uses dailytech as their daily news feed (the headlines section).


RE: Very strange...
By retrospooty on 5/2/2007 11:32:45 AM , Rating: 2
"I seriously question Anandtechs bench marking ethics."

YOu know, it IS mentioned in the article that the 8800 is factory OC'd. Its not like they are trying to SKU the data. The point of the article is that the XTX seems not too fast.


RE: Very strange...
By Chadder007 on 4/26/2007 8:43:04 AM , Rating: 2
I wonder if there is a major flaw in the drivers perhaps?? At least I hope so anyway.


RE: Very strange...
By Anh Huynh on 4/26/2007 2:22:06 AM , Rating: 5
Also note the HD 2900 XT and XTX share the same exact GPU, whereas the GeForce 8800 GTS and GTX differ in shader processing capabilities. The GTS has 96 stream processors and the GTX has 128, which makes a larger performance difference than slightly different clock speeds.


RE: Very strange...
By redbone75 on 4/26/2007 2:30:02 AM , Rating: 2
That seemed odd to me as well: the HD 2900 XT bested the 8800GTS, but the 2900 XTX not only failed but failed miserably against the GTX? Would like to wait for retail cards and a more thorough test; however, that would really suck if there were no retail availability of a XTX.


RE: Very strange...
By redbone75 on 4/26/2007 2:36:13 AM , Rating: 5
Also, looking at the numbers, there would seem to be no reason whatsoever for someone to purchase a XTX seeing that it only provides an order of 5-6 fps over the XT. What?! It was even slower in Half Life 2: Episode One to the XT by a frame! There's got to be something seriously wrong with either the card, the drivers, or something. That, or the XTX simply sucks, but I find that hard to believe considering the specs of the card and the stellar showing the XT had against the GTS.


RE: Very strange...
By sol on 4/26/2007 2:38:21 AM , Rating: 4
The difference between 8800 GTS and GTX is huge though. The GTS is actually too crippled if you ask me. :-)


RE: Very strange...
By defter on 4/26/2007 3:40:45 AM , Rating: 5
Yes, since GTS has some units disabled.

Compared to GTS, GTX has:
- 50% more shader power ((1350x128)/(1200x96)=1.5)
- 38% more ROP power ((575x6)/(500x5) = 1.38)
- 53% more TMU power ((575x128)/(500x96) = 1.53)
- 35% more memory bandwidth (86.4/64 = 1.35)


RE: Very strange...
By oddity21 on 4/26/2007 2:47:00 AM , Rating: 2
Architectural efficiency problems, maybe? And isn't XTX just an XT with 1GB memory, since the clock difference is so minute?


RE: Very strange...
By KristopherKubicki (blog) on 4/26/2007 2:50:48 AM , Rating: 2
It's my thinking that the XTX could be set with higher clocks (remember the final clocks *still* aren't set in stone), but the big thing is the memory doesn't really seem to show better performance. I'm pretty convinced though that OEMs aren't picking this card up.


RE: Very strange...
By oddity21 on 4/26/2007 3:11:01 AM , Rating: 3
Is it possible for you guys to run another benchmark using seriously video memory-intensive games like, say, Oblivion with Qarl's Texture Pack 3 plus 16xAF? That thing eats video memory like mad.

Doubters should take note of the last paragraph. If the 2900XTX is as awesome as the hype, board builders wouldn't have dropped it from the launch lineup.

This is bad for the competition, IMO.


RE: Very strange...
By cheetah2k on 4/26/07, Rating: 0
RE: Very strange...
By oddity21 on 4/26/2007 4:21:47 AM , Rating: 5
quote:
Imagine an XT with GDDR4 mem and increased clocks? Nice!


2900XT + GDDR4 + increased clocks = 2900XTX

Is it not? XD


RE: Very strange...
By JSK on 4/26/2007 4:23:38 AM , Rating: 2
Id mod you up if I could.

If they are going to delay the XTX till 65nm, that may make sense, but it certainly isnt ready now.


RE: Very strange...
By cheetah2k on 4/26/2007 5:44:27 AM , Rating: 2
Yes, it would be an XTX, but it seems that the XT base is more optimised. I wonder what the board revision is on the XTX tested in this part-review, and compare that to the XT tested...

Oddity21, I know you wernt being facetious


RE: Very strange...
By oddity21 on 4/26/2007 8:40:25 AM , Rating: 2
While I wouldn't consider optimization to be a factor in these matters in other circumstances, this benchmark forces it into my mind. It's shocking how more memory and greater clock speed result in a performance dip.

I still hope that everything turns out differently on May 14th, for the sake of the competition. Doesn't look like it though. *sigh*

Still, it isn't like this is going to be ATI/AMD's end. The 2900XT is set to comfortably thrash the 8800GTS, while the real battle will be fought between the 8600GTS/8800GS and the 2600 series cards anyway.


RE: Very strange...
By defter on 4/26/2007 3:36:02 AM , Rating: 4
That's not suprising, afterall the core clock is almost the same and only difference is in memory clock and size:
- extra 512MB will not make any difference in today games
- XT already has ~100GB/s of memory bandwidth, that's quite plenty. Increasing memory bandwidth by e.g. 20% will not bring significant performance increase without corresponding core clock increase.

Thus, it's not suprising that the fate of XTX is still unknown, it has roughly the same performance with significantly higher production costs (more expensive memory, twice as much memory), this naturally would lead to lower margins for AMD and their board partners.


RE: Very strange...
By ogreslayer on 4/26/2007 7:27:29 AM , Rating: 2
Why would you think its strange the clock difference over an XT isn't there. Unlike the GTS and GTX cards relationship the XT and XTX are the same core with all the same features enabled.

This is exactly where I thought the XTX would be unless it had a massive increase in speed. Although, the drop in speed at 1920x1200 to the GTX seriously surprised me. It seems that the 1GB of GDDR4 is not doing what they thought it would. I'm sure if they get it to 1200 or so; it will sort itself out. But they'd also need to move the core clock into the 800 range.

And its not like an XT hoses a GTS. Beats it sure, but it is not smacking it up and down the street like it stole the XTs money. This was always gonna be the problem with R600. and is probably why they are looking to compete in the price area rather than the performance area. The card is not that powerful and now way to late for its own good.

Even at stock the cards would perform similar, I'd assume the XTX and the GTX would swap positions in some tests. The real point is that the Ultra is supposed to be nothing more than a cherry picked OC'd GTX. And its not like a good chuck of the companies don't sell OC cards.

I'd assume for the clocks however that the card was a certain 8800GTX, one with water-cooling. That's just a little on the unfair side, even if you are trying to point out the foolishness of the XTX's clocks when an Ultra or even more powerful products are gonna be on the way.


RE: Very strange...
By aguilpa1 on 4/26/2007 8:33:34 AM , Rating: 2
well, no not necessarily, you don't need H2O cooling to match the speed of 650Mhz GPU 1000Mhz Mem on the GTX, I have two running SLI and they can both run at that speed with no issues air cooled. They actually run set to 650Mhz GPU and 1024Mhz Mem. The GTX is just that good and that stable.


RE: Very strange...
By coldpower27 on 4/26/2007 1:07:49 PM , Rating: 2
There is no difference between the XT and XTX in terms of shader processors so the only increase in shader power comes from increase core rate, with the GTS to GTX 8800 line there is a 33% increase from the increase in units alone, and more plus the clockspeed hike.


RE: Very strange...
By wolf on 4/26/07, Rating: -1
RE: Very strange...
By KristopherKubicki (blog) on 4/26/2007 5:59:15 PM , Rating: 5
If we took legal advice from everybody who posted a forum comment we would have already been out of business.


RE: Very strange...
By JumpingJack on 4/26/2007 10:53:52 PM , Rating: 2
:) :) :) He did not sound very lawyer-like in his admonishment did he??

Isn't if funny to see people react in such a bizarre fashion when a review/preview/news article produces data they do not like to see ;) ....

Ostriches and sand.... ostriches and sand...


RE: Very strange...
By Ard on 4/26/2007 6:23:43 PM , Rating: 3
Ahh, laymen's who think they know the law. Common sense alone should tell you that legal agreements and contracts are only binding on those who sign them.


RE: Very strange...
By wolf on 4/26/07, Rating: -1
RE: Very strange...
By IntelGirl on 4/27/07, Rating: -1
RE: Very strange...
By scrapsma54 on 4/27/07, Rating: 0
RE: Very strange...
By Ard on 4/27/2007 3:06:13 PM , Rating: 3
Wow, 24xAA, huh? Nothing more than a pissing contest considering you're hard pressed to see a difference btw 8xAA and 16xAA. Next.


RE: Very strange...
By falc0ne on 4/26/2007 6:30:46 PM , Rating: 2
I was expecting sincerely a lot more performance from ATI/AMD...and by the numbers I see, their "ace" is on par in performance with 8800GTS or worse(let alone 8800GTS vendor overclocked,ie e-vga). I have no idea what's on their mind. How can you come up with this and expect to compete...If they don't cut the prices they don't stand a chance


RE: Very strange...
By HurleyBird on 4/27/07, Rating: -1
Numbers are FAKE People
By NextGenGamer2005 on 4/26/2007 4:36:05 AM , Rating: 5
Just let me show you what ATI's current Radeon X1950 XTX scores in these same games, with the same settings:

Company of Heroes - 1600x1200 - 69.2fps
Company of Heroes - 1920x1200 - 53.2fps

F.E.A.R. - 1280x1024 - 70fps
F.E.A.R. - 1600x1200 - 53fps
F.E.A.R. - 1920x1200 - 47fps

Half-Life 2: Episode One - 1280x1024 - 107fps
Half-Life 2: Episode One - 1600x1200 - 82fps
Half-Life 2: Episode One - 1920x1200 - 69fps

I would show you the Elder Scrolls IV: Oblivion scores as well, except that in this case the DailyTech numbers for the GeForce 8800 GTX are completely out of whack. I'm not sure how they got Oblivion running on all-high settings at 1920x1200 at 98fps, but in EVERY other place on the Internet, an XFX GeForce 8800 GTX XXX can only manage 69fps on an INDOOR scene at that resolution...let alone an outdoor one.

Now, look closely at those Radeon X1950 XTX framerates (they all came from X-bit Labs past reviews). Do you guys really think ATI would spend all these months and 700+ million transistors for a card that never performs more then 5% better, and in some cases, actually WORSE then their current champ? Yeah, I didn't think so either...

I am extremely dissapointed in DailyTech right now...




RE: Numbers are FAKE People
By JSK on 4/26/07, Rating: 0
RE: Numbers are FAKE People
By JSK on 4/26/2007 4:41:02 AM , Rating: 2
If thats a testament to the drivers, early hardware, GDDR4 issues, who knows.


RE: Numbers are FAKE People
By Sunrise089 on 4/26/2007 6:00:33 AM , Rating: 1
Nice research, and you're right - AMD wouldn't design a card to perform in that manner, but the consensus seems to be DT's benches are legit. So we fall back to either the design is (badly) flawed, or something is wrong with the drivers.


RE: Numbers are FAKE People
By hadifa on 4/26/2007 6:18:21 AM , Rating: 5
I checked Tomshardware VGA chart for oblivion score. It reports that
Oblivion outdoor, 1920*1200*32, no AA, 8xAF, Max quality, HDRR manages 28 FPS

From Anh description I understand that the test was an outdoor.
While there are different outdoor areas where the performance will be different still I am not sure if the 98.4 FPS for the 8800 GTX is correct.

Looking at the numbers for 2900XTX, they seem to be very close to the 1950XTX which is difficult to believe.

For example:

Company of heroes 1280*1024:
1950 XTX 99 (tweaktown) E6600
2900 XTX 97 (dailytech) QX6800

Company of heroes 1600*1200:
1950 XTX 70 (tweaktown) E6600
2900 XTX 73 (dailytech) QX6800

http://www.tweaktown.com/articles/1084/6/page_6_be...

FEAR 1280*1024:
1950 XTX 80 (tomshardware) No softshadow 4AA 8AF , X6800
2900 XTX 84 (dailytech) with softshadow 4AA 16AF , QX6800

FEAR 1600*1200:
1950 XTX 57 (tomshardware) No softshadow 4AA 8AF , X6800
2900 XTX 58 (dailytech) with softshadow 4AA 16AF , QX6800

http://www.tomshardware.com/2007/02/12/the_amd_squ...

In Fear the softshadows can make a big difference but in company of heroes it seems the 2900 XTX has no advantage over 1950 XTX.


RE: Numbers are FAKE People
By DingieM on 4/26/2007 7:18:25 AM , Rating: 2
Looking at your post now I believe the XTX with correct revision and mature drivers will utterly crush the 1950XTX as it should be.


RE: Numbers are FAKE People
By hadifa on 4/26/2007 7:40:41 AM , Rating: 2
Looking at Sven numbers, the difference between the 2900XTX and the 8800GTX is more like the difference of previous and the current generation rather than same generation cards.

Just a look at 2900XTX guarantees that the weak results can't be because of limited bandwidth or insufficient shader units or the like. It is a flaw and one can only hope it is a simple problem that can be easily fixed.


RE: Numbers are FAKE People
By coldpower27 on 4/26/2007 1:21:20 PM , Rating: 2
These results are actually in line with what we expect, there is no difference in terms of shader units and given the core rates 745 to 750 this is within margin or error mode, add to the fact that GDDR4 is typically higher latency as well as having 1GB is sometimes slower.

Not to mention the fact that more memory bandwidth would only come into play if the card is starved for bandwidth in the first place.

At over 100GB/s for the XT, it already has plenty to work with. So 20% more isn't going to really help, when the limitation is shader power.


RE: Numbers are FAKE People
By Regs on 4/26/2007 10:32:59 PM , Rating: 2
I would agree. The XTX just doesn't offer anything more to play with. AMD's line ends with the XT. End of story. They have nothing to compete with on the higher end.

Which would explain why AMD wanted to wait for the mid-range to launch with the high end because they have no high end.

So lets hope when 65nm comes out they are able to pump some life into the high end, or else they'll be locked into another price war with Nvidia. Nvidia could crush AMD right now the same way Intel is. Offer a higher performing and higher quality product, but at a competitive price.

Loyal as I am to AMD, though after 5 long years with nothing to show, they literally should just knock on my door and piss on my feet. That's how angry, disappointed, and terribly violated I feel to be any such loyal customer to them.


RE: Numbers are FAKE People
By elmikethemike on 4/26/07, Rating: 0
RE: Numbers are FAKE People
By coldpower27 on 4/26/2007 1:23:09 PM , Rating: 2
The numbers are accurate given the core rates and resolutions covered, it seems that the XTX need a core clock bump to over 800MHZ to becomes competitive with the GTX.


RE: Numbers are FAKE People
By tungtung on 4/26/07, Rating: 0
RE: Numbers are FAKE People
By slacker57 on 4/26/2007 1:36:38 PM , Rating: 5
quote:
Just look how flashy this site has been reporting the iPhone, and didn't even bother to mention that LG has actually a product similar to it and its out in the market now (the LG Prada).


http://www.dailytech.com/article.aspx?newsid=5758

Ooh, irrelevant AND incorrect. A double Whammy!


RE: Numbers are FAKE People
By coldpower27 on 4/26/2007 1:55:12 PM , Rating: 4
Worthless unless the testing suite is standardized and done by the same reviewer as you then introduce a huge margin of error.

Company of Heroes done on Xbitlabs normalizing to the X1950 XTX values.

8800 GTX
1600x1200 88.7
1920x1200 71.6

Half Life 2: Episode One It looks like your running into serious CPU limitations, as the decrease is only likely coming from overhead rather then shader limitation,

Performance if your shader limited should be close to these values:

Normalized to 1920x1200 as 100%
1600x1200 = 83.3%
1280x1024 = 56.9%

The figures you have provided show that the X1950 XTX is shader limited in Half Life 2: Episode 1 beyond 1280x1024.

1280x1024 = 107
1600x1200 = 82
1920x1200 = 69

Now let's see the 8800 GTX figures in those benchmarks

1280x1024 = 108
1600x1200 = 104
1920x1200 = 98

1600x1200 is 83% the workload of the 1920x1200 resolution, but the 8800 GTX is still providing 94% the performance of the 19x12 res at this resolution.

It likely going by this analysis the 8800 GTX is still CPU limited at 19x12 and like memory bandwidth limited, as the bottleneck isn't shader power. It's not surprising that the results here are much better as the Athlon FX-60 was a CPU bottleneck at Xbit, given they are using Core 2 QX6800 here with 2.93GHZ and the 20% advantage per clock it gives, give or take.

In F.E.A.R, without knowing what benchmarks were run, it's hard to say.

The article you also reference was quite old only 1 month or so after the 8800 launch, the drivers have matured since then, so Dailytech could be using a new set giving the 8800 GTX a considerable boost, as I heard the FW 15x are quite good.


RE: Numbers are FAKE People
By JumpingJack on 4/26/2007 10:56:24 PM , Rating: 3
quote:
Do you guys really think ATI would spend all these months and 700+ million transistors for a card that never performs more then 5% better, and in some cases, actually WORSE then their current champ?


That is precisely what AMD did when they released Brisbane...


RE: Numbers are FAKE People
By Azured on 4/27/2007 12:32:47 PM , Rating: 3
Brisbane is simply a process shrink. It may not do much for performance (in fact it should'nt do anything), but it does make the core a bit smaller, and thus cheaper to manufacture.


RE: Numbers are FAKE People
By JumpingJack on 4/28/2007 1:07:54 AM , Rating: 3
A shrink that took a hit in performance due to latency increase in L2 cache.... not much, but it did underperform at equivalent clocks....


RE: Numbers are FAKE People
By scrapsma54 on 4/27/2007 12:39:35 PM , Rating: 1
Good call. I also stated in one of my posts how they failed to mention what os they ran under. I believe Amd and Microsoft have been under close collaboration in order to design their graphics from the ground up for Vista, and not backwards compatible for Dx9, therefore, these numbers are shams until dx10 games are released, then we can see who really has their spine pressed to the back of the throne.


RE: Numbers are FAKE People
By Hawkido on 4/30/2007 6:13:00 PM , Rating: 2
I can't say the numbers are fake, and I really don't like entertaining the thought.

My thoughts are:

Remember when there was a DirectX change (8 or 9 I can't remember) and the cards made for the newer DirectX actually performed slower on the older directX games then the last generation cards on the older DirectX games.

The 2900 IS a DirectX10 card. So is the 8800. While you can compare the DirectX9 capabilities and say which you will buy now. I will not buy one till I see DirectX10 benchies.

How often do you see Feraris running Rally? Its fine and interesting to see how they perform in DX9, but that is not what they are designed for. And if you can't wait for a new card then get the one that performs best in DX9, because we really don't know how long it will be before DX10 sees the light of day. We really won't know which design will perform better until DX10 games are out and the crappo games out first are brushed aside.

I think the 8800 has more brute strength, but the AMD/ATI option has more Finesse. The ATI, with 360 individual stream processors, will be able to better adapt to a changing and varied environment. From what I have read about the 8800 the Shaders are programmed by the environment and cannot be changed just because you are facing a different direction (which can lead to lower minimum FPS, and exceedingly high Maximum FPS), whereas the ATI can change the programming of each shader with every rendering of every frame and yeilding a far more consistant FPS. Thus making far better use of its many simple shaders, than the the Brutish 8800. If the games are all programmed the way the 8800 is designed then the ATI 2900 will be destroyed, because it will not be able to flex because the program will not allow it, and the 8800 will power its way through each frame, while the 2900 will struggle to keep up.

Remember brute strength can be beaten by finesse. Just compare a v6 race car to a v8. The v8 will usually be faster in the straight stretches (where it is easy), but can't handle the corners as fast.

Do you really need 50-500 FPS? Wouldn't a consistently smooth 60 fps be far more preferable? Guess when the 500 FPS is going to drop to 50 FPS... Right when you need it the most, because all the explosions are happening, debris is flying, guns a blazing, planes crashing, you get the picture. Always when you need it the most is when your card is going to chug. The card that can on the fly reprogram its shaders to render what is needed (be it pixels, vertices, or physics) will best suit you in the end by not dropping the quality and framerate on you.


Horrible
By JackPack on 4/26/2007 2:03:41 AM , Rating: 1
And the XTX draws how many watts?

ATI might have a better mid-range product, but nVidia having a halo product like 8800 GTX/Ultra - that's priceless.




RE: Horrible
By PrinceGaz on 4/26/2007 4:07:57 AM , Rating: 4
Priceless is a rather ironic way of describing it- having a better top-end product like the 8800GTX/Ultra is indeed priceless if you mean that they make very little money out of it due to the relatively tiny number of them sold.

The mid-range is where the money is at, and having the best performing high-end part isn't going to make many people buy lackluster 8600 series cards. As for the 8500GT, on paper it looks like it will perform absolutely dreadfully even compared to the 8600GT, and be even poorer value despite its lower price.


RE: Horrible
By Mudvillager on 4/26/2007 4:42:25 AM , Rating: 2
Actually they've sold incredibly well...


RE: Horrible
By JackPack on 4/26/2007 4:45:53 AM , Rating: 2
quote:
The mid-range is where the money is at, and having the best performing high-end part isn't going to make many people buy lackluster 8600 series cards.


Of course it will. That's why it's called a halo effect. Most uninformed consumers simply ask whether nV or ATI is better. They don't spend the effort analyzing the GT/GTS benchmarks. The rationale is that if nV has the best card, then the rest of their products can't be that bad.

If vendors like Dell recommend the 8800 GTX with their gaming systems, people automatically assume any card from nV is the way to go. It's like E6300 vs. 5600+ at the same price. Sure, the 5600+ performs better, but people are still buying the E6300 because Intel claims Core 2 Duo is the "world's best processor" and everyone seems to agree.


RE: Horrible
By StarOrbiter on 4/26/2007 6:10:14 AM , Rating: 2
My AMD fanboy friend. If I'm not mistaken, that is after AMD Price cuts. When put on par with the recent Intel price cut, the E6420, which is priced where the E6300 was is better than the 5600+


RE: Horrible
By Martimus on 4/26/2007 9:43:34 AM , Rating: 2
Wow, that came out of left field. What is it with Fan Boy attacks that have very little to do with the actual post. I am just getting tired of reading these unfounded attacks that people write to each other that I actually created an account just to point out how stupid they sound.


RE: Horrible
By coldpower27 on 4/26/2007 2:01:08 PM , Rating: 4
Give him some slack, maybe he wasn't aware of the impending Core 2 Duo price drops on 22nd of April. :)

Once normalized against the E6420 goes against the 5600+, it's a pretty decent matchup. I would still say the AMD is better in performance but it's now marginal, a 5200+ can already take on the E6420. 5600+ lies between the E6600 and E6420. 6000+ ties the E6600 overall, they actually trade blows.


RE: Horrible
By bl1nd on 5/10/2007 8:18:14 PM , Rating: 2
Lol damn amd fanboys... look this is how it is... right now AMD totally lost the war a 5600+ cost the same as an e6300 because they had to cut prices to AT LEAST get a space in the market because they were being anhilated it would be amazing to know how much money is AMD making per CPU vs Intel... when core2duos came out a 4200 costed the same as an e6400

Your putting the second fastest CPU from the AMD X2 line vs the 7 or something in core2duo


RE: Horrible
By Garson007 on 4/26/2007 10:09:12 AM , Rating: 2
I agree 100%. People not in the know-how normally only think about the high-end cards. Same goes for any marketed goods. In the end people only buy the budget or mainstream cards, not really knowing how the performance actually differs. In my country you rarely go into a shop and see anything other than budget cards in stock, with the brand of the high-end champion.

The people actually in the know-how go for the high-end cards because they want the best performance, they -need-* the best performance.

Either way AMD loses again (referring to CPUs). People are not the researchers (GPU) and long-term investors (CPU) you think they are. You're marketing is completely wrong.

*Sarcasm


RE: Horrible
By kilkennycat on 4/26/2007 4:33:34 PM , Rating: 4
400,000 GTX/GTS cards at last count.....two months ago !
Not bad for the highest-profit-margin products in the nVidia/partners line.


Something is wrong, no?
By Sunrise089 on 4/26/2007 3:16:30 AM , Rating: 2
I'm not able to take the time to look back at AT's GTX and GTS reviews to see if either of them is performing well off or above the pace, but if we assume DT is getting the performance they shouuld be out of the nVidia parts, then it's obvious something (other than the on-paper design) of the XTX is wrong. There is no reason the XTX is no faster (and sometimes slower) than the XT. I have to assume the GDDR4 is having a problem with drivers or something else, because otherwise even if slower than the GTX, that XTX shouldn't be slower than it's little brother.

One thing that I would predict: if AMD knows it won't sell many of these, they might as well speedbin the GPUs and raise the clocks so the few they do sell will look better in benches.

The moral of this story, if AMD can't fix things, is to learn from nVidia's example: when you don't own the high-end, make a great midrange part. AMD: please one-up the 8600GTS, and take at least some of the market.




RE: Something is wrong, no?
By KristopherKubicki (blog) on 4/26/2007 3:22:24 AM , Rating: 5
quote:
There is no reason the XTX is no faster (and sometimes slower) than the XT.

Note that the clock frequencies are so close. If ATI bumps these clocks (significantly) you could see some performance improvements. But this illustrates the difference between the GDDR4 and GDDR3 is almost nonexistant.


RE: Something is wrong, no?
By defter on 4/26/2007 3:29:04 AM , Rating: 2
There has been many rumors that XTX should have 800MHz core clock and 1100MHz memory clock. Do you know anything about it?

However, it looks from those results that raising clocks to 800/1100MHz (+6% core, +10% memory) will not be enough to surpass GTX. Maybe AMD just decided to lower clocks a bit to reduce power consumption?


RE: Something is wrong, no?
By KristopherKubicki (blog) on 4/26/2007 3:31:40 AM , Rating: 4
We've gotten the core up to 845MHz stable, so anywhere between 750MHz and 845MHZ obviously seems feasible for the "official" clocks.

My money is on ATI lowering the price of the XT to the point where it becomes a better per dollar buy even if its not the halo champion.


RE: Something is wrong, no?
By kilkennycat on 4/26/2007 5:00:22 PM , Rating: 2
quote:
My money is on ATI lowering the price of the XT to the point where it becomes a better per dollar buy even if its not the halo champion.


Er, one little problem with that suggestion. Little or no price premium to help quickly recover development costs.... nVidia has had that particular luxury for the past 6 months with a high-end market all to themselves and at least 400,000 G80s sold. Helps finance the G80 successor now in development, if nothing else....


RE: Something is wrong, no?
By cheetah2k on 4/26/07, Rating: 0
RE: Something is wrong, no?
By coldpower27 on 4/26/2007 2:08:27 PM , Rating: 4
It works if you assume margin of error, 745 to 750 is less then 1% difference, easily attainable within human error.

As well you have to add that larger memory sizes have worse latency at the same speed, not even mentioning the latency is worse on GDDR4 then GDDR3.

Memory bandwidth only comes into play if you don't have enough, maybe 25x16 will show some differences for the XTX. 100GB/s is plenty already..


Most likely R600=TMU limited
By Sharky974 on 4/26/2007 6:48:48 AM , Rating: 5
Working out the theoretical shader power, ATI claims something like 320 stream procs (scalar ALU's) at ~750mhz. Nvidia has 128, of course they are "double" pumped. But they are at 1350 mhz..so Nvidia it's the same as if they had 256 at 675mhz.

Theoretically:

X2900XT=320@750
8800GTX=256@675

You see the X2900Xt should have more shader power. Now, the ATI uses supposedly a vector alu designand the Nvidia supposedly a scalar ALU design which can be more efficient. However, unless I miss my guess, it's not going to be so efficient as to make GTX's significant raw shader power deficit overtake ATI's design by vast amounts as shown in the benches. They should be comparable.

So my best guess is ATI seriously TMU crippled their card, just as they did with R580. R580 did not have ebough texture throughput to really take advantage of it's raw power (which was MASSIVELY more than 7900GTX). So this isn't the first time ATI has done this.

I heard rumurs R600 only had 16 TMU's, same as R580 had. 8800GTX has 32 IIRC.

In other words ATI once again SERIOUSLY AND INTENTIONALLY crippled this card, by outfitting it with what was obvious to anybody with a severe lack of TMU's, for the second generation in a row, is my very strong guess.

Engineers at ATI should be fired if this proves out. They are crippling their own products on purpose because they want to lose. The company should also be sued by it's shareholders for these deliberatly anti-competitive actions.




RE: Most likely R600=TMU limited
By DingieM on 4/26/2007 7:31:26 AM , Rating: 3
You won't come far regarding theoretical power.
No electronic "calculation" device (be it CPU or GPU) can function 80% of their maximum theoretical speed.
Even if its 80% than its a real real good product.

Efficiency outweighs theoretical pixel/vertex pushing power.
ATI is not stupid, they choose to accelerate pixel shading because they think the last generation of DX9 games would use.
Now DX10 is a whole different story.

Why do you think (partly) ATI's Xenos inside the Xbox360 is so much faster than nVidia's RSX? right: efficiency and by the way superior architecture.


RE: Most likely R600=TMU limited
By FITCamaro on 4/26/2007 12:02:06 PM , Rating: 4
quote:
Engineers at ATI should be fired if this proves out. They are crippling their own products on purpose because they want to lose. The company should also be sued by it's shareholders for these deliberatly anti-competitive actions.


Yes you figured out the conspiracy. ATI actually wants to loose so they can close up shop and not have to work anymore.


RE: Most likely R600=TMU limited
By shabby on 4/26/2007 11:05:50 PM , Rating: 4
Maybe it is crippled, but its not like ati can just add more tmu's at will.
Features/speed/r&d time... pick two.


Quality settings
By Anh Huynh on 4/26/2007 1:50:28 AM , Rating: 6
Before anyone asks, the quality settings for the games were as follows:

Company of Heroes - High shader quality, High model quality, Anti-aliasing enabled (in game), Ultra texture quality, high quality shadows, high quality reflections, Post processing On, High building detail, High physics, high tree quality, High terrain detail, Ultra effects fidelity, Ultra effects density, Object scarring enabled and the model detail slider all the way to the right.

F.E.A.R. - 4x FSAA (in game), maximum light details, shadows enabled, maximum shadow details, soft shadows enabled, 16x anisotropic filtering, maximum texture resolution, maximum videos, maximum shader quality.

Half Life 2: Episode 1 - High model detail, high texture detail, high shader detail, reflect all water details, high shadow detail, 4x multi-sample AA (in-game), 16x anisotropic filtering, v-sync disabled, full high-dynamic range.

Elder Scrolls IV Oblivion – Preset ultra quality settings. The scene used was where you’re at the end of the tunnel and run outside for a little bit.




RE: Quality settings
By drebo on 4/26/2007 2:08:16 AM , Rating: 2
Something is greatly wrong with these benchmarks. I will reserve my judgement until more thorough benchmarks with gold hardware have been conducted.


RE: Quality settings
By Galacticus on 4/26/2007 11:37:31 AM , Rating: 1
There is no way these benchmarks are correct. Have you contacted ATI ?


RE: Quality settings
By Christopher1 on 4/29/2007 2:10:26 AM , Rating: 2
I've got to agree with you. There is no way that these benchmarks are correct in the least, unless there is a problem with the drivers.

I've been thinking about getting a new ATI graphics card for my parents computer (still has an old X300 in it) and I was thinking of going with another ATI (no 'new driver' problems).

However, if the performance is this bad for the card....... it just ain't going to be worth the money.


RE: Quality settings
By jabber on 5/1/2007 6:32:49 AM , Rating: 2
What you say there makes no sense really. If your folks have been getting by with an X300 then they dont strike me as 'hardcore gamers'.

I hardly think they would be drooling over high-end cards.

So why would you even entertain the idea of upgrading to such a high-end card? You don't use your folks PC for gaming do you?

As for the benchmarks, have we not seen enough times that 'just before general release' performance is always rather lacking and kicks in later on?


RE: Quality settings
By bl1nd on 5/10/2007 8:12:44 PM , Rating: 2
WHY it has to be wrong? because it just didnt come out as u expected? lmao bunch of spoiled kids...
Grow up fanboys !!


Memory
By CyborgTMT on 4/26/2007 4:47:29 AM , Rating: 2
Can you verify that it's using GDDR4 and not GDDR3. I'm almost certain that earlier revisions for the OEM had DDR3, which would explain the almost identical scores between the XTX and the XT.




RE: Memory
By KristopherKubicki (blog) on 4/26/2007 4:59:12 AM , Rating: 3
Yes, its GDDR4. This is not an early sample.


RE: Memory
By CyborgTMT on 4/26/2007 5:03:19 AM , Rating: 2
Thanks for the fast reply.... no chance of tearing off that fan shroud/HS and throwing up some pictures of both cards so we can see what's different on both?


RE: Memory
By JSK on 4/26/2007 5:05:10 AM , Rating: 2
If they had time for that they probably would have finished off the benchmarks first?


RE: Memory
By CyborgTMT on 4/26/2007 5:07:44 AM , Rating: 3
Doesn't hurt to ask....


Bad drivers?
By hardwareking on 4/26/2007 3:59:51 AM , Rating: 2
It could be just a case of bad drivers couldn't it?

the 8800 cards had a lot of issues,so this cud be just poor drivers hampering the HD 2xxx cards performance




RE: Bad drivers?
By smut on 4/27/2007 12:02:53 PM , Rating: 2
If its bad drivers, will the same ppl that have been bitching about Nvidias drivers now turn around and bitch out about ATIs? I assume the fan boys wont.


RE: Bad drivers?
By smut on 4/27/2007 12:38:17 PM , Rating: 3
Its also funny to see how the ATI fanboys put down NVIDIAs drivers and have been claiming ATI will have better vista drivers, they have the best drivers etc. But now when the XTX loses to a 6 month old card they claim its probably bad drivers. Great job on being a hypocrite!

You complain about Nvidias drivers but when ATI loses "its okay, they probably just have bad drivers". Why arent you bitching up a storm? Ya know, the way you bash Nvidia drivers every chance you get.

As an outsider looking in on the fanboys I find this hilarious! If your going to bitch about one companies drivers, bitch about the others. They'll shrug off one brands bad performance and claim its "driver issues". But when the other company has trouble with their drivers theyre bashing them non stop.


RE: Bad drivers?
By Scabies on 4/27/2007 1:35:45 PM , Rating: 2
hmm... release candidate vs 6 months of "in the market" development. Yeah, you're right we shouldnt point that out.


RE: Bad drivers?
By Ard on 4/27/2007 3:09:33 PM , Rating: 2
You mean like those retail drivers DT blatantly stated they used for this test, right?


By Oxygenthief on 4/26/2007 3:03:37 AM , Rating: 5
Google the 2900 for reviews and benchmarks and you may stumble upon rumblings of poor performance from the xtx cards as opposed to the xt cards. Apparently, its a memory issue. The xtx cards are actually performing worse than the xt cards in some instances and no one seems to know why. This is a very costly issue for ATI/AMD if they cannot resolve the issue. GDDR4 is, by far, more costly to implement than GDDR3.

I consider myself to be an avid ATI and AMD fanboy (I like to back the underdog). I remember when the 9700 Pro came out and spanked Nvidia, then the 1900xtx and again with the spankage. But now it would appear that Nvidia is fed up and decided to do some spanking of their own. Kudos to Nvidia to retaining the enthusiast crown.

I can only hope that ATI/AMD can continue to compete in the mainstream market.




By penter on 4/26/2007 3:12:52 AM , Rating: 2
AMD knows about this, they were planned to release this XTX with gddr4 but as the bechmarks reveal the card isn't that great so they will not release it anytime soon.
They would be stupid to release a more expensive card that gives no real advantage.


this is rubbish
By penter on 4/26/2007 3:17:21 AM , Rating: 2
Whats going on dailytech? Having some problems doing some descent benchmarks??
Take a look at the last article of yours from the 2900XT vs the 8800GTS. Guess what, the scores there were higher even tho you used a similar test setup with a slower CPU.
Next time if you want to do something ahead of everybody else, do it good.




RE: this is rubbish
By JackPack on 4/26/07, Rating: 0
RE: this is rubbish
By Goty on 4/26/2007 8:09:57 AM , Rating: 3
If you even read the previous article, you'll see that they have the XT in-hand, they were not canned benchmarks.


RE: this is rubbish
By coldpower27 on 4/26/2007 2:18:42 PM , Rating: 2
GPU power is largely governed by core clock when there are no differences between the 2 GPU cores, the 2900 XTX and XT has the same amount of functional units on the die both of them all enabled. Their only difference is 5MHZ that is less then 1% difference.

8800 GTS to 8800 GTX is a much more giant leap because Nvidia only has 3/4 of the shader units working on the GTS and the full enchalada on the GTX, so it gets a 33% boost minimum, plus the additional difference in their core rates, 75 MHz which is significant at the lower clock frequencies the cores are running at on the Nvidia side.


Oblivion Copy Error
By KristopherKubicki (blog) on 4/26/2007 3:18:33 AM , Rating: 6
We had a copy error on the 1600x1200 Oblivion benchmark (it was the same as the 1280x1024 row). That is corrected now.




RE: Oblivion Copy Error
By osalcido on 4/30/2007 11:18:34 PM , Rating: 1
after a mistake like that... I have to ask...

Is it at all possible that you guys got X1950XTX's and not 2900s?


This is one funny thread
By Lakku on 4/26/2007 6:31:11 AM , Rating: 5
This is one of the best threads I have seen on DailyTech in awhile. You got one side saying "In your face, biatch!" and the other camp saying, "Is it drivers..." or the more obvious "this is marketing FUD!". Good read, no doubt. But let's all step back for a minute.

1) The writing has been on the wall for WEEKS, maybe even MONTHS. Do you think the card is 6 to 7 months behind nVidia for no reason? HardOCP has been saying the XTX would be ATi's 5800, and they have been saying it for a looong time. They aren't the only ones, so if you are suprised by all of this, should it be TRUE, you haven't been paying attention or didn't want to believe. This is just the way it is if it pans out this way.

2) Quit focusing on the negative. The XT looks to be a good performer, especially since it's supposed to come in at under 400 MSRP. Hopefully it has good heat and power draw, otherwise it doesn't look rosy. Either way, it does perform well, so be happy about that. At least it's competition.

3) Wait for the mainstream, that is where the money is made. The HD 2600 series or whatever could be killer.

4) Quit bitching about nVidia drivers. They messed up on Vista, no real excuse. However, it worked just fine in most cases under XP, and performs even better today then at launch (not just bugs, but performance). Vista is much better as well. ATi drivers haven't always been rosy and I've had various issues with them as much as nVidia's, though not as bad as nVidia's in Vista. Either way, who cares, those with an 8800gtx are still usually quite very happy.

Lay back, deal with the fact the XTX may be/is ATi's 5800, and except the XT is a great card pending power usage and heat output.




RE: This is one funny thread
By Dustin25 on 4/26/2007 12:40:57 PM , Rating: 2
Yup, I have been enjoying my 8800gtx's since release date. One 8800 is overkill for every game out there at the moment, two is just insane. There is absolutely no need for a card with even higher performance at this point, but that's purely my opinion. I'm going to sell one of my cards and continue enjoying blazing fast 1900x1200 performance until dx10 games start rolling out and maybe challenging the 8800 goodness. Maybe those with 30" screens have seen their 8800 work, but I don't think mine has had to get out of first gear yet at stock speeds.


Doesn't Add Up
By Goty on 4/26/2007 8:18:47 AM , Rating: 4
According to sources not directly related to AMD, the XTX was supposed to be a "beast". That's not exactly the term I would use if the performance wasn't even up to par with the GTX. After reading some of the other posts and doing a little digging of my own, I've come to the conclusion that there's got to be something wrong with this.

Kristopher stated somewhere else that there was basically no difference between GDDR4 and GDDR3. Well, how about a 200MHz speed difference? That's pretty significant in my book. You might cite loose timings as the cause for the lack of a performance increase, but video cards are much less sensitive to memory timings than CPUs; Bandwidth is the biggest concern, which the XTX has in spades.

So, yeah, basically either AMD has lost it's collective marbles and has doomed their newly acquired GPU sction or something's not quite right here.




RE: Doesn't Add Up
By coldpower27 on 4/26/2007 2:29:14 PM , Rating: 3
Memory bandwidth only helps in situation where you are bandwidth limited, with both cards over 100GB/s I don't believe at these resolution you have reached the limits yet.

Also 512MB vs 1024MB also introduces additional latency, look at the 320 to 640 GTS in certain scenarios, where the frame buffer size isn't coming into play.

25% more bandwidth when you already have over 100GB/s isn't going to produce that much of a gain, I think ATI currently has more bandwidth then it already needs with the 512Bit Interface, at least at these resolutions.


This is quite funny
By Domicinator on 4/26/2007 11:06:46 AM , Rating: 4
I'm really shocked at gamers sometimes. If these XTX benches end up being legit, or even close to legit, then Nvidia wins this round. So what? These two companies go back and forth constantly. It's just that this time, Nvidia may have come out on top. Again, so what?

Being a fanboy of one company or the other is never a good thing, because you'll cheat yourself out of some great products. For you blind faith ATI users, are you really telling me that if the XTX truly sucks that you will buy it anyway because you hate Nvidia that much? That's ridiculous!! What would be wrong with switching companies for one generation?

For the last couple of generations of GPUs, Nvidia has come out with a great product, only to be slightly bested by ATI a couple of months later. This time that may not be the case. What's the big deal? Maybe, just maybe, Nvidia will have the best card for a few months. That's not a bad thing. If you're in the market for a card, buy the best one you can afford regardless of which brand it is. If you stick to that rule, you won't care which brand you buy, you'll just care about how awesome your games look/perform.




RE: This is quite funny
By cochy on 4/26/2007 11:46:09 AM , Rating: 3
quote:
It's just that this time, Nvidia may have come out on top. Again, so what?


I'd say so what, because Nvidia's released the better card half a year ago. I would expect much better from ATI.


By wingless on 4/26/2007 2:55:14 PM , Rating: 3
I'm an Nvidia fanboy. I admit honestly and wholeheartedly. I'm also an AMD fanboy and I would HATE to see AMD go out of business. The world would truly be bland with ONLY Intel processors (though software can finally be highly optimized for one platform). I've been running AMD+Nvidia for 7 years and they have always been great to me.

If AMD goes under we will lose much neeeded competition for Nvidia and also AMD. We will be doomed to bland hardware and slow product roll-outs if Intel and Nvidia dont have anybody to push them along. You Intel FANBOYS wont see Penryn for YEARS if AMD goes under.

Also I wanted to comment on my title. I believe these cards are truly optimized for a DX10 environment and we probably wont see the benefits until we play Crysis at 1650x1080 (my native resolution on this 22" widescreen with HDMI inputs), on Vista Ultimate edition sometime at the end of the year.

For all my Nissan 240sx/180sx/Silvia guys, the 8800 series is kinda like the KA24DE/T. It has torque everywhere and can pull frames at low resolutions and isnt too bad up high. These Radeons are like SR20DET's that dont have any use but ultra high revving (very high resolutions) where they work their best with a 512-bit mem bus with 1gb of GDDR4. We wont see the benefits until DX10, high resolution, and mature, optimized drivers.




By GlassHouse69 on 4/27/2007 12:52:45 AM , Rating: 2
I feel that this is the truth.

I also feel that this is exactly what marketing people do not want to see. Kiddies+benchers want to see frames now in current games. My x1900 aiw is supposedly slower than the regular x1800xt. It is equal in some aspects, but when the video quality goes up and things start getting advanced, my x1900 surpassed the x1800 series. I hope this is the same for dx10. I will be buying hellgate london and the conan game. I dont do mmorpg's that require payment, but I will for these two titles. I have to say that until the real popular big games come out, we wont know which card to buy. Guild wars 2 is also comming out. WOW needs a new graphics system badly too, (for all its nice art, its pretty pathetic in light of games like stalker or even half-life2)

complexity and ease of using dx10's new powers is where the real benches will be for the new ati card. either that or it is a wash out for 2007.


By Lakku on 4/28/2007 9:02:11 PM , Rating: 2
I disagree. DX10 doesn't matter at this point, so it does make a big point about the card if it does have this kind of performance, or lack thereof, in DX9 titles. The other point is this simple fact: the Crysis and BioShock teams have been using 8800GTX's to demo their games. Crysis has been demo'd on 30inch monitors, with the outrageous resolutions, on single GTX systems. It was running around 30fps apparently, with max settings. That means it should be just fine at anything below that, especially 1680x1050. BioShock also seems to run just fine at high resolutions on single GTX's. True, we need to wait to judge, but if the 8800gtx can have perfectly playable framerates in DX10 titles right now, what does it matter what the XTX can do? Unless of course you plan on spending thousands of dollars on a gaming machine just to play a couple DX10 titles and none of the dozens, if not hundreds, of DX9 titles.


OK something is wrong here
By FITCamaro on 4/26/2007 7:56:39 AM , Rating: 2
If you look at the performance at 1280x1024 for the X2900XT, it doesn't match that of the previous tests vs. the 8800GTS. The Oblivion performance is near twice as good which doesn't make any sense if the testing was done at the same detail level here.

I have a feeling these benchmarks were done without the high details of the 2900XT vs. 8800GTS tests.




RE: OK something is wrong here
By Lakku on 4/26/2007 8:13:26 AM , Rating: 2
The XT vs the GTS was on a different computer and setup, though relatively similar. It's unfortunate the same two computers couldn't be used, though I don't know if that should make up for the discrepancy. However, the previous test doesn't mention any FSAA for Oblivion, so it could have been using it. From personal experience, at least on 8800 series cards, FSAA in Oblivion can severly hamper framerates, especially w/ transparacy MSAA in foilage areas. This review does indicate NO FSAA was used in Oblivion. It seems about right, as when I turn off 2x FSAA with MSAA transparacy, I get a huge boost in performance, mainly due to the much higher min. FPS. Also, this test apparently is using a different driver as the previous for ATi, as this test uses the release drivers going to card makers now, and the previous tests used a unreleased candidate driver. Just have to wait and see.


RE: OK something is wrong here
By Lakku on 4/26/2007 8:20:23 AM , Rating: 2
I forgot to mention, the previous test doesn't mention the area used. This test does. The area just outside the tunnel isn't the most demanding Oblivion can get, though still demanding. Also, there is no mention of interior and exterior shadow settings, nor of the shadows on grass settings. Ultra high does NOT have these shadows up and on by default. You must turn on shadows on grass, and the default slider for exterior shadows is just above 1/4 of max, and interior is at 1/2 of max. Turning those two up, along with grass on shadows, takes quite a large performance hit, somewhere in the 15 to 25% range.


<no subject>
By Scabies on 4/27/2007 3:39:17 PM , Rating: 2
I know this is going to get lost in the torrent of replies, but could LinkBoost be to blame here? Can we do some testing on an AMD/ATI chipset?

also, wouldnt AMD realize that the XTX is inferior before releasing it into the wild, therefore scrapping it prematurely? "Here it is, six months late, overpriced, and totally pwned by what is currently available. Enjoy! (dont forget to grab a second to crossfire 'em!)"
...I think they would release the XT, do a core revision, then release the XTX. Unless it secretly dominates in the DX10 field, which some speculate.




RE: <no subject>
By Ard on 4/27/2007 4:59:39 PM , Rating: 2
Doubtful. LinkBoost has never had any appreciable affect on performance, on or off. And I'm sure AMD did realize that the XTX was inferior, hence the reason why it has been delayed, once again, to Q3'07.


RE: <no subject>
By KristopherKubicki (blog) on 4/27/2007 5:48:29 PM , Rating: 2
I had thought about this before hand, it was off.


This is not Real R600 perf
By Poogz on 5/7/2007 3:01:21 AM , Rating: 2
Uhuh.

Well something was very wrong with dailytechs little 'test' there.

I see on another forum (I don't want to hotlink to because I may get into trouble for wasting bandwidth). Anyway, he basically said that clocks automatically 'retard' if the card doesn't get enough power and it's best to have a single rail PSU. He also posted some 3dmark of an 'unknown videocard' with clocks of 885/2000 on the stock cooler with a 3.9ghz Kentsfield.

default 3dmark 03: 45 123 points
default 3dmark 05: 23 448 points

default 3dmark 06: 14 282 points
shader 2: 5421
shader 3: 6009

Clocks will apparently reach over 1ghz (on air) and the memory clocks 'will be a surprise'.

And the 8800ultra is 'jacked up as high as the G80 can handle' and will cost $850. So if you must, go blow money on that overpriced garbage even though it is still POSSIBLE for the R600 is be an amazing card which outperforms the G80.

People also talk about the Direct X 10 version of MSFS 10 (FSX). Well the patch they are developing is a performance enhancing patch; NOT THE DX10 patch (That will be coming out late 2007). However even without the DX10 patch I'm 100% sure the R600 would obliterate the G80 in fsX. The reason for this is the shaders on the G80 have problems with fsX (enabling water using shader 2.0 halves fps) and fsX is an extremely Vmemory intensive program.

All I'm saying is wait a few weeks for R600 before possibly wasting money on a Geforce 8. You don't have to beleive these benchmarks, that is for you to decide. However something was definately wrong with Dailytechs test.

The bottom line is if you're getting a Geforce 8 now, wait two weeks and see how the R600 pans out. Don't beleive a website (however reputable they may be) just because they said so.




RE: This is not Real R600 perf
By Poogz on 5/7/2007 3:11:36 AM , Rating: 2
Oh, and no I do not care if you call me an 'ATi' fanboy, or a 'OMFGLOLZOR NOOBZOR!!11111', because I only speak facts as I know them. The person who I got these numbers of is genius from 'Seattle', I am merely passing this information on (If I didn't think they were fact why would I post them?)

Dailytech; the numbers were definately REAL, you people should know that. However something was strangely wrong, all indications indicate the R600 is faster and more hungry than the G80. what power supply were you using?


RE: This is not Real R600 perf
By Poogz on 5/7/2007 3:16:59 AM , Rating: 2
And for the record the real gains from R600 will be made with the AMD Direct Connect (NOT DSDC) which will enable the GPU to talk directly with the CPU.


Sucks, but...
By archcommus on 4/26/2007 1:55:13 AM , Rating: 2
Sucks, but they're beating them in the midrange, which is more important.




RE: Sucks, but...
By otispunkmeyer on 4/26/2007 4:06:00 AM , Rating: 2
how are they doing this with no mid range DX10 part in the shops yet?

ok the x1950pro doesnt have too much trouble handling the 8600 but at the end of the day thats now an old part. it all rests on RV630 now....hopefully its abit better equipped than the 8600 is.


Please add all 8800GTS and 2900XT results
By KHysiek on 4/26/2007 2:06:39 AM , Rating: 2
You've used slightly different setup, but comparing 8800GTS to GTX, GTS seems to be 2x slower than GTX which I don't think it's true.




By KristopherKubicki (blog) on 4/26/2007 2:08:36 AM , Rating: 3
We had limited testing time with the cards (which is actually why we didn't include COD2 and some of the Radeon XT benchmarks). I'll try to get the GTS in there, but we had a limited window.


The delay is now apparent
By 457R4LDR34DKN07 on 4/26/2007 2:15:35 AM , Rating: 2
It seems that the delay has been explained, I'm still going to pick one up. Could it just be immature drivers?




RE: The delay is now apparent
By Griswold on 4/26/2007 3:33:22 AM , Rating: 2
I dont think the reason for the delay was the highest-end part. Thats just a marketing bonus, but the bread and butter is in the segments below. After all, they're not even sure if the XTX will hit the shelves in this shape.


Nobody reacts on 3dmark06 scores
By brokensoul on 4/26/2007 4:11:43 AM , Rating: 2
the previous test of the 2900XT gave an OC score of 14000, the score of an OC 8800GTX. the 2900xt and the 8800gtx seemed very close, as long as we can trust 3d mark (which has been the case for the previous years). How come those scores have nothing in common with the hierarchy of this article ?




By coldpower27 on 4/26/2007 2:34:13 PM , Rating: 2
The 8600 GTS gets 3D Mark 06 Scores close to that of the 7950 GT, but in typical cases isn't anywhere near that.

Same could be happening for the X2900's.


Question
By Min Jia on 4/26/2007 7:32:10 AM , Rating: 2
quote:
Despite the reference clock speed differences, DailyTech managed to push the ATI Radeon HD 2900 XT up to 845 MHz core and 1.99 GHz memory.

How did the overclocked 2900 XT perform then?




By Lugaidster on 4/26/2007 9:47:51 AM , Rating: 4
The R600 has 64x5 Vec5D units which maces each unit handle a maximum of 320 stream ops per second, but makes it's worst case scenario a lot worse with 64 stream ops per second (for the lack of a better term). You can think of that in the same manner as you see that SIMD units in our current processors can deliver huge amounts of processing power if, and only if, used correctly and optimized accordingly, otherwise we see no gains.

In my opinion AMD/ATI made a design compromise, they used this approach as it could prove to be way better in the dx10 world, and in a much more interesting way, in the GPGPU world.

Think about it, if you open up your architecture with CTM and give the people the power of 64x5 vec5d units you end up with an amazing amount of processing power. That's where I think they are focusing.

Nvidia has a much more favorable place in the gaming world. If you have 128 scalar units, in a worst case scenario you'd still issue 128 stream ops (all else constant, and given you have the bandwidth). But your best case scenario isn't that good.

I believe they delayed it because they were expecting dx10 games (of course, this is just speculation on my part). And I hope, for their sake, that it performs a lot better in that world.

Still, if I am somewhat right, drivers could provide better optimization for shader programs that aren't made with a simd architecture in mind, but then again, I could be entirely wrong.




This is a farce
By Mr Anderson on 4/26/2007 12:15:40 PM , Rating: 2
Go to Tomshardware, or any other site and look at the 1950XTX that uses GDDR4, and look at the 8000GTX. If you look at the bench marks on those sites, then compare them to the ones in this article, it is easy to see that something is wrong. Also, there is no indication whether these benchmarks used AA or AF or had HDR Lighting. I'm sticking my neck out there and call this what it is...FAKE!!!




RE: This is a farce
By coldpower27 on 4/26/2007 2:38:42 PM , Rating: 2
Unnormalized comparisons are worthless, you can't compare benchmarks across websites, you can only compare the delta differences but hos much is the X1950 XTX slower compared to the 8800 GTX at Toms' compared to here.


By kitaro on 4/26/2007 9:32:52 PM , Rating: 2
Folks take a very close look at the picture of the XTX. I do believe Dailytech benched it and believe the benchmark is legitimate. But something isn't right and thats fact. This isn't the retail card. The driver...well maybe but Nvidia pulled that trick on 3dfx long ago. And viola at release time released a driver to unlock hidden performance and boosted the card 25 percent and put them out of business. Careful to jump the gun folks.
Use the noggin' it makes no sense the card is benching the same so something is up. Whether it be VERY bad or SNEAKY.

The retail part WILL HAVE A TRUE HDMI connection to support the HDMI 1.2 compliance the card will have. The card is also WINDOWS VISTA CERTIFIED which means it must sport the HDMI connection. Only an HDMI connection sports audio and video streaming not a DVI connector as you can see in the picture (dual DVI) very clearly. So this is not the HD XTX RTM version. If it were it would sport HDMI connectors. It may be leaked or given but it's not the exact card in the box your going to see as DailyTech has posted.

I smell something fishy....
DailyTech can prove that finding wrong by explaining the dual DVI connector in the picture when its fact it will sport an HDMI connnector. For the DVI connections it will come with a convertor to DVI. It's not true HD without the connector and it's a FACT FACT FACT it will have HDMI connector with onboard sound to complete the Windows Vista certification. They can also further explain what driver version of the audio device do they have? I know which one it should be.

I think the benchies are definitely right but definitely the wrong board.

Wait and see...
I'm still holding onto my dollars until then.
Too much bs floating around.

Just wait till it comes out and then pay attention. I doubt seriously they would fake anything, play favortism, etc. but one thing is for sure they don't have the retail card in there posession. And we all know someone somewhere that would know such things. ;-)




By Darkskypoet on 4/28/2007 1:47:21 PM , Rating: 2
I fully agree. DT has nothing to gain by screwing up benchmarks, attacks of that sort are rubbish. Come on, it's not difficult to bench a card, to attack them like that is just inane.

The longer cards were supposed to be OEM boards from what I understand. Anyone remember the days when OEM ATI products always shipped with inferior speed bins, etc? Added to that why in the world would they want to lengthen the memory traces for gddr4? Distance matters people. Especially if gddr4 already has higher latencies, etc.

Regardless of that, the longer cards "are not the boards we're looking for".. not Retail release XTX boards anyway.

DT thanks for bringing us what you had, and I am sure you did your best to bench the cards, however I for one am not joining this FUD bath until we see the proper cards.

Kudos to those noticing how AMD/ATI chose a great hybrid design for their stream processing accelerator (ie CTM add in cards selling at around 2k a pop) and video card. And to those that see the inherent advantage in cerain circumstances this gives nVidian cards.

Downside is that R600 suffers when there is a lack of extra eye candy, upside is that far more complex operations can be done for free. (relative to a lower baseline object throughput per tick tho)

What I would be interested in seeing from DT is a repeat of the workstation benchmarks for the GTX... XT kicked the lesser sibling in the nuts. Hard. I want to see how the bigger brother deals in the work station realm.

Last point, 65nm Si would bring R600^2 into both the Fire GL line, and this new CTM stream accelerator variant quite nicely :)

(Been up far too long building machines, night all.)


650 MHz oc 8800 GTX
By erikejw on 4/27/2007 3:24:40 PM , Rating: 2
How come you don't want to say which card it is?
Maybe the company that sent it does not know that they support this site :)))

650 MHz, 1000MHz DDR3 can it be a

BFG NVIDIA GeForce 8800 GTX OC 768MB Water Cooled Edition graphics card




RE: 650 MHz oc 8800 GTX
By Ard on 4/27/2007 4:57:27 PM , Rating: 2
Your point is? The XFX 8800 GTX XXX runs at 626MHz/2GHz. You think that 24MHz makes a difference?


By DocDraken on 4/26/2007 5:15:43 AM , Rating: 3
The scores for the XTX are actually less than the XT in some cases and in other cases practically the same. So either there are driver or hardware issues in this test. I think it would be wise to reserve judgment until retail XTX cards are out and the drivers have matured. Until then it's pure speculation whether it is "doomed" or not. The XT clearly performs well.




By DingieM on 4/26/2007 7:14:25 AM , Rating: 3
http://www.fudzilla.com/index.php?option=com_conte...

Anyhow don't expect to be able to buy an XTX this half of the year...




This just in...
By yacoub on 4/26/2007 10:41:54 AM , Rating: 3
In other news, a little girl's My First Hairdryer, first reported stolen nearly a year ago, turned up today attached to a flagship videocard. No charges have been filed yet, though authorities are investigating Vijay Sharma as a "Person of Interest".




haha
By jlm46 on 4/26/2007 12:25:53 PM , Rating: 1
All I have to say to those posters who tried to make us 8800 GTX owners feel bad about our purchase is...that's karma baby, eat it haha!

We've had our cards for months now, enjoying blazing performance in games like Vanguard that bring other cards to their knees

Hopefully we'll see better competition from ATI in the future, since competition is good for everyone....my post is strictly bashing those who tried to make 8800 GTX owners feel bad during the past few months...thanks, please come again




RE: haha
By GlassHouse69 on 4/27/2007 12:41:37 AM , Rating: 2
well, you can feel bad that two xt's in crossfire will be the same price as your gtx WAS, and that it will perform better.

all in all, that would be like 300+ watts of power lol :) nothing really to brag about there, but someone will point this out shortly.

Also note: anyone with a gtx absolutely anhilates any game out there and all of those to come in 2008. Pretty solid purchase it seems.


People need to get a grip
By slacker57 on 4/26/2007 1:03:00 PM , Rating: 3
Conspiracy Theories, Fanboys, False Benchmarks, oh, my!

Wait till the cards hit market and we'll find out how they perform. This was just an early article to give you an idea of what to expect.

People in here are actually getting angry over this, laying blame, making up conspiracy theories.

I get excited about new tech, too, but good grief, people, get a hold of yourselves. It's not something that should ruin your day. That's silliness.




the upside
By slash196 on 4/26/2007 1:55:58 PM , Rating: 3
I personally could care less about the ultra-high-range cards. What I see is a solid competitor to the 8800 GTS, that should force nVidia to counter with a superior offering at a lower price point, and then we'll have great cards for cheap. Everybody wins.




Bad benchmarking
By Alexstarfire on 4/26/2007 4:23:12 PM , Rating: 3
While I would normally agree that comparing a stock card to an OC card isn't fair, but to be honest the difference between the FPS on the cards are just too big for that OC to overcome. I highly doubt the OC on the nVidia card makes a 20% FPS difference.

Also, I do know that Vista performance is down on both cards compared to XP performance. They don't even tell you what OS they used to compare the cards. if they used Vista then it's no wonder the X2900 XTX sucks so much. ATI performance has been abysmal at best on Vista.




By ToeCutter on 4/28/2007 4:04:14 PM , Rating: 3
Sorry DT, I'm just not buying these benchmarks.

I'm by no means in denial, but there are several factors that simply beg do be questioned:

1. Why use an nVidia mainboard to test an AMD videocard? We've seen some mischief from both nVidia and ATI with regards to "optimizations" for their own products. While I'm not implying any skullduggery on DT's part, using an Intel chipset with an Intel CPU would have reduced the possibility of questionable results.

2. Where did you get the card you tested? It's an OEM board, but how old is it? Why OEM and not retail?

3. Most importantly: Why use a partner oc'ed part for nVidia? It's perfectly reasonable to question the use of a faster board for prelim benches. I UNDERSTAND the oc'ed 8800 is available in retail, but so are reference clocked parts.

I've read most of the comments on this story and I'm surprised how quickly many here consider this the end of the R600. It's ridiculously premature to consider this the nail in the coffin for R600, considering earlier benches do not concur with these results (obtained under very questionable circumstances).

We're not even close to the end of this story....

Disclaimer: I currently own a eVGA GeForce 8800 GTS Superclock.




I call BS...
By meyerds on 4/29/2007 2:15:37 AM , Rating: 3
My X1900 GT performs about the same as the HD 2900 XT on these Episode 1 benchmarks (at 1280x1024). Something about these benchmarks is not right. I won't take DailyTech's word for it until I see some 'real' benchmarks (in depth review, retail drivers, etc). I see absolutely no way that these parts could underperform *any* 8800. Everything about their design suggests otherwise.




just curious
By jay2o01 on 4/30/2007 12:41:35 AM , Rating: 3
Anyone else wondering how these "next gen" cards will work on "next gen" games? Maybe some ut 2k7 or something else thats DirectX 10, correct me if I'm wrong, but thats where this line of cards should shine...Right?




XTX down! Call 911!
By JSK on 4/26/2007 2:25:31 AM , Rating: 2
This definitely isnt good. Looks AMD may want to hold off on this card, get their XT out there and put all their horses behind the r650, or whatever the 65nm die shrink is.

Looks like the 8800 Ultra will hold the crown this gen.




wait till 65nm maybe?
By knowom on 4/26/2007 5:33:03 AM , Rating: 2
At least they can push harder towards 65nm and iron out the kinks in the meanwhile hopefully assuming it's not something they can fix in the near future.




Technical question
By Dwarden on 4/26/2007 8:16:50 AM , Rating: 2
May i ask for some details like

what board revision is this, what core revision and what exact GDDR4 modules are used ?

thanks (also if You an put up some powerusage values and measure fan noise level in decibels)

thanks




not
By omyg0t on 4/26/2007 8:52:41 AM , Rating: 2
why don't you guys send me that card :O i'll make sure to tickle it the right way

something is defenitly wrong, those benchs are ridiculous..
i don't know who would believe such nonsence
i'm not blaming it on dt though.. anything could have gone bad, not really your fault




Yikes!
By casket on 4/26/2007 9:27:40 AM , Rating: 2
Usually overclocking memory leads to higher scores.

If the benchmarks are true... they are a disaster.

3dMark has has some kind of memory bandwidth benchmark, right? I'd run that first.
************
Possible problems:
Software Bottleneck (Fixable)
Harware Bottleneck (Yikes!)
GDDR4 stinks (Yikes!)

Under software Bottleneck, there are many possibilities, from thread handling (320 stream processors)(too many threads)(software only using a fixed amount like 100 due to bad programming).

Without extensive benchmarking, it is hard to identify the problem.




hmmm
By nefariouscaine on 4/26/2007 10:10:38 AM , Rating: 2
as stated above I'm concerned with the fact that the XTX they reviewed just got a few frames higher than the XT reviewed earlier this week. Forget about GTX comparison as it doesn't look like there is any. But I'm gonna wait for release and drivers to mature.




I wonder
By Dwarden on 4/26/2007 10:23:14 AM , Rating: 2
If problem of speed on XTX is related to GDDR4 ...

by logic isn't then easier just to made GDDR3 model with 1GB memory ?

i mean something else seems to be wrong ...




By RussianSensation on 4/26/2007 11:55:06 AM , Rating: 2
Considering there are a large number of situations where an overclocked 8800GTS 320mb card outperforms a stock 8800GTS 640mb, we can conclude that there are still games for which 320mb is sufficient. In this case we are comparing a 512mb to 1gb card and we can expect the difference to be even more minute as hardly any games use 512mb of video memory.

The 2nd point to address is that most games are GPU efficiency/speed limited. The proof of this is a massive increase of bandwidth for X1950XTX over X1900XTX which hardly provides any benefit (7% speed increase in games for a 29% memory bandwidth increase - Anandtech ran this comparison a while ago). As has been mentioned by many posters above, the GPU is overwhelmed far before the card becomes memory bandwidth limited.

The disappointing part of all of this is that had ATI released a faster card, 8800GTX's would have fallen in price. On the positive side, recall the flop NV made with 5800/5900 (in DX9) series and how this forced them to do a full 180* turnaround towards an outstanding 6800GT/Ultra series. Perhaps this is the best thing that could happen to ATI for R700. The only problem is will ATI deliver considering its poor cash position? Certainly it cannot fail with Barcelona or they are done.




By MFer on 4/26/2007 12:43:01 PM , Rating: 2
I have a feeling it might be the new retail driver. Or at least it better be.




confused
By ultimaone on 4/26/2007 12:55:00 PM , Rating: 2
took a look at the benchs from the 2900 XT from yesterday

Gaming: Maximum Quality, 1280x1024
Game
AMD ATI Radeon
HD 2900 XT
NVIDIA GeForce
8800 GTS 640MB
Call of Duty 2
73.5 FPS
56.7 FPS
Company of Heroes
92.1 FPS
90.1 FPS
F.E.A.R. 84.0 FPS
83.3 FPS
Half Life 2: Episode 1 112.0 FPS 57.4 FPS *
Oblivion
47.9 FPS
39.5 FPS
3DMark06
11447
9836

oblivion has half the frame rate as whats
on the current test, and FEAR scored...higher...

oblivion maybe was inside vs outside ?

and you can see call of duty 2, ati card is WAY ahead...
weird

also i suspect that those quad cores are not being used
properly or being fed enough (sorry if somone mentioned this before)

whatever if the 2900 XT is same price as a GTS
i'll get the ATI card




By EastCoast on 4/26/2007 2:30:55 PM , Rating: 2
In all honesty the scores posted appear to be nothing more then an OC HD 2900XT IMO. Guys, did you check the ram to make sure it was GDDR4?




Damn AMD
By nkumar2 on 4/26/2007 2:48:02 PM , Rating: 2
well i will also wait for the official benchmarks, and these also seem very legit to me, i have been waiting for this damn card from the day i returned my 8800gtx to get my money back, spec wise this card should blow the 8800 away, honestly thats what happens when nvidia is more involved in the gaming community almost everygame i see nvidia's logo stamped on it in the cutscene before the games. i am sure ati is involved but not as good as nvidia.

also i read at vr-zone.com who i alwasy trust stated a few weeks ago that the 2900xt not the xtx, was beating the 8800gtx, so thats the only reason i am waiting for the official release.

from the day amd bought ati it was heading for disaster, i wouldnt be surprised if amd just focuses on fusion and totally shuts down ati. ati would have been better off without amd, and u hear about barcelona getting better by the day, now one wonders where their resources are being well spent. architecture looks to be massive and powerful, i just guess it could be the drivers or some serious efficiency problems. and last but not least, nvidia probably wont be dropping the 8800gtx prices if these benchmarks are indeed deadon.




Welcome to Fantasyland
By steve001 on 4/26/2007 4:10:01 PM , Rating: 2
These numbers are CRAP. From the real world numbers we have and have confirmed with others that have cards... these numbers only exist in Fantasyland. I love the fact they call the long cards XTX.... LOL. Do some homework and tell me that there aren't 3 XT versions.




By kilkennycat on 4/26/2007 4:29:34 PM , Rating: 2
With the continuing financial woes of AMD it seems that the joint AMD/ATi company cannot finance both leading-edge GPU development and CPU development. And it is obvious which one has to give...

According to Anandtech, the 8800 GPU cost $475 million to develop:-

http://www.anandtech.com/video/showdoc.aspx?i=2870
(See bottom of page 5)

In spite of this huge development cost, nVidia makes a handsome net profit and has continued to do so for the past 4 years. ATi has just barely scraped by, last year being the only year in a very long time that the company made any profit. If the 2900XT(X) does weakly in the marketplace and unable to command a price-premium (to cover development costs) AMD/ATi just cannot afford to spend more of that sort of money on a successor to the 2900XT(X) when their CPU line is under increasing business-threatening pressure from Intel. BTW, nVidia is already well into development of the G80 successor.

So AMD may be faced with two undesirable but logical alternatives, especially if the other members of ATi's Dx10/2900 family also turn out to be only moderately competitive :-

(a) Put ATi up for sale and return to the CPU business.

(b) Redefine and tighten ATi's role as the AMD motherboard-chip-set satellite with moderate-performance integrated-graphics, consumer-tv decoders and mass-market discrete-GPU technology. No expensive bleeding-edge development in GPUs.

In the case of either of these events taking place, I see the top GPU designers at ATi jumping ship. I'm sure that nVidia or Intel might give consideration to the more stellar ATi resumes.




Wow....
By BucDan on 4/26/2007 4:52:56 PM , Rating: 2
that's truely suprising. even tho the hd 2900 xtx is stronger overall, it still loses to a overclocked 8800gtx. i can now also agree that the oem builders wont pick up this card because of the freaky 12" card and the expensive gddr4...if i were a oem builder, i wouldnt get a hd2900 xtx because ill be getting teh ugly 12" model. id rather do a hd 2600 series and so on.

now i just want to see the hd2600 series. and compare it with a 8600 series card. then consider gettting either one




DAMN my faith is gone.
By nkumar2 on 4/26/2007 4:56:18 PM , Rating: 2
if this is true, i have been going by the word that was around that r600 is going to be a monster, i really think that this card will show in dx10 where unified shader will be used to its fullest. with the power it has i think it is purly dx10 power house where nvidia's shader arcitechture is simple and power's dx9 games way easily than ati's shaders. or else i shall hale to the the all mighty nvidia. and grab whats the best. i frickin hate amd, they have ruined a perfect battle between nvidia and ati, if it wasnt for them r600 would have been here 6 months ago, and we would have been like oh 6 months and still not demolishing the 8800. i think amd should let just ati people handle the graphics business even if they are under the same business.




All hell breaks loose
By Saito on 4/26/2007 5:57:11 PM , Rating: 2
Can't this site go one bloody day without this infernal and deeply annoying "OMG lolzor my brand kicked your brand's butt" and whatnot?
If I read what some people write here... "Suck it ATI fanboys"
Jezus, Maria and Joseph :roll:




XT > XTX
By mac2j on 4/26/2007 7:24:44 PM , Rating: 2
You really have faith in posting numbers that show the 2900 XTX underperforming compared to the 2900 XT ... sorry I'm with everyone else here .... shenanigans !




would be funny if...
By Jeff7181 on 4/26/2007 8:46:48 PM , Rating: 2
Would be funny if RIGHT before it's release they're like, "Oh, by the way, it's official clock speeds are 1250 MHz for the GPU and 1400 MHz (2800 effective) for the memory." :D




crossfire and sli?
By Armorize on 4/26/2007 10:11:06 PM , Rating: 2
No one has mentioned crossfire vs sli yet...hmm... only 1 of each ati card? I'm just curious if their performance will be differant with crossfire or not. Also if these cards have other neat little packages on the cards that are bogging things down, I read on the INQ that it has hdmi sound capabilities I believe it could be that because these arent the real cards that will be released to the public and only referance cards. Just a thought.




Reason
By scrapsma54 on 4/27/2007 12:53:20 AM , Rating: 2
The previous tests were run under vista, what Os were these run under? Remember Ati said their Gpu is built from the ground up for VISTA? Well in that case I see something odd here, because yesterdays XT benchmarks were much better.




Drivers...
By viperleader on 4/27/2007 1:41:21 AM , Rating: 2
It's funny how ATI fans were talking up how great and polished the R600 drivers were going to be. As they should be... with a 6 month launch delay they should be pretty good by now. Even ATI has talked up how good the R600 drivers are. not to mention they have admitted to having shippable cards for over a month now. You are just kidding yourselves if you are hoping the drivers are still buggy. Either that or ATI can't code drivers for shit...

Another thing that's funny is the DX10 argument. Are we supposed to ignore the R600 performance until DX10 games ship? It may do better against 8800gtx then, but what are you gonna play on it till that happens? Gonna stick with Minesweeper till then?

ATI is flat-out behind, they haven't made up any ground in the last six months. Nvidia, can ship the next gen anytime they want, but there is no rush. they still own the high ground, and can price their last gen stuff aggressively as they need to to maintain market share.

Stop whining about factory OC cards being compared to the non existent 2900XTX. I can buy as many 8800GTX OC cards as my Visa will allow, but I sure can't get an XTX. It only makes sense to compare ATI's highest performing retail card against Nvidia's highest performing retail card. What would it prove to do otherwise? By that logic, the XTX is just an XT overclocked with more memory and should be benchmarked at the XT ckock speeds.




seems to me
By AntDX316 on 4/27/2007 4:14:23 AM , Rating: 2
ROPs r to blame considering the 8800GTS is 20 8800GTX is 24 and the R600 is only 16




Wait, something's up
By SCAxman on 4/27/2007 5:13:51 AM , Rating: 2
Aren't those shots of HD2900XTs? If you guys have an XTX post a close-up or two.




Why
By Heyhey on 4/27/2007 8:36:04 AM , Rating: 2
I don´t understand why so many of you think that new drivers would do a some kind of magic trick for this card. Cmoon. It is only a little bit overclocked 2900XT with a redicilous amount of GDDR4 memory. Why it should be dramatically faster than 2900XT? I admit that the performancer of whole 2900 family may be increased with new drivers, but the gap bewtween 2900XT and 2900XTX cards will not change drasticly

You should just admit that this card is just a marketing trick with no real value.




3DMark 06
By tonI on 4/27/2007 9:41:21 AM , Rating: 2
Can we see some ATI Radeon HD 2900 XT 3DMark 2006 scores ?




3DMark 2006 scores
By tonI on 4/27/2007 9:44:33 AM , Rating: 2
Can we see some ATI Radeon HD 2900 XTX 3DMark 2006 scores ?




DX10 maybe?
By Snipester on 4/27/2007 9:59:42 AM , Rating: 2
I realize i have no evidence to back up my claim but, i thought it was expected that DX9 wouldnt be that good with the R600 overall? I thought the killerapp was DX10 for that card? Unfortunately there are no DX10 games so it is hard for me to say without benchies.




Just hold your horses people
By tungtung on 4/27/2007 1:55:49 PM , Rating: 2
Just saw this from another forum

http://www.vr-zone.com/?i=4931

So they bumped up the release date to May 2 for the 2900XT. So I'd say just wait till then before you all go crazy mad about why this and that. (I can't 100% vouch for validity of the news, since I have not seen any other sites confirming this information yet, only posting so that people will know)




If these are acuratre
By eman 7613 on 4/28/2007 7:48:10 PM , Rating: 2
If these results truly represent the to be released version of ati's new XT and XTX, what we should focus on is lowering clock settings. Since the XTX performed with less then a 1% difference in frame rate between the two, one could speculate that lowering the settings will produce still similar results (granted there is a limit to how far that will go). If this is the case, ATI has actually produced and incredible piece of hardware with some funky attributes, that would lead to a rapid development in the quality of integrated graphics in laptops and low end desktops.




AMD and ATI
By FakeDetector on 4/29/2007 7:26:31 AM , Rating: 2
currently AMD and ATI performance is really pathetic !!!

Hope they get back on the track soon




fps is an old metric
By derdon on 4/30/2007 3:05:54 AM , Rating: 2
Who cares about fps (=raw performance)!
The new metric that's going to become important is performance per watt. If I don't know how many energy I'll waste on ridiculous fps numbers, I can't make a serious buying decision.




I wonder if this was planned
By Ebbyman on 4/30/2007 9:33:25 AM , Rating: 2
My concern is whether the card Dailytech received from its "source" was a true 2900. Dailytech got the card second hand and I wonder if AMD has released some weaker cards to weed out sites that lend them out to pre-release benchmark posters. Before you flame me for being a fanboy or something, I have a 8800 GTX and Intel Conroe, so don't go there. I just find it hard to believe that AMD's latest and greatest is so weak compared to its last series and considering that the 8800's have been our for over 6 months (plenty of time to bench against internally).

I am sorry is someone already noted anything similar to this. I have not had the time to go through all the posts.




By stance on 5/1/2007 4:22:52 PM , Rating: 2
this link might be fake benches but look at it and tell me
what you think

http://www.bilgiustam.com/?p=125

It would be funny seeing how ati has alot of optimizing
left to do in thier video drivers, We will see what we
see




By yxalitis on 5/3/2007 6:58:37 PM , Rating: 2
OK, from compiling info over the past few months, this is my conclusion:

The long, 12.5" XTX card is an OEM card intended SOLELY for the MAC.
http://gizmodo.com/gadgets/peripherals/atis-radeon...
(sorry for the looong URL!)
It is not representative of the final; XTX card. NO-ONE has seen any benchmarks for the 9.5" XTX product yet, and are speculating wildly on a product in hand without knowing ATI's intent.
Dailytech has benched the XT, and it performed favorably over the GTS 640MB:
http://www.dailytech.com/article.aspx?newsid=7043
They ALSO benched a 12'5" card called XTX:
http://www.dailytech.com/article.aspx?newsid=7052
Why on God's good Earth would ATI release a premium card only 5MHz faster then the lower card?
...because they aren't intended to compete with each other...
The XTX 12.5" board will not be available for sale except if you buy a MAC, and pull it out (and why would you want to do that?)
The real XTX card has not been authentically benched yet, but will be a 9.5" card.
There is even some speculation that the old, 12.5" card was on 80nm process, ran too hot, was underclocked, fitted with a massive heatsink, and flogged off to Apple to stick in their overpriced hardware (Oh Please!, no-one dispute that MAC's are overpriced...!)

That's my researched conclusion for you...we'll know for sure once the NDA's are lifted!

Oh, it IS odd that Dailytech benched the XTX against a 8800 clocked HIGHER then the Ultra:

NVIDIA GeForce 8800 GTX (650 MHz core, 1000 MHz GDDR3)




680i Chipset with ATI?
By feetsniffer944 on 5/4/2007 6:58:28 PM , Rating: 2
Everyone knows that while the nVidia GPUs perform well, the ATI GPUs perform terribly with the 680i chipset, and most nForce chipsets for that matter. I'm not convinced by those numbers. I'm stuck waiting until the card materializes and it's tested with different chipsets, better suited to ATI, before I lay down any cash on the 2900 or the 8900.




Why not a Reference?
By Anub1s on 5/6/2007 7:05:38 PM , Rating: 2
Out of curiosity why was a O/C board used, I mean you should have known there would be arguements about this showing up all over the place
so why not use a reference board?




Guys there's no way this is it.
By Ryanman on 4/27/2007 12:23:00 AM , Rating: 1
If ATI had even an INKLING that performance would be like this they would not even release it. Even if the hardcore fanboys bought it R&D wouldn't be covered and they wouldn't be able to live down the shame for generations.
There are multiple things that could have gone wrong- did anyone see the 8800 GTS frames for HL2? Spit happens with computers. Just wait a LIITLE BIT LONGER (the R600 theme song)




Better just cancel the XTX version
By Min Jia on 4/26/07, Rating: 0
What a Joke
By Ard on 4/26/07, Rating: 0
AMD fanboys make me laugh
By Min Jia on 4/26/07, Rating: 0
Risky
By paydirt on 4/26/07, Rating: -1
RE: Risky
By jazkat on 4/26/2007 6:28:20 PM , Rating: 2
why dont we all just wait until may when the proper driver for the 2900 series comes out, when i had my x1800xt it was crap then ati realesed a driver that sorted out something to do with the memory then it was awesome.its a bit unfair to bench the xtx against the gtx because this site may cause ati to loose money and probably make computer manufacturers not want to use it in there systems over premature bechmark numbers.
something isnt rite with that card , anyone with a brain can see it has a 512-bit interface ,gddr4 ,320 stream prossesors so in reality it should be faster. me thinks when ati sorts this problem we will see its true performance. i wonder what driver they did actually use ????
8.136??? 8.37??? im gonna wait a bit and not take these benchmarks to seriously, also maybe it will be better in dx10 who knows best we wait and see what ati has to say.

there is no proof these benchmarks are real only one photo and no photos of system details to say what speeds they are running at,why no screenies?? why no photo evidence of fps ,and bench results??? and also why they only use a nvidia mobo why not 2 motherboards one with an ati chipset??
the pic shows a closeup of the xtx in a nvidia mobo that could of been taken anytime anywhere, just wait until someone respectable does the benches before you make your mind up.


RE: Risky
By paydirt on 4/26/2007 9:02:07 PM , Rating: 2
I don't know why you chose to reply to my post. I'm on your side of the fence. I also don't know why people downgraded me. If these benchmarks prove to be B.S. then would they still trust DailyTech? We'll find out soon.

My post is just an observation that DT really stuck their neck out.


RE: Risky
By GlassHouse69 on 4/27/2007 1:01:14 AM , Rating: 2
it all sounds whacked out, I agree. It is more than obvious that drivers are really off for the 1gig memory card.

my x1900 aiw card has 256megs on it. It took 4 updates from ATI to get my card nearing the power of x1900xt's which had 512 megs ram. there was an error in how the ram was being utilized. once it was fixed I got a 20% or higher solid increase in frames in all of my games. Even games like vampire bloodlines and BF2 didnt get bogged down randomly.

ati makes really crappy drivers it seems. this could be and for some, is, the reason why they do not use ATI. linux drivers are eternally (purposively, I feel) terrible and 1 out of 2 windows drivers are good.

I will still buy it as the vga signals from ati have, for the past 9 years, beaten out in 2d quality nvidias. I havent tried the 8800 series though, but I dont want to bother. Also, aiwonder isnt an nvidia thing as well as avivo is the clearest for dvd viewing. (the old ati multimedia player for my 9800 was better than nvidia's offerings back then too) (huge 22" mitsu. pricey crt I got, analogue quality is the most important)


RE: Risky
By jazkat on 4/30/2007 8:37:41 AM , Rating: 2
hey thats the thing i aint a fanboy, i have a gtx in my system as we speak but the card daily tech used is only a SAMPLE which was given to amd partners, notice how it stays the same as the xt well thats because at the moment the xtx uses gddr3, dont u think its strange no one else has has these cards to bench?? if dailytech was honest they would tell you the truth. that was an official word that amd partners would be gettin gddr3 only sample and that makes a big difference because it gives the xtx a massive 45 gb/s bandwidth over the gtx and that 45gb/s is half of the gtx bandwidth because its only 90 gb/s and the gddr4 xtx will be nearly 130gb/s.and thats enough to trounce the gtx at high resolutions and with the settings throuh the roof.
what makes you think dailytech would have the gddr4 card if amd partners were only having gddr3 samples on 22.04.07 you have had the wool pulled over ur eyes and thats funny,
dailytech has done this only cos they no that others cant comment because they cant say any different as they have signed a nda agreement its a kind of windup on there part no one has a gddr4 xtx at the moment otherwise everyone would be benching them.
2900XTX "GDDR3" samples available to AMD partners 22.04.07
^^^^^
dont believe stories until you have heard it from the horses motuh >>>>> AMD

good day


RE: Risky
By housecat on 4/30/2007 11:17:47 AM , Rating: 2
quote:
My post is just an observation that DT really stuck their neck out.


Yes, they did.

...because AMD refuses to.


RE: Risky
By bl1nd on 5/10/2007 8:10:20 PM , Rating: 2
HAHAHAHAHAHA man im crying from laughing so much all those amd ati fanboys making up stories and excuses hahahahahahaha like guy said what if that isnt a 2900 and it is a plan from ATI to release weaker cards AHAHHAAHAHAHAHAHAHAHAHAHA

It is so funny to see them making up stuff...

IT SUCKS even if it gets close to a GTX it is going to suck released after 2090925902 months and with crappy drivers and Nvidia drivers will be strong...



RE: Risky
By t5chris on 5/13/2007 7:25:53 PM , Rating: 2
This is the first time in a long time that I've seen benchmarks "win" under every condition and this lopsided. If this was the case, ATI would probably lose more money producing this card, than abandoning this product line completely. I'll bet money that when other sites start doing their own benchmarks under better controlled environments, the numbers won't even be close to the same.


ATI just got owned by Nvidia
By IntelGirl on 4/26/07, Rating: -1
RE: ATI just got owned by Nvidia
By Min Jia on 4/26/2007 9:44:26 PM