backtop


Print 173 comment(s) - last by subarusvx.. on May 15 at 10:21 AM

ATI Radeon HD 2900 XT puts up some impressive numbers in benchmarks

After several delays, AMD plans to launch its long-awaited R600 graphics processors. AMD is currently briefing select members of the press on its R600 architecture in the Tunisia, but there is no embargo date on the R600 for DailyTech -- we can show you benchmarks now.

AMD plans to launch a completely new DirectX 10 lineup with the flagship ATI Radeon HD 2900 XTX. Other models such as the Radeon HD 2900 XT, Radeon HD 2600-series and Radeon HD 2400-series will also join AMD’s DirectX 10 family after the initial high-end launches.

AMD equips the ATI Radeon HD 2900 XT with 320 stream processors to take on NVIDIA’s GeForce 8800 GTS, which features 96 stream processors. However, AMD and NVIDIA have taken different approaches towards their unified shader designs. AMD pairs the R600 GPU with 512MB of GDDR3 memory clocked at 1.65 GHz across an eight-channel, 512-bit memory interface. In comparison, the NVIDIA GeForce 8800 GTS features 640MB of 1.6 GHz GDDR3 memory on a 320-bit memory interface.

AMD equips the ATI Radeon HD 2900 XT with a dual-slot, blower-type heat sink. Unlike the OEM Radeon HD 2900-series previously pictured, which is an 11.5” long card, the ATI Radeon HD 2900 XT makes use of a smaller cooler so everything fits within the 9.5” PCB. Although the R600 GPU supports HDMI audio and video output, the reference design only features dual dual-link DVI.

Onto the benchmarks. The tests were conducted on an Intel D975XBX2 BadAxe2, Intel Core 2 Extreme QX6700 and 2x1GB DDR2-800 MHz. The operating system on the test system was Windows XP, with a fresh install before benchmarking each card. Testing of the AMD ATI Radeon HD 2900 XT was performed using the 8.361 Catalyst RC4 drivers, while the GeForce 8800 GTS used ForceWare 158.19 drivers.

All game tests were run with the maximum detail settings at resolutions of 1280x1024. Futuremark’s 3DMark06 was tested with the default settings. Although we ran the benchmarks on our PC, we were not supplied a monitor for testing higher resolutions.

Gaming: Maximum Quality, 1280x1024
Game
AMD ATI Radeon
HD 2900 XT
NVIDIA GeForce
8800 GTS 640MB
Call of Duty 2
73.5 FPS
56.7 FPS
Company of Heroes
92.1 FPS
90.1 FPS
F.E.A.R.84.0 FPS
83.3 FPS
Half Life 2: Episode 1 112.0 FPS57.4 FPS *
Oblivion
47.9 FPS
39.5 FPS
3DMark06
11447
9836

* Our benchmarks for Half Life 2: Episode 1 showed an abnormal framerate for the NVIDIA GeForce  8800 GTS card that scaled with lower resolutions -- we believe there was a copy error. We reran the tests this morning and achieved 119.2 frames per second with the GeForce 8800 GTS.

The following benchmarks were performed under SPECviewperf 9.

Workstation: Maximum Quality, 1280x1024
Game
AMD ATI Radeon
HD 2900 XT
NVIDIA Quadro
FX 5500 (G71)
Cadalyst C2006
314
243
Autodesk 3ds Max v8 OpenGL
129
101
Autodesk 3ds Max v8 D3D
342
242
Catia 02
56.73
44.87
Maya 02
224.59
142.18

Expect AMD to pull the wraps off its DirectX 10 product line up in mid-May, with value, midrange and high end models to boot. AMD’s flagship ATI Radeon HD 2900-series will have two models at launch – the ATI Radeon HD 2900 XTX and the HD 2900 XT. The ATI Radeon HD 2900 XTX models feature 1GB of GDDR4 memory while the lower HD 2900 XT features 512 MB.

The ATI Radeon HD 2900 XT is poised to have a street price approximately the same as the GeForce 8800 GTS, which currently has a suggested retailer price of $449.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Impressive
By osalcido on 4/24/2007 5:46:52 AM , Rating: 3
those numbers are damn good for an equally priced competitor to the 8800gts.. let's hope that this is not another ati paper launch tho




RE: Impressive
By ashishmishra on 4/24/2007 5:58:18 AM , Rating: 3
Impressive numbers indeed, though FEAR numbers somehow seem too low in comparison to 8800GTS, may be some driver issue holding back performance there. Also 8800GTS's score for Episode 1 seems low, unless maximum quality means something like 16X MSAA 16X AF.


RE: Impressive
By Hypernova on 4/24/2007 6:23:50 AM , Rating: 3
At 1280x1024 I would say it's CPU bottle necked. Definitely needs more testing at high res.


RE: Impressive
By Roland702 on 4/24/2007 9:01:38 AM , Rating: 3
While I would normally consider this res to be bottlenecked I would like to see a comparison at the higher resolutions to see how they fair. I would assume both cards to be bottlenecked at this resolution.

AMD/ATI Also claims to have "free AA", but I would say they are using bottlenecked resolution and turning on AA/AF as "free and not performance taxing"...


RE: Impressive
By retrospooty on 4/24/2007 10:00:32 AM , Rating: 3
yes... Todays cards cant be stretched until at least 1600x1200@4xAA.


RE: Impressive
By Slaimus on 4/24/2007 10:35:39 AM , Rating: 3
Something is not right here: these 1280x1024 frame rates are quite low for these super high end cards.


RE: Impressive
By Justin Case on 4/24/2007 2:00:49 PM , Rating: 3
I take it that "maximum settings" includes maximum AA, which means that in fact the card is rendering at much higher resolutions.


RE: Impressive
By Araemo on 4/24/2007 4:34:36 PM , Rating: 3
No, I don't know of any consumer-level card that still renders at a higher resolution for anti-aliasing. Every card I know of does multi-sampling(rather than super-sampling, which is what you are referring to.)

Multi-sampling involves running PART of the rendering pipeline at a higher resolution, and then tossing out the rest for most pixels. Only pixels that are identified to be on a triangle boundary get rendered as more pixels(and then down-sampled into one pixel).

Both ATI and nVidia are trying to do partial texture anti-aliasing, but neither seems to do it all the way unless there is a transparency affecting the current pixel.

I would really like to see real SSAA again, but I don't expect it any time soon.

That all said: Yes, MSAA still incurs a performance hit, but it is much less than it would actually take to render the scene at a higher resolution and then downsample it to your screen resolution.


RE: Impressive
By Justin Case on 4/29/2007 12:34:55 AM , Rating: 2
Determining if a pixel lies at a polygon boundary requires a good deal of calculations. And with high polygon count models, a very significant number of pixels does lie on a polygon boundary. So 1280x1024 is likely to mean at least 4x that much in terms of samples per frame.

As long as you have good anisotropic filtering, there is no real advantage to full supersampling.


RE: Impressive
By StarOrbiter on 4/25/2007 4:04:10 PM , Rating: 2
It is indeed ...


RE: Impressive
By puffpio on 4/24/2007 12:57:44 PM , Rating: 4
If it's CPU bottlenecked, shouldn't they both achieve around the same framerate as both cards would be waiting on the CPU?

some of those test show this..but CoD doesn't look CPU limited since the framerate is so different


RE: Impressive
By idconstruct on 4/25/2007 12:23:58 PM , Rating: 2
they were only talking about teh FEAR benchmark... in which they DO have almost identical framerates.

It makes sense though since FEAR is frequently bottlenecked by the cpu


RE: Impressive
By tuteja1986 on 4/25/2007 3:07:46 AM , Rating: 2
Woo , My buddy is angry after i sent him a link... a month ago he bought a used FX 5500 for $1700 and now he see it getting beaten up so badly by a $400 to $500 card. Anyways the card is looking very good and i would love to see my DX10 benchmark of games like Crysis , UT2007 and Alan Wake.


RE: Impressive
By ssidbroadcast on 4/25/2007 5:42:42 AM , Rating: 1
quote:
Woo , My buddy is angry after i sent him a link... a month ago he bought a used FX 5500 for $1700


lmao. Really wow? I'd have to be super-retarded to pay $1,700 for a FX 5500 ...

/giving the benefit of the doubt. U prbly meant something else.


RE: Impressive
By PlasmaBomb on 4/25/2007 8:12:43 AM , Rating: 2
He is referring to the Quadro line of professional cards for the like of CAD, which cost a fortune.


RE: Impressive
By FITCamaro on 4/25/2007 10:52:11 PM , Rating: 2
Agreeing with you and adding that ATIs FireGL line are also in the same high price range.


RE: Impressive
By Justin Case on 4/29/2007 12:42:29 AM , Rating: 3
Why would anyone buy a CAD card (optimized for ultra high polygon counts and low textures) to play games (that use medium polygon counts and lots of texture passes)...? It's like using a chainsaw to make sushi.


RE: Impressive
By Psychless on 4/30/2007 10:28:22 PM , Rating: 2
A gem bladed chainsaw that is.


RE: Impressive
By Conman530 on 5/3/07, Rating: 0
RE: Impressive
By Nightmare225 on 4/24/07, Rating: -1
RE: Impressive
By BZDTemp on 4/24/2007 8:13:10 AM , Rating: 3
Well so far we know nothing about the price - it's simply speculation so can it.


RE: Impressive
By Spoelie on 4/24/2007 9:06:57 AM , Rating: 4
Learn to read, AMD aims to have the street prices the same.
"The ATI Radeon HD 2900 XT is poised to have a street price approximately the same as the GeForce 8800 GTS"

The 499 number is only in reference to nvidia's suggested retail price. There is no such price yet for the ati card.


RE: Impressive
By Nightmare225 on 4/24/07, Rating: -1
RE: Impressive
By PlasmaBomb on 4/25/2007 8:16:01 AM , Rating: 2
If you read the article it suggests that the ati card is priced around $449.


RE: Impressive
By PlasmaBomb on 4/25/2007 3:15:06 PM , Rating: 2
A dutch site has the HD 2900XT on preorder for 319 euros ex vat!


RE: Impressive
By defter on 4/24/2007 6:44:39 AM , Rating: 4
Never compare future price of a product Y with the current price of a product X.

It's quite obvious that currently 8800 GTS 640MB and 8800 GTX are relatively expensive simply because there isn't any competition. Current price for 8800 GTS 640MB is about $400 while 8800 GTX costs about $550.

When R600 will become available, NVidia will introduce new high end card (Ultra) and naturally prices of existing cards will decrease. 8800 GTX will definitely cost less than $500 when R600 will be widely available. That's why those benchmarks numbers aren't very exciting, sure R600XT is clearly faster than GTS, but the gap between GTS and GTX is also quite large. Where are the comparison against GTX???


RE: Impressive
By KristopherKubicki (blog) on 4/24/2007 6:46:38 AM , Rating: 4
NVIDIA's current documentation says $999 for the Ultra. I think it's going to be very low volume though.


RE: Impressive
By BladeVenom on 4/24/2007 5:54:43 PM , Rating: 3
The Inquirer reported that it was going to be less than a 100 cards.


RE: Impressive
By defter on 4/24/07, Rating: -1
RE: Impressive
By AnnihilatorX on 4/24/2007 8:33:12 AM , Rating: 5
You should not compare it to GTX.

R600XTX is the direct competitor to GTX, not R600XT


RE: Impressive
By Phynaz on 4/24/07, Rating: -1
RE: Impressive
By DingieM on 4/24/2007 10:28:14 AM , Rating: 4
No it is not canned, there will be an XTX with GDDR4, next to the XT.
The XTX will as well be build on 65nm.
AMD is releasing ALL flavours on 65nm that includes the XTX as well.

Power issues and leakage was with the 80nm builds.


RE: Impressive
By Min Jia on 4/24/07, Rating: -1
RE: Impressive
By idconstruct on 4/25/2007 12:30:31 PM , Rating: 2
so basically what defter proved is that the r600xt (the second best r600) is roughly equal in 3dmark to the 8800gtx (the best nvidia has)

I've been waiting for this for so long :D

I'll probably end up saving for the r600xt... since it gives near-8800 performance and i probably won't be able to afford the r600xtx...


RE: Impressive
By idconstruct on 4/25/2007 12:33:35 PM , Rating: 2
so basically what defter proved is that the r600xt (the second best r600) is roughly equal in 3dmark to the 8800gtx (the best nvidia has)

I've been waiting for this for so long :D

I'll probably end up saving for the r600xt... since it gives near-8800 performance and i probably won't be able to afford the r600xtx...


RE: Impressive
By kiwik on 4/24/2007 12:15:22 PM , Rating: 5
quote:
See any mention of an XTX?

It was canned due to power issues.


"AMD’s flagship ATI Radeon HD 2900-series will have two models at launch – the ATI Radeon HD 2900 XTX and the HD 2900 XT. The ATI Radeon HD 2900 XTX models feature 1GB of GDDR4 memory while the lower HD 2900 XT features 512 MB."

What part of that can't you read?


RE: Impressive
By KristopherKubicki (blog) on 4/24/2007 1:40:10 PM , Rating: 5
Unless something changed -- Sven got an XTX about 2 hours after Anh got the XT. Sven is benchmarking now and we should have that up this week.


RE: Impressive
By slacker57 on 4/24/2007 12:23:15 PM , Rating: 3
Um, yes. Illiterate much?

quote:
AMD’s flagship ATI Radeon HD 2900-series will have two models at launch – the ATI Radeon HD 2900 XTX and the HD 2900 XT. The ATI Radeon HD 2900 XTX models feature 1GB of GDDR4 memory while the lower HD 2900 XT features 512 MB.


Unless the article was edited in the last two hours. That would make me look like an ass. :)


RE: Impressive
By Min Jia on 4/24/07, Rating: 0
RE: Impressive
By Armorize on 4/25/2007 5:11:44 PM , Rating: 2
Wouldnt that be nice. At most they'll probably drop $50, unless their going to play a behind the scenes intel vs amd war here. Although I wouldnt mind to see a price war with high end video cards ;).

Hopefully we'll see more benchmarks as we get closer to the paperlaun... I mean launch of the 2900. Heres to hoping though.


RE: Impressive
By DTAllTheBest on 4/28/2007 3:18:29 PM , Rating: 2
That's right.
Maybe 8800 GTX is more better if compared to this.


RE: Impressive
By GoatMonkey on 4/24/2007 9:58:25 AM , Rating: 1
...most impressive.


RE: Impressive
By ajfink on 4/24/07, Rating: 0
RE: Impressive
By TheDoc9 on 4/24/07, Rating: -1
RE: Impressive
By Polynikes on 4/24/2007 11:18:58 AM , Rating: 3
It's not the top of the line card. The 2900 XTX is the 8800 GTX's competitor.


RE: Impressive
By TheDoc9 on 4/24/2007 6:36:13 PM , Rating: 2
Ahh, I see now the extra X at the end of the name. That's the one I want to see.


RE: Impressive
By BucDan on 4/24/2007 1:47:08 PM , Rating: 2
im liking the benchmarks...seems impressive...im not a guy with a pocket full of cash but, im lookin forward to a hd 2600 model


RE: Impressive
By Armorize on 4/24/2007 6:07:13 PM , Rating: 2
Paper launch? whats a paper launch, ati never lies about release dates and releases them 6 months later pfh. lets hope that since AMD hopped on the boat that there will never be a paper launch ever again DAMMIT =P


RE: Impressive
By cocoviper on 4/25/2007 12:37:15 AM , Rating: 2
agreed. Especially when you consider Nvidia has had a good 6 months to refine the 8800 drivers whereas the R600 drivers have got to be at least somewhat immature.


RE: Impressive
By Spoelie on 4/25/2007 6:43:26 AM , Rating: 2
Not necessarily, as ATi's team has been familiar with the architecture in the R600 since the Xenos cpu. Since this is their second stab at this, they should conceivably have a bit of an advantage.


RE: Impressive
By Goty on 4/25/2007 8:50:04 AM , Rating: 2
I think there's just a slight difference between programming for a console and programming for a computer running a complex OS.


RE: Impressive
By FITCamaro on 4/25/2007 10:58:11 PM , Rating: 2
Yes but ATI has a bit more experience with fully programmable shaders (what we're calling stream processors) since it's had a GPU out with them for over a year longer than Nvidia.

I'm not saying ATI's drivers won't need to mature. But ATI does have an advantage in experience.


RE: Impressive
By boe on 4/25/2007 10:24:28 AM , Rating: 2
The street price for the 640mb 8800 gts is now about $360 so these are pretty good marks for a mid level card.

What I'd love to see is a comparison of their two top cards. Actually a great comparison would be these marks, the two top cards, and the two top DX9 single GPU cards since we aren't doing DX10 benchmarks anyways.


RE: Impressive
By veloxletalis on 4/27/2007 12:16:35 AM , Rating: 2
This is hilarious considering the nearly $200 cheaper 320mb GTS outperforms the R600 and 640mb GTS at that resolution. The ATI fanboys think they are getting such a great deal, when all they are getting is a higher price and less frames, all more than 6 months after Nvidia released their proven superior product.


RE: Impressive
By DingieM on 4/27/2007 7:32:41 AM , Rating: 2
Are you stupid or something??
The price is not fixed and these are only release candidate drivers. Remember AMD can drop the price significantly due to 65nm parts.
nVidia (should) had enough time, say, 6 months to deliver mature drivers.
Expect AMD to vastly improve driver speed of this beast.
And did you see the scores for workstation graphics application? That XT (with release candidate drivers) utterly crushes the GTS. Look again if you've missed that.
Fully integrated ability to stream (video and sound) via HDMI is already a reason to buy this card or a slightly cheaper one of the R6xx generation.
Now where is nVidia at that??

You are bluffing as hell, don't expect a $200 cheaper 320mb GTS to outperform either 640mb GTS or XT unless you overclock it to the maximum. We are talking about stock speeds.

So your post says absolutely nothing.
If I was a vendor and I would recommend a graphics card I would definitely NOT recommend the nVidia based on all specs of the card speed and feature wise, and I'm objective here.

I wouldn't buy the XT because I don't want to spend the money on it and there are no DX10 games yet (wait at least a half year). Other reason is the heavily bloated Windows Vista that totally sucks.


RE: Impressive
By veloxletalis on 4/27/2007 6:08:33 PM , Rating: 2
Stop with the insults, fanboy.
Look at the benchmarks of the 320mb GTS and the 640mb GTS, at any res at or below 1600 the 320mb gts outperforms it because of higher clock speeds. With a stock 320mb GTS at the above tested resolution I get upwards of 15FPS better than the XT on all the above games. All for over $200 less.
Now head on over to the newest news post, fanboy, and see your precious little XTX get smashed by the GTX.


Half Life 2: Episode 1 Numbers
By Schmeh on 4/24/2007 6:12:59 AM , Rating: 2
Maybe I am retarded, but in Anandtech's review of the 8800GTX and GTS, the 8800GTS scores over 120FPS in HL2: Episode 1 at 1600x1200. But here it is only listed at 57.4FPS at 1280x1024. Something isn't right about that.




RE: Half Life 2: Episode 1 Numbers
By KristopherKubicki (blog) on 4/24/2007 6:20:20 AM , Rating: 2
We're aware of the discrepancy and we were a little alarmed when it scaled with resolutions as well. I recommend taking that particular benchmark with a grain of salt.


RE: Half Life 2: Episode 1 Numbers
By MartinT on 4/24/2007 6:29:09 AM , Rating: 2
Besides that HL2 issue, could you clarify exactly what "MAXIMUM QUALITY" entails, if it includes AA and/or AF, and, if yes, at what settings?


RE: Half Life 2: Episode 1 Numbers
By KristopherKubicki (blog) on 4/24/2007 6:31:42 AM , Rating: 4
I've told Anh to reply to this particular thread when he wakes up. He can better answer that.


RE: Half Life 2: Episode 1 Numbers
By MartinT on 4/24/2007 6:34:35 AM , Rating: 3
Thank you!


RE: Half Life 2: Episode 1 Numbers
By Anh Huynh on 4/24/2007 11:30:06 AM , Rating: 6
The quality settings for the games were as follows:

Call of Duty 2 - Anisotropic filtering, 4xAA (in game), V-Sync off, Shadows enabled, a high number of dynamic lights, soften all smoke edges and an insane amount of corpses.

Company of Heroes - High shader quality, High model quality, Anti-aliasing enabled (in game), Ultra texture quality, high quality shadows, high quality reflections, Post processing On, High building detail, High physics, high tree quality, High terrain detail, Ultra effects fidelity, Ultra effects density, Object scarring enabled and the model detail slider all the way to the right.

F.E.A.R. - 4x FSAA (in game), maximum light details, shadows enabled, maximum shadow details, soft shadows enabled, 16x anisotropic filtering, maximum texture resolution, maximum videos, maximum shader quality.

Half Life 2: Episode 1 - High model detail, high texture detail, high shader detail, reflect all water details, high shadow detail, 4x multi-sample AA (in-game), 16x anisotropic filtering, v-sync disabled, full high-dynamic range.

Pretty much as high as you can crank it.


RE: Half Life 2: Episode 1 Numbers
By MartinT on 4/24/2007 2:36:34 PM , Rating: 3
... and thank YOU, too!


RE: Half Life 2: Episode 1 Numbers
By Furen on 4/24/2007 7:09:29 PM , Rating: 2
Shadows in FEAR, no wonder...


RE: Half Life 2: Episode 1 Numbers
By PrezWeezy on 4/25/2007 7:06:28 PM , Rating: 2
I'm still a little new to DailyTech but I noticed that some people have the little DT next to their name and there are some posts with ratings of 6 but if someone has a rating of 5 my "worth reading" goes away. Sorry it's off topic, I was just wondering.


RE: Half Life 2: Episode 1 Numbers
By Goty on 4/25/2007 7:37:53 PM , Rating: 2
Those would be the DailyTech writers (i.e. the ones with the DT adn the rating of 6).


By Anh Huynh on 4/26/2007 1:13:14 AM , Rating: 2
We rate posts that are worthwhile with information thats beneficial to our readers, commenter's with a 6. Only DT staff can rate posts with a 6.


bah
By yacoub on 4/24/2007 7:51:00 AM , Rating: 2
quote:
The ATI Radeon HD 2900 XT is poised to have a street price approximately the same as the GeForce 8800 GTS, which currently has a suggested retailer price of $499.


You must mean $299-399, which is what 8800GTSs sell for these days (that range covers 320MB to 640MB).

If that thing costs more than $375, I'd rather have the GTS, sorry. The cooler on that thing looks like the typical ATi loud jet engine design and it probably runs just as hot as X1800 and X1900 series cards, because they clearly didn't learn their lesson to make quieter and cooler cards that last longer.

They're going to drive me to buy an 8800GTS, aren't they? :(




RE: bah
By DublinGunner on 4/24/2007 8:15:37 AM , Rating: 2
I think a few people are forgetting thats the XT, NOT the XTX.


RE: bah
By yacoub on 4/24/2007 8:58:13 AM , Rating: 1
so you're saying the XTX will cost even MORE.


RE: bah
By Goty on 4/24/2007 8:23:59 AM , Rating: 2
Notice where he said suggested retail price.


RE: bah
By someguy123 on 4/24/2007 8:26:10 AM , Rating: 2
its talking about MSRP, not pricing after the fact.


RE: bah
By DingieM on 4/24/2007 9:15:13 AM , Rating: 2
I read multiple times the new ATI coolers were going to be as good as the nVidia's for the R6xx generation


RE: bah
By GoatMonkey on 4/24/2007 10:03:06 AM , Rating: 2
If you're worried about the design of the cooler then don't get a copy of the reference board. I guarantee that there will be other cooler designs available from other companies.


RE: bah
By Omega215D on 4/24/2007 4:03:31 PM , Rating: 2
I dunno about you but to me the cooler on the HD 2900XT looks similar to the one on the 8800GTS/GTX.


Stream processor(s)
By Stele on 4/24/2007 6:20:04 AM , Rating: 3
It's curious to note that for more than 3x the number of stream processors of the 8800 (320 vs. 96), the new ATI card delivers on average roughly 1.5x the performance of the NVIDIA card. As such, I'm very eagerly waiting for Anandtech's detailed study of the R600's architecture when the time comes... For one thing, I wonder if the 320SPs could be referring to some form of 'virtual SPs' instead of 'true SPs', if such a thing were possible - especially considering the amount of memory bandwidth that would be needed to keep the SPs fed vs. that which seems to be actually available on the R600. I guess time will tell :)




RE: Stream processor(s)
By encia on 4/24/2007 6:29:59 AM , Rating: 3
Note that G80’s shaders are double pumped i.e. twice the clockspeed over the rest of the GPU. It's effectively 192 shaders.


RE: Stream processor(s)
By MartinT on 4/24/2007 6:39:44 AM , Rating: 2
quote:
For one thing, I wonder if the 320SPs could be referring to some form of 'virtual SPs' instead of 'true SPs'

I fully expect R600 to have a Vec4+Scalar architecture (like, say, the Xbox 360 chip), and that AMD used some "creative math" to arrive at the "320 shader processors" bullet point.


RE: Stream processor(s)
By James Holden on 4/24/2007 6:42:50 AM , Rating: 2
I thought I read somewhere that it was 320 shaders, but that the shaders were 4-way or something. I just assumed that meant 80 physical 4-way shaders.


RE: Stream processor(s)
By Dactyl on 4/25/2007 11:18:14 PM , Rating: 1
quote:
I fully expect R600 to have a Vec4+Scalar architecture
I fully expect your expectations to be WRONG.

R600 has Vec5. The Inq says so, that means it's true.
http://www.theinquirer.net/default.aspx?article=39...


RE: Stream processor(s)
By AnnihilatorX on 4/24/2007 8:34:22 AM , Rating: 2
The 320 Stream Processors in ATI's offering runs at a slower clock speed. While in 6800s they run at 1.4Ghz, I think the ones in ATI only runs at 800 or so.

That explains the difference


For engineering, this card might be incredible...
By Toolpost on 4/24/2007 9:17:44 AM , Rating: 4
What I'm really interested in is that this card has much
better numbers than the NVIDIA Quadro FX 5500 when testing
"engineering" features, and the 5500 retails for ~$2.5! If
the numbers hold for real-world applications, I'll be
pulling the FX 3500 from my Dell 690, and throwing one of
these in.

David




By Anh Huynh on 4/24/2007 8:35:27 PM , Rating: 2
The card is an early sample that had all the workstation features enabled. However, we'll reveal more on upcoming workstation products later ;)


By Toolpost on 4/25/2007 12:40:32 AM , Rating: 2
Hi Anh,

Looking forward to it! If I remember correctly, it's been
easier to unlock the workstation features of the gamer ATI
cards than the Nvidia cards, hope that tradition continues!
(ATI engineers, you didn't read this...)

David


By Anh Huynh on 4/25/2007 11:11:34 AM , Rating: 2
The NVIDIA products have historically, been a lot easier to unlock, whether its a BIOS flash or simply using RivaTuner to enable a few strings.

I've unlocked a few 6800GTs, 6800 non-ultra's and FX5700s myself for CATIA and Pro E use. In the past, GeForce 2 MX's, GeForce 5-series and 7-series were easy to unlock.


By Toolpost on 4/25/2007 10:35:15 PM , Rating: 2
Hmm - I'll adjust my perceptions. I remember reading about
it being fairly easy to make some of the 68xx cards into
their Quadro equivalents, but I'd thought that later cards
were more difficult to unlock. And I thought I'd read that
it was easier to make ATI cards into the Fire versions, but
I can't trace down where I saw this.

I'll still hope it's possible with this new card, as the
specs look great.

David


Why 1280x1024?
By Breogan on 4/24/2007 6:04:25 AM , Rating: 2
Why benchmark them at 1280x1024? If I wanted a card to run stuff at that resolution, I'd rather buy an ATI X1950Pro and save 300$. These cards are meant to run games at 1920x1200 or higher.




RE: Why 1280x1024?
By KristopherKubicki (blog) on 4/24/2007 6:11:05 AM , Rating: 3
We had the misfortune of getting one-on-one time with the card but there were no displays capable of higher resolutions at the facility. We'll get more cards and benchmarks later this week.


RE: Why 1280x1024?
By Breogan on 4/24/2007 6:13:49 AM , Rating: 2
Great, can't wait to see further testing on them!


RE: Why 1280x1024?
By JarredWalton on 4/24/2007 6:29:17 AM , Rating: 2
Besides higher res benches, it would be nice to have an 8800 GTX in the table as well. Seems like this card could beat out both 8800 chips in a lot of games.


RE: Why 1280x1024?
By Omega215D on 4/24/2007 4:07:13 PM , Rating: 2
I'm guessing there'll be Crossfire results as well to get an idea on how much effect 2 would have for say a 30" display? I'm just curious, and Crysis is almost due for launch.


512 vs 640
By Stan11003 on 4/24/07, Rating: 0
RE: 512 vs 640
By elmikethemike on 4/24/2007 8:12:05 AM , Rating: 1
If the benchmarks are done at a higher resolution, you're likely to see ATI's lead increase rather than disappear. That little bit of extra memory is meaningless. The fact of the matter is the ATI card is better all around spec-wise, except for the amount of memory.


RE: 512 vs 640
By bunnyfubbles on 4/24/2007 9:15:36 AM , Rating: 1
yeah, pretty sure bandwidth is going to trump capacity in just about every instance in this scenario


RE: 512 vs 640
By Spoelie on 4/24/2007 11:19:23 AM , Rating: 5
As long as there is enough memory, the huge bandwidth advantage of the ATi will mean that the lead actually becomes larger, not smaller. I haven't seen any games yet that need more than 512mb at 1600x1200+

You might have a point for 30" panels, but testing needs to be done first


RE: 512 vs 640
By mendocinosummit on 4/24/2007 11:26:21 PM , Rating: 2
If someone has a 30" screen most likely they will have the XTX though.


RE: 512 vs 640
By Zoomer on 4/24/2007 6:57:47 PM , Rating: 2
At higher resolutions, nvidia's performance should fall faster than the r600.


512-bit
By Kaleid on 4/24/2007 6:57:46 AM , Rating: 2
For an 512-bit offering the benchmarks don't seem all too encouraging.
But I'll go with Ati once again when I update later this year.




RE: 512-bit
By otispunkmeyer on 4/24/2007 7:56:16 AM , Rating: 1
i think they have gambled here, games are becoming less and less reliant on huge textures and the like...and more focused on compute intensive stuff like shaders.

i really think piling on the memory bandwidth isnt going to provide the returns that would make it worthwhile


RE: 512-bit
By Sahrin on 4/24/2007 2:51:52 PM , Rating: 3
Games are more texture dependent on texture cache than ever before - hence the ever increasing video memory, and the reason HL2 looks great on all systems (great texture work - not lots of triangles and shader magic).


RE: 512-bit
By ssidbroadcast on 4/25/2007 5:59:24 AM , Rating: 2
You're forgetting that lighting technology has essentially tripled the number of texture files. Previously we only needed 1 map(diffuse,) now shaders call for 3 or more maps (bump, normal/displacement, etc)

The only thing getting smaller, arguably, are the physical meshes. (Master Chief, for example, in Halo 1 is more polys than he is in Halo 2)


RE: 512-bit
By bunnyfubbles on 4/24/2007 9:11:23 AM , Rating: 2
what do you expect from 1280x1024 and little to no information regarding AA/AF used?


HD2900XT OC ablility...
By EastCoast on 4/24/2007 9:03:26 AM , Rating: 2
I wonder how well the HD 2900XT scales in OCing? IMO, you will get more use out of 512-bit memory interface at a higher OC. But it may take some water cooling to do it. But IMO it may compete (if not beat) the GTX.




RE: HD2900XT OC ablility...
By James Holden on 4/24/2007 2:20:36 PM , Rating: 2
I concur.


RE: HD2900XT OC ablility...
By sviola on 4/24/2007 3:07:40 PM , Rating: 2
With 240W+ at stock, you'll need to run it inside a fridge if you overclock it.


RE: HD2900XT OC ablility...
By Scabies on 4/24/2007 3:10:46 PM , Rating: 2
current wattage numbers are inaccurate. Could be higher (heaven forbid) and could be lower. Besides, I think you are detailing the req's of the XTX instead of the XT (reviewed above)


RE: HD2900XT OC ablility...
By EastCoast on 4/24/2007 8:03:32 PM , Rating: 2
Core Clocks
By Kougar on 4/24/2007 9:38:49 AM , Rating: 2
Can you tell us what the core clock frequency was for this card? If so that would be great. :)




RE: Core Clocks
By Pitbulll0669 on 4/24/07, Rating: 0
RE: Core Clocks
By Pitbulll0669 on 4/24/2007 10:01:00 AM , Rating: 2
This will HD with a HDMI to A DVI Cable so one will run Video and the other will Run audio.. plain and simple!if the card can do it on board there is no reason for a soundcard anymore. they just single handedly destoyed the Souncard market! Nvidia better hope this doesnt work or they are SCREWED! For one Ill be getting the XTX version and then run it though my 47in LCD! Nice!I cant wait..Peace.


RE: Core Clocks
By Spoelie on 4/24/2007 11:31:20 AM , Rating: 2
Clocks are set at 742 MHz for the GPU


RE: Core Clocks
By Kougar on 4/24/2007 12:35:47 PM , Rating: 2
Really?? Then this card is still in the running then... It was stated by ATI that performance is HIGHLY dependant on the core clock speed, which is why they wanted to eek out every last mhz they could manage.

It seemed to me that 800MHz or more was going to be the core clock for launched cards, with higher clocks via Overdrive + 8-pin PCIe connector being possible.


RE: Core Clocks
By Spoelie on 4/25/2007 6:38:35 AM , Rating: 2
Well, if memory isn't the only discerning factor between the XT and XTX, the latter may well have a core clock at or in excess of 800mhz.. possibility is still there ^^


By scrapsma54 on 4/24/2007 4:41:10 PM , Rating: 2
I knew I should have held off on getting an sli mobo. Lets just hope this isn't a paper launch.




By scrapsma54 on 4/24/2007 4:49:24 PM , Rating: 2
Btw, the reason Gts did better with Half-life is because Half-life uses all of the available video memory. Half life takes heavy usage in textures than in shaders, therefore the gts is much more effective. Oblivion uses nothing but shaders (figure of speech) so Ati must have reworked something in their set up for it to run its shader 10 frames faster. Hello crossfire.


By Goty on 4/25/2007 8:58:13 AM , Rating: 2
Yeah... the GTS almost had it's framerate doubled by the XT....


By Mojo the Monkey on 4/27/2007 11:34:20 AM , Rating: 2
since the new drivers came out, arent you able to use 2 ATI cards in crossfire for any motherboard with 2 PCI-E slots?

does an sli board specifically stop 2 x ATI from operating correctly?


CPU Limited?
By Egglick on 4/24/2007 8:24:53 AM , Rating: 2
As others have said, it could very well be that the 2900XT is CPU limited at 1280x1024. Wait until we get more benchmarks before jumping to conclusions.




RE: CPU Limited?
By DingieM on 4/24/2007 9:28:15 AM , Rating: 3
According to one reading this card supposed to be less dependant on the bandwidth to the main CPU because more geometry and/or pixel generating hardware is inside the GPU now.
This has also to do with DX10. So this new generation of DirectX may see much faster graphics due to less bottlenecks from the CPU.
So in order to do so bandwidth to video memory inside the GPU is even more crucial than ever before like DX9 generations. This is why ATI/AMD is rumoured to be much better in DX10 than nVidia will ever be. But we will see, about one year later when *some* games developers create DX10 games...


RE: CPU Limited?
By GoatMonkey on 4/24/2007 10:32:42 AM , Rating: 2
quote:
This is why ATI/AMD is rumoured to be much better in DX10 than nVidia will ever be

If "ever" means the next year or so, then you're probably right.


AGP Version?
By kdog03 on 4/24/2007 1:20:13 PM , Rating: 2
I know im going to get fried for asking this but are we going to get an agp version of any of these cards?




RE: AGP Version?
By hellokeith on 4/24/2007 3:03:55 PM , Rating: 2
Ditto.

My X1950 AGP is already somewhat cpu-limited, even w/ P4 3.2GHz, but getting a mainstream or performance DX10 AGP card would be a great last card for my trusty P4 system. Hopefully HIS will make an AGP version.. we'll see.


RE: AGP Version?
By Omega215D on 4/24/2007 4:12:46 PM , Rating: 2
I don't think AGP can supply that much power compared to a PCIe slot and that's including the use of power from the PSU PCIe connectors as well. I guess it's almost time to dump the "legacy" interfaces.

For now it looks like you can choose a GeForce 7900GS or Radeon 1950Pro, either way both cards are more expensive than their PCIe counterparts (of course you would need a new mobo anyway.)


By Darkskypoet on 4/24/2007 1:40:24 PM , Rating: 2
Now I know almost everyone is looking at the gaming benchmarks, and rightfully so. However, if you look at the past gen comparisons it seems quite evident that the older ATI Architecture was more "fill rate" limited then nVIDIA's. Namely that While ATI could dress up the pixels with eye candy far more efficiently (as in shader ops per pixel) Nvidias last gen could churn out more simple pixels.

If you look at the Workstation benchmarks from above, I think again you'll see this almost holds true here. Gaming benchmarks rely far more on fillrate then workstation benchmarks do. Now in this case the R600 is not lagging at all, however in some of the benchmarks its been noted that perhaps the R600 should be trouncing the 8800 even more.

Take a look at the workstation D3D benchmarks however and the R600 Devestates the 8800. Simply put in the early stages of driver development, and on this as of yet unknown process R600; the R600 completely destroys the 8800 in all Direct 3d workstation benchmarks. Catia utilizing Open GL shows a still recurring weakspot in ATI implementations vs. nVidia; Open GL.

If ATI has implemented their 320 Streamprocessors in the form of 4 x 80, then we should see a case whereby 4 ops per pixel could be implemented essentially freely, however you'd be limited then by the that same 80 number as to how many independent pixels / entities you were working with per clock. (heavily estimated - more likely less I am just illustrating an idea on architecture). In this case then the architecture would be "fill-rate" limited (again taking a term that may not 100% apply, but idea the same).

If this is the case, then I would expect the R600 to be less superior in cases with x amount of dumb / plain pixels, and more superior in cases of x amount of complex pixels. (shader ops, AA, AF, etc.) Much like last gen.

Also, this would explain the reliance on core clock, for performance, as it would directly impact upon the number of pixels (theoretical max) pushed through the architecture per second much more then with the 8800.

If my basic understanding of Nvidia's architecture is correct, they have gone a different path of creating more (using the 80 sp number) but simpler shaders, this would give an edge in simple pixel production, but would require more of these simpler shaders for complex pixels. (For complex pixels maybe more accurate to use the 320sp number for comparison as then nVidia shaders must work together to achieve same effect as ATI architecture. 4:1)

Any extra input?




By gescom on 4/24/2007 2:45:23 PM , Rating: 2
hmm... how about maya benchmark - 224.59 vs. 142.18. If that is true then Ati's new card is more revolution than evolution. Can you imagine Ati's middle card performing like nvidia double quadro 5500?
whoaa....


By Darkskypoet on 4/24/2007 3:13:10 PM , Rating: 2
I am not aware of this for sure, but is the maya chunk also utilizing Open GL? edit:(checked, and it is... my bad) If so, perhaps its the specific libraries being used, or simply immature drivers not yet fully Open GL optimized (going off D3D vs Open GL 3DSMax benches listed above vs differences between r600 and 8800 in other Open GL Benches.)

In both cases, the movement to unified shaders would be revolutionary... By def'n I think. :)

On a side note I would seriously hope ATI is better at Unified Shaders then nVidia considering this isn't the first US graphics implementation done by ATI. Whereas it was nVidia's first.


more interested in midrange
By Visual on 4/24/2007 8:02:29 AM , Rating: 2
personally i am more interested in ati's performance in the $150 segment.. seeing how they can match nvidia's dx10 cards there with their last gen, there is a potential for good surprises.




RE: more interested in midrange
By Goty on 4/25/2007 8:59:13 AM , Rating: 2
I feel the same way. That's where the money is.


Sound and Physics
By aftlizard01 on 4/24/2007 1:10:42 PM , Rating: 2
Since the processor supports sound, but the reference board only came with dual link dvi, did that mean it came with the sound disabled? Or was the sound still eating up clock cycles even though it was inaccesible?(Or a third question, is the sound able to be fully disabled for those who want to use a stand alone solution?)

On the physics were you guys able to see any noticable difference in objects staying on screen during the tests, or is that something that is unable to be tested?

I think other than outperforming at the same price point the two other big selling points will be physics and sound and I would like to hear about any impressions or opinions on them.




RE: Sound and Physics
By oopyseohs on 4/24/2007 2:52:39 PM , Rating: 2
The 8.361 drivers do not include the files for the HDMI Audio. The hardware capability thing pops up, but it does not find files to install.


If I may offer a suggestion
By Chaotic42 on 4/25/2007 12:03:20 AM , Rating: 2
As far as I am concered, I'd really like to see high resolution in video reviews. How well do the cards do at 1920x1200 or 2560x1600? Maybe throw a 1600x1200 in there. I can't imagine spending $300+ on a video card to use it to game on a 1280x1024 monitor.




RE: If I may offer a suggestion
By Chaotic42 on 4/25/2007 12:05:13 AM , Rating: 2
Oops, I completely missed that this was explained. Sorry!


Slight dissapointment
By Jaegs on 4/24/2007 5:55:55 AM , Rating: 2
A good showing for ATI, but after making me wait this long to upgrade I was hoping for a larger margin of victory.

Here's hoping that the XTX tools the GTX.




Can't Wait!
By 457R4LDR34DKN07 on 4/24/2007 5:57:08 AM , Rating: 2
I'm building a new rig around this monster




AMD are back at last
By mrteddyears on 4/24/2007 6:01:17 AM , Rating: 2
I can’t wait to get my hands on 1 and I am happy I waited. Those figures are really good especially for my gaming rig I am building.




Sweet
By 5150Joker on 4/24/2007 6:07:30 AM , Rating: 2
Hopefully AMD gets these out in quantity asap. nVidia's 8800 GTS is going to be in trouble if both are priced equally.




Process
By jay2o01 on 4/24/2007 10:08:27 AM , Rating: 2
Was the refernce card you benchmarked 65nm? 65nm may give the r600 more headroom when it comes to overclocking, a die shrink should mean less power consumption/heat. Get it to market already AMD/ATI...




All around good news.
By VooDooAddict on 4/24/2007 10:13:07 AM , Rating: 2
Good to see some competition to the 8800s.

Now we can easily recommend any of the ATI X2900 or NVIDIA 8800 series.

Prices should start coming down a little on the DX10s and maybe we'll see some more DX10 features start filtering into some games.




Finally
By jlanders646 on 4/24/2007 1:02:24 PM , Rating: 2
I'm just happy its finally here, where can I can I buy mine!!




Mid May Launch
By just4U on 4/24/2007 1:22:34 PM , Rating: 2
You know, here and there people are saying lets hope that it is not a "Paper Launch".

I think that one of the reasons that AMD has delayed is because that's precisely what they want to avoid doing. It would make alot of sense when you think about it.

Will be interesting to see if they pull it off and weather it's a limited supply accross the board or there in volume. :)




Tunisia
By lplatypus on 4/25/2007 4:07:26 AM , Rating: 2
So where is the news from Tunisia? There were rumours than an Agena FX system was going to be demo'ed, but nothing seems to have come of it.




By Staples on 4/25/2007 2:00:09 PM , Rating: 2
Glad to see that there may actually be some competition now in the DX10 capible video cards however these cards are way too expensive for me to buy. $450 for the second best is too expensive. Reviews have always been centered on comparing the top of the line products and have little regard for the mainstream products. That used to be ok five years ago when top of the line was not insanely expensive but now, more imphasis should be placed on the mainstream because even me as an enthusiast will not be paying the prices that the high end parts cost nowdays.




By iollmann on 4/26/2007 12:28:00 AM , Rating: 2
These newer stream GPGPUs deserve more than just gaming benchmarks. How about some LinPack (single precision) numbers?




GeForse 8800GTX forever
By stasian on 4/27/2007 11:01:36 AM , Rating: 2
? ?? ???? ?????? ??????, ?????? ??? ???????????? 8800GTS, ? ?? 8800GTX




Why not test using a TV and s-Video
By pplapeu on 4/28/2007 4:35:44 PM , Rating: 2
The best they can up for testing is a 15" LCD??????

What a joke of a comparo test.




2900XT pure DX10
By MustardTheoRy on 4/28/2007 10:51:16 PM , Rating: 2
Nvidia was originally going to have a fixed architecture and at last minute changed to dynamic pipelines similar to ATI's configuration. I think the 2900xt will be able to outperform any 8800 in DX10 games, contrary to early benchmark reports. The 320 scalar pipes are utilized on a more granular level down to a single pipe. Compared to Nvidia’s 128 which can be allocated only in blocks of four from what I remember. I believe we are seeing scores closer to current ATI hardware and Nvidia GTS because the 2900xtx being used is in a static configuration with premature drivers to render DX9. Don't forget about the 2900 being able to process sound and physics also.

http://www.bilgiustam.com/resimler/donanim/test2.j...




Meh
By Domicinator on 4/24/2007 5:06:42 PM , Rating: 1
With the exception of the CoD2 bench, it looks like I have no reason to regret buying my 8800GTS, which is a terrific overclocker, by the way. We're talking about just a few frames per second less in performance here, and I can easily squeeze those frames out of my GTS with a very light overclock.

I have always been a fan of Nvidia, but ATI makes some great cards too, especially in the last couple of years. It's good that the R600s are finally coming out, because the new mid range Nvidia stuff will become very affordable for people, and prices across the board will start dropping. But I'm still standing by my trusty 8800.




HD 2900XTX at launch?
By JSK on 4/24/07, Rating: 0
hdmi?
By Borkil on 4/24/07, Rating: -1
RE: hdmi?
By Borkil on 4/24/2007 9:12:03 AM , Rating: 3
nvm...i've now read the entire article lol


Another NDA violation...
By cornfedone on 4/24/07, Rating: -1
RE: Another NDA violation...
By digitalwanderer on 4/24/2007 4:48:04 PM , Rating: 2
...pretty much entirely wrong.

Tuan on the other hand seems to be pretty much spot on with his info in the article, at least it fits with all the leading rumors right now rather nicely...and the fact that Tuan ain't really known for lying.

Thanks for your opinion though, come back soon ya hear? :)


RE: Another NDA violation...
By JSK on 4/24/2007 10:32:42 PM , Rating: 3
Of if only you guys knew... blind leading the blind around here...


RE: Another NDA violation...
By RyanVM on 4/24/2007 7:56:14 PM , Rating: 2
Chris Tom, is that you? Seriously, get a life. I can see why you get pissed that you do nothing but kiss AMD's ass and they do nothing to reward you for your blind loyalty, but I don't think you need to be fighting their battles for them.


By digitalwanderer on 4/24/2007 8:49:35 PM , Rating: 2
ROFL!!!


RE: Another NDA violation...
By Dactyl on 4/25/2007 11:14:52 PM , Rating: 2
If DT says there is no embargo date on these benchmarks, I believe DT until I heard AMD say otherwise.

What's shameful is accusing journalists of malpractice when AMD isn't accusing them of it!


RE: Another NDA violation...
By palindrome on 4/30/2007 6:07:39 PM , Rating: 2
Clearly a Violation to the NDA as DT is full of Anandtech people who DID sign the NDA, just like everyone else.

Here is a VERY helpful link explaining why DT and Anandtech are going to be in deep sh!t:
http://forums.amd.com/forum/messageview.cfm?catid=...


RE: Another NDA violation...
By palindrome on 4/30/2007 6:10:16 PM , Rating: 1
Quoted from Kab @ AMD Forums:
"K. TBVH I was nearly heart broken when I've waited for 8 months for this card and all my associates had been leading me with opposite info. until I saw those benches, although I was very skeptical. The first thing I did is contact all who I know in regards to the card and the reports.

You want some truth?

I will tell you a few things: Vista, DX10, CrossFire, 3Dmark 06/07 - After that I am :HUSH:

Just SEE: http://r800.blogspot.com/2007/...eon-hd-2900xt_28....

People who have the cards, affiliated, and replying on those Daily Tech reports:

Kombatant: www.kombatant.com AMD/ATI employee, (ex) Beyond3D/Rage3D mod:

On a serious note, I still see some stuff out there that have no merit. Patience.

They christened the OEM version we've known and loved for quite a few months as an "XTX". That should tell you a lot about their credibility actually.

Ok, so before this goes totally out of hand, let me say this, and this will be my final say on the matter, until the NDA is lifted: AMD made certain decisions concerning this card. I took a hard look out there, to see what's being leaked, and it seems there are still some stuff that are totally made up - tbh, it smells like a FUD campaign to me, if I take into consideration certain emails that are flying around lately. Certainly there are some stuff out there that are true, and you will know which is which when the NDA lifts soonish. The journalists that were in Tunis certainly know, and are probably laughing at some of those at this minute.

So, to recap: What I've said in the past stands about the card (Sound_Card been on a mission to put all of my quotes in his sig so that we don't miss anything ). Unfortunately I can't reveal more at this point due to the NDA. And for those who are wondering, no, I am not a moderator/staff on Rage3D anymore, I stepped down two months ago, because it wouldn't be ethical imo with the new job I now have to continue do work here.

As I said, some info out there are accurate, some are not. Whatever rumours are out there certainly won't force AMD to reveal stuff sooner, that simply doesn't work, whether you're called Intel, AMD, nVIDIA or whatever.

Bum_JCRules @ THG (under NDA with cards):

Total Crap.. well almost:
While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it.

I really would love to comment on this stuff...

I understand that DT and Anand are seperate but that is so childish. Derek was there and his cards got to his place of business before he returned home from Tunisia. That long board they have ... Not what Derek should have gotten in his delivery. That is all I will say before I go too far.

Kink (under NDA with cards):

Dailytechs benchmarks are inaccurate. Atleast in terms of 3dmark 06 (commenting on HD 2900 XT)

Metro.cl (under NDA with cards): www.chilehardware.com

laughable (whistling emoticon (wasn't me))

BenchZowner (under NDA and has card):

1) These benchies from DailyTech are quite off the reality, that's all I can say.

2)The 2900XTX & the 8800GTX are performing on par in Crysis at the moment ( direct info from a developer of the game )

3) Is it a deja-vu ? Remember the Mr Sanders case ? The...non invited to the press event at Ibiza editor of Hardware Analysis who wanted to "punish" ATi by publishing his...numbers of desire for the R580 ? He he, Deja-Vu

http://i12.tinypic.com/2py29hz.png

"Do I really have to express myself here ?
If these guys are so called my colleagues ( as of being reviewers like me ) then I should feel ashamed, really Evil or Very Mad

What are we looking at here ?

a) They used a better test system today, better drivers ( supposed to be ) and they managed to get the 2900XT to perform worse than their previews bench session with a worse test system & worse drivers ?
How come ?
On April 24 they got 84FPS on F.E.A.R. with the 2900XT with a QX6700 and pre-release drivers, and they got 79FPS today with a QX6800 and retail drivers ? Oh really ? Very Happy

b) Where's the result for the 2900XT in Company Of Heroes today ? Why N/A ?

c) On Half-Life 2:E1 today they got the 2900XT to score 1FPS more than the 2900XTX.
What could've caused this ? A typo ? Quite angelic.
Something else ? Using a CPU limited resolution which would cause both cards to behave like they're the same.And then there's the GTX surpassing the R600s by ~40 FPS. Quite the real leap over the Radeon X1950XTX at that game. Evil ? heh

d) Now, the best part...they scored ~48FPS in Oblivion on the 24th, and now they present us a 54FPS gain by a move from the QX6700 to the QX6800 and the small gain from running the RAM at 1T Command Rate ?

e) A reviewer in order to conduct comparable and a valuable & trustworthy review must use the same testbed, quite they opposite is what they did ( if they really went through a performance testing process )

f) And for what reason would somebody present unclear results in combination with unclear drivers & filters ( AF & AA ) settings ?
***** right boy Very Happy

My two cents ( oh wait, I have another one ) [ I'll save it for later ]

P.S. The stock core clock for the 2900XTX as of current is 800MHz and not 745MHz as they state.

P.S.2. That's pretty much all I can say at the moment.

P.S.3. Now I have to finish a memory roundup and then pack my stuff for a trip.

What did Crysis lead Developer (has the cards) say about the two (G80GTX and R600XT)?

And remember, Crysis is a DX9 game with an additional DirectX 10 codepath which will only help in getting better performance (gains) for the ones with DX10 cards.

28th April:
I'm gonna spit out a very last detail ( before they get me behind the bars for disclosing information )

The 2900XTX is 4fps in front of the 8800GTX ( in average ) in Crysis ( information directly from a developer of the game )

The 2900XTX won't be the G80 killer that everybody was expecting, but it'll be in front in most cases, with little to good margins depending on the game and settings.

nVIDIA did a great job with the G80 this time, and the performance leap over the previous generation is huge and only compared to the R300 launch in the past

That's STOCK clocks guys, it can OC high

BTW, from Fudzilla by "mouth": GeForce 8800 Ultra get 14000 scores in 3DMARK06 with no detail of test platform and no driver detail:
http://www.fudzilla.com/index....=view&id=678&Item...

From DailyTech: Radeon HD 2900XT get 14005 scores in 3DMARK06 with full detail of test platform and driver detail:
http://www.dailytech.com/Overc...+R600/article7044...

AND NONE of them are with with release drivers, look at the GPU clocks very carefully aswell, look for the silicon version, look at the benchmarks used aswell and the detail in each setting (compare), look at the motherboard used too and that 8800GTX is no where to be found BUT by BFG at $950 as a watercooled card. PLUS those are EARLY pre-release sample cards, ES samples if you understand what that means and do you remember the abysmal performance of the faulty 8800GTX early ES samples with the wrong resistor values on release? What about the driver optimization that took place over months on end to get decent performance? Joe is on the JEDEC board BTW, and an AMD/ATi employee who helped produce GDDR4 for X1950XTX and made it a killer product so now you are telling me after 2 years spent on it like nVidia had spent 4 years on the G80, and millions of dollars, it simply doesn't beat the X1950 in benchmarks and no higher clock on the GDDR4? BTW, those benchmarks are not remotely correct either. We have 100's of GTX split around in our corporation and my own work system has 2 in SLi at 650/2100 with watercooling. Seriously, some of them benchmarks for let's say Oblivion, are 100% increased by h*ll knows what!

Hint #200: Memory clocks can go much higher and bandwidth (i.e. performance too) is much greater in upcoming titles and higher res/detail.

BTW the owner of Fudzilla, Fuad, is the ex-Graphics Editor of The Inquirer if you didn't already know. Take everything with a brick of salt.

I'll leave you with a little pre-release bang"


RE: Another NDA violation...
By frahman on 5/4/2007 5:41:33 PM , Rating: 2
Well does ATI Radeon HD 2900 XT be supporting native crossfire?


RE: Another NDA violation...
By frahman on 5/4/2007 5:45:08 PM , Rating: 2
Well does ATI Radeon HD 2900 XT be supporting native crossfire?

ps. double post .. lol


RE: Another NDA violation...
By subarusvx on 5/15/2007 10:21:10 AM , Rating: 2
yes it does


PR Fluff
By michaelheath on 4/24/07, Rating: -1
RE: PR Fluff
By DingieM on 4/24/07, Rating: 0
RE: PR Fluff
By michaelheath on 4/24/2007 2:27:54 PM , Rating: 2
Contrary to what the jaded may believe, I am unbiased between ATI and Nvidia. Believe it or not, I have been purposely held off from purchasing a DX10 card because ATI/AMD has not released their product yet. I am simply stating that the information provided is not tell-tale of overall performance, and I am waiting for an in-depth review that provides more information. I am also stating that ATI/AMD is 8 months behind Nvidia at this point for DX10 parts, so, in my mind, they really have to produce a significant performance advantage within similar pricing versus the Nvidia parts in order to produce sales. As previously stated, I currently own an ATI card, and I have owned both ATI and Nvidia cards in various generations. My primary concern is performance, not who it comes from.

I defend Nvidia only in the instance that the HL2 score is abnormally low for a high performance card, especially considering that I have a mid-range card that seemingly out-performs it.


RE: PR Fluff
By Goty on 4/24/2007 2:51:55 PM , Rating: 2
They specifically stated that they noticed an anomaly in the HL2:EP1 framerates and were looking into it. Even if the GTs outperformed the XT in that one benchmark, would that change the outlook as a whole? No, I don't think so.

Also, who cares if ATI is late with their DX10 parts? There are no (and I mean absolutely zero) DX10 games out there, so who cares whether or not the card support DX10 or not? Also, the X1950 series keeps up quite well with the GTS (which is pretty respectable considering the generation gap).

So, again, who really cares that AMD is late to the party? The only people who have reason to complain are those who spent their money on an 8800GTX when it was first released.


RE: PR Fluff
By MartinT on 4/24/07, Rating: 0
RE: PR Fluff
By Goty on 4/24/2007 6:15:40 PM , Rating: 2
I'm just talking about spending the roughly $600 it cost when it first came out, I agree that it's a great card (with the exception of the whole Vista thing that all nvidia's cards have gone through).


RE: PR Fluff
By Min Jia on 4/24/2007 9:14:49 PM , Rating: 2
Well, while ther's no DX10 game out yet, there're games that you need a 8800 series card to run if you want everything maxed, Rainbow Six Vegas, for example.


RE: PR Fluff
By Nandro on 4/26/2007 9:45:09 PM , Rating: 2
This is quite silly comparing a new release to a 6 month old release. it should be compred to whatever Nv releases to spoil it's release, probably the 8900 series. bein on par with the Gts is mute because the GTX price will be lowered to match as soon as its released making the money aspect non existant.


RE: PR Fluff
By TOAOCyrus on 4/26/2007 11:20:18 PM , Rating: 2
So reviewers should wait for Nvidia to release something new before they post benchmarks? Thats just silly. They should compare it to whatever is out now and compare the new Nvidias when they come out if ever. The only thing rumored is a $999 Ultra which certainly wont push down the price of the GTX.


nice but...
By michal1980 on 4/24/07, Rating: -1
BIZARRE comparison
By thartist on 4/24/07, Rating: -1
RE: BIZARRE comparison
By l33th41 on 4/24/2007 1:09:09 PM , Rating: 5
This is yet another reply stating that the card mentioned is not the top of the line 2900.


"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates

Related Articles
ATI Releases More "R600" Details
April 12, 2007, 8:06 PM
The AMD "R600" in Pictures
April 11, 2007, 11:24 AM
AMD Releases Final "R600" Specs
February 16, 2007, 9:35 PM
NVIDIA Announces GeForce 8800-series
November 8, 2006, 1:33 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki