backtop


Print 67 comment(s) - last by RW.. on May 19 at 4:45 AM


Sapphire ATI Radeon HD 2900 XT TOXIC

ATI Radeon HD 2400

ATI Radeon HD 2600

ATI Radeon HD 2900 XT
AMD announces its long-awaited ATI Radeon HD 2900 XT, seven months after its initial launch date

AMD is prepared to launch its ATI Radeon HD 2000 family tomorrow. The new ATI Radeon HD 2000 family consists of the HD 2400, HD 2600 and HD 2900 graphics processors, formerly known as RV610, RV630 and R600, respectively. Although AMD will announce its ATI Radeon HD 2000 series, only the ATI Radeon HD 2900 XT will have immediate availability. The accompanying ATI Radeon HD 2400 and HD 2600-based products are paper launches.

After months of delays, AMD’s R600 GPU is finally ready for consumer purchase. Only one ATI Radeon HD 2900 model will launch today in XT guise. DailyTech previously pitted the ATI Radeon HD 2900 XT up against NVIDIA’s GeForce 8800 GTS. The ATI Radeon HD 2900 XTX previously benchmarked by DailyTech will not hit retail channels, for now.

AMD’s ATI Radeon HD 2900 XT packs 320 unified stream processors with an estimated 47.5 Gigapixels/sec pixel processing rate. The 320 stream processors are joined by 16 texture units and render backends. AMD claims the 740 MHz ATI Radeon HD 2900 XT delivers 475 GigaFLOPS of processing power in multiply-add, or MADD, calculations.

AMD pairs the 700 million transistors GPU with 512MB of GDDR3 video memory. The 1.65 GHz GDDR3 memory communicates with the GPU via a 512-bit memory interface, delivering 106 GB/sec of bandwidth. The new ATI Radeon HD 2900 XT continues to make do with an 80nm fabrication process to consume approximately 215 watts of power altogether.

Although images have shown the ATI Radeon HD 2900 XT with dual dual-link DVI ports, the card does in fact support HDMI output. An included adapter allows users to experience high-definition video and 5.1 surround sound audio via HDMI output. Also bundled with the ATI Radeon HD 2900 XT are keys for Valve’s upcoming Half Life 2: Episode Two and Team Fortress 2. The keys allow owners to download the games, when released, over STEAM.

Taking on NVIDIA’s recently launched GeForce 8600 family is the new ATI Radeon HD 2600 series. AMD plans to paper launch the ATI Radeon HD 2600 in PRO and XT guises. The ATI Radeon HD 2600 series features 120 stream processors. The GPU is clocked anywhere between 600 to 800 MHz depending on flavor. AMD backs the 120 stream processors with eight texture units and four render backends.

The amount of processing power brings the ATI Radeon HD 2600 GPU transistor count to 390 million. Nevertheless, the ATI Radeon HD 2600 is manufactured on a 65nm process. AMD rates power consumption at approximately 45 watts.

AMD pairs the ATI Radeon HD 2600 GPU with GDDR4, GDDR3 or DDR2 memory. Manufacturers can equip cards with 256MB of video memory clocked anywhere between 800 MHz to 2.2 GHz, on a 128-bit memory interface. ATI Radeon HD 2600-based cards will also support HDMI audio and video output via adapter.

At the bottom of the new ATI Radeon 2000-series lineup is the ATI Radeon HD 2400 series with PRO and XT models. The new ATI Radeon HD 2400 features 40 stream processors with four texture units and render backends.  GPU clocks vary between 525 MHz to 700 MHz.

The ATI Radeon HD 2400 series features less than half the transistors as the HD 2600 – 180 million. AMD has the ATI Radeon HD 2400 series manufactured on the same 65nm process as the HD 2600. Power consumption of the ATI Radeon HD 2400 hovers around 25 watts.

Add-in board manufacturers are free to equip ATI Radeon HD 2400 series graphics cards with GDDR3 or DDR2 memory. The GPU supports 128MB or 256MB memory configurations on a 64-bit memory interface. Memory clock can vary from 800 MHz to 1.6 GHz depending on model. Expect ATI Radeon HD 2400 series graphics cards to support HDMI audio and video output via a DVI to HDMI adapter.

Despite an announcement for the complete ATI Radeon HD 2000 series, the ATI Radeon HD 2900 XT will be the only card available for purchase tomorrow. The ATI Radeon HD 2600 and HD 2400 series will hit retail in late June 2007 with an accompanying benchmark NDA lift.

Expect to pay $399 for an ATI Radeon HD 2900 XT tomorrow from the usual add-in board partners including Diamond Multimedia, HIS, PowerColor and Sapphire Technology. AMD expects to target the ATI Radeon HD 2600 series towards the $99 to $199 market segment and ATI Radeon HD 2400 series towards the less than $99 segment.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

WEB reviews for 2900 XT
By hadifa on 5/13/2007 11:26:07 PM , Rating: 2
Apart from the early numbers of Daily Tech there are some reviews and benchmark on the web:

http://it-review.net/index.php?option=com_content&...

http://vr-zone.com/?i=4946&s=1

And finally this one seems to be down because of web traffic
http://www.tweaktown.com/articles/1100/ati_radeon_...




RE: WEB reviews for 2900 XT
By James Holden on 5/13/2007 11:29:28 PM , Rating: 2
I doubt it was web issues. He broke his NDA and was told to take it down, but wanted to save face.


RE: WEB reviews for 2900 XT
By hadifa on 5/13/2007 11:48:20 PM , Rating: 2
I didn't buy that as well. The whole site seems to be functioning quite well!


RE: WEB reviews for 2900 XT
By TechLuster on 5/14/2007 12:00:28 AM , Rating: 2
Agreed, but I wonder why other sites that posted early reviews haven't taken theirs down.

In any case, here are two more:

http://theinq.com/default.aspx?article=39580

http://www.chilehardware.com/Revisiones/Tarjetas-d...

These aren't the most professional websites/reviews (compared to AnandTech anyway), but the second seems to include DX10 benchmarks.


RE: WEB reviews for 2900 XT
By James Holden on 5/14/2007 12:04:44 AM , Rating: 1
DX10 is bullshit. Call of Juarez isn't shipping yet, and teh bundle that comes with the review kits is actually just a demo. It won't even run on NVIDIA cards, so the fact that CHW put numbers up should be a testimant to their validity.

And I would guess the reason those 2 didn't take their reviews down is cause they only broke the nda by a few hours. Tweaktown broke it by a day -- and believe me when one of those guys breaks teh NDAs, every editor calls ATI until ATI makes the offending site take it down.


RE: WEB reviews for 2900 XT
By TechLuster on 5/14/2007 6:52:59 PM , Rating: 2
Regarding reviews, I posted this below, but I wanted to make sure people actually read this, so here it is again:

-------------------------------------------------

A few weeks ago, [H]ard|OCP gave a glowing review of the 8600 GTS, in which they concluded that it soundly beats the X1950 PRO (for now, we'll ignore the fact that they were comparing the $150 1950 to $220 OC'ed versions of the GTS, which cost roughly 50% more). Of course, every other review on the web concluded that in the best cases, the 8600 GTS keeps up with Ati's 1950 Pro, and in the worst case is completely embarassed by the 1950 Pro; in several cases, even a stock 7900 GS can cream the 8600 GTS.

So when they stated in the intro of their HD 2900 XT review that the 8600 offers "the best performance in [its] class," I was reminded of their deceptively positive review of the 8600, and thus was not a bit surprised when they showed the 8800 GTS outperforming the 2900 XT in virtually ALL cases. Most reviews I've seen so far have had the 2900 keeping up with, and in some cases soundly beating, the 8800 GTS in at least a few significant games (see, e.g., Rainbow Six: Vegas in AnandTech's review).

I won't go as far as to say that [H]ard|OCP massaged or fabricated data, but I do think they deliberately chose their tests in both their 8600 and 2900 reviews to make the ATI parts look bad.

In any case, I don't think any of us can ever trust that website again.

Here are the relevant links:

[H] 8600 review:
http://enthusiast.hardocp.com/article.html?art=MTM...

[H] 2900 review:
http://enthusiast.hardocp.com/article.html?art=MTM...

AnandTech Rainbow Six: Vegas benchmark of the 2900 & 8800:
http://www.anandtech.com/video/showdoc.aspx?i=2988...


RE: WEB reviews for 2900 XT
By Rugar on 5/15/2007 2:02:30 PM , Rating: 2
I'm trying to understand your point and it still seems to be missing me. I accept that you do not like [H]. That's fine, everyone has likes and dislikes. I actually like the way [H] does their reviews because I am more interested in what IQ options can I use with each card rather than the reviews that look at framerates at some insane resolution (Really, how many people run native rates of 2560x1600?) without any concern over IQ.

You appear to be comparing the 8600GTS with the 2900XT. Why would you choose those two cards to compare? Just looking at NewEgg, the 8600GTS (one reviewed by [H] actually) can be had for $175+S/H. The 8800GTS-640Mb can be had for $330+S/H-MIR and the HD2900XT is running $430+S/H. Clearly, the $430 for the 2900 will fall in the next few months but it will never be less expensive than the 8800GTS. That being said, the 8800GTS is the closest competitor to compare to the 2900XT.

Ok, we've established the 8800GTS and the 2900XT are competitors rather than the 8600GTS vs. the 2900XT. Now let's look at the reviews you linked. The "sound beating" the 2900XT delivers to the 8800GTS is in a game with no AA or AF included. It is true that the 2900XT clearly demonstrates greater framerates in this particular graph, but it's not really the full story. You may argue that AA/AF is unnecessary in R6:Vegas, but that is subjective. You should consider such data when describing how one card “soundly beats” another. Look back at the review Derek did of the 8800Ultra. Look at the R6:Vegas graphs of with and without AA (Notice that the graphs aren’t at the same resolution. You have to look down into the tables to compare with and without AA at the same Res). Notice that turning on 4xAA in R6:Vegas hammers the framerates (about a 45% drop in FPS). Without data on how AA affects the performance of the 2900XT in R6:Vegas, we have to look to other games where data with and without AA is presented. Let's look over at [H] a second and look at Oblivion. There, they show that the 8800GTS shows basically the same framerates as the 2900XT, but with increased grass draw distance. In Dereks review, he doesn't say what the settings are for the grass but you can see that when AA/AF is enabled the 2900XT goes from clearly leading the 8800GTS to slightly behind. Also, look at the S.T.A.L.K.E.R. results. Both Derek's and the [H] review show the 8800GTS clearly leading the 2900XT even if you don't include AA/AF as Derek chose not to do. Adding the IQ factors, the [H] review shows similar performance between the 2900XT and the 8800GTS but with substantial increases in IQ for the 8800GTS (1600x1200 for the 8800 vs. 1280x1024 for the 2900, full dynamic lighting for the 8800 vs objects dynamic lighting for the 2900, enabled sun and grass shadows for the 8800).

Is the 2900XT a total dog? Absolutely not. It has features that could potentially add a lot of value as they are implemented down the line. Based on pricing today however, the 2900XT clearly lags behind the 8800GTS which gives AT LEAST as good a performance although in many situations it is substantially better than the 2900XT. The 8600XT is not a competitor either in performance or price with the 2900XT. Derek clearly points out that where AMD really has the chance to shine is if they can bring a product to market to compete with the 8600 in price and performance, but that product isn’t here yet. For now, you have to stipulate that based on price/performance in current applications the 2900XT is of less value than is the 8800GTS.


RE: WEB reviews for 2900 XT
By hadifa on 5/13/2007 11:46:16 PM , Rating: 2
Here is an interesting part of VR-Zone conclusion:

quote:
As said before, I had results from both the Catalyst 8.36 and Catalyst 8.37 Drivers. How did performance improve between this small driver jump of 0.01 version?


quote:
Taking a setting at 1600x1200 with 16xAF, I saw a major increase in performance, particularly Company Of Heroes and Quake 4. Performance went up by 11% on COH and 42% on Quake 4! This shows that the drivers is still very raw on this card , with just a minor driver revision boosting up performance that much, it gives us quite a lot of hope for a fair bit of improvement to come. Let's hope for that!


quote:
In many non Anti-Aliasing, High Definition game settings, you have seen the X2900XT push ahead of the performance of it's closest competitor, the GeForce 8800GTS 640MB, sometimes by quite a large margin, sometimes falling behind or ahead by a small percentage. In a select few games, the GTS is slightly faster, and vice versa. When Anti-Aliasing is turned on, the X2900XT showed that it carries it off with great efficiency in games that the drivers are optimized for, performing significantly better than the GTS; while the AA efficiency is piss-poor in some games due to the raw driver which has not fully blossomed to take advantage of ATi's new GPU technology. Just take a look at how performance has boosted from Drivers 8.36 to 8.37, that shows the potential in performance growth... a whole lot of it to reap.


I don't know about Quake4 but the performance boost for Company of heroes in X1000 class cards exist since catalyst 7.1 or maybe 7.2. Even though the new generation are complicated products with DX10, Audio and maybe physics handling, but immature drivers after 6-7 months delay is not acceptable.


RE: WEB reviews for 2900 XT
By James Holden on 5/13/2007 11:54:07 PM , Rating: 2
8.39 got released to reviewers yesterday.


RE: WEB reviews for 2900 XT
By hadifa on 5/14/2007 2:03:05 AM , Rating: 2
Is there any indication on how it performs compared to the previous releases?


RE: WEB reviews for 2900 XT
By KristopherKubicki (blog) on 5/13/2007 11:58:12 PM , Rating: 2
quote:
Performance went up by 11% on COH and 42% on Quake 4! This shows that the drivers is still very raw on this card

That's a hell of a conclusion to pull. What I suspect happened (this happened with us until we figured it out) was that you had to put two cards into Crossfire first, then back down to single-card. Doing so would increase single card operation by like 20%. I'm guessing the 8.37 addressed this.


RE: WEB reviews for 2900 XT
By hadifa on 5/14/2007 2:00:49 AM , Rating: 2
quote:
... you had to put two cards into Crossfire first, then back down to single-card. Doing so would increase single card operation by like 20%. I'm guessing the 8.37 addressed this.


~_~

Sorry but I am a bit confused here. Are you implying that prior to 8.37, in a two cards setup, even if you choose to test a single card, still the second card will influence the test? or you are saying that the single card test was 20% lower than what it should have been and adding another card corrected the mistake.


By KristopherKubicki (blog) on 5/14/2007 2:46:38 AM , Rating: 2
Correct, you had to go to two cards, then back down to one card. This same problem used to happen a lot in the original Crossfire days.


RE: WEB reviews for 2900 XT
By GlassHouse69 on 5/14/2007 12:26:58 AM , Rating: 1
nvidia never had nor seems to have updated drivers for year+ hold products. ....


RE: WEB reviews for 2900 XT
By hadifa on 5/14/2007 4:09:02 AM , Rating: 2
Now I checked tweaktown and the review is up again

http://www.tweaktown.com/articles/1100/ati_radeon_...


RE: WEB reviews for 2900 XT
By RW on 5/19/2007 4:45:34 AM , Rating: 1
After seeing the benchmarks all over the web I can honestly say that:
This HD 2900 XT is a fucking joke , whichever developed it should be fired immediately, because such low gains in performance over the old X1950 XTX is unacceptable for a card with 700 mil transistors.

They should have know that they have to deliver at least 2x the performance of the old X1950 XTX, at least that's what NVidia does with every new chip they develop it to be at least 2x faster than the old model.


delay of xtx?
By venny on 5/13/2007 11:28:58 PM , Rating: 3
y delay the xtx?




RE: delay of xtx?
By James Holden on 5/13/2007 11:30:27 PM , Rating: 3
It looks like its not going to come to fruition. I heard ATI scrapped it after dailytech's review

http://dailytech.com/article.aspx?newsid=7052


RE: delay of xtx?
By cheetah2k on 5/14/2007 12:19:56 AM , Rating: 3
Indeed, but pure speculation AMD intend to scrap it. I understand that there is an issue with using the GDDR4 memory and memory addressing. AMD will certainly sort this out, but i would guess not in the near future.

They jumped the gun on this really - like what Nvidia did releasing the 8800GTX Ultra straight after they allowed the resellers to offer overclocked versions of the stock 8800GTX's


RE: delay of xtx?
By KristopherKubicki (blog) on 5/14/2007 12:23:13 AM , Rating: 3
Well, it's still kicking around I believe. Some manufacturers will release a long card and a 1gb short card. I don't know if they're still planning to call it XTX though -- the guidance certainly doesn't suggest that's the case.


Seven months late...
By Quiksel on 5/13/2007 11:27:48 PM , Rating: 2
and somehow they are still paper-launching the lower-end cards?

Seriously?

What in the world?

~q




RE: Seven months late...
By James Holden on 5/13/2007 11:30:58 PM , Rating: 2
July for mobility parts. Didn't Santa Rosa just launch? That's awful


RE: Seven months late...
By neon on 5/13/2007 11:37:21 PM , Rating: 2
I thought the latest 1 month delay was so that they could launch the entire X2000 family at the same time, and at 65nm?


RE: Seven months late...
By James Holden on 5/13/2007 11:45:11 PM , Rating: 2
That's why people shouldn't believe the inquirer.


RE: Seven months late...
By neon on 5/14/2007 12:00:52 AM , Rating: 2
In this article, they claim to be quoting David Orton on that directly:
quote:
"We pushed out the launch of the R600 and people thought is must be a silicon or software problem…it's got to be a bug," said Dave Orton, president and chief executiveof ATI. "In fact, our mainstream chips are in 65nm and are coming out extremely fast. Because of that configuration, we have an interesting opportunity to come to market with a broader range of products," he explained.

"Instead of having them separate, we thought, lets line that up, so we delayed for several weeks," Orton continued, referring to the R600 family as a whole, which AMD now says will come out at the same time (a matter of weeks as opposed to months, according to Richard) instead of just the high-end version.

http://www.extremetech.com/article2/0,1697,2099613...


RE: Seven months late...
By James Holden on 5/14/07, Rating: -1
RE: Seven months late...
By neon on 5/14/2007 12:29:24 AM , Rating: 2
Whether it was Orton's intent to obfuscate the situation, or the writer of the extremetech article just didn't understand, they certainly led me to believe that there would be a simultaneous launch of 65nm parts.


benchies
By ttnuagadam on 5/14/2007 8:56:29 AM , Rating: 2
hmmm the 2900 seems to put out some pretty inconsistent numbers. looks like ati has some work to do on its drivers. it looks like it has some serious potential however.




RE: benchies
By Goty on 5/14/2007 10:17:44 AM , Rating: 2
Yeah, I think AMD can pull the proverbial rabbit out of the hat with some decent drivers.


RE: benchies
By jarman on 5/14/2007 12:01:58 PM , Rating: 2
quote:
Yeah, I think AMD can pull the proverbial rabbit out of the hat with some decent drivers.


There's no harm in remaining optimistic, but when you're counting on better drivers to make a video card competitive, you're better off sticking a fork in it and starting over. Better drivers will not fix the HD2900XT's horrible power management issues.


RE: benchies
By Chaser on 5/14/2007 7:54:12 PM , Rating: 2
Considering Nvidia's almost bi monthly release of 8 series drivers since their launch 6 months ago compared against AMD's first DX 10 GPU driver release there is most certainly room for optimization and performance improvement.

At the $400.00 price point this card does remarkably well at launch against the competition. Most legitimate review sources agree that theres a considerable amount of performance room to harness with this "forward looking" GPU.


RE: benchies
By jarman on 5/14/2007 8:17:18 PM , Rating: 2
quote:
Most legitimate review sources agree that theres a considerable amount of performance room to harness with this "forward looking" GPU.


Normally that is a good thing, but AMD/ATI still must deal with the HD 2900XT's major power leakage issues. No driver will fix that problem...

At $400.00 this card is soundly beaten by a $359.00 8800GTS that is much more power efficient.


memory bandwith...
By Darth Farter on 5/13/2007 11:55:31 PM , Rating: 2
512bit topend and 128bit low/mid... why not 256bit low or mainstream AMD/ATI? that'd get my dollar for sure. hope they do in the future.




RE: memory bandwith...
By Alexvrb on 5/14/2007 12:27:18 AM , Rating: 2
I'd like a wider memory interface on the cheaper parts as well, but if you look at their price points, its easy to see why they went this route. They're going to be very competitive on price. Besides, look at the memory interfaces and GDDR speeds on the low to mid Nvidia cards. Nothing impressive there.

Their lineup actually looks very strong against Nvidia until you get to the fastest 8800s. They've got nothing to compete there that I have seen yet. Are they holding back? Or putting something else while launching the already-completed parts?


RE: memory bandwith...
By Griswold on 5/14/2007 4:18:21 AM , Rating: 2
Thats what I'm probably waiting for as well, considering they paperlaunch mainstream. Rumor has it that there will be a performance part like the x1950pro with a 256bit bus sometime in Q3 - I guess I can wait for that since the 2900XT was never what I aimed for...


RE: memory bandwith...
By Spoelie on 5/14/2007 6:51:49 AM , Rating: 2
I was excited reading the initial specs of the HD2600, but now...

8 texture units and 4 render backends??? wtf
The 8600 has 12 texture units and 8 render backends, and considered too slow.
Forget the shader output, daamit should seriously reconsider their stance on rendering the final pixels. The 2600 is seriously underpowered in that department. The 9600 years ago had 4 render backends.

It seems they have some mad mathematician at the helm, screaming 'more shaders more shaders moooooooore' but ignoring the rest of the gpu.


More reviews
By Quander on 5/14/2007 5:47:45 AM , Rating: 2
http://enthusiast.hardocp.com/article.html?art=MTM...

quote:
The Bottom Line

“A day late and a dollar short.” Cliché but accurate. The Radeon HD 2900 XT is late to the party and unfortunately is bringing with it performance that cannot compete. The GeForce 8800 GTS 640 MB is $50 cheaper, performs better, and draws a lot less power than the 2900 XT.

This is as good as it is going to get for a while from ATI. The GeForce 8800 GTX will still dominate at the high end of the video card market. Of course we do not know about DX10 games yet, and there is no way to make any predictions how that comparison will turn out. As it stands right now the Radeon HD 2900 XT, in our opinion, is a flop . ATI needs to get its act together quickly. It needs to push out the mainstream cards soon and it needs to deliver a high end card that can actually compete at the high end of the market.




RE: More reviews
By Spoelie on 5/14/2007 6:54:50 AM , Rating: 2
hardocp has been a little too nvidia minded lately, both in their wordings and their tests. I remember a lot of negative backlash on the 8600gt/s test that they did not adress or explain.


RE: More reviews
By Goty on 5/14/2007 10:16:35 AM , Rating: 2
HardOCP has been extremely nvidia-biased ever since the delay of the X1800, and the delay of the HD 2900 hasn't helped anything at all.


RE: More reviews
By jarman on 5/14/2007 11:16:31 AM , Rating: 2
quote:
HardOCP has been extremely nvidia-biased ever since the delay of the X1800, and the delay of the HD 2900 hasn't helped anything at all.


You've gotta be kidding me... Do you think that ATI deserves any praise whatsoever since the X1800? The X1800 was a failure, X1900 series performed better, yet nVidia still had a better solution and multi GPU implementation (dongle anyone?). ATI's chipsets have not been as competitive as Nvidia & Intel offerings and now ATI is 7 months late with a power hungry lame duck?

I'm sorry, but maybe that bias is the reality of the high-end GPU landscape. ATI is a non-player at his point in time in the high-end market.

We're already hearing "just wait for R700!", and I for one am finished waiting for ATI to execute. It appears that my 9700Pro will be the last ATI card that I will own for some time to come...


Apology
By James Holden on 5/14/2007 12:11:28 AM , Rating: 5
Hi Dailytech,

I called bullshit on your original benchmarks due to the inconsistencies. After reading about 5 reviews now (still waiting for Tech Report and Anand), I'd like to say I'm sorry.

It looks like you guys were spot on, good job. Can someone tell me exactly what's going on with the XTX? I read it was canned from disti, but maybe they will still release it as an OEM card?

JH




RE: Apology
Ok
By thebrown13 on 5/14/2007 2:39:49 AM , Rating: 1
So Nvidia had a faster card (8800 GTX) out last september. 9 months ago. NINE. Wow. Does that mean we should not expect any meaningful boosts in GPU power for a long time?




RE: Ok
By Le Québécois on 5/14/2007 3:46:42 AM , Rating: 2
http://www.anandtech.com/video/showdoc.aspx?i=2870

More like november, not september.

And why do you say ?

quote:
we should not expect any meaningful boosts in GPU power for a long time?


The X1900 series came just a couple of months after the ill fated X1800. I would bet Nvidia has more to come than the 8800 Ultra. Probably not tomorrow, but before 2008 I'm pretty sure.


RE: Ok
By Griswold on 5/14/2007 4:20:25 AM , Rating: 2
Try novemeber..

And why would you care? I'm sure you bought a 8800GTX already.


RE: Ok
By Dactyl on 5/14/2007 6:39:38 PM , Rating: 2
In a few short months we will have the HD 2950 XT, based on the 65nm RD650. If that's as efficient as the 65nm HD 2600 XT, it will be better than all of the 8800 cards.

Further, Crossfire benchmarks have come out very well for AMD. That suggests the R600's architecture will scale well to 4-way Crosswire (or even 8- or 16-way for professional applications). If ATi can get power consumption down to 150W--which is 3 times that of the HD 2600 XT, which is 1/3 as big and 1/3 as powerful--an HD 2950 XT Crossfire setup will be feasible, and will be better than two 8800 Ultras.

The question is how well NVidia has come along in the race to 65nm. If they've got a 65nm 8900 GTX in the pipe, that could potentially trump the HD 2950 XT.

A further benefit of AMD's architecture is that it is more suited to general-purpose computing (such as physics, folding @ home, etc.). That's not good enough to make the HD 2900 XT worth buying, but the 65nm version should be awesome, assuming AMD can pull it off.

I don't get all the gloom and doom. We knew the 80nm R600 parts were really hot, and as a result they would be slower than the NVidia models. The question is whether AMD has a version of R600 that is 65nm, and if so, how close is that to being released?

Funny thing, the fear among AMD fanboys a month ago was that they wouldn't be able to tell which R600 cards were 80nm (hot and slow) and which were 65nm (cool and fast) and they would get stuck buying the wrong one. Now the fear is that AMD doesn't even have a high-end 65nm card at all.


darn it 128-bit again
By shraz on 5/14/2007 12:34:56 AM , Rating: 2
Well it seems the x2900XT ranks in the middle of the GTS and GTX. It's sad the x2600 isn't 256bit... they are making the same mistake as geforce 8600 =[




RE: darn it 128-bit again
By Chadder007 on 5/14/2007 1:08:38 AM , Rating: 2
Agreed. The 2600 should have had 256bit memory interface for sure leaving the 2400 with 128 instead.


Vista Driver
By subhajit on 5/14/2007 2:38:56 AM , Rating: 2
ATI Vista drivers are really impressive, very little to no performance hit.




RE: Vista Driver
By subhajit on 5/14/2007 2:42:30 AM , Rating: 2
Forgot to mention, Guru3d has the Vista vs. XP numbers.
http://www.guru3d.com/article/Videocards/431/1/


[H]ard|OCP paid off by Nvidia?
By TechLuster on 5/14/2007 6:44:37 PM , Rating: 2
A few weeks ago, [H]ard|OCP gave a glowing review of the 8600 GTS, in which they concluded that it soundly beats the X1950 PRO (for now, we'll ignore the fact that they were comparing the $150 1950 to $220 OC'ed versions of the GTS, which cost roughly 50% more). Of course, every other review on the web concluded that in the best cases, the 8600 GTS keeps up with the Ati's 1950 Pro, and in the worst case is completely embarassed by it; in several cases, even a stock 7900 GS can cream the 8600 GTS.

So when they stated in the intro of their HD 2900 XT review that the 8600 offers "the best performance in that class," I was reminded of their deceptively positive review of the 8600, and thus was not a bit surprised when they showed the 8800 GTS outperforming the 2900 XT in virtually ALL cases. Most reviews I've seen so far have had the 2900 keeping up with, and in some cases soundly beating, the 8800 GTS in at least a few significant games (see, e.g., Rainbow Six: Vegas in AnandTech's review).

I won't go as far as to say that [H]ard|OCP massaged or fabricated data, but I do think they deliberately chose their tests in both their 8600 and 2900 reviews to make the ATI parts look bad.

In any case, I don't think any of us can ever trust that website again.

Here are the relevant links:

[H] 8600 review:
http://enthusiast.hardocp.com/article.html?art=MTM...

[H] 2900 review:
http://enthusiast.hardocp.com/article.html?art=MTM...

AnandTech Rainbow Six: Vegas benchmark of the 2900 & 8800:
http://www.anandtech.com/video/showdoc.aspx?i=2988...




RE: [H]ard|OCP paid off by Nvidia?
By maroon1 on 5/15/2007 12:25:55 PM , Rating: 2
Why you don't look for the benchmarks of other games like STALKER

http://www.anandtech.com/video/showdoc.aspx?i=2988...

8800GTS 640MB kicks HD2900XT ***, and it is cheaper, it consume less power and produce less noise

HD2900XT = Failure


8800 > HD2900
By maroon1 on 5/14/2007 10:53:21 AM , Rating: 2
http://www.vr-zone.com/?i=4946&s=12
http://enthusiast.hardocp.com/article.html?art=MTM...

Nvidia > ATI

And this new HD2900XT consumes more power than 8800 and it produce more noise...




RE: 8800 > HD2900
By TomZ on 5/14/2007 3:25:50 PM , Rating: 1
Yeah, the 215W figure is a bit crazy for a video card. They need to work harder to get that under control.


ATI is DOOMED
By IntelGirl on 5/15/2007 2:41:03 AM , Rating: 1
Nvidia just royally owned ATI with the 8800 GTX, not to mention the 8800GTS.

AMD might as well quit the graphics business now. Their latest efforts has been an embarrasement. LOL, It's so funny, even the mid end G80 can beat the high end R600.

ATI fanboys have been owned.




RE: ATI is DOOMED
By Proteusza on 5/15/2007 4:36:30 AM , Rating: 2
The only ones who ever get owned are the fanboys who wont switch brands when offered a better card. nVidia and ATI have both made good cards and bad cards and they will both continue to make good cards and bad cards. Restricting yourself to one brand means you get the good and the bad. If you only went NVidia you would have bought the DustBuster, AKA the FX5800.

Mixing and matching gets you the best of both companies.

But fanboys (and girls) dont see that and buy on blind brand loyalty, and end up sometimes buying utter junk.


2600 Power Consumption
By Goty on 5/14/2007 12:10:57 AM , Rating: 2
If the 2600 cards come in around 45W (which is wonderful), imagine what they could do with the 2900 cards at 65nm!




8-pin 6-pin
By shraz on 5/15/2007 3:34:16 PM , Rating: 2
Why did they make 8-pin and 6-pin and not just a 12-pin (2x6) for the cable power? Many power supplies now come with 4x6-pin cables... Such an hassle to get an 8 and a 6.




From the source-
By crystal clear on 5/14/2007 3:03:10 AM , Rating: 1
Add this on-

ATI Mobility™ Radeon HD 2000 Series Mobile Line-Up
For unparalleled graphics performance on the go, the ATI Mobility Radeon™ HD 2600 series combines the ultimate DirectX® 10 gaming experiences with the graphics performance gamers demand in a high-end mobile system. Thin and light notebooks featuring ATI Mobility Radeon HD 2400 series can multitask with ease using the new Windows Aero™ interface, run the hottest upcoming DirectX® 10 games or play exciting new HD DVD and Blu-Ray™ discs. Those in the market for a value notebook will enjoy great performance in Windows Vista™ and astonishing multimedia using the ATI Mobility Radeon™ HD 2300. ATI Mobility Radeon HD 2000 products feature ATI Avivo™ HD technology and UVD to deliver outstanding HD DVD™ and Blu-ray™ disc playback, and ATI PowerPlay™ 7.0 power management for long battery life and optimized performance-per-watt operation.6

Notebooks from OEM partners including Acer, ASUS, Fujitsu, Fujitsu-Siemens Computers, Gateway, HP, LG, Packard Bell, Samsung, and Toshiba, and ODM whitebook partners including Arima, ASUS, ECS, First International Computer Inc., and MSI, will be available beginning in May.

quote:
AMD Introduces the ATI Radeon™ HD 2000 Series, Delivering The Ultimate Visual Experience™ for Desktop and Mobile Platforms


http://www.amd.com/us-en/Corporate/VirtualPressRoo...




Keep the ATI name...
By Seymourbbuts on 5/14/2007 10:16:20 PM , Rating: 1
ATI could now stand for...
A. M.D.
T echnological
I magery

Just a thought...




Time for some slide shows
By crystal clear on 5/15/2007 12:29:12 AM , Rating: 1
Slideshow: AMD's H2000 Graphics launch

http://www.tgdaily.com/index.php?option=com_conten...




By crystal clear on 5/15/2007 6:52:44 AM , Rating: 1
Stream computing
AMD is making it clear they're not going to cede the burgeoning GPGPU market to NVIDIA's G80, and the company's pre-launch press materials tout the 2900's usefulness in high-performance computing applications. In particular, there's a sort of software component to the new GPU that hasn't gotten much attention in any of the launch coverage, and indeed I hadn't seen news of it anywhere before coming across it in AMD's press materials.

The R600 is an extremely wide VLIW/SIMD design that relies heavily on a special software layer to dynamically manage its large volume of parallel execution resources. This software layer is called the Accelerated Computing Software Stack, and it includes both compile-time and run-time components. The compile-time component is a set of stream extensions for C/C++ and a math library that AMD calls ACML (probably for Accelerated Computing Math Library). These tools allow coders to write stream computing (or "data parallel") code in C and C++for both the R600 and AMD's multicore GPUs.

This code isn't run natively on the AMD/ATI hardware, but instead it's passed to a runtime component called the Compute Abstraction Layer (CAL), which sits between the programmer and both the multicore CPU and the GPU and appears to contain a just-in-time (JIT) compiler that dynamically translates the code for either x86 or CUDA before passing it on to the appropriate piece of hardware.

The GPU's CTM assembler interface is itself covered by another hardware abstraction layer (HAL) that appears to reside within the CAL. Third party developers can write to either the CAL or the HAL, depending on whether they need to talk only to the GPU (via the HAL, as in the case of display drivers and some HPC applications) or to both the CPU and GPU (via the CAL, for generic "stream processing").

The CAL and HAL portions of ACSS are complex yet integral parts of the driver for the R600 family, and I'd bet money that together they're one of the bottlenecks that's holding back the system from achieving its full potential on gaming benchmarks. It appears that on all of the benchmarks run so far, both DX9 and DX10, all of the graphics calls are going to the CAL via the DirectX and OpenGL CAL bindings, where they're dynamically farmed out to the available stream computing resources on the GPU. If the CAL/HAL stack, which is a brand new piece of software that probably has quite a bit of optimization overhead left in it, doesn't do its job optimally, then the graphics code that's running on it won't be able to get peak performance out of the hardware.

People who really want to max out the R600 will write directly to the GPU hardware using CTM, bypassing the ACSS entirely. This is probably behind AMD's recent promise to open source the R600 drivers—they may be hoping that developers will step up and use CTM to write card-specific drivers that are fully optimized, game-console-style, so that all of the R600's potential can be unlocked.

http://arstechnica.com/news.ars/post/20070514-amd-...




Finally is right....
By Jassi on 5/13/07, Rating: -1
RE: Finally is right....
By kobymu on 5/13/2007 11:29:53 PM , Rating: 4
Mills Lane: Ok, I want a good clean fight. No final output altering rendering techniques, drivers tricks of any kind or form, and no straw man argument. Now, LET’S GET IT ON!


RE: Finally is right....
By James Holden on 5/13/2007 11:31:37 PM , Rating: 2
quote:
Mills Lane: Ok, I want a good clean fight. No final output altering rendering techniques, drivers tricks of any kind or form, and no straw man argument. Now, LET’S GET IT ON!

Lol, like "DirectX performance" ?


RE: Finally is right....
By tuteja1986 on 5/13/2007 11:33:10 PM , Rating: 2
Waiting for the firingsquad , xbit-lab and bit-tech review ;)


"Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki