Print 118 comment(s) - last by Targon.. on Mar 14 at 8:51 AM

Six weeks from now, the world will get the first retail Radeon X2900 XTX

Late yesterday DailyTech was briefed on the final details for the upcoming R600 retail specifications, just in time for everyone to go on vacation for Chinese New Year.  AMD has briefed its board partners on the specifications that will appear on the marketing material for the card launches.

AMD's guidance claims R600 will feature 700 million transistors.  By comparison, the Radeon X1900 series R580 GPU incorporated 384 million transistors into its design; the half-generation before that, R520, only featured 320 million.  

As disclosed by DailyTech earlier this year, the GPU features a full 512-bit memory interface with support for GDDR3 and GDDR4.  R580 was also similar in this regard as it supported GDDR3 and GDDR4. 

The R600 boasts 320 steam processors.  ATI does not clearly define what a steam processor is, though insiders claim 64, 4-way unified shaders would be 256 stream processors (64 shaders, 4 interfaces each). 

According to company guidance, on March 30, 2007, AMD will initially debut the R600 as the ATI Radeon X2900 XTX in two separate configurations: one for OEMs and another for retail.  The OEM version is the full length 12" card that will appear in high-end systems.

ATI guidance claims the X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler.  Vapor chambers are already found on high-end CPU coolers, so it would be no surprise to see such cooling on a high-end GPU either.  The OEM version of the card is a 12" layout and features a quiet fan cooler. 

1GB of GDDR4 memory is the reference configuration for Radeon X2900 XTX.  Memory on the reference X2900 XTX cards was supplied by Samsung.

Approximately one month later, the company will launch the GDDR3 version of the card.  This card, dubbed the Radeon X2900 XT, features 512MB of GDDR3 and lower clock frequencies than the X2900 XTX.  The X2900 XT is also one of the first Radeons to feature heatpipes on the reference design. 

AMD anticipates the target driver for X2900 XT to be Catalyst 8.36.  WHQL release of the X2900 XTX drive will appear around the Ides of March.

Radeon X2900 will feature native CrossFire support via an internal bridge interface -- there is no longer a need for the external cable found on the Radeon X1000 series CrossFire.  There is no Master card, as was the case with other high-end CrossFire setups. Any Radeon X2900 can act as the Master card.

A much anticipated feature, native HDMI, will appear on all three versions of Radeon X2900.

One 6-pin and one 8-pin (2x4) VGA power connectors are featured on Radeon X2900, but both connectors are also backwards compatible with 6-pin power supply cables.

AMD claims the R600 target schedule will be a hard launch -- availability is expected to be immediate.  Board partners will be able to demonstrate R600 at CeBIT 2007 (March 15 - 21), but the only available cards will be reference designs. 

Why was there such discrepancy with the board layouts and designs up until now?  An ATI insider, who wished to remain nameless, states  "The original Quad-Stealth design is what we build the R600 on: GDDR4, full-length and dual-slot cooling.  As the silicon further revised, [ATI] took up several alternative designs which eventually included GDDR3 and heatpipes into the specification.  The release cards demonstrate the versatility of R600 in each of these unique setups."

Final clock frequencies will likely remain estimates until later this month.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

X2900? Not X2800?
By Warren21 on 2/16/2007 11:54:11 PM , Rating: 4
I sware this card is supposed to be the X2800 series... It's been all over the net and every website so far except DT has referred to the R600 as such.

If they called the first gen X2900, they wouldn't leave themselves much room for refresh part naming schemes.. a la X2900/X2950.

RE: X2900? Not X2800?
By KristopherKubicki on 2/16/2007 11:55:38 PM , Rating: 3
Yep, X2900.

RE: X2900? Not X2800?
By Chillin1248 on 2/17/2007 12:48:46 AM , Rating: 5
Was there any reason given for the deviance? Cause from looking at the previous generation, it was as follows:

X1800 = First release
X1900 = Second release
X1950 = Modification of second release SKUs

So they are not really leaving themselves much room now are they?

And perhaps this would also explain why some leaked driver sets showed the following:


Instead of the normal:

X1100 = Intergreated GPU
X1300 = Low end segment

Seems they just added a 100 to every GPU segment this generation.


RE: X2900? Not X2800?
By KristopherKubicki on 2/17/2007 12:58:49 AM , Rating: 3
It seems to me ATI pretty much does something different every generation.

Radeon 9700 was the top generation once, and then the 9800 was the revamp revision.

RE: X2900? Not X2800?
By maevinj on 2/17/2007 1:33:19 AM , Rating: 2
I like Amd (ati) but wow they've got to get their naming down. it's confusing as hell. Not that nvidia is any better.

RE: X2900? Not X2800?
By Chillin1248 on 2/17/2007 6:38:58 AM , Rating: 4
Have fun trying to figure them out:


Rage and R1xx
Rage XL

8xxx series (R2xx)
8500 Pro
9000 Pro 64-Bit
9000 Pro
9200 SE
9250 64-Bit

9xxx series (R3xx)
9600 Pro
9550 SE
9600 SE
9600 Pro
9600 XT
9700 Pro
9800 SE
9800 Pro 128-Bit 128MB
9800 Pro 128MB
9800 Pro 256MB
9800 Pro 256MB DDR2

Xxxx Series (R4xx)
X300 LE
X300 SE
X300 SE Hyper Memory
X550 Hyper Memory
X550 XT
X600 SE
X600 Pro
X600 XT
X700 SE
X700 LE
X700 Pro
X700 XT
X740 XL
X800 128MB
X800 256MB
X800 GTO
X800 GTO/2
X800 Pro
X800 Pro VIVO
X800 XL
X800 XL 512MB
X800 XT PE
X850 SE
X850 Pro
X850 XT
X850 XT "Crossfire Edition"
X850 XT PE

X1xxx Series (R5xx/RV5xx)
X1300 Pro
X1300 CE
X1300 XT
X1600 Pro
X1600 XT
X1650 Pro
X1650 XT
X1800 GTO
X1800 XL
X1800 XT
X1800 XT 512MB
X1800 Crossfire Edition
X1900 GT
X1900 Pro
X1900 AIW
X1900 XT 256MB
X1900 XT 512MB
X1900 XTX
X1950 GT
X1950 Pro
X1950 XT
X1950 XTX
X1900 Crossfire Edition
X1950 Crossfire Edition

Radeon X2xxx Series (R6xx)
Radeon X2200
Radeon X2400
Radeon X2900 XL
Radeon X2900 XT (512MB GDDR3)
Radeon X2900 XTX (OEM) (1024MB GDDR4)
Radeon X2900 XTX (Retail) (1024MB GDDR4)


MAC Edition:

9800 Pro MAC EDITION 256MB
X1900 MAC Edition

124 Cards


TNT2 Pro
TNT2 Ultra
TNT2 Model 64 (M64)
TNT2 Model 64 (M64) Pro
Vanta LT

GeForce 256
GeForce DDR

Geforce 2 (NV1x)
GF2 MX400
GeForce2 GTS
GeForce2 Pro
GeForce2 Ti
GeForce2 Ultra

Geforce 3 (NV2x)
GF3 TI-200
GF3 TI-500

Geforce 4 (NV1x/NV2x)
GF4 MX4000
GF4 MX420
GF4 MX440
GF4 MX440 SE
GF4 MX440 8X AGP
GF4 MX460
GF4 TI-4200
GF4 TI-4200 8X AGP
GF4 TI-4600
GF4 TI-4800
GF4 TI-4800 SE
GF4 PCX4300

Geforce FX series (NV3x)
5200 LE
5200 Ultra
5200 128MB
5200 256MB
5600 Pro
5600 SE/XT
5600 Ultra
5700 LE
5700 VE
5700 Ultra DDR2
5700 Ultra GDDR3
5800 Ultra
5900 XT
5900 ZT
5900 Ultra
5950 Ultra
5990 Ultra
PCX 4300
PCX 5300
PCX 5750
PCX 5900

Geforce 6 series (NV4x)
6200 LE
6200 LE TC (Turbo-Cache)
6200 SE TC (Turbo-Cache)
6200 TC (Turbo-Cache)
6600 LE
6600 128MB
6600 256MB DDR2
6600 GT
6610 XL
6700 XL
6800 XE
6800 LE
6800 XT
6800 GS
6800 GT
6800 Ultra
6800 Ultra "Extreme Edition"

Geforce 7 series (NV4x/G7x)
7100 GS
7300 SE
7300 LE
7300 GS
7300 GT
7500 LE
7600 LE
7600 GS
7600 GTL
7600 GST
7600 GT
7800 GS (G70 256MB)
7800 GS+ (G71 512MB)
7800 GT
7800 GTX
7800 GTX (Ultra) 512MB
7800 GX2
7900 GS
7900 GT
7900 GTO
7950 GT
7900 GTX
7900 GX2
7950 GX2

Geforce 8 series (G8x)
8300 GS
8300 GT
8600 GS
8600 GT
8600 Ultra
8800 GTS 320MB
8800 GTS 640MB
8800 GTX 768MB

104 Cards


3dfx Voodoo 5 5500

1 Card

Thats a total of 227 Choices for a consumer choosing a video card. Enjoy.

RE: X2900? Not X2800?
By mino on 2/17/2007 8:44:21 AM , Rating: 2
AFAIK many X-seires such as X600 belong to the RV3x line...
However as both the R/RV3x and R/RV4x share the common design I would not consired them 2 generation. technologicaly they are single generation.
The same goes for GF6xxx and 7xxx series. The single next-gen part from the time of 6800 is in fact 8800 ...

RE: X2900? Not X2800?
By melgross on 2/17/2007 6:56:36 PM , Rating: 2
You missed most of the Mac cards from ATI, and all of the ones from Nvidia.

RE: X2900? Not X2800?
By xsilver on 2/17/2007 10:48:11 AM , Rating: 2
and how many of those are still available in retail channels?

granted, realistically you've got about 20-30 choices

I like how you quote the voodoo 5500 which doesnt even have full directX support yet dont quote matrox or volari?

RE: X2900? Not X2800?
By StevoLincolnite on 2/17/2007 10:38:44 PM , Rating: 1
He didn't make this list, some other guy on some other forum did. I prefer people to quote where they get they're information from.
Original source to his list:

RE: X2900? Not X2800?
By Chillin1248 on 2/18/2007 7:00:51 AM , Rating: 5
Actually, I did make the list.

Brute Force on that forum is me, watch as I add "chillin1248" to the top of the first post at that link.

I do not condone stealing peoples work, and if I burrow theirs I always to my capability list my sources.


RE: X2900? Not X2800?
By bohhad on 2/18/07, Rating: 0
RE: X2900? Not X2800?
By colgateam on 2/25/2007 3:05:52 AM , Rating: 2
Haha seeing StevoLincolnite get owned really made my day.

RE: X2900? Not X2800?
By Crusader on 2/26/2007 9:31:54 AM , Rating: 2
8800 GTX 768MB

Done picking. Thanks.

RE: X2900? Not X2800?
By sprockkets on 2/26/2007 3:04:43 PM , Rating: 1
That's why I like how Linux identifies it by the chip itself and not the marketing crap. But even then, that is confusing too.

RE: X2900? Not X2800?
By Targon on 3/14/2007 8:51:03 AM , Rating: 2
The problem comes from both ATI and NVIDIA when it comes to the names of cards. When you have GS, GT, GTS, Ultra, and the "plain" versions of cards, that's almost as bad as the Pro, XT, and XTX you see from ATI. What's wrong with releasing new numbers, like the X2800, X2805, X2810, X2815 and dropping all the extras?

RE: X2900? Not X2800?
By obeseotron on 2/17/2007 4:54:51 PM , Rating: 2
I would have to say nVidia's naming scheme is a lot more consistent and to some extent easier to understand than ATI's. Since the Geforce 4 it's pretty consistently been the generation number followed by a number that differentiates it in speed from the others in the same generation. Comparing between generations is nearly impossible based solely on the number, but within a generation, it's been consistent with a few exceptions.

ATI's problem was starting with 8000 in the same generation that nVidia used 4000. After the 9000 series they went with X000 instead of 10000, and that leads to todays strange latin/arabic hybrid numbering scheme. ATI has also put misleading prefixes on some of their lowend cards making them seem to be from a newer generation than they actually are.

RE: X2900? Not X2800?
By Dactyl on 2/17/2007 6:35:49 PM , Rating: 2
Seems they just added a 100 to every GPU segment this generation.
They can't keep that up, or their numbers will wrap around (the best card of the X3*00 series will be the X4100XTX)

Or they'll use two-digit numbers

Or (my preference) they'll move to hexadecimal

RE: X2900? Not X2800?
By Warren21 on 2/17/2007 8:49:24 PM , Rating: 2
The numbers in the Radeon series aren't just for show, they mean something about the generation.

The X infront of 2900 is meant to be understood as a Roman numeral 10, meaning that an X2900 is a 12-series card.

Using wrap around or two digit numbers doesn't work.

Also, using hex would be severely bad marketing. 95 % of the population doesn't know hex, and it would confuse customers. People also believe (like in school grading) that the earlier the letter in the alphabet the better. IE Joe Shmoe would think that an A is better than a B, C, D, E or F.

RE: X2900? Not X2800?
By Vinnybcfc on 2/22/2007 10:24:01 AM , Rating: 2
Or maybe they might go back to using a 7 and you have the X3700

RE: X2900? Not X2800?
By Hawkido on 2/22/2007 10:51:54 AM , Rating: 2
Or (my preference) they'll move to hexadecimal

Oh No! I can already see it:

Truly, Hex is not the way to go.

Talk about Marketing nightmare. Sales may go up in some segments though...

RE: X2900? Not X2800?
By jay401 on 2/24/2007 12:34:44 PM , Rating: 2
Is there a "G" in hex? I thought hex only went out to "F".

RE: X2900? Not X2800?
By Tyler 86 on 2/24/2007 3:54:03 PM , Rating: 2
Unfortunately, there are some GS series cards, like the 6800 GS...

RE: X2900? Not X2800?
By Jayceon Carter on 2/18/2007 4:19:25 PM , Rating: 2
ATi cards with 800 in their name, weren't the most popular/well performing in the past. Except for the 9800 (Pro). They can always upgrade the name to a X2950 if they need to.

RE: X2900? Not X2800?
By Future145 on 2/18/2007 9:53:05 PM , Rating: 2
its true that it doesnt leave much room for expantion but im guessing that they will just add more letters to the end like x2900 xt, x2900 xtx, x2950 xt,x2950 xtx and etc.

By paydirt on 2/19/2007 9:38:53 AM , Rating: 2
Either buy or don't buy, your card will likely be obsolete in a few years. Is the obsolescence planned? Hmmm.

By timmiser on 2/20/2007 9:42:48 PM , Rating: 2
I understand what you are saying but there is one thing to keep in mind. With this new round of high powered video cards mean you will most likely have to invest in a multi-rail power supply that runs upwards to $300-$500! In addition to the need for increased wattage, you won't be able to boot up with a single rail power supply regardless of how many watts it is.

I have an 8800 GTX which requires two PCI-E direct feeds from the PS. My 500w single rail high quality PS with power adapters could not feed the video card what it needed and I had to upgrade to a triple rail PS and I'm not even planning on going SLI. If you did want to SLI your 8800 GTX's, you'd need a PS with 4 PCI-E plugs to run optimally. Those power supplies are pricy.

By Whedonic on 2/17/2007 12:23:58 AM , Rating: 5
Agreed. While Intel and AMD have been making CPUs cooler and more efficient, it looks like the current GPU tactic is to pack as many transistors on the board as possible...use brute force to boost performance. I seriously hope that there's a new architecture in the works for the next cycle that won't require 12 inches and 700 million transistors.

By saratoga on 2/17/2007 1:19:17 AM , Rating: 2
Not likely. For perfectly parallel problems with graphics, theres a fairly hard limit on performance/watt for a given nm process. You can give up clock speed, or give up shaders, but that drops your performance almost proportionally.

By Trogdor on 2/19/2007 11:42:45 AM , Rating: 2
The retail R600 will not be 12" long, that's only for the OEM/SI card. Retail cards will be 9.5" dual slot. Around the same size/possibly slightly shorter than the 8800.

By Targon on 3/14/2007 8:44:17 AM , Rating: 2
With a CPU, there is only so much of a performance increase you can get by adding more cores. With a GPU, the more pixel pipelines that are added will add that much more to the performance of the video card. That's why graphics processors have continued to increase in transistor count, because of the continued increase in the number of shaders and pixel pipelines.

Now, an interesting side-effect of the AMD/ATI merger is that we may see Radeon GPUs move to 65nm and 45nm a LOT faster than previous transitions. That should help in the size/power department. It won't help with the increase in the data path between GPU and video memory. Remember the R600 is supposed to use a 512 bit connection to memory, which is only part of the reason for the size of the cards.

When you need to fit a gig of memory on a video card, that will also increase the physical size of the cards until higher density memory chips are released.

By saratoga on 2/17/2007 1:16:58 AM , Rating: 2
What is the point of salivating over this card if the power consumption is greater then it's overall performance? I mean for the amount of power required to run these cards what will they equate to in performance?

How exactly do you compare watts to FPS? They're not the same thing. IE:

"10 watts faster at Quake" doesn't make any sense.

By Axbattler on 2/17/2007 5:56:55 PM , Rating: 2
Card A consumes 200W to deliver 150 FPS in a certain application, under a certain setting.
-> Each watt gets you 0.75 FPS in that scenario.

Card B comsumes 300W to deliver 180 FPS in the same application using the same setting.
-> Each watt gets you 0.6 FPS in this scenario.

Assuming those two cards were, at the time of the test, running at 100% (or most of it), then you can say that Card A is more efficient than Card B, at least under the test circumstances. And under those very circumstances, Card A may be more attractive to some, especially if Card A runs much cooler (as it should). Of course, if one is only interested in more FPS, then Card B is the best choice. And in situations where the game do not hit the 3 digits, people may be more willing to pay the premium in power consumption just over the more efficient choice if it means going from unplayable frame rate to playable frame rate.

[I do not know anything about the X2900's performance or power consumption, so I am not agreeing or disagreeing with the quoted statement]

By brute1248 on 2/17/2007 7:23:43 AM , Rating: 4
That "review" has come under extreme criticism for many errors in the article that casts a shadow on its validity.


By Tyler 86 on 2/21/2007 11:16:09 AM , Rating: 2
TSSAA: Transparency Super-Sample Anti-Aliasing
AAA: Adaptive Anti-Aliasing

Both aspects deal with improving the image quality of textures with transparent components, to make fencing and railing seem more realistic (less blocky, sharper edges, but smoother contours, less moire effect, and undistorted at distances)...

By Enoch2001 on 2/17/2007 11:07:10 AM , Rating: 2

I don't mind waiting a bit longer for this card.

Laughable - even at 26 fps, ALL of those cards in that questionable "benchmark" perform like arse. When I can do 60 fps or higher at 1600 x 1200 and maxed out fx, then and only then will any video card be worth it..

catalyst 8.36????
By hardwareking on 2/17/2007 2:05:17 AM , Rating: 2
AMD anticipates the target driver for X2900 XT to be Catalyst 8.36

They have hardly reached catalyst 7.3
And this card is due in about 2-3 months.
So at the rate of 1 driver update a month they'll only come upto about catalyst 7.6 or 7.7

Must be an error.

RE: catalyst 8.36????
By Warren21 on 2/17/2007 2:43:59 AM , Rating: 2
This card is due in less than 2 months, man more like 1.5.. RTFA, hard-launch = Mar 30th.

RE: catalyst 8.36????
By ButterFlyEffect78 on 2/17/2007 2:44:51 AM , Rating: 2
The card is due in 6 weeks and not 2-3 months. The driver version (Catalyst 8.36.)is obviously a typo. They most likely meant to say 7.36 around end of march.

RE: catalyst 8.36????
By Saist on 2/17/2007 3:22:36 AM , Rating: 2
It's not a typo.

The current Linux Driver version is 8.33.6 8.33.6

8.36 sounds about right given that the next Linux driver release this month will be 8.34

Next month will be 8.35, and then 8.36 after that, which will be the launch of R600.

RE: catalyst 8.36????
By Griswold on 2/17/07, Rating: 0
RE: catalyst 8.36????
By peternelson on 2/18/07, Rating: -1
RE: catalyst 8.36????
By paydirt on 2/19/2007 9:36:43 AM , Rating: 4
Sorry, you're in the minority, even for "high-end" gamers. Just because it is your case, it doesn't make it so for other hardcore gamers. Most l337 gam3rs aren't going to care much with fooling around with a 2nd OS, so they are just going to use Windoze.

RE: catalyst 8.36????
By KristopherKubicki on 2/17/2007 5:34:33 AM , Rating: 2
It's not a typo.

8.xx refers to the driver, Linux or otherwise. We're talking about the Catalyst package based on the 8.36 display driver.

RE: catalyst 8.36????
By Snipester on 2/17/2007 10:32:48 AM , Rating: 2
They mean Display driver version. Catalyst version is just a package. 7.1 just means it came out in (2007, 1st month)

CCC itself and the Display driver have their own version numbers. you can check in the CCC panel what the driver number is that you have and also CCC version number.

RE: catalyst 8.36????
By johnsonx on 2/18/2007 12:55:03 PM , Rating: 2
From my CCC running on Vista 32-bit:

Driver Packaging Version 8.333-070118a2-041241C-ATI
Catalyst® Version 07.1
Provider ATI Technologies Inc.
2D Driver Version
Direct3D Version
OpenGL Version
Catalyst® Control Center Version 0122.1848.2579.33475

By Rebel44 on 2/17/2007 10:04:21 AM , Rating: 2
I hope power consumption wont go much higher than 200W.

RE: Power
By shabby on 2/17/2007 10:14:07 AM , Rating: 2
240w for the retail, 270w for the oem.
Your hopes are diminished.

RE: Power
By Rebel44 on 2/17/2007 10:59:48 AM , Rating: 2
I dont think so because you can use 2 6pin power cables which means max 225W.

RE: Power
By shabby on 2/17/07, Rating: 0
RE: Power
By Rebel44 on 2/17/2007 2:28:59 PM , Rating: 2
I know that it has 6pin and 8pin connector but 8pin is backward compatible which means you can use either 6+8 or 2x6 - and those numbers (240 and 270w) are just another rumors.

RE: Power
By melgross on 2/17/07, Rating: 0
RE: Power
By Hawkido on 2/22/2007 11:05:10 AM , Rating: 2
with the 2x6 you loose the overdrive (integrated overclocking feature of the driver) feature.
1x6 + 1x8 unlockes the overdirve feature.

RE: Power
By Targon on 2/17/2007 8:26:11 PM , Rating: 2
With more memory comes more power demand, and it's something that many people may not be thinking about. How much power is needed for the GPU is one question, then how much power is needed to drive both the GPU and the 1 gig of memory on the card?

The 512 meg version will have a lower power demand just because the memory is cut in half, though it remains to be seen how much less since the GPU will also have it's speed reduced.

To be honest, do we REALLY need a 512 meg video card right now, let alone a 1 gig video card? I've been fairly satisfied with my 128 meg Radeon 9800 pro, though I AM feeling the need for 256 megs of memory and a more powerful GPU for a while now. I am personally going to wait for the X2900 XT, and leave the XTX for those who feel that 512 megs of video memory isn't enough.

The price difference between cards is also something to keep in mind. If the 512 meg XT is $150 cheaper and only a touch cheaper, many people will jump on it in comparison if the performance is there compared to the Geforce 8800 cards.

RE: Power
By DeepBlue1975 on 2/19/2007 7:36:19 AM , Rating: 2
In benchmarks on high-res with anti aliasing turned on (which is the most desired target for 20" and larger LCD panels), the 8800gts 320mb vs. the 8800gts with 640mb takes a big hit.
With no AA enabled, the difference is easily forgettable (at least it is for me, a 100usd saving which accounts for 25% of the price for just loosing less than 10% in performance is a great deal).
Anyways, I like AA and HDR, but guess I could live without AA if I can get HDR to work (right now I can't - using x800xl here :( )

Will R600 be DX10.1?
By sandytheguy on 2/17/2007 1:14:14 AM , Rating: 2
I remember hearing somewhere that R600 was going to be a DirectX 10.1 part. Can this be confirmed or denied?

RE: Will R600 be DX10.1?
By FITCamaro on 2/17/07, Rating: -1
RE: Will R600 be DX10.1?
By Url on 2/17/2007 10:31:33 AM , Rating: 2
DirectX 10.1 is being worked on by microsoft. DirectX 10 has been released, it came with windows vista.

RE: Will R600 be DX10.1?
By xsilver on 2/17/2007 10:52:02 AM , Rating: 2
sorry, how is the 8xxx series not true dx10 comliant?

RE: Will R600 be DX10.1?
By kilkennycat on 2/17/2007 2:58:08 PM , Rating: 2
Specifics on the 8800 lack of compliance with Dx10, please. Together with URLs to authorative sources, please.

RE: Will R600 be DX10.1?
By RyanVM on 2/18/2007 6:05:36 PM , Rating: 2
umm, DX10 shipped with Vista, which last time I checked is out.

GeForce 8950GX2........???
By crystal clear on 2/18/2007 3:38:25 AM , Rating: 2
"Six weeks from now, the world will get the first retail Radeon X2900 XTX"

Read this.....

GeForce 8900GTX and 8950GX2 Pricing and Information

"As you can see, nVidia have a whole range of new graphics cards ready and apparently waiting to spoil AMD and ATI’s R600 launch. Word around Taiwan is that the cards are basically ready and nVidia is just waiting for AMD to take the covers off their range of upcoming R600 graphics cards."

Our friends over at OCW seem to be talking to the right people here in Taiwan and have a bunch of information regarding the most expensive GeForce 8950GX2 that will be sold for $599 USD. It is rumored to be a dual GPU solution much like the 7950GX2 and will come with a total of 512MB GDDR-4 memory per GPU.

The second on the list is the GeForce 8900GTX which will replace the current high-end 8800GTX. It will ship at a retail price of $549 USD and sees total pipelines increased to 128 which is an increase of 32 or 25% as suggested by Fudo over at the INQ recently. It will also see the inclusion of GDDR4 memory which allows nVidia to increase the clock speed to a huge 2200MHz DDR. It also seems like the manufacturing process is being reduced to 80nm from 90nm in the latest batch of G80 chips

RE: GeForce 8950GX2........???
By Lakku on 2/18/2007 3:48:05 AM , Rating: 2
Increased to 128 pipelines? I am pretty certain the 8800 already has 128 shader units, so what exactly is being conveyed here? I also thought the whole 'pipeline' idea of GPU architecture did not apply to the 8800.

RE: GeForce 8950GX2........???
By crystal clear on 2/18/2007 4:33:30 AM , Rating: 2
refer to the table displayed in the link I give.

Right now I am check other Chinese/Taiwanese site-thats the reason I put the sign in the heading "???"

Translation on/of these sites is cubersome.

RE: GeForce 8950GX2........???
By AnnihilatorX on 2/18/2007 7:01:08 AM , Rating: 2
According to the table, 6900 GTX and 6800GTX has the same 128 shader units. Yes I can read Chinese

So what?
By fence on 2/17/07, Rating: 0
RE: So what?
By justjc on 2/17/2007 9:43:40 AM , Rating: 3
To some people it does mean a lot to have the new stuff, it can be an addiction, and for others this is the way to make their new computers last longer, after all we know DX10 will become mainstream someday.

I'm looking forward to r600 because I have a good idea that ATi knows how it's done, they allready made DX10 graphics for the xbox360 afterall, however I won't buy a dx10 card for at least a year either, as the use for it currently isn't there in my case.

RE: So what?
By kilkennycat on 2/17/2007 3:07:47 PM , Rating: 2
Dx10 in Xbox360? Er, what are you smoking? Maybe Dx10 partial emulation @ 0.1 frames per second? The Xbox360 hardware is frozen in obsolescence. Maybe when M$$ gets around to the Xbox720 with integrated HD-DVD-drive, beefed-up CPU, Dx10 GPU derived from the upcoming mid-range Dx10-capable PC graphics cards, your statement might have validity. And your old Xbox360 can join all the others of the same vintage on Ebay.

RE: So what?
By Tyler 86 on 2/21/2007 11:21:32 AM , Rating: 2
With games like WoW grabbing so much media attention, what makes you think the PC gaming scene is dead? (I don't play WoW, so let me know if it died since last I checked...)

By Gigahertz19 on 2/16/2007 11:15:15 PM , Rating: 1
As Borat would say, "Very Niiice, how much?"

RE: Nice
By crazydingo on 2/16/2007 11:17:37 PM , Rating: 5
How many shaders?

Scalar? Vec4?



By Lakku on 2/18/2007 4:03:17 AM , Rating: 2
I didn't painfully experience anything due to drivers because I didn't jump on the Vista bandwagon right away. Was nVidia at fault? Certainly, at least from a marketing standpoint of touting their Vista ready GPUs for the ultimate Vista experience. Were people who expected Vista to have perfect driver support out of the gate at fault? Certainly, because no MS OS, since Win 95 at least, has ever had good driver support to start out. XP had problems with games etc. at launch as well. At any rate, since you are talking about DX9 support, well, stick with XP. While not perfect, and no display driver usually is now a days, the 8800GTX runs just fine for me for 98% of what I play/do, so there's nothing painful about it. Until DX10 games come out, I see no need for Vista to play games. If you need it for work or other functionality try dual booting.

By Tyler 86 on 2/21/2007 11:24:36 AM , Rating: 2
Were people who expected Vista to have perfect driver support out of the gate at fault? Absolutely not.

nVidia & ATI have both had a reasonably long ammount of time with the prerelease versions of Vista to have had their drivers ready for the release of Vista.

ATI managed to pull it off, what the hell happened to nVidia?

By crystal clear on 2/18/2007 5:25:05 AM , Rating: 2
VR-Zone has learned that the official marketing name for R600 will be Radeon X2900 series and we will start using the name from now on. We can expect AMD to launch Radeon X2900XTX and XT first while X2900XL will follow later. The recent leaked R6xx series chart with pricing contains lots of false information is all we can say. RV610 and RV630 are 65nm and RV610 has 64-bit memory interface.

Among other things we can confirm right now; R600 is 80nm, contains 700M transistors and has 512-bit memory interface. X2900 requires 2 supplemental power connector from the PSU, one 8-pin and one 6-pin but you are able to plug in the 6-pin power cable into the 8-pin connector. The targeted driver release version for X2900 series is Catalyst 8.35.1. Systems with X2900XTX cards and tech demos will be at CeBIT but inside a private room for the privileged only.

VR-Zone has learned about some new details on 80nm R600 today and there will be 2 SKUs at launch; XTX and XT. There will be 2 versions of R600XTX; one is for OEM/SI and the other for retail. Both feature 1GB DDR4 memories on board but the OEM version is 12.4" long to be exact and the retail is 9.5" long. The above picture shows a 12.4" OEM version. The power consumption of the card is huge at 270W for 12" version and 240W for 9.5" version. As for R600XT, it will have 512MB of GDDR3 memories onboard, 9.5" long and consumes 240W of power. Lastly, there is a cheaper R600XL SKU to be launched at a later date.

R600 final specs?
By xNIBx on 2/17/2007 4:37:50 PM , Rating: 5
Where are the specs? Maybe you forgot to write them down? Core frequency(ies)? Memory frequency? How many shaders? What type of shaders? How many tmus and rops?

This article offers nothing that we didnt already know and mentions almost no specs. The title of the article is misleading.

three things
By soydios on 2/17/2007 12:25:41 AM , Rating: 2
First: 700 million transistors?!? Wow. Power will be something to see, and I really hope that it's a 65nm part.

Second: XTX has 1GB of GDDR4, XT had 512MB of GDDR3. That's a big gap between what have been two nearly identical parts except for clock speeds in past generations.

Third: Even though I'm waiting until R700/G90 to get a DX10 card, I want to see benchmarks and the nitty-gritty details of the silicon on this puppy ASAP!

RE: three things
By coldpower27 on 2/17/2007 5:02:44 AM , Rating: 2
It's on the 80nm node pretty much.

Yeah I guess throwing on 1Gb of VRAM isn't quite feasible for the entire lineup at this time.

ATI is trying to extend the gap between the XT and XTX it seems. Not have a X1900 XT and XTX situation again where the performance difference wasn't a whole lot.

Power Consumption
By AninParadX on 2/17/2007 1:21:38 AM , Rating: 2
Personally I think the X2900 is a great chip. I just read on other news site that the OEM version uses 270 W (max) and the consumer version around 240 W (max). I am just a little bit afraid that consumers won't be happy with so much power consumption. I am a bit concerned if this will set a new trend in the grapics sector. Intel and AMD are trying to lower the power consumption and the grapic cards are using up more and more power. Where will this end? Damn now I sound like a doomsday prophet ;)

RE: Power Consumption
By coldpower27 on 2/17/2007 5:11:51 AM , Rating: 2
You also has to keep in mind die sizes have been increasing over the past few generations, and a process is usually only cooler if you run it these days at around the same clock frequencies and around the same die size.

The reason why Intel and AMD have been trying to lower power consumption is that their die size levels haven't seen too much fluctuation in the past few years on the mainstream front, were talking about 80-150mm2 for mainstream SKU's. Not to mention the higher die sizes were mostly large cache parts which aren't too power hungry to begin with as cache doesn't suck a ton of juice.

It's very hard to lower power consumption, when all the power reducing effects of a process have been used to increase the amount of transistors on the card.

Look at ATI, they used a newer 80nm, but it looks like any power reducing effects were countered by the large increase in transistor count and the increase in clockspeeds over the prior generation. You can't have it all anymore something has got to give. GPU increase performance at a rate of 2x each significant generation through parallelism rather then clockspeed and that requires adding transistors which generate more heat and require more energy.

Good news.
By bunnyfubbles on 2/17/2007 3:38:38 AM , Rating: 2
Although I'll most likely wait until next gen (or half gen even) when I heard that they (both NV and AMD) should be back on track and get a handle on ballooning card size/power/heat(noise).

RE: Good news.
By Pitbulll0669 on 2/17/2007 8:43:55 AM , Rating: 2
they are all saposed to be DX10 cards IE. the new batch. Top end being 600$ and Bottom being 100. it will be intresting to see what and how they come about getting out the Vista drivers and if they will enable full 64 bit support. For 1 Im getting one I run the X1950XTX now But I can wait to push the X2900xtx i water cool it any how so heat wont be an issue.Plus Im running a QX6700 kentsfeild Quad at 3.4 right now so I cant wait to Unleash that bad bow and the are saposed to be comming out with their version of the 2 GPUs on one card too X2900xtx2.Pit.

Chinese New Year FTW!
By Kromis on 2/17/2007 3:24:47 PM , Rating: 2
Chinese New Year for the win! Time to get some of those red envelopes and save up to get the card!

RE: Chinese New Year FTW!
By slacker57 on 2/20/2007 2:15:22 PM , Rating: 2
I only get 20 bucks every year. Not going to cover it :(

So we are all to HDMI 1.3 now?
By iwod on 2/18/2007 8:07:47 AM , Rating: 2
UDI is dead after remaining Intel and Samsung jumpship to DisplayPort.
ATI was suppose to be one of the backer of DisplayPort ( as they also stated in their roadmap )

Now all of a sudden they decide HDMI is the way to go? Did they change roadmap or is the HDMI cost no longer a problem?

By Predatorgsr on 2/18/2007 11:58:00 AM , Rating: 2
Why are there different board designs for retail and oem?

Everyone takes vacation for the Chinese New Year?
By EglsFly on 2/18/2007 5:15:19 PM , Rating: 2
just in time for everyone to go on vacation for Chinese New Year


By Pythias on 2/19/2007 9:16:06 AM , Rating: 2
Any excuse to party :)

X2900 for the hardcore gamers
By psychobriggsy on 2/19/2007 9:57:00 AM , Rating: 2
I think most people are more interested in the X2200 or X2400 (and is there going to be an X2600?) cut-down variants of R600. And on the downward pricing pressure they'll put on X1xxx series cards...

Impressive to get 700m transistor chips in the consumer market though.

By kilkennycat on 2/21/2007 5:52:16 PM , Rating: 2
Yep. nVidia's 681 million transistors in the G80 is indeed quite an achievement.

A bit disapointed...
By Snowy on 2/17/2007 9:46:37 AM , Rating: 2
Ah, the R600 sounds like a beast indeed, but is anybody else disappointed with the March 30th launch? That's giving Nvidia quite a big lead with their G80.

I'm wondering when the mid-range R600 parts are coming out, I'd almost say that they are more important than the high end, because not too many people can afford a $600 graphics card.

Although I'm sure, that since this launch is quite late, that the R600 will be worth waiting for.

RE: A bit disapointed...
By fence on 2/17/07, Rating: -1
RE: A bit disapointed...
By gilboa on 2/18/2007 8:53:10 AM , Rating: 2
I just bought a 24" Dell display and my GF6800 doesn't even come close to handling Doom3/Q4/X2 at native resolutions. (1920x1200)
While I'm not in the market for the new R600 (lack of iron-clad Linux support) - I will be getting the 8800/640 once the Linux support matures.

Given my age, I couldn't care less about the size of my E-penis.

- Gilboa

By yacoub on 2/16/2007 11:14:19 PM , Rating: 2
Final clock frequencies will likely remain estimates until later this month.

Prices too, then?

That's a little hasty.
By Dodgeballa03 on 2/17/2007 12:56:10 AM , Rating: 2
From the estimates that I have heard, the R600 will launch at or around $600 same as the 8800GTX. nVidia will lower their prices and AMD will be the cream of the crop for a few months.

RE: That's a little hasty.
By Rebel44 on 2/17/07, Rating: 0
By suryad on 2/17/2007 1:02:33 AM , Rating: 2
I understand this R600 will be an 80 nm part?

ATI possibly fastest?
By SilverBack on 2/17/2007 11:59:43 AM , Rating: 2
Not for long if it even does.
The 8950 GX2 dual GPU card is sitting quietly at Nvidia ready to be released.....

New Nvidia Cards
By TheRequiem on 2/18/2007 5:12:34 PM , Rating: 2
So is Nvidia releasing the 8900 GTX's immediately when the R600 comes? Also, I wonder if the nvidia cards have HDMI too...

Any chance for an "AGP" version?
By Blood1 on 2/20/2007 10:53:37 AM , Rating: 2
Any chance for an "AGP" version?
Does anyone know if there will be a possible 'agp' release for this chipset?
Anyone? Ideas?

Vacation for Chinese new year?
By jediknight on 2/21/2007 10:19:44 AM , Rating: 2
Late yesterday DailyTech was briefed on the final details for the upcoming R600 retail specifications, just in time for everyone to go on vacation for Chinese New Year

Who gets that?

By kilkennycat on 2/21/2007 5:45:47 PM , Rating: 2
The launch of the R600 has been delayed into Q2. Discount this article.

From AMD today:-

To better align our strategy with current market opportunities, we've changed the launch plan for R600. We are going to deliver a competitive configuration to market with an extremely attractive combination of performance, features and pricing, targeting a broader market segment in Q2. With the revised strategy, AMD will be better able to capitalize on the broad appeal of 3D graphics and DirectX 10, being driven in part by the growing popularity of Microsoft Windows Vista.

Two things
By gramboh on 2/17/07, Rating: 0
RE: Two things
By TheRequiem on 2/17/07, Rating: -1
RE: Two things
By mjz on 2/18/2007 9:57:21 AM , Rating: 2
any power supply works on any card.. nvidia or ati, so buying nvidia based on a nvidia certified PS is stupid

AMD cash-flow warning
By crystal clear on 2/18/2007 4:37:30 AM , Rating: 2
"Analyst sounds AMD cash-flow warning"

A US analyst has expressed "increasing concerns" that AMD is heading toward a cash-flow crisis even a shareholders' ears prick to whispers that a private equity company is looking to buy a stake in the chip maker - or even the whole kit and kaboodle.

AMD's cash flow was questioned yesterday by American Technology Research analyst Doug Freeman, who told the firm's investor customers: "We were surprised to see AMD shares rally yesterday given what we believe to be increasing concerns about cash flow at the company... we think management will be forced to come to the capital markets for operating cash before the end of the summer."

RE: AMD cash-flow warning
By crystal clear on 2/18/07, Rating: -1
What happened to your wallet?
By pauldovi on 2/17/07, Rating: -1
RE: What happened to your wallet?
By bbomb on 2/17/2007 3:57:23 PM , Rating: 2
Quote your source?

R600 will prove worthy.
By ButterFlyEffect78 on 2/17/07, Rating: -1
RE: R600 will prove worthy.
By tkSteveFOX on 2/17/07, Rating: 0
RE: R600 will prove worthy.
By Sailorboy on 2/17/07, Rating: -1
RE: R600 will prove worthy.
By FakeDetector on 2/25/2007 3:22:48 PM , Rating: 2
by Sailorboy on February 17, 2007 at 6:47 AM

You're right about the performance. See benchmarks at:

FAKE !!!! FAKE !!!!FAKE !!!!FAKE !!!!FAKE !!!!FAKE !!!!

"We can't expect users to use common sense. That would eliminate the need for all sorts of legislation, committees, oversight and lawyers." -- Christopher Jennings

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki