Print 33 comment(s) - last by mrzeld.. on Aug 10 at 6:09 PM

Historical pricing of the ATI Radeon X1900 CrossFire - Image courtesy AnandTech
Cheap ultra-high end Radeons are upon us

For those of you who can't wait until the Radeon X1950 to debut on the 23rd, ATI has already started to cut pricing on the ASICs for the Radeon X1900 series cards: Radeon X1900 CrossFire, X1900XTX, X1900XT and X1900GT.  Essentially, all of the ATI cards have shifted down a price tier, leaving the top $500 price tag open for the Radeon X1950. 

The pricing change began to affect some merchants early this week, with the majority of the merchants following suit now.  Here are a few of the average price changes over the last few days:
Although the price reduction has mostly affected built-by-ATI cards, Sapphire and PowerColor are also starting to see some movement on the new pricing.  Expect to see the rest of the industry follow shortly, and definitely expect to see all pricing adjusted before August 23th.  ATI distributors claim there will be some availability of Radeon X1950 on the launch day.

The Radeon X1950 is ATI's first GDDR4 graphics card.  The card is expected to fill in the pricing tier of the older Radeon X1900s. 

Update 08/09/2006:
We originally published the release date for Radeon X1950XTX was August 27.  This was a typo and the correct date is in fact August 23.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

verdict: wait for DX10 parts
By pyrosity on 8/8/2006 2:38:40 AM , Rating: 2
August 27 is my birthday.

Too bad I will be a poor college student around that time.

The upcoming DX10 parts are what sound the most interesting to me. Better parts causing large price reductions are always welcome in my book, however.

RE: verdict: wait for DX10 parts
By hwhacker on 8/8/2006 5:43:30 AM , Rating: 2
I agree on waiting for dx10 or x1650pro/x1950pro as they certainly fill in nice spots in ATi's midrange...but for now when you can get a X1900AIW (notabley missing from the list) for as cheap as $209, there certainly some good deals to be had compared to what is top of the line right now.

RE: verdict: wait for DX10 parts
By JeffDM on 8/8/2006 7:22:14 AM , Rating: 2
Isn't DX10 going to be Vista-only? As such, that's not much of a feature to me.

RE: verdict: wait for DX10 parts
By killerroach on 8/8/2006 8:00:03 AM , Rating: 2
Even if you're not upgrading to Vista, the DX10 parts are slated to offer more pipelines at a higher clock speed, which would provide for even greater DX9 performance than anything currently out on the market. Not to mention I'm not exactly thrilled about the concept of buying a third DX9-generation video card, and will hobble around on my GeForce 6600 until I can snare a G80-based card (probably an 8800GT or something equivalent).

RE: verdict: wait for DX10 parts
By Le Québécois on 8/8/2006 9:47:42 AM , Rating: 2
From what I know the next Nvidia isn't going to be least not completly. SO I guess you will be buying a semi 4th generation DX9 card after all. If I remember correctly you need to have a completly unifed architecture to be granted to put DX10 on your product.

RE: verdict: wait for DX10 parts
By Nelsieus on 8/8/2006 11:34:23 AM , Rating: 2
No you don't, there's nothing in the DX10 spec / API that *requires* a unified shader architect.

G80 will be SM4 and WGF2.0 compliant, and fully DX10 compliant. I think you're just confused by that mis-conception that DX10 requires unified shaders, which it does not. It does however, support them, granted that's the direction a firm wants to go in (as ATI is opting for).

RE: verdict: wait for DX10 parts
By Le Québécois on 8/8/2006 2:07:00 PM , Rating: 2
Have you read the link I had in my post?

If so and you still think DX10 doesn't need unified shaders then I misunderstood the article and thank you for correcting my error.

And if the link IS wrong in what it says, could you please give me a link that with the correct information so I can be up to date on my knowledge of future product. Because until I read what you said I was only waiting for the R600. To have the G80 in mind for my next gpu would be a nice option. My X800XL is getting old...

RE: verdict: wait for DX10 parts
By mrzeld on 8/10/2006 6:09:02 PM , Rating: 2
i believe the answer you are looking for is software vs. hardward. to the programmers, DX10 is unified. they wont have to worry about this shader or that shader. as for the hardware, the video drivers for the cards will take an incoming request for some action and send it to the appropriate hardware. it is all about abstraction.

RE: verdict: wait for DX10 parts
By Engine of End on 8/8/2006 9:21:39 AM , Rating: 2
Too bad the DX10 cards are going to be power hungry as well as expensive. Over 300W on load? No thanks.

RE: verdict: wait for DX10 parts
By rrsurfer1 on 8/8/2006 9:25:20 AM , Rating: 2
Why, it's not like we're in an energy crisis or anything... oh wait...

RE: verdict: wait for DX10 parts
By Engine of End on 8/8/2006 9:35:29 AM , Rating: 2
Some of us don't want to have huge electric bills, especially so when you would probably need a least a 1KW PSU to run DX10 cards along with everything else.

RE: verdict: wait for DX10 parts
By RamarC on 8/8/2006 9:40:26 AM , Rating: 2
a least a 1KW PSU to run DX10

Some of the new cards are going to come with external power supplies so you won't need to upgrade you internal unit. But the back of your PC will become more of a rats nest of cables, cords, and power bricks.

RE: verdict: wait for DX10 parts
By MrDiSante on 8/8/2006 10:24:34 AM , Rating: 2
That's why I'm happy to live in a condo - the electricity and water bills are shared equally by all of the residents.

RE: verdict: wait for DX10 parts
By ChuckvB on 8/8/2006 3:31:41 PM , Rating: 3
Hum, you got me thinking about how you could use your set price electricity in your Condo to produce a transportable energy source to sell or for your car. How about a little hydrogen electrolysis in the spare bedroom?

RE: verdict: wait for DX10 parts
By Phynaz on 8/8/2006 2:46:04 PM , Rating: 2
Too bad the DX10 cards are going to be power hungry as well as expensive. Over 300W on load? No thanks

Don't beleive it. There's not a chip one earth that could withstand that. 300w being dissipated by something the size of your fingernail would glow orange, if not white.

By rrsurfer1 on 8/8/2006 3:31:32 PM , Rating: 2
Beleive it or not, it is possible. You can already overclock CPU's that use previous generation processes to 150+ watts on air cooling. And that includes no memory.

You have to realize the memory, running at high clock speeds, is also consuming quite a bit of power. It could easily get to 300w. And of course the chip would probably fry in seconds without a heatsink, but the sink continually draws heat away so it never gets white hot. The cooling demands for the next generation video cards are going to be high.

By MonkeyPaw on 8/8/2006 4:12:14 PM , Rating: 2
The entire card will consume 300W, not just the GPU. Granted, the GPU will consume the most energy, but the GDDR will absorb some energy, along with any inefficiencies in the power delivery system on the graphics cards. IIRC, most high-end cards have heat sinks on these components already. If those components are giving off that much heat, then they are likely a decent part of the energy consumption.

RE: verdict: wait for DX10 parts
By oddity21 on 8/8/2006 9:28:31 AM , Rating: 2
I myself am going for either the G80 or R600 for my new build. Just bought a 700W PSU. Let's hope that will be enough for an SLI/CF setup...

RE: verdict: wait for DX10 parts
By rrsurfer1 on 8/8/2006 11:40:07 AM , Rating: 2
SLI - doubtful. If you go high-end your talking nearly 300w per card, that leaves 100w to run the rest of your system. If you go with the base model it should use something like 150w if the rumors are correct, which you could handle. God knows what kind of cooling system will be required with those inefficient monsters... CPU-style coolers on each chip...

RE: verdict: wait for DX10 parts
By Seer on 8/8/2006 1:09:29 PM , Rating: 2
You're exaggerating the power needs. 700 should be enough.

By rrsurfer1 on 8/8/2006 2:03:59 PM , Rating: 2
Your "700 should be enough" is only a guess, too. The rumors put the new generation at between 140 to 300+ watts. If it's 300*2, the it is indeed 600, which IS 100w for the system, far too low to run stably off a 700w supply.

But we'll both have to wait and see, unless you are employed by either ATI or nVidia ;)

RE: verdict: wait for DX10 parts
By archcommus on 8/8/2006 9:47:12 AM , Rating: 2
I'm hoping my 256 MB X800 XL can last me until the NEXT gen of energy efficient video cards but I'm not sure if that's possible. Either way I don't feel like replacing my 400W PSU so if I get a first-gen DX10 card it better come with a simple, cheap, 5.25" bay supplemental power supply.

By rrsurfer1 on 8/8/2006 10:15:02 AM , Rating: 2
Same here. Video companies are rediculous. There's absolutely no need for these super power-hungry designs. I understand the logistics/costs involved on developing smaller process tech, but commonn a video card that uses more power than the whole rest of the computer!?! At a time when oil prices are rising and every other industry is striving for better power efficiency.

I have an X800XL as well and it will be my last card until they get the power consumption under control. I'm not having my computer be more power hungry than my damn air conditioner! I don't care if they include the power supply.

By crazyeinstein on 8/9/2006 10:41:06 AM , Rating: 2
my b'day too is on 27th august!!nice to meet someone sharing the same b'day date!

i currently own an xfx 6600gt-256 mb --planning to sell it off and get a 7900gt-256mb maybe asus or xfx.
wht d'you think--should i wait till dx10 based nvidia cards are out or go ahead??
btw i own an asus a8ne-nforce4 ultra chipset and amd 64 3200+ with 1.5gb of ddr400 transcend ram.

By L33tMasta on 8/8/06, Rating: 0
RE: nVidia
By SixDixonCider on 8/8/06, Rating: 0
RE: nVidia
By Nelsieus on 8/8/2006 2:19:09 AM , Rating: 2
Or nVidia's G80, which is supposed to be debuting in September.

RE: nVidia
By The Cheeba on 8/8/2006 2:26:07 AM , Rating: 4
Oh yeah. I'm sure that's going to be cheap.

RE: nVidia
By tuteja1986 on 8/8/2006 7:39:29 AM , Rating: 2
X1900XT for $310 would be a real killer seller :)

RE: nVidia
By RamarC on 8/8/2006 10:13:04 AM , Rating: 2
X1900XT for $310 would be a real killer seller :)

x1800xt 256MB for $205 ain't bad either. It's the fastest card in that price range.

By christovski on 8/8/2006 2:33:07 PM , Rating: 2
Lots of people running this card, and shopATI's recently dropped the price dirt cheap (just got one for 130$), in that price bracket the XL is unbeatable. I think some press about the mid-high end parts would have helped sales alot mroe. These prices are outrageous to begin with, so these might make crossfire systems more bearable.

RE: x800XL
By Phynaz on 8/8/2006 2:46:53 PM , Rating: 2
You can pick up an XT for $150.

By smokenjoe on 8/9/2006 12:11:39 AM , Rating: 2

Don't forget that the only reason you need a 700watt PSU is because the people that made you PSU are not really putting out that kind of power they advertise at least not . Many have trouble powering a system that only uses 250W. Even worse even some of the more expensive ones are way off on the real power they can delever cleanly enough to keep your system running smooth.

Rember that video cards are amost doubiling in performcance each year- that kind of power needs to be fed well.

"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard
Related Articles
ATI Radeon X1950 Announced
July 21, 2006, 5:58 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki