backtop


Print 23 comment(s) - last by Ianirvin.. on May 11 at 11:11 AM


Sapphire's X1900GT

PowerColor's X1900GT

GeCube's X1900GT
ATI ships R580 for under $300

We've received word that Sapphire, GECUBE and PowerColor all have announced latest editions for the high end line of X1900 graphics cards.  The ATI Radeon X1900 GT is a single-slot video card that uses the ATI Radeon R580 core, with a stock clock speed of 575MHz, 256MB of DDR3 and a 1.2 GHz memory clock.  


Radeon X1900 GT
Radeon X1900 XTX
Pixel Shaders
36 48
Core Clock
575MHz
650MHz
Memory Size
256MB
512MB
Memory Clock
600MHz/1.2GHz
775MHz/1.55GHz

The card was not supposed to be made available until May 9, to coincide with E3, but several Best Buy stores mistakenly made the card available to consumers last week.  Even before being released publicly, the cost of the card has been slashed by several manufacturers.  The PowerColor X1900 GT will cost $299, with Sapphire and GECUBE within a buck or two.  Expect to see the cards on sale at the major online retailers this week, and at B&M stores immediately.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Shame no life in AGP parts
By lemonadesoda on 5/9/2006 12:53:20 PM , Rating: 2
I have a perfectly acceptable Intel 865PE system. Northwood 3.0GHz, 2GB DDR, etc. running Windows 2003 Server as a workstation. Absolutely perfect except that I can't buy a current generation GPU. I would really like to drive 2 x 1600x1200 DVI TFTs from the GPU.

I wonder how many other millions of PCs are also non-upgradeable from the GPU perspective?

In fact, I have 5 such machines (in our office). Another 5 lost sales for ATi.

And no, we won't upgrade to Intel 925/75 platform just to get a PCIex16 socket. There is really no material performance gain for a complete re-investment of the whole platform (5 times).

So we have had to buy PCI GPU's to provide the second (and third :-) DVI output to coexist with the X800's we have installed.

5 sales to Nvidia! (Quadro4 PCI boards).





RE: Shame no life in AGP parts
By phaxmohdem on 5/9/2006 12:56:57 PM , Rating: 2
Um couldn't you just buy an AGP Quadro w/ Dual DVI?


By lemonadesoda on 5/9/2006 3:24:17 PM , Rating: 2
Good point, but unfortunately AGP Quadro doesn't have enough horsepower compared to x800 for driving one screen in DirectX at high FPS.

I would still prefer to have a new generation AGP card for the main screen, rather than the X800. The secondary (and third) display just needs high quality 2D. Not used in 3D.

Perversely the Quadro PCI makes a perfect secondary controller, except it cannot drive > 1600x1200.

I would still prefer a x1800 or x1900 on the main screen for, you know, after work-hours entertainment ;-)


RE: Shame no life in AGP parts
By Tebor0 on 5/9/2006 12:59:31 PM , Rating: 2
BOTH ATI and Nvidia are guilty of this.


RE: Shame no life in AGP parts
By Plasmoid on 5/9/2006 1:23:24 PM , Rating: 2
How do you make that out.

As i speak you can get a 7900gt in AGP from gainward (under the name 7800GS) or a 7800GS.

Nvidia can hardly do more considering the AGP 8x transfer rate is being reached, especially since it is 1 way and limited by the entire PCI bus transfer rate. The sales from AGP cards are small so Nvidia arent going to continue development on it.

In Ati's case if they offered the x1600 AGP it would not outperform their x800's or x850's. That being said they should offer a more modern AGP card.


By lemonadesoda on 5/9/2006 3:30:25 PM , Rating: 2
Good point about the 7800GS. Although reviews of performance indicate this card is essentially half a generation behind.

On your other point, I beg to differ. AGP 8x saturation is not being reached. What we are doing is increasing resolution(s) eg. 1280 x 1024 to 1600x1200 or 1920x1200 and going from 0xAA to 8xAA etc.

This adds nothing to bandwidth. AGP 4x is probably still ok. As the market has moved to increasing MB on the GPU and processing more and more effects on the GPU (rather than rendered on the CPU and submitting the results over the AGP) we have in fact reduced the bandwidth requirements for typical GPU.

You may have noted that many 2 x SLI and 4 x SLI setups are very happy in PCIex8 even PCIex4.


RE: Shame no life in AGP parts
By Trisped on 5/9/2006 3:28:54 PM , Rating: 3
I am sorry your computer hardware is out of date, but buying a top of the line video card does not seem intelegent under those circomstances. If the x800 on AGP or the NVIDIA 7800 on AGP isn't enough, then you need a new system.


RE: Shame no life in AGP parts
By lemonadesoda on 5/9/2006 5:22:50 PM , Rating: 2
Your answer is partly correct.

1./ With a current generation card (e.g. ATI x1900), even on the AGP bus, one could expect 200% framerate in most GPU limited applications (e.g. DirectX FPS). I am a framerate greedy guy. The cost of upgrade would be circa $300 per machine. 5 upgrades would be $1500. I could also drive 2x DVI on each, which the X800 does not have (it has 1x DVI and 1xVGA), without having to add a second GPU controller card.

2./ With the latest generation CPU and GPU, I could expect up to 250% performance improvement compared to the Northwood 3.0GHz. The cost of upgrade would be mainboard $150+, CPU $400+, Memory (2GB DDR) $250+, GPU circa $300. Total $1100+. 5 upgrades would be $5500+. Difference $4000+.

Yes, option 2 would give me an up to date machine, and probably an overall win of 250%/200% = 25% extra. But at an additional cost of $4000. It is an investment of nearly 4x option 1 but with only a 25% improvement.

(note that these benchmarks are approx and based on a quick google of hardware benchmarking sites)

My formula for upgrading or renewing hardware is for a gain in the order of magnitude of 2x or more performance. If there is only a small percentage gain, then I don't update the system.

I'm happy to blow $1000 over a weekend. And $1500 if I'm in a really good mood. But $5500? Wishful thinking. It aint going to happen. Not for 25%.

Considering these analogies:...

a) latest HDD available IDE (legacy)
b) latest DVD burners available IDE (legacy)
c) latest sound cards available PCI (legacy)
d) latest raid controllers available PCI, PCI-X (legacy)
e) latest TFTs available VGA/DVI (legacy!)
f) latest PPU available PCI ;-) (legacy)

...I am very surprised that there is not more current product available for AGP.

Hardly do I want to stand in the way of progress; I'm pleased the path of the future is already being trodden. But the surprising part is the commercial opportunities being lost by having no upgrade path for literally millions of home and business users.

I personally am sitting out the first round PCIe (v1.0)... since we have 5 machines to upgrade the investment cost is too high for too small a benefit. Just like USB 2.0, and PCI 66Mhz over PCI 33Mhz, and AGP 4x and 8x over AGP (PCI 2x), within the next 18-24 months I'm sure we will have PCI express 2.0, Quad core and ULV. That's when I'll have to dig deep into my pockets.

My observation of many upgrade cycles (and in this comment I must exclude the wealthy-enthusiast who upgrades at every opportunity) is that full system replacement occurs approximately every 4 years. Based on this, hardware manufacturers should plan products that are part of the (bi-)annual upgrade path.

It may be a strategic issue: over the last ten years, the market has grown significantly and there has been sufficient appetite for completely new machines. As the (western) market matures and becomes saturated, perhaps "upgrading" rather than "new systems" will have a larger impact on marketing priorities.


RE: Shame no life in AGP parts
By Zoomer on 5/10/2006 6:42:08 AM , Rating: 2
Well, any card manufacturer should be able to use the ati bridge chip and create an agp version, just like the x800.

But no one bothers.


According to Sapphire Web Site
By Link on 5/9/2006 1:22:49 PM , Rating: 2
their X1900GT is HDCP ready. Does this mean their card has the chip and key for HDCP-DVI output and Windows Vista ready?




RE: According to Sapphire Web Site
By Plasmoid on 5/9/2006 1:25:05 PM , Rating: 2
I doubt it... it doesnt have HDMI outputs, which is what HDCP needs.


RE: According to Sapphire Web Site
By abhaxus on 5/9/2006 1:35:53 PM , Rating: 2
Umm... HDCP works over DVI just fine. And the DVI-HDCP interface is a little cheaper than licensing the HDMI interface.


RE: According to Sapphire Web Site
By Plasmoid on 5/9/2006 1:45:07 PM , Rating: 2
Thats a new one for me, wont you need something that has a HDCP but accepts it over DVI ?

Last time i checked all the TV's that accepted HDCP signals had HDMI connectors and component connnectors. I guess an adapter could be used, but there is still the matter of sound. Seems all to messy to me for anyone to have products out which go this way.

I still doubt it has actual HDCP key's on the board, but again refers to the "HDCP Ready" status of the ATI GPU


By deeznuts on 5/9/2006 2:41:00 PM , Rating: 2
I use a DVI to HDMI cable, works just fine. About sound, well the TV only has two speakers anyway, which I disconnected. Cheap RCA wires will suffice usually here, because who the hell has 5.1 sound going to their TV? Yeah it's convenient but not a deal killer.


RE: According to Sapphire Web Site
By Odeen on 5/10/2006 3:45:20 AM , Rating: 2
By and large you DON'T want sound over HDMI connection.

Why?
You don't want source to TV HDMI audio because DD-capable HDMI receiver chips are not making it into consumer TV's. They exist in Silicon Image labs, sure, but the chips in HDMI-equipped TV's only handle PCM or DD 2.0.

After all, adding a 5.1-to-2.0 downmixer chip will increase the cost of the TV (and you need to downmix to 2.0 because no TV has 5.1 speakers built in). [b]But, if you switch your source to stereo, you're also feeding your 5.1- or 7.1-capable receiver stereo too.[/b]

This stems straight from my experience with consumer-level HDMI equipment, namely a DirectTV HD receiver and a Panasonic plasma TV. The two are connected via HDMI only, with a coax digital audio cable running to the 5.1 receiver.

When I was working on the equipment, I noticed the box wasn't passing through 5.1. I switched it to 5.1, but then you couldn't listen to JUST the TV, the TV speakers were outputting static. Conclusion: TV can't receive multichannel audio (and it rightfully shouldn't) and all digital audio output on the satellite box is tied together (also reasonable, in retrospect)

What would have been a better idea is to run HDMI + RCA audio to the TV (and feed the TV downmixed-by-definition analog audio) while keeping digital out on the box at multichannel-if-possible, with digital out running to multichannel receiver. Alas, the plasma was bolted to the wall, and running additional cables is impossible. I had to teach the owner how to switch the box from stereo out (for receiverless watching) to DD out (for maximum audio impact when using the receiver)

Single-cable connections only become useful when they are plentiful, standardized, and the source is smart enough to be flexible about the output.

The ideal situation is something like this: a DVD player or cablebox/satellite receiver has two HDMI ports, and one HDMI can output video + stereo audio to the TV while the other port outputs multichannel and/or high-res audio to the receiver. A bidirectional link can inform the source about the capabilities of the device on the other side (i.e. "I am a 1024x768 plasma, and I support stereo PCM and DD 2.0" or "I am a receiver, I can handle DD and DTS up to 7.1, along with PCM at up to 192/24, and DSD," so the source can automatically engage mixdown circuitry, and not send incompatible / disallowed signals where they don't belong.


I'd love to see some benchmarks
By bamacre on 5/9/2006 1:13:48 PM , Rating: 2
With this priced so close to the 7900 GT, I'd LOVE to see some benchies. Also, I assume this thing will come with 256bit memory as apposed to 128bit.




RE: I'd love to see some benchmarks
By coldpower27 on 5/9/2006 1:23:08 PM , Rating: 2
I'ts been awhile now since ATI or NV put 128Bit Memroy Interfaces on their flagship cores.


RE: I'd love to see some benchmarks
By bamacre on 5/9/2006 1:33:01 PM , Rating: 2
Definitely. But at $300, this is certainly the cheapest of any X1900 card. And I assume its 256bit, but I'd like to see for certainty. :D


By littlebitstrouds on 5/9/2006 2:14:55 PM , Rating: 2
Click on the bottome picture... the box says 256bit memory


Worst Buy has them in stock, but.....
By Link on 5/9/2006 3:25:35 PM , Rating: 3
It's whopping $400 a piece!!!




By toyota on 5/9/2006 4:48:01 PM , Rating: 2
yep they are $399 at Best Buy and they have had the x1900gt on the shelves for over a week.


What would be cool...
By nrb on 5/10/2006 6:56:04 AM , Rating: 2
What would make this really interesting is if someone could figure out a way of enabling the other 12 shaders, perhaps via a VGA BIOS flash. :-)

But I'm not holding my breath. :-(





RE: What would be cool...
By Ianirvin on 5/11/2006 11:11:43 AM , Rating: 2
unfortunatly the 7900GTX out performs this card, I went with an X850 Pro PCIe for my last card because of ATIs superior pricing however if they are going to keep prices similar to the 7900 they won't get my buisness.


"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot

Related Articles
Radeon X1900GT For Sale Early
May 3, 2006, 6:37 PM
ATI Radeon X1900 GT Leaked
April 10, 2006, 5:24 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki