backtop


Print 16 comment(s) - last by lemonadesoda.. on Mar 24 at 9:35 PM

Say hello to the first dual Radeon video card

GeCube sends us word that the company has officially announced its Dual Radeon X1600 Gemini video card. The card is the first of its kind to feature two Radeon GPUs on a single PCB.  NVIDIA is certainly no stranger to multiple GPU PCBs, but these multi-GPU solutions require NVIDIA's SLI platform to run.  The X1600 Gemini is unique in the fact that it does not require Crossfire (or SLI) and that it will run on any PCIe motherboard.

GeCube accomplishes this bit of magic by putting a separate bit of core logic directly on the PCB.  GeCube would not release high resolution images of the card, but we were able to confirm that the design uses three processors: two Radeon X1600 GPUs and a third unnamed core logic tying the two together. 

The card features two DMS59 outputs -- meaning you can actually run four DVI (or four analog) displays from one card. 

ATI is currently co-developing the driver with GeCube for this card, and we should expect to see a retail launch sometime in late April.  ATI is expected to launch a new chipset, RV570, sometime later this year that will not require additional core logic to run in Crossfire mode.  Each RV570 chipset will have Crossfire on-die. 


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Proof on concept?
By lemonadesoda on 3/23/2006 7:41:43 PM , Rating: 2
I guess a dual X1600 is really a proof of concept card. If it works, then swap the x1600 for a x1800 or x1900 GPU. I'll get me one of those. Nice.

Alternatively, if there are many X1800 or X1900 dies not quite making it, then you can stick two failed X1800 (that CAN work at X1600) together, and get performance back up to X1800 speeds. Hence effective yield increases. No chip wastage. Nice.

Or perhaps designed solely to meet the corporate workstation DVI x4 market.

It seems to me that the manufacturing cost will be more than a single GPU x1800 or x1900 unless they are using the failed die approach. I wonder what the street price will be for this card?

How about an AGP version so that millions of corporate workstations can be upgraded.

***BONUS*** >> I guess with TWIN AVIVO you could do some pretty pretty quick encoding. :-) Nice.




RE: Proof on concept?
By UNCjigga on 3/23/2006 8:23:10 PM , Rating: 2
Wouldn't dual-core GPUs be a much more elegant solution?


RE: Proof on concept?
By Jep4444 on 3/23/2006 9:11:11 PM , Rating: 2
this solution is far from elequent, as would dual core GPUs

The thing is it makes no sense to throw two GPUs onto one PCB since its cheaper just to double up the power on a single GPU, GPUs are already highly parralel so dual core makes no sense(where in CPUs are not parralel up to this point)


RE: Proof on concept?
By BrynS on 3/24/2006 7:48:23 AM , Rating: 3
While GeCube's experience with the development of their X1600 Gemini product should give them a leg up against the other AIB's for future products using the more formidable RV560/RV570 GPU's in a similar fashion -- allowing for possible Quad-CrossFire configurations across two PCB's -- this product (X1600 Gemini) was never intended to be a gaming solution proof of concept.

In this sense the arguments against the viability of multiple GPU's per PCB as a gaming solution is moot due to the GeCube product being developed for an entirely different niche (video). The likely take-up of Quad-SLI at the extreme high-end will put pay to the multi-gpu PCB argument, even if each 7900GX2 PCB is not quite the same engineering implementation as Gigabytes Dual 6600 or Asus Dual 7800GT PCB's.

In this Digitimes interview - http://www.digitimes.com/mobos/a20 060222PR202.html - (requires subscription) GeCube state that the X1600 Gemini is aimed at the video market, which makes sense given the 4 DVI ports:
quote:
...We also have another product we’ll be releasing, a dual GPU graphics card, which we’ve tentatively named “Gemini.” It’ll be based on the X1000 series GPU from ATI and will have four DVI output ports which can simultaneously display four different DVD streams. Our objective here is to get into the video market . This probably will be available by April after we display it at CeBIT and fine tune the drivers. We’ll probably first launch the X1600XT high-end version for around US$399...


I believe it was mentioned in various CeBit coverage that the final price should be considerably lower due to ATI price cuts on their X1K range, particularly the X1600.


RE: Proof on concept?
By lemonadesoda on 3/24/2006 9:35:56 PM , Rating: 2
Interesting info. Thanks for the research.

However, my goodness! Do people really want to watch 4 videos at the same time?

>> I guess GeCube actually meant to say "video-editing"/"video-production" market?

And 4 DVD streams? Do people actually have 4 DVD players in their PC?

>> I guess GeCube actually meant to say DVD quality "video-sources"? ie. from video capture card, internet/cable, or HDD streams.


RE: Proof on concept?
By lemonadesoda on 3/24/2006 9:27:57 PM , Rating: 2
Elegent or efficient?

A one-chip solution would need a new fab process and a huge piece of silicon. The "tooling costs" are not worth it. Cheaper in the short run to have two separate chips. (Short run is always true for GPUs which never last more than 6months before being superceded).


History repeat itself
By armagedon on 3/23/2006 4:50:24 PM , Rating: 2
anybody remember the ATI Rage128(?) dual chips on a card, years ago ? It desappeared as quickly as it saw the light.
Dual chips were superseed by quicker single core rapidly.
I'm not sure if this one has a better future eventhough it's surprising that the old VooDoo SLI has shown a new life in Nvidia recently.




RE: History repeat itself
By Saist on 3/23/2006 4:59:43 PM , Rating: 2
Rage MAXX

thing is, it is cheaper, and easier to build one chip that does all the functions, than to link two chips together.

Think about it for a second. If you have two chips you now have to figure out how to supply power to them both, figure out how to balance data that sent to both chips, as well as determining wether or not both chips are aware of each or controlled by another chip.

It's why ATi took the stance during 2003, 2004, and 2005 that Dual Chip solutions were going nowhere, and we have, in fact, seen that come true. A Radeon x1900 decimates 2 x850's hooked together. The Geforce 7900 toasted two 6800's.

However, most consumers are unwilling to look to the long term viability of their purchase, and instead look to the short term. Consumers want it "now"

That is why SLI and CrossFire sell. Consumers can get that extra power, now. Not later.

Also, the current CrossFire and Nvidia SLI techniques are nowhere close to the old Voodoo SLI. So, no. It is NOT suprising that the old Voodoo SLI has shown a new life. It's not even being used.

So... also, no. History is not repeating itself in reguards to Voodoo SLI. History, however, is repeating itself in reguards to Rage MAXX.


RE: History repeat itself
By Bonrock on 3/23/2006 7:26:44 PM , Rating: 2
In regards to dual-chip graphics solutions, you wrote:

However, most consumers are unwilling to look to the long term viability of their purchase, and instead look to the short term. Consumers want it "now"

The thing is, there's no "long term viability" to worry about if you're a consumer. Your dual-chip video card won't suddenly stop working if GeCube and ATI decide to stop making this card. Now, I don't know how much this thing will cost, and I probably won't be able to afford it. But if you can afford it, and it works as advertised, I don't see the problem with getting one.


RE: History repeat itself
By Sunbird on 3/24/2006 3:34:27 AM , Rating: 2
I had a Rage Fury MAXX, they were to lazy to release a Windows 2000 driver and since then I have never had another ATI card again. My TNT2s can still be used with Win XP...


History Repeats Itself
By rhisgen on 3/23/2006 5:19:42 PM , Rating: 2
I had a RageMAXX when they came out.( I still have it somewhere) it was a good card 64MB.
The problem was thier design for dual processor video was not supported under NT or 2000 and MS products were headed that way. So ATI discontinued it. I wish they would have redesigned it for use under the NT kernel.




RE: History Repeats Itself
By ksherman on 3/23/2006 5:50:57 PM , Rating: 2
I actaully bought one from a garage sale about 2 years ago for 16$... but it wouldnt work on any of my PCs (all XP) do you know if it is supported in linux in any way?


RE: History Repeats Itself
By Saist on 3/23/2006 6:31:01 PM , Rating: 2
technically yes.

Keeping in mind that ATi begins their 3D support for Linux at Radeon 8500, the Rage Maxx isn't even a candidate for support in FireGl drivers.

However, the card "should" run under Mesa. Emphasis on should. I've never actually tried it.


Not always
By Egglick on 3/23/2006 6:03:00 PM , Rating: 2
Just because the Rage Fury MAXX failed, doesn't necessarily mean that this card will fail. The MAXX had no core logic chip, and it also had very poor software/driver support. To top it off, the Rage128 chip was allready outdated by the time the MAXX game out.

This card looks like it could do alot better, but because of the logic chip, it will probably require custom drivers. I guess we'll see how all that goes.




Here's an idea
By shaw on 3/23/2006 6:39:48 PM , Rating: 2
Bundle two useful GPUs together!




day late and a dollar short
By Anemone on 3/24/2006 9:10:09 PM , Rating: 2
I'm going to start calling ATI "me too me too"!

Look if they could do it why'd they wait for someone else to announce it? Oh wait, that would be because they said "wow what a great idea! hey wait, our chips can do that!"

Now I'm not really a fan of taking something as expensive as a graphics chip and putting it to use doing physics, but I guess we all need something more to do, lol. However, Nvidia does have models of dual core boards out there, while they are using mobile chips it does not require SLI on the motherboard to put into effect.

However one poster made the burning truth early on. A dual chip solution buys you 1 year of advanced speed at most. After that the single core catches up and passes that dual core solution. Now in some cases people need that extra speed, say to drive a 30" lcd! Ok maybe "need" wasn't the right word. But in many cases you are far better off buying that top end $500 card now and the next $500 card a year from now than 2x the first now.

$.02




"It's okay. The scenarios aren't that clear. But it's good looking. [Steve Jobs] does good design, and [the iPad] is absolutely a good example of that." -- Bill Gates on the Apple iPad

Related Articles
Forget DVI, DMS59 Is Back Again
February 27, 2006, 4:30 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki