Print 27 comment(s) - last by fumar.. on Dec 29 at 2:43 AM

Sapphire's Dual-Radeon X1950, dubbed "The Godfather," running in action
Quad-Crossfire, here we come

If waiting for R600 has you down, Sapphire has an upcoming alternative.  Early this morning the company internally demonstrated its dual-GPU Radeon X1950 Pro board, dubbed The Godfather.  The dual-slot GPU uses two 6-pin power connectors but still fits within the standard 9" riser-board dimensions.

Sapphire representatives tell us the performance of the single card is comparable to two Radeon X1950s in Crossfire mode.  The card runs on a single x16 PCIe connector. Both dual-link DVI connectors can be utilized at the same time.

Previous Radeon X1950 Pro cards have a GPU core clock of around 575MHz with a memory clock of 700MHz.  The clock frequencies for The Godfather were not revealed.

Also visible on the adaptor are the two interfaces needed for AMD Crossfire.  We have confirmed that the upcoming adaptor does not need a Crossfire motherboard as well in single-card implementations.  However, memos from the company reveal that the card can be put into Crossfire mode with another identical card on a Crossfire motherboard -- the upcoming implementation would be the first capable of Quad-Crossfire.

Like other X-series Radeon cards, The Godfather is also capable of physics processing via Havok or any other middleware capable of interfacing with the Radeon FPU for physics calculations.

Expect to see demonstrations and benchmarks of the card during the upcoming CES 2007.  Sapphire claims the card will be available in Q1'07, well ahead of the March 2007 R600 launch.  Sapphire recently announced its AGP version of the Radeon X1950 Pro, but the company would not respond if The Godfather would be available in AGP versions.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By leidegre on 12/21/2006 4:54:38 AM , Rating: 4
Dual core GPU is nothing new, but would you really buy this? Now, i hate to say it, but nVidia as of today does provide a better alternative. The money this would cosy simple cant compete with what you get from nVidia.

Then again, every thing that can be done, should be at least tried. But i would definitly wait, and see what the R600 will bring, and what nVidia will do with the G80.

But for the moment, i wouldn't buy anything in terms of hardware... there are to many changes going around with Vista and DX10 shipping in the begining of '07.

By masher2 on 12/21/2006 5:22:45 AM , Rating: 1
> "Dual core GPU is nothing new, but would you really buy this?"

All depends on the cost, of course. It's likely to be considerably cheaper than the 8800 series, and probably somewhat close in performance to the 8800 GTS. If those parameters wind up correct, then there will be a market for it.

By Aikouka on 12/21/2006 8:13:51 AM , Rating: 2
First, I wanted to note that this is more than likely a dual-GPU/die solution rather than a dual-core solution.

I'm kind of curious about the power specs on this thing... I know the 1950xtx is a bit of a power-hungry monster, but this is based off of the pro (which features the 2 cross-fire connectors, which is most likely how this works).

CES 2007 just seems a bit late to be introducing such a product though. It'll either be too close to R600's release or after R600 is released and I'm not sure anyone would want The Godfather over an R600-based video card solution.

Also, about AGP solutions. I don't know how viable an AGP solution would be, because the fact is that only AGP 3.0 supports multiple AGP processors per AGP controller. I'm not sure how well a PCI-E to AGP bridge chip (sort of like nVidia used, but they used AGP to PCI-E) would work in this situation to make this a possibility. Maybe it's just time for people who expect decent graphics to finally upgrade?

By Spoelie on 12/21/2006 8:20:38 AM , Rating: 1
The godfather will not come in AGP form, they just mentioned that sapphire launched a x1950pro AGP as well.

By Aikouka on 12/21/2006 1:14:51 PM , Rating: 2
Sapphire recently announced its AGP version of the Radeon X1950 Pro, but the company would not respond if The Godfather would be available in AGP versions.

The second part in that sentence is what my comments on AGP were directed at :).

By sbanjac on 12/21/2006 9:10:32 AM , Rating: 2
You are corect when you say that 1950xtx is power hungry, but i belive that it doesn't mean that automatically all 1950 graphic cards are power hungry. For exmple 1950 pro is one of the more efficient ati cards on the market. it has 36 pixel procesors and 12 Rops + pipes, and it works at 580/700.
One 1950 xtx has 16 rops/pipes and 58 pixel proccesors. It runs at 650 /775-1000.
When it comes to power consumption (according to official data)
x1950XTX consumes 125 W while x1950PRO consumes 75W.
That means that this dual gpu card should have power consp. a litle above a singel XTX. If you ask me thats great.

One single 1950 Pro dual will have 24 ROPS/pipes ad 72 pixel processors. On clock-to-clock basis it will be faster than XTX 50%. However (if the clocks remain the same) XTX has 12% faster GPU clock and up to 42 % faster memory. (which is not quite correct, because radeon x1950 pro should have two memory buses, so that means twice the memory bandwith. There fore I believe that it would be safe not to consider this difference os extreame). All this puts the radeon x1950 pro dual in front of everything else in AMD's lineup. One Power color radeon 1950 pro with 512 MB of memory (core 600 ram 700) costs 195 € while one XTX with 1000 MHz memory costs 340 €. I believe that the price for the new dual gpu radeon x1950 could be around 360 €. That is not a bad offer.

By oralpain on 12/21/2006 4:29:56 PM , Rating: 1
It is clearly two separate GPUs on the same PCB.

By Aikouka on 12/22/2006 9:58:57 AM , Rating: 1
Yeah, I loved that clear shot of the PCB with the heatsink removed too.

Oh wait...

By surt on 12/21/2006 6:59:55 PM , Rating: 3
A 'dual core' graphics solution is very unlikely. Graphics card designs are already essentially 8 or 16 core, and to design a 'dual core' GPU rather than just a 32 pseudo-core chip would be more difficult rather than less.

By Aikouka on 12/22/2006 9:57:17 AM , Rating: 2
... what was the point of this? I said that it wouldn't be dual-core. Frankly, Sapphire is a card manufacturer, not a chip manufacturer, they wouldn't develop their own ati-based chip just to put in their own cards. However, creating a quasi-Crossfire-based single-pcb card (aka two graphical subsets existing on the same PCB connected via 1 of the 2 crossfire interfaces to eachother and each subset puts the addition crossfire interface as output (aka at the top of the card like a normal 1950 Pro)) isn't out of the question.

Also, graphics cards are not "essentially multi-core." With that logic, you could consider half of the computing components today "multi-core" just because they have more than one component capable of completing a task.

By surt on 12/26/2006 3:36:46 PM , Rating: 2
The point was that when you claimed it wasn't dual core, that was pointless. There won't be any dual core graphics designs because they are already highly parallel designs. Makeing them 'dual core' would add cost, not reduce it, so the likelyhood of any of the graphics players doing this for anything other than marketing purposes is very very low.

By abhaxus on 12/21/2006 1:34:17 PM , Rating: 2
Those of us that own motherboards with one PCIe slot are always interested in dual gpu/single slot configurations. Now if I could only afford it... :)

By Wwhat on 12/22/2006 9:48:34 AM , Rating: 2
Agreed, too little too late, people want a cheap (less than $250) card or a DX10 capable card now.

By DigitalFreak on 12/21/2006 10:39:05 AM , Rating: 2

I guess ATI has learned nothing from the failure that is Quad-SLI?

RE: Quad?
By peldor on 12/21/2006 11:43:14 AM , Rating: 3
ATI probably did learn a lot.

Sapphire? Not so much.

RE: Quad?
By therealnickdanger on 12/21/2006 12:25:13 PM , Rating: 5

Crossfire on motherboards? What a crock!
By Araemo on 12/21/2006 8:39:46 AM , Rating: 2
"We have confirmed that the upcoming adaptor does not need a Crossfire motherboard as well."

Good riddance! Seriously, I could understand crossfire/SLI support being special on a motherboard if A: crossfire/SLI didn't require a side-band connection between the two cards(The crossfire/SLI bridge), and if B: PCI Express didn't allow two PCIe devices to communicate directly without controller arbitration at high speeds as part of the spec. (And since you can crossfire low end nVidia cards without a bridge connector, nVidia realizes that too.)

Now then, if nVidia will follow suit, we'll be back to a fair and level graphics playing field. I have not heard a single technical reason why crossfire/SLI would NOT work on a given motherboard that has the physical PCIe lanes, and the driver inf hacks that allow it to work on any motherboard that has the physical PCIe lanes seem to support my suggestion. ;)

By KristopherKubicki on 12/21/2006 8:47:20 AM , Rating: 1
My apologies if there is some confusion here. For one card, you will not need a Crossfire motherboard. This is unique since previous dual-GPU Radeons did require Crossfire boards, and previous dual-GPU Geforces required SLI motherboards.

For multi-card solutions, you'll still need a Crossfire motherboard I believe. Sapphire might have something planned, but they were pretty tight lipped about it.

RE: Crossfire on motherboards? What a crock!
By Jedi2155 on 12/21/2006 8:48:21 AM , Rating: 2
<quote="Araemo">you can crossfire low end nVidia cards without a bridge connector

Crossfire nVidia thats new!

RE: Crossfire on motherboards? What a crock!
By jackalsmith on 12/21/2006 9:42:01 AM , Rating: 2
Not really.
You can run 2 nvidia cards in SLI on a crossfire board wih a modified nvidia driver. I've seen 2 8800gtx on a 975x crossfire board.

By DigitalFreak on 12/21/2006 10:38:23 AM , Rating: 2
Yep, it's all political. I hate that!

Irrelevant, too late by a year.... ??
By kilkennycat on 12/21/2006 3:38:46 PM , Rating: 2
Dual Crossfire, no DX10. It would have to be a blind ATi (er, AMD) enthusiast that would ever buy this board and not wait see what the R600 has to offer in competition with the G80/8800 series.... and then make a decision between a R600 board and the 8800 family. Sapphire must think that they can make some money, so maybe there are some very rich and blind enthusiasts out there for this dead-end product. BTW, the nVidia 7950GX2 is also a dead-end product, obviously rendered immediately obsolete by the readily-available 8800GTX. Maybe ATi (er, AMD) and Sapphire know something that we don't and R600-based video-cards will not be available in the retail chain for a very long time? Hence this stop-gap product might be viable for maybe a year, instead of the ATi(er, AMD)-hinted 3-4 months before R600 shipments.

RE: Irrelevant, too late by a year.... ??
By masher2 on 12/21/2006 3:59:33 PM , Rating: 3
> "so maybe there are some very rich and blind enthusiasts out there for this dead-end product"

This card is going to be what, about $400? It hardly takes a rich man to afford one. And this card will be just fine until DX10 becomes prevalent, which is still likely a year or more away.

By fumar on 12/29/2006 2:43:13 AM , Rating: 2
I agree, this product is not irrelevant in single card situations. But it is impractical to attempt Quad-Crossfire. Even Core 2 Quad's/QX6800's won't provide the 4 GPU's with enough data to run at their full potential.

And Cousin Vinny says...
By Kurumi on 12/21/2006 12:40:50 PM , Rating: 2
You going to buy this card. Along with payment of him offering "protection" :p

x1950 pro overclocking don't mix
By jackalsmith on 12/21/06, Rating: -1
"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson
Related Articles
Sapphire Announces Radeon X1950 Pro AGP
November 30, 2006, 2:18 PM
ATI Announces Stream Computing Technology
September 29, 2006, 3:11 PM
ATI GPU 2006 Roadmap
June 6, 2006, 3:20 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki