Print 27 comment(s) - last by fumar.. on Dec 29 at 2:43 AM

Sapphire's Dual-Radeon X1950, dubbed "The Godfather," running in action
Quad-Crossfire, here we come

If waiting for R600 has you down, Sapphire has an upcoming alternative.  Early this morning the company internally demonstrated its dual-GPU Radeon X1950 Pro board, dubbed The Godfather.  The dual-slot GPU uses two 6-pin power connectors but still fits within the standard 9" riser-board dimensions.

Sapphire representatives tell us the performance of the single card is comparable to two Radeon X1950s in Crossfire mode.  The card runs on a single x16 PCIe connector. Both dual-link DVI connectors can be utilized at the same time.

Previous Radeon X1950 Pro cards have a GPU core clock of around 575MHz with a memory clock of 700MHz.  The clock frequencies for The Godfather were not revealed.

Also visible on the adaptor are the two interfaces needed for AMD Crossfire.  We have confirmed that the upcoming adaptor does not need a Crossfire motherboard as well in single-card implementations.  However, memos from the company reveal that the card can be put into Crossfire mode with another identical card on a Crossfire motherboard -- the upcoming implementation would be the first capable of Quad-Crossfire.

Like other X-series Radeon cards, The Godfather is also capable of physics processing via Havok or any other middleware capable of interfacing with the Radeon FPU for physics calculations.

Expect to see demonstrations and benchmarks of the card during the upcoming CES 2007.  Sapphire claims the card will be available in Q1'07, well ahead of the March 2007 R600 launch.  Sapphire recently announced its AGP version of the Radeon X1950 Pro, but the company would not respond if The Godfather would be available in AGP versions.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By surt on 12/21/2006 6:59:55 PM , Rating: 3
A 'dual core' graphics solution is very unlikely. Graphics card designs are already essentially 8 or 16 core, and to design a 'dual core' GPU rather than just a 32 pseudo-core chip would be more difficult rather than less.

By Aikouka on 12/22/2006 9:57:17 AM , Rating: 2
... what was the point of this? I said that it wouldn't be dual-core. Frankly, Sapphire is a card manufacturer, not a chip manufacturer, they wouldn't develop their own ati-based chip just to put in their own cards. However, creating a quasi-Crossfire-based single-pcb card (aka two graphical subsets existing on the same PCB connected via 1 of the 2 crossfire interfaces to eachother and each subset puts the addition crossfire interface as output (aka at the top of the card like a normal 1950 Pro)) isn't out of the question.

Also, graphics cards are not "essentially multi-core." With that logic, you could consider half of the computing components today "multi-core" just because they have more than one component capable of completing a task.

By surt on 12/26/2006 3:36:46 PM , Rating: 2
The point was that when you claimed it wasn't dual core, that was pointless. There won't be any dual core graphics designs because they are already highly parallel designs. Makeing them 'dual core' would add cost, not reduce it, so the likelyhood of any of the graphics players doing this for anything other than marketing purposes is very very low.

"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs
Related Articles
Sapphire Announces Radeon X1950 Pro AGP
November 30, 2006, 2:18 PM
ATI Announces Stream Computing Technology
September 29, 2006, 3:11 PM
ATI GPU 2006 Roadmap
June 6, 2006, 3:20 PM

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki