Print 60 comment(s) - last by 91TTZ.. on Jan 1 at 3:27 PM

AMD's January GPU launch includes the dual-GPU R680  (Source: AMD)
Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way

AMD's newest R680 graphics processor might look a whole lot like the ill-fated R600 GPU, but the reality couldn't be more bizarre.  Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two 55nm processor cores.

Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.

The RV670 graphics core, announced last November with the Phenom processor, is the first 55nm desktop graphics adaptor.  AMD does not target this card as a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's best product of 2007.

The company also made quick mention of the RV620 and RV635 GPU cores.  These cores are nearly identical to the previous RV610 and RV630 processors, but will be produced on the 55nm node instead. 

All three of AMD's new GPUs are scheduled to launch next month. 

Dual-GPU technology is not new.  3dfx's flagship Voodoo 5 family also resorted to multiple processors to achieve its relatively high performance.  ASUS, Gigabyte, Sapphire, HIS and PowerColor all introduced dual-GPU configurations of just about every graphics processor on the market, though these were never "sanctioned" ATI or NVIDIA projects.  Ultimately, all of these projects were canned due to long development times and low demand.

Cross-state rival NVIDIA isn't sitting on idle hands though, either.   The company publicly announced plans to replace all 90nm G80 graphics cores with G92 derivatives by the end of the year.  G92's debut introduction, GeForce 8800 GT, met wild support from reviewers and analysts alike.  G92's second introduce, GeForce 8800 GTS 512MB, was met with similar but less enthusiastic acceptance during Tuesday's launch.

NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will also hit store shelves this Spring.  The chipsets -- codenamed D9E, D9M and D9P -- are architecturally different from the G80/G92 family.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Is Ati aware of what happened to 3Dfx?
By Lonyo on 12/13/2007 9:52:16 PM , Rating: 1
Are you aware of what happened to ATi?
Rage Fury MAXX anyone?
This is nothing new to any major graphics company, ATi included.
Also, the whole concept of dual chips is nothing particularly odd. Compare it to Intel vs AMD, with 2 dual core die on one package vs a single quad core die, although with graphics cards it's slightly less refined in terms of implementation.

RE: Is Ati aware of what happened to 3Dfx?
By thornburg on 12/14/2007 9:28:24 AM , Rating: 2
Good point.

Lots of people forget that ATI video cards used to be complete garbage. The entire Rage line was trash compared to 3Dfx's VooDoo cards and even nVidia's TNT line.

So why did ATI live to fight another day? Because all sorts of Gateways and Compaqs and Macs had ATI video cards in them. They were king of cheap video for the mass market.

Incidentally, for those who aren't aware, the Rage Fury MAXX was a 4 GPU video card in the days when people were getting away from the original SLI (VooDoo 2 supported SLI), in favor of bigger/better/faster single chips. I don't remember whether the Fury MAXX was competing with TNT 2 Ultra or GeForce (the first one), but whichever it was, it blew the Fury MAXX (and all other available cards) out of the water, because although 4 GPUs pushed a lot of pixels, it didn't have any innovative technology.

RE: Is Ati aware of what happened to 3Dfx?
By DublinGunner on 12/14/2007 11:37:17 AM , Rating: 2
Technically, out of all those you listed, only the Geforce is a GPU, so none of those would have competed with the GeForce.

None of the other cards performed transform & lighting on the card, the CPU handled that until the GeForce 256

By Flunk on 12/15/2007 4:30:32 AM , Rating: 2
The MAXX was actually released after the Geforce 256 to try and compete (even though it has no T&L support) and it did actually beat the Geforce in a few tests but godawful driver issues basically made the card an overpriced piece of crap.

Also, what the GPU handles as per the rendering pipeline has increased with each generation. Saying that something cannot compete because or lack of one feature is a crass oversimplification. Such as claiming that the Radeon X1950 XT is outperformed by the Radeon HD 2400 PRO because it lacks directx 10 support.

"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates

Most Popular ArticlesSmartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki