backtop


Print 60 comment(s) - last by 91TTZ.. on Jan 1 at 3:27 PM


AMD's January GPU launch includes the dual-GPU R680  (Source: AMD)
Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way

AMD's newest R680 graphics processor might look a whole lot like the ill-fated R600 GPU, but the reality couldn't be more bizarre.  Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two 55nm processor cores.

Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.

The RV670 graphics core, announced last November with the Phenom processor, is the first 55nm desktop graphics adaptor.  AMD does not target this card as a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's best product of 2007.

The company also made quick mention of the RV620 and RV635 GPU cores.  These cores are nearly identical to the previous RV610 and RV630 processors, but will be produced on the 55nm node instead. 

All three of AMD's new GPUs are scheduled to launch next month. 

Dual-GPU technology is not new.  3dfx's flagship Voodoo 5 family also resorted to multiple processors to achieve its relatively high performance.  ASUS, Gigabyte, Sapphire, HIS and PowerColor all introduced dual-GPU configurations of just about every graphics processor on the market, though these were never "sanctioned" ATI or NVIDIA projects.  Ultimately, all of these projects were canned due to long development times and low demand.

Cross-state rival NVIDIA isn't sitting on idle hands though, either.   The company publicly announced plans to replace all 90nm G80 graphics cores with G92 derivatives by the end of the year.  G92's debut introduction, GeForce 8800 GT, met wild support from reviewers and analysts alike.  G92's second introduce, GeForce 8800 GTS 512MB, was met with similar but less enthusiastic acceptance during Tuesday's launch.

NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will also hit store shelves this Spring.  The chipsets -- codenamed D9E, D9M and D9P -- are architecturally different from the G80/G92 family.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Is Ati aware of what happened to 3Dfx?
By AdamK47 on 12/13/2007 7:58:38 PM , Rating: -1
3Dfx got desperate in the performance race by using multiple VSA-100 chips. Ati is following the same path.




By Shadowmage on 12/13/2007 8:28:30 PM , Rating: 2
That's completely incorrect. Both companies are following the same path. NVIDIA's next high end will be 2xG92, like the 7950GX2.


By Targon on 12/13/2007 8:36:17 PM , Rating: 2
3Dfx was suffering from a lot more than just performance issues. Glide(the API that 3Dfx used and championed) was limited to 16 bit color, and that really hurt them compared to NVIDIA's 32 bit color and good performance. It wasn't that NVIDIA was so far ahead of 3Dfx in terms of performance, it was an issue of marketing and complacency on the part of 3Dfx.

Many people also forget that the majority of GPU sales come from integrated graphics and the low to mid range graphics products. As a result, if ATI/AMD can be competitive in the $200 and below range in terms of price vs. performance, it's really not as bad for Radeon graphics as many people seem to think.

There was also talk about BOTH ATI and NVIDIA moving to a multi-GPU approach as the way to scale graphics going forward. The reason for this is that since 3D graphics can be handled in parallel with almost infinite benefits(ok, one GPU pixel pipeline per displayed pixel being the limit), it really makes more sense. A good GPU core that shows up with 1 GPU for low end, and 4 to 8 for the high end would be a lot easier to deal with in terms of power and heat demands.

The real key is to make each GPU core work together for ALL applications. Right now, both SLI and Crossfire suffer from the multi-card setup not working for all applications, indicating some serious issues for both platforms.


RE: Is Ati aware of what happened to 3Dfx?
By Lonyo on 12/13/2007 9:52:16 PM , Rating: 1
Are you aware of what happened to ATi?
Rage Fury MAXX anyone?
This is nothing new to any major graphics company, ATi included.
Also, the whole concept of dual chips is nothing particularly odd. Compare it to Intel vs AMD, with 2 dual core die on one package vs a single quad core die, although with graphics cards it's slightly less refined in terms of implementation.


RE: Is Ati aware of what happened to 3Dfx?
By thornburg on 12/14/2007 9:28:24 AM , Rating: 2
Good point.

Lots of people forget that ATI video cards used to be complete garbage. The entire Rage line was trash compared to 3Dfx's VooDoo cards and even nVidia's TNT line.

So why did ATI live to fight another day? Because all sorts of Gateways and Compaqs and Macs had ATI video cards in them. They were king of cheap video for the mass market.

Incidentally, for those who aren't aware, the Rage Fury MAXX was a 4 GPU video card in the days when people were getting away from the original SLI (VooDoo 2 supported SLI), in favor of bigger/better/faster single chips. I don't remember whether the Fury MAXX was competing with TNT 2 Ultra or GeForce (the first one), but whichever it was, it blew the Fury MAXX (and all other available cards) out of the water, because although 4 GPUs pushed a lot of pixels, it didn't have any innovative technology.


RE: Is Ati aware of what happened to 3Dfx?
By DublinGunner on 12/14/2007 11:37:17 AM , Rating: 2
Technically, out of all those you listed, only the Geforce is a GPU, so none of those would have competed with the GeForce.

None of the other cards performed transform & lighting on the card, the CPU handled that until the GeForce 256


By Flunk on 12/15/2007 4:30:32 AM , Rating: 2
The MAXX was actually released after the Geforce 256 to try and compete (even though it has no T&L support) and it did actually beat the Geforce in a few tests but godawful driver issues basically made the card an overpriced piece of crap.

Also, what the GPU handles as per the rendering pipeline has increased with each generation. Saying that something cannot compete because or lack of one feature is a crass oversimplification. Such as claiming that the Radeon X1950 XT is outperformed by the Radeon HD 2400 PRO because it lacks directx 10 support.


By Spacecomber on 12/13/2007 10:24:08 PM , Rating: 1
Maybe the better parallel is that 3dfx went out and bought themselves a factory in Mexico so that they could start producing their own cards. Previously, they were just a chip-maker, and they let others put the cards together. I think this ended up being a money loser that they never recovered from, when nvidia turned up the heat with their geforce series of GPUs.


"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki