backtop


Print 60 comment(s) - last by 91TTZ.. on Jan 1 at 3:27 PM


AMD's January GPU launch includes the dual-GPU R680  (Source: AMD)
Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way

AMD's newest R680 graphics processor might look a whole lot like the ill-fated R600 GPU, but the reality couldn't be more bizarre.  Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two 55nm processor cores.

Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.

The RV670 graphics core, announced last November with the Phenom processor, is the first 55nm desktop graphics adaptor.  AMD does not target this card as a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's best product of 2007.

The company also made quick mention of the RV620 and RV635 GPU cores.  These cores are nearly identical to the previous RV610 and RV630 processors, but will be produced on the 55nm node instead. 

All three of AMD's new GPUs are scheduled to launch next month. 

Dual-GPU technology is not new.  3dfx's flagship Voodoo 5 family also resorted to multiple processors to achieve its relatively high performance.  ASUS, Gigabyte, Sapphire, HIS and PowerColor all introduced dual-GPU configurations of just about every graphics processor on the market, though these were never "sanctioned" ATI or NVIDIA projects.  Ultimately, all of these projects were canned due to long development times and low demand.

Cross-state rival NVIDIA isn't sitting on idle hands though, either.   The company publicly announced plans to replace all 90nm G80 graphics cores with G92 derivatives by the end of the year.  G92's debut introduction, GeForce 8800 GT, met wild support from reviewers and analysts alike.  G92's second introduce, GeForce 8800 GTS 512MB, was met with similar but less enthusiastic acceptance during Tuesday's launch.

NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will also hit store shelves this Spring.  The chipsets -- codenamed D9E, D9M and D9P -- are architecturally different from the G80/G92 family.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Expecting 3870 X2 to do well
By NullSubroutine on 12/13/2007 8:49:01 PM , Rating: 5
The 3870 really works well in X-fire right now keeping up and alot of times beating the 8800 GT in SLI (and sometimes the single GTX/Ultra despite costing less).

I would expect the 3870 X2 to perform equal to if not better than the 3870 X-Fire. I am not entirely sure the drivers will be there for 3870 X2 in X-fire (effectively quad-fire) but as time goes on I think multi-GPU card will become the standard (as it looks like the R700 is a multi-GPU design).

The G92 X2 will probably be pretty good as well, but X-fire seems to give more of a boost than SLI (in one article I saw it had 96% boost).

The end of 2007 has been exciting for video cards and it loos as though the start of 2008 will be as well.




RE: Expecting 3870 X2 to do well
By MonkeyPaw on 12/13/2007 10:22:13 PM , Rating: 5
Yeah, I think they may have no choice but to go multi-GPU soon. Both high-end DX10 solutions have 600-700 million transistors, quickly passing today's CPUs (where most of the transistors are cache anyway). If AMD and nVidia were to upstage these current DX10 cores with something considerably faster (as is the trend), they would likely be crossing the 1 billion transistor mark. Even with the best of today's process tech, yields would probably be awful on such monsters.

I guess technically speaking, the R680 should pass the 1 billion mark (1,332,000,000), but it will be doing it in an MCM-like fashion. I guess the main question is if drivers will ever make such solutions practical across all situations. It looks like we may have no choice!


RE: Expecting 3870 X2 to do well
By TMV192 on 12/14/2007 3:08:42 AM , Rating: 4
the drivers should be there at least by launch or soon after, and its not called Quad-Crossfire, as you see in the picture it's officially Crossfire-X


RE: Expecting 3870 X2 to do well
By maroon1 on 12/14/2007 9:20:01 AM , Rating: 1
quote:
The 3870 really works well in X-fire right now keeping up and alot of times beating the 8800 GT in SLI


8800GT SLI beats HD3870 crossfire a lot of times
http://www.hothardware.com/articles/ATI_Radeon_HD_...
http://techreport.com/articles.x/13772/3

Why you didn't mention that ?


RE: Expecting 3870 X2 to do well
By glynor on 12/14/2007 11:09:18 AM , Rating: 4
I would hope! Considering the price differential, and the single-card performance, the GT in SLI should beat the 3870 all of the time.

IMHO, it is a testament to the efficiency of the design of the Crossfire system that two slower cards combined can eek out a win occasionally over two faster cards combined.


RE: Expecting 3870 X2 to do well
By just4U on 12/14/2007 12:54:00 PM , Rating: 2
Will this new gpu be on a single die type thing? Or two chips on one board?


RE: Expecting 3870 X2 to do well
By KingstonU on 12/14/2007 1:16:28 PM , Rating: 5
The pictures show two chips on one board.

http://www.fudzilla.com/index.php?option=com_conte...


RE: Expecting 3870 X2 to do well
By just4U on 12/14/2007 1:33:32 PM , Rating: 2
Interesting. Thanks! I'd rate you up but heh .. im posting :)

My only real concern then is heat. A 200Watt card isn't optimal either I suppose but as they say that's the price of power. I wonder if it will be better then 2 crossfired 3870s... being on the same board and all.


RE: Expecting 3870 X2 to do well
By retrospooty on 12/14/2007 2:46:09 PM , Rating: 5
He did... He said the 3870 keeps up with the 8800GT and beats it alot of the time. that would imply that the 8800GT wins some too.

The point is that even though Nvidia has the upper hand with single card, crossfire is more efficient than SLI. 2 ATI cards in crossfire are 90+% faster than one ATI card. 2 NV cards in SLI are nowhere near 90% as fast as one NV card.


RE: Expecting 3870 X2 to do well
By Flunk on 12/15/2007 4:21:35 AM , Rating: 2
Seems like a good product. Hopefully it won't end up like the Rage 128 MAXX (Ati's last dual GPU card) and the drivers will be up to snuff immediately and kept up as well (Ati never did release drivers for the MAXX for anything over Windows 98).


hmm
By ttnuagadam on 12/13/2007 9:25:11 PM , Rating: 2
its about time they did this. with the inherent parallelism of graphics processing you would think that this would have happened before intel/amd started doing it with their CPU's




RE: hmm
By phusg on 12/14/2007 7:14:32 AM , Rating: 2
The latest GPUs are up to something like 320 stream processor, so actually they have been for quite some time. See the earlier post by MonkeyPaw, this move has more to do with manufacturing technology.


RE: hmm
By DallasTexas on 12/14/07, Rating: -1
RE: hmm
By Proteusza on 12/14/2007 11:25:26 AM , Rating: 5
Even if you use ray tracing instead of traditional lighting and shading, you still need to rasterize graphics to display them on the screen.

Rasterization refers to extracting colour information from the screen into discrete pixels, so they can be displayed on the screen.


RE: hmm
By DallasTexas on 12/15/2007 11:14:22 AM , Rating: 2
Really? I thought Raster Graphic images are data structures based on rectangles. Maybe we are talking about different things or someone needs to update the encyclopedia.


RE: hmm
By spluurfg on 12/20/2007 11:17:24 AM , Rating: 3
Rasterization is the process of translating vectors/triangles into pixels. It is only part of a modern day graphics engine, which includes a geometry/transform/clipping/lighting and also shading.

Consider the old ATI Fire GL4 which had independent processors for the geometry engine (IBM GT1000) and rasterizer (IBM RC1000).


Best of AMD is ATI
By 16nm on 12/13/2007 8:00:28 PM , Rating: 2
quote:
reviewers were quick to herald the RV670 as AMD's best product of 2007


I'm sure that does not make AMD too happy. I wish AMD had more to toot their horn about.




RE: Best of AMD is ATI
By ChronoReverse on 12/14/2007 12:30:58 PM , Rating: 5
Despite what AMD said about overpaying for ATI, ATI might be the only reason why AMD will survive. R600 was pretty bad, but R670 is great at its price niche. Since the price niches for the 3850 and 3870 are where the most cards are sold, coupling that with the smaller process will mean good profit there.


RE: Best of AMD is ATI
By skyyspam on 12/21/2007 10:36:01 PM , Rating: 2
I do agree there...at least, for the short term.

AMD is trading at 7.79 as of now. It hasn't been this low for years...

AMD's taken on a LOT of risk recently, what with their buying ATI, and making Hector Ruinz--err Ruiz--their CEO, and of course pumping out several flop-of-a-chips compared to Intel and nVidia.

However, I don't think AMD's done fighting yet. I really do think they'll come back strong sometime soon--maybe not CY2008, but I doubt that they'll go under. After all, they weren't exactly a smash hit against the early pentium chips, compared to their 486 cloning days. It was quite a while after that before they were successful again--I think it was in 2000-2001 when they came out with their first Athlon chips. After that we had the Tbirds and Durons of early 21st-century fame, which really spanked the early P4s. After that, Intel came back strong with their Northwood P4 cores, only to be spanked again a year or so later when AMD came back with the Athlon 64.

Anyway, before I go off on a complete tangent, my point is that AMD has had many dire eras in its existence. Historically, despite how bad things have gotten with AMD, it's always sprung back and shocked the world by crushing the competition, David-vs-Goliath-style.

Anyway, I seriously doubt that AMD is going to stay down forever. The odds are that it'll come back strong and scare the piss out of Intel yet again.

Might as well buy a few shares while it's down, cause I doubt it'll be this cheap for too long.


8800GTS 512MB > Two HD3850 in crossfire
By maroon1 on 12/14/2007 9:30:27 AM , Rating: 1
8800GTS 512MB (G92) performs better than two HD3850 in crossfire, specially when you run AA

Here is the proof
http://www.tweaktown.com/articles/1246/1/page_1_in...
http://techreport.com/articles.x/13772/3

So, there is no reason to get two HD3850 because you can get a better single card for the same price

Don't forget the fact that many games are not optimized for crossfire or SLI. So, it is better to have one high-end video card that two midranged video card




By ChronoReverse on 12/14/2007 12:27:30 PM , Rating: 2
I actually agree with you that 2x3850 is not a good buy but I should still point out that 2x3850 costs less than a single 8800GTS 512MB.

The real issue with 2x3850 is that the 3850 (a great card btw) has only 256MB of RAM. In Crossfire, that severely limits the potential.

A single card with 2 chips at 3850 level might not be as expensive because its one PCB, however, the card would still need 1024MB (that's right, 1GB) of RAM to be equivalent to a 512MB card.

Unless AMD (or maybe Nvidia) figures out some kind of clever trick to avoid the necessity of two sets of memory banks for SLI and Xfire, multiple chips still seem like a poor proposition.


RE: 8800GTS 512MB > Two HD3850 in crossfire
By therealnickdanger on 12/14/2007 12:35:24 PM , Rating: 2
I recently snagged a single HD3850 to replace my 7950GT. Oh my science. It is so much faster in every game, plus it has full VC-1 and MPEG4 off-loading, which is also great. Awesome $170 purchase.


By just4U on 12/14/2007 12:52:29 PM , Rating: 4
It should also be noted that Amd has a 3850 512Meg card, that runs you about $35 more then it's lower memory part. (not sure what the difference is for the euro's reading this)

I was rather surprise that the 3850 I currently have (new system) is doing so well compared to my 8800GTS 640 meg card. I bought it because there were no GT's, and no 3870s at the time and you know ... I am impressed. For those of you who game at 1280/1024 (or it's wide screen equivelent) it really is a damn good card.

I think both Nvidia and Ati dropped the ball on their mid range this past Q2.. Perhaps these Q4 launches have made up for that somewhat.


Waw ! What a big nose for a model !
By greylica on 12/14/07, Rating: 0
By Schadenfroh on 12/16/2007 1:29:39 AM , Rating: 2
Who is the lady in the pic, by the way?


RE: Waw ! What a big nose for a model !
By dice1111 on 12/18/2007 1:42:23 PM , Rating: 2
You need to look harder...
http://wingedmammal.com/action_photos_2005/IMG_191...
I definatly wouldn't say no to her! And definatly give an ATI box cover a second look!


By dice1111 on 12/18/2007 1:43:39 PM , Rating: 2
Die Size
By Zurtex on 12/14/2007 6:58:04 AM , Rating: 1
I find it interesting Intel is able to keep one step ahead of AMD on die size and AMD is able to keep one step ahead of nvidia on die size.

This may lead to some interesting situations in the Graphics and CPU market. Maybe ATi really needed AMDs help to stay ahead... Can't wait to see how it'll all play out :D.




RE: Die Size
By 91TTZ on 1/1/2008 3:27:13 PM , Rating: 2
It's actually completely unrelated.

Intel's process is more advanced than AMD's process, but AMD isn't producing these GPUs in their own fabs. They're using TSMC to produce their GPUs, just like NVidia is.


Wait...the picture...Ruby's REAL?
By yxalitis on 12/13/2007 9:18:54 PM , Rating: 2
Oh my...my fantasies CAN come true!




2nd gen UVD?
By phusg on 12/14/2007 7:56:10 AM , Rating: 2
This 2nd generation UVD, is this new, or did the 2x00 cards already have 2nd generation UVD?




By psychobriggsy on 12/14/2007 9:12:06 AM , Rating: 2
This is what AMD should have done with its darned CPUs a year ago.

Instead of selling 3800+ X2s for peanuts, they could have been selling equivalent clockspeed X4s for way more. 1.8GHz X4 in late 2006 anyone?

Sad really. AMD's technology makes this an obvious thing to do - connect up the second core to the first with HyperTransport (possibly running even faster because of the short on-package link). I hope AMD learns and makes dual-die octocores in 2008 once they've sorted out their issues with Barcelona.




Bang for the buck.
By Hieyeck on 12/14/2007 12:55:38 PM , Rating: 2
I've been reading alot of johnny petting for nVidia, and sure while nvidia may beat out AMD, everyone seems content to ignore the price. While I AM a gamer, I don't have $400 to blow on a video card. Simply put, ATI gets me better performance at the $200 range.




Efficiency - Turn off second core
By praeses on 12/13/2007 10:59:52 PM , Rating: 1
The first thing annoys me the most, if they managed to make the two cores share the same memory, then I could see it being great, but they all the other pictures reveal otherwise. There is also a fair amount of redundancy between the chips which of course increases the cost, but probably less than the additional engineering to thin out the die.

My one hope is that they shutdown the second core while not needed, like < 10watts. If its just idling there this card will disappoint me.




hate it
By Screwballl on 12/13/07, Rating: 0
RE: hate it
By Sifl on 12/13/07, Rating: 0
Power consumption at idle?
By 9nails on 12/13/07, Rating: 0
Is Ati aware of what happened to 3Dfx?
By AdamK47 on 12/13/07, Rating: -1
By Shadowmage on 12/13/2007 8:28:30 PM , Rating: 2
That's completely incorrect. Both companies are following the same path. NVIDIA's next high end will be 2xG92, like the 7950GX2.


By Targon on 12/13/2007 8:36:17 PM , Rating: 2
3Dfx was suffering from a lot more than just performance issues. Glide(the API that 3Dfx used and championed) was limited to 16 bit color, and that really hurt them compared to NVIDIA's 32 bit color and good performance. It wasn't that NVIDIA was so far ahead of 3Dfx in terms of performance, it was an issue of marketing and complacency on the part of 3Dfx.

Many people also forget that the majority of GPU sales come from integrated graphics and the low to mid range graphics products. As a result, if ATI/AMD can be competitive in the $200 and below range in terms of price vs. performance, it's really not as bad for Radeon graphics as many people seem to think.

There was also talk about BOTH ATI and NVIDIA moving to a multi-GPU approach as the way to scale graphics going forward. The reason for this is that since 3D graphics can be handled in parallel with almost infinite benefits(ok, one GPU pixel pipeline per displayed pixel being the limit), it really makes more sense. A good GPU core that shows up with 1 GPU for low end, and 4 to 8 for the high end would be a lot easier to deal with in terms of power and heat demands.

The real key is to make each GPU core work together for ALL applications. Right now, both SLI and Crossfire suffer from the multi-card setup not working for all applications, indicating some serious issues for both platforms.


RE: Is Ati aware of what happened to 3Dfx?
By Lonyo on 12/13/2007 9:52:16 PM , Rating: 1
Are you aware of what happened to ATi?
Rage Fury MAXX anyone?
This is nothing new to any major graphics company, ATi included.
Also, the whole concept of dual chips is nothing particularly odd. Compare it to Intel vs AMD, with 2 dual core die on one package vs a single quad core die, although with graphics cards it's slightly less refined in terms of implementation.


RE: Is Ati aware of what happened to 3Dfx?
By thornburg on 12/14/2007 9:28:24 AM , Rating: 2
Good point.

Lots of people forget that ATI video cards used to be complete garbage. The entire Rage line was trash compared to 3Dfx's VooDoo cards and even nVidia's TNT line.

So why did ATI live to fight another day? Because all sorts of Gateways and Compaqs and Macs had ATI video cards in them. They were king of cheap video for the mass market.

Incidentally, for those who aren't aware, the Rage Fury MAXX was a 4 GPU video card in the days when people were getting away from the original SLI (VooDoo 2 supported SLI), in favor of bigger/better/faster single chips. I don't remember whether the Fury MAXX was competing with TNT 2 Ultra or GeForce (the first one), but whichever it was, it blew the Fury MAXX (and all other available cards) out of the water, because although 4 GPUs pushed a lot of pixels, it didn't have any innovative technology.


RE: Is Ati aware of what happened to 3Dfx?
By DublinGunner on 12/14/2007 11:37:17 AM , Rating: 2
Technically, out of all those you listed, only the Geforce is a GPU, so none of those would have competed with the GeForce.

None of the other cards performed transform & lighting on the card, the CPU handled that until the GeForce 256


By Flunk on 12/15/2007 4:30:32 AM , Rating: 2
The MAXX was actually released after the Geforce 256 to try and compete (even though it has no T&L support) and it did actually beat the Geforce in a few tests but godawful driver issues basically made the card an overpriced piece of crap.

Also, what the GPU handles as per the rendering pipeline has increased with each generation. Saying that something cannot compete because or lack of one feature is a crass oversimplification. Such as claiming that the Radeon X1950 XT is outperformed by the Radeon HD 2400 PRO because it lacks directx 10 support.


By Spacecomber on 12/13/2007 10:24:08 PM , Rating: 1
Maybe the better parallel is that 3dfx went out and bought themselves a factory in Mexico so that they could start producing their own cards. Previously, they were just a chip-maker, and they let others put the cards together. I think this ended up being a money loser that they never recovered from, when nvidia turned up the heat with their geforce series of GPUs.


So..
By Comdrpopnfresh on 12/13/07, Rating: -1
RE: So..
By pattycake0147 on 12/13/2007 11:12:19 PM , Rating: 2
Why not try the 3870.


RE: So..
By sweetsauce on 12/14/07, Rating: -1
RE: So..
By just4U on 12/14/2007 1:50:35 PM , Rating: 1
It does however come close and is intended to be priced less. All in all it's a good time to buy a card if your on a previous generation as both companies have a couple of choices at their high mid range.

I personally thought that the launches by Nvidia and Ati were winners all around.


GeForce 9???
By marsbound2024 on 12/13/07, Rating: -1
RE: GeForce 9???
By marsbound2024 on 12/13/2007 8:10:23 PM , Rating: 5
I am guessing that the end of the article is talking about it with the parts to be released in the Spring. I wish NVIDIA would release more information.


RE: GeForce 9???
By 3kliksphilip on 12/14/2007 8:48:55 AM , Rating: 2
I get all excited about new graphics cards normally, waiting to see them rip benchmarks to shreds etc. However, since I bought a Geforce 8800 GTS at the beginning of this year, I've willed the progress to stop. And I guess it has, kind of. Though I agree, DX10 needs a GPU powerful enough to run games at over 60 fps on high resolution settings.

Can anybody name some big name games coming out to utilize this new technology? We went through a stage in 2004 with Far Cry, Half Life 2, Rome TW and Doom3. I wish that could happen again. I know the occasional new game springs up and impresses every one. I want a game with new, interesting technology! Like 3D water physics so a dam can burst and a town further down stream gets washed away, with you hiding in one of the buildings. Yes, I want strange things. But that's what moves technology on. (Who would have thought that watching falling corpses in Unreal Tournament 2003 would have been so FUN?)


RE: GeForce 9???
By Homerboy on 12/14/2007 9:00:40 AM , Rating: 2
Umm Crysis already makes current cards weep. Have you seen the 3xSLI review? Not even 40FPS. Im sure games like Alan Wake etc will do the same


RE: GeForce 9???
By 3kliksphilip on 12/14/2007 11:33:19 AM , Rating: 2
From what I've seen 3 x GPU isn't worth it, as hardly anything supports it effectively.

As this is in the Geforce 9 thread, I was suggesting that a Geforce 9 would be capable of running Crysis on highest everything. I'm not that interested in new Geforce 8 cards- I have one, and new ones only seem about 10 - 15 % faster, which is hardly enough to get excited over. I want a 256 pipeline, 1 GB (minimum) card to come out! It probably won't for over a year- Geforce 8's are going to be milked dry before the new ones come out. It's expensive business researching new cards and until ATI deliver something which will bring the Geforce 8's to their knees, what incentive do Nvidia have for releasing faster cards? All it will do is to reduce the amount of money they gain from the 8 series.

As much as I love AMD, I didn't like the way they released the X2 series and kept the prices at about £200 for over a year until the core 2 duos came out. I know it's the same as Intel did when they had the upper hand and AMD would be struggling even more if they hadn't cashed in on it, but businesses are businesses and competition can only be a good thing. Hopefully these new ATI cards are something super special and it will lead Nvidia to release their new series.


RE: GeForce 9???
By retrospooty on 12/13/2007 9:49:04 PM , Rating: 2
The next high end card is coming in January. its a dual GPU solution like the ATI high end... basically it is 2 8800GT's SLI on one card just like the 7950GTX was. It will be the high end card until the Geforce 9 (or whatever they name it) series is introduced later this year. A date has not yet been set for Geforce 9.


RE: GeForce 9???
By cochy on 12/13/07, Rating: -1
RE: GeForce 9???
By retrospooty on 12/13/2007 11:25:16 PM , Rating: 2
I doubt it, the dual 8800GT is releasing in january as the high end card. The G9 would kill its sales.

It may be ready, but NV will milk it until ATI has a competitive product on the high end.


RE: GeForce 9???
By cochy on 12/13/2007 11:28:09 PM , Rating: 2
I was hoping to upgrade around Feb too. I was looking forward to a next-gen upgrade. Might have to settle for one of these now.


RE: GeForce 9???
By Omega215D on 12/14/2007 12:29:01 AM , Rating: 2
I just bought an 8800GT at Best Buy for $260 after taxes. Before I know it the next gen stuff will be here and beat the pants off the GT. Oh well such is life dealing with technology.


RE: GeForce 9???
By Goty on 12/13/2007 9:50:32 PM , Rating: 2
NVIDIA really doesn't have any incentive to introduce a new line of video cards right now since ATI still can't compete at the high end.


"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot




Latest Headlines
4/21/2014 Hardware Reviews
April 21, 2014, 12:46 PM
4/16/2014 Hardware Reviews
April 16, 2014, 9:01 AM
4/15/2014 Hardware Reviews
April 15, 2014, 11:30 AM
4/11/2014 Hardware Reviews
April 11, 2014, 11:03 AM










botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki