Print 90 comment(s) - last by Goty.. on Jun 19 at 1:48 AM

NVIDIA GTX 280 3-Way SLI  (Source: NVIDIA)

The GPUs in the series include the GTX 280 and GTX 260

NVIDIA launched a new family of GPUs today called the GTX 200 series. Within the series there are currently two GPUs -- the GTX 280 and the GTX 260. The NVIDIA GTX 280 is now the flagship GPU from NVIDIA and sits in the line above the 9800 GX2.

NVIDIA is stressing with the new GTX 200 family that the GPUs go beyond gaming and are one of the most powerful processors in a PC and can be used for rendering video and other functions. NVIDIA says that its goals with the architecture of the GTX 200 series were to design a processor twice as powerful as the GeForce 8800 GTX, rebalance the architecture for future games with more complex shaders and more memory, improve efficiency per watt and per square millimeter, provide enhanced CUDA performance, and add a significant reduction in idle power requirements.

NVIDIA says that the GTX 200 line provides nearly a teraflop of computational power. The GTX 200 family also offers support for PhysX powered physics processing right on the GPU. Both the new GTX 280 and GTX 260 support SLI and 3-way SLI. The previous NVIDIA 9800 GX2 could not support 3-way SLI.

Key features in the new GTX 200 GPUs include support for three times the number of threads per flight at any given time. A new scheduler design allows for 20% more texturing efficiency. The memory interface for the GPUs is 512-bit (GTX 280) and full-speed, raster-operation (ROP) frame blending is supported. The GTX 200 series also features twice the number of registers for longer and more complex shaders and IEEE754R double precision floating-point. The GTXC 200 line also supports 10-bit color scan out via the DisplayPort only.

One of the main goals with the GTX 200 line was improved power management. Both the GTX 200 series GPUs have idle power requirements of about 25W; during Blu-ray playback power requirements are around 35W; full 3D performance requirements vary with the most power needed being 236W (GTX 280). The GTX 200 line is compatible with HybridPower, which makes the power needs of the GPU effectively 0W.

The GTX 280 is built on a 65nm process and has 1.4 billion transistors. The stock video cards have a graphics clock of 602 MHz, processor clock of 1,296 MHz, and a memory clock of 2,214 MHz. The GTX 280 has 1GB of GDDR3 and 240 processing cores and 32 ROPs.

The GTX 260 is also built on the 65 nm process and has 1.4 billion transistors. The graphics clock for the GTX 260 is 576 MHz, the processor clock is 1,242 MHz, and the memory clock is 1,998 MHz. The memory interface on the GTX 260 is 448-bit and it has 896MB of GDDR3 memory. The GTX 260 has 192 processing cores and 28 ROPs. The maximum board power is 182W.

Both video cards will support PhysX processing on the GPU. NVIDIA purchased Ageia in early 2008.

GTX 280 video cards will be available tomorrow for $649 and the GTX 260 cards will be available on June 26 for $399.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Great Achievement
By Mitch101 on 6/16/2008 3:21:27 PM , Rating: 5
I think its a great achievement however I feel its in desperate need of a die shrink to be successful.

Fast might possibly be faster than ATI's 4870 in GPU power.

High Price
High Heat
No DX10.1
Low yields so its going to be hard to find to purchase.
Requires Extra power and connection which some might not have requiring a power supply purchase.

I know this will certainly merit a -1 thread rating because of fanbois of NVIDIA however I think this would be a great chip when it die shrinks even with the lack of DX10.1

RE: Great Achievement
By gyranthir on 6/16/2008 3:38:22 PM , Rating: 1
No DX 10.1 isn't a big loss because it does support the only real useful part of DX10.1 already.

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The requiring a 6 pin and an 8 pin, annoys me greatly, as my beast of a computer will require a brand new PSU to be able to use one. My 8800GTX will have to suit me for a while because I don't have $850 to waste on a new video card and a new PSU....

RE: Great Achievement
By JLL55 on 6/16/2008 3:46:03 PM , Rating: 4
WNo DX 10.1 isn't a big loss because it does support the only real useful part of DX10.1 already.

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The requiring a 6 pin and an 8 pin, annoys me greatly, as my beast of a computer will require a brand new PSU to be able to use one. My 8800GTX will have to suit me for a while because I don't have $850 to waste on a new video card and a new PSU....

If you continued to read TA on anandtech you would realize that they basically said BS to that comment, however, I completely agree with you on that last part. Requiring those extra pins is a big PITA cause if I wanted to stick it in my machine, I would have to get a new PSU besides the purchase of the new video card. I can't wait to see the 4870 X2 and pray that it can be used without a new PSU

RE: Great Achievement
By FITCamaro on 6/16/2008 3:53:06 PM , Rating: 2
Pretty sure the 4870X2 is going to require 2 8-pin connectors.

RE: Great Achievement
By Warren21 on 6/16/2008 5:11:42 PM , Rating: 2
I don't think it will. The vanilla HD 4870 may come with 2 x 6-pin PEG connectors, but it certainly doesn't need them, rather they are for more clocking headroom. A better way to think of it is HD 4850 X2 with a little bit of a OC. The 4850's only require 1 6-pin, so I would assume 1 75W 6-pin PEG + 1 150W 8-pin PEG + PCIe 1.1 75W slot (must be backwards compatible) = 300W should be enough.

RE: Great Achievement
By gyranthir on 6/17/2008 11:14:48 AM , Rating: 2
Brilliant to rate me down, for explaining Nvidia's point of view as to why they excluded DirectX 10.1,

Let's not forget that DX 10.1 is vista only.

When something like 75-80% of gamers aren't using Vista, because they see ZERO benefit from it. So the idea to not use DX10.1 isn't really that big of a stretch anyway.

Latest hardware + lamest software = LOL for gaming.

RE: Great Achievement
By Mitch101 on 6/17/2008 1:19:55 PM , Rating: 3
No problem here running Vista and Gaming on it. So maybe I don't get 130FPS anymore but my LCD monitors refresh rate wouldn't allow it anyway. Its not like Linux would run Crysis any better the engine exceeds todays hardware abilities and Im not sure its necessary to improve the graphics that high after seeing Cinema 2.0

Everyone knows that DX10.1 gave ATI hardware in Assasins Creed a what 25-35% speed increase on some Anti-Alias modes. It didnt put the ATI lightyears ahead of NVIDIA but it gave it enough of a boost to make the cards relatively the same. By that I mean you can play it on either system with the same level of visual quality on either.

Seriously if Assassins Creed was broken using DX10.1 why doesn't anyone have a screen shot of one of those game play reducing images yet we hear about but haven't seen proof of? If the DX10.1 was so broken where are all the screen shots and complaints to prove it? Seems there are more complaints to put DX10.1 back than there were of anyone screaming the visuals are broken.

RE: Great Achievement
By MrBlastman on 6/16/2008 3:39:08 PM , Rating: 5
If all games utilized Open GL, then we wouldn't be bickering over 10.1 would we? :)

Not to mention in the modding community we wouldn't have such strict limitations and hurdles to overcome as we do if say we were to take a DX 7 game and port it to DX 9.

Oh, and the cost - I think at some point something is going to need to rock the trend of 600+.00 video cards - to the lower direction.

RE: Great Achievement
By FITCamaro on 6/17/2008 10:00:17 AM , Rating: 5
No we'd be bickering over the latest and greatest OpenGL features.

RE: Great Achievement
By FITCamaro on 6/16/2008 3:51:12 PM , Rating: 4
The performance of these cards right now definitely does not justify the price. A pair of 8800GTs in SLI routinely outperforms even the GTX 280. Cost of a pair of 8800GTs is $300-350.

In the single card world, yes these are good parts. But if you can get SLI, the 8800GT or GTS is a far better value. And yes it will be interesting to see how the 4850 stacks up against these two.

RE: Great Achievement
By AssassinX on 6/16/2008 4:30:40 PM , Rating: 5
Yeah, IMO, it's kind of a win-lose/lose-win situation with SLI.

Create a new video card that is more powerful than the top of the line video card in SLI mode and people will say that SLI is useless and just buy the new video card.

Create a new video card that is less powerful than the top of the line video card in SLI mode and people will say it is not worth the money.

I realize price is a huge factor in determining if it is worth it or not but really if Nvidia did drop the price of the GTX 200 family then we will have people saying again to screw SLI and just go for the new card. Obviously this is just my opinion but it seems like almost any tech forum site swings back and forth on the subject of SLI. Although, I do feel it is interesting to see SLI as a viable upgrade option since it TYPICALLY SEEMS like it is a waste of money, power/heat and space.

RE: Great Achievement
By FITCamaro on 6/16/2008 4:47:14 PM , Rating: 3
If even the GTX 260 performed at the level of two 8800GTs in SLI for $300-325, I'd consider it a good buy. But when its about 33% more money and around 10-15% slower, its not.

When you look at the fact that the GTX 280 is over double the cost and doesn't perform even identical most of the time, its definitely not a good buy.

The only time it'd seem reasonable to run it is if you don't have a dual slot board. But in my mind if you can afford a $650 graphics card, you probably have a motherboard with dual 16x slots.

RE: Great Achievement
By Parhel on 6/16/2008 4:56:33 PM , Rating: 2
. . . its definitely not a good buy.

I wish I could disagree with you there. I guess that nvidia's yields were so bad that the only choice was to release them at far beyond the price/performance levels of other options.

It's a very impressive chip. If the die shrink allows them to raise clock speeds, reduce heat and power, and lower the price it might be worth purchasing. Hopefully they will execute on that. That said, if I were looking for a card right now, I'd be anxiously awaiting AMD's new lineup.

RE: Great Achievement
By Adonlude on 6/16/2008 6:13:06 PM , Rating: 5
Holy heatsinks Batman! The only powersupplies certified to run SLI with these bad boys are 1200W or above. I have a single 8800GTX in my machine and it literally heats my small room. In the summer it gets pretty hot and I always have to have my window open and my door open to get air flow through my room.

I am amused that I would literally need to install an A/C unit into my room if I wanted to SLI these suckers.

RE: Great Achievement
By ImSpartacus on 6/16/2008 5:20:49 PM , Rating: 2
I agree. I never thought about what would happen if nVidia released a product that was inferior to two last gen GPU's in SLI.

I actually have an 8800GT, and I may wait until 8800GT's go down in price and get another.

RE: Great Achievement
By paydirt on 6/17/2008 8:51:21 AM , Rating: 2
Huh? Maybe you're being sarcastic? Recent history (past 3-4 years?) shows that nVidia releases a single card that underperforms two lesser gen cards in SLI and the two lesser gen cards are less expensive combined than the new single card. This isn't a new concept, not sure why people are reacting the way they are, or why people think things should have been different this time.

RE: Great Achievement
By FITCamaro on 6/17/2008 9:00:23 AM , Rating: 3
I don't think the prices on 8800GTs are going to get much lower. I mean you can get an ECS 512MB 8800GT with the AC Accelero S1 on it for $145 right now. The heatsink is $30 so you're paying $115 for the GPU.

RE: Great Achievement
By Reclaimer77 on 6/16/2008 9:07:50 PM , Rating: 4
Good points.

Personally as a PC gamer I would like to see SLI not become the standard by which every game will be required to run to enjoy it.

I mean, Crisis isn't even that good of a game. Its certainly not revolutionary. Its just one game that someone decided to make at the time when hardly any system could run it at good frame rates. But everyone is running around acting like you need the two hottest video cards in SLI to enjoy high frame rates on the PC. I just don't get it.

Video cards are expensive enough. I enjoy building my own systems and buying all the components, with the emphasis on performance. But I would hate to see the day where your pretty much required to buy two of the hot video cards in SLI to enjoy PC gaming.

RE: Great Achievement
By SlyNine on 6/16/2008 11:56:56 PM , Rating: 2
You can play Crysis with a X800XT. Why people insist you need the best of the best in videocards x2 to play this game is beyond me. Heres a clue, turn down the graphics. Unless you are implying that the Devs should just turn the graphics down for you so no one elses game can look better ?

As far as gameplay thats pure opinion. I loved Crysis and thought it was a great game with a bad ending, But alot of games and movies end up that way.

RE: Great Achievement
By othercents on 6/16/2008 4:51:29 PM , Rating: 2
9800GX2 outperforms this card too and is running ~$480.


RE: Great Achievement
By Cunthor666 on 6/16/2008 7:14:05 PM , Rating: 2
...I'm still not too keen on buying two of something just to play new games for next six months.

I wonder how much does it cost to actually produce a video card these days, because the price for something that *still* can't give you decent frames at max settings on 24" screen is beyond justification for an average gamer like myself.

RE: Great Achievement
By Silver2k7 on 6/17/2008 2:53:22 AM , Rating: 2
I agree 75-80 fps @ 1900x1200 should be required of a flagship video card.

RE: Great Achievement
By FITCamaro on 6/17/2008 9:02:20 AM , Rating: 1
Why when the most the human eye can see is 60 fps?

RE: Great Achievement
By ali 09 on 6/17/08, Rating: 0
RE: Great Achievement
By FITCamaro on 6/17/2008 9:03:19 AM , Rating: 2
I am going off the review numbers here at Anandtech. Perhaps you should read the article. If you have an issue with their numbers, take it up with them.

RE: Great Achievement
By GhandiInstinct on 6/16/2008 4:48:55 PM , Rating: 2
Nvidia released the unappealing 9800 followed by the underwhelming GTX200.

It's safe to say we're back to the Geforce days where there's no competition and a company becomes stagnant.

I'm skipping the GT for their 2009 product. And will be a nice performing 9800x2 once it goes under $400.


RE: Great Achievement
By VahnTitrio on 6/16/2008 5:08:05 PM , Rating: 2
You forgot a pro (looking at that 3 way SLi):

World's best multipurpose space heater (well, con if you live in a warm environment).

1200w needed for SLI?
By DeepBlue1975 on 6/16/2008 4:02:37 PM , Rating: 2
I think this is the time when they should start thinking on power consumption as hard as they do about the GPU's performance.

RE: 1200w needed for SLI?
By FITCamaro on 6/16/08, Rating: -1
RE: 1200w needed for SLI?
By freaqie on 6/16/2008 4:37:10 PM , Rating: 4
still having a 800+ psu is no waste there
the effieciency of the psu goes down as one approuches its maximum output. so does its life expectancy.
1.2kw is overkill yeah.
but a 880 watt or even 1kw is not. especially when one starts overclocking the cards and or CPU.
also i want to have some headroom when i buy a new pc.
so i can upgrade later (if needed)

RE: 1200w needed for SLI?
By theapparition on 6/17/2008 11:41:30 AM , Rating: 2
the effieciency of the psu goes down as one approuches its maximum output. so does its life expectancy.

Pure FUD.
The effficiency of any power supply is greatest at max output. Anyting less than max output and the efficiency drops.
As for the life of the components, running them at rated levels does not affect life. Heat, cycling (cold-hot), and running time affect life. All else equal, a PS running at max capacity will last just as long as one running at half capacity.
Information from my own experience and confirmed in Telcordia SR-332 and MIL-HDBK-217.

RE: 1200w needed for SLI?
By theapparition on 6/17/2008 11:43:26 AM , Rating: 2
I should also note, that every power supply manufacturer rates efficiency at max output as well.

RE: 1200w needed for SLI?
By Warren21 on 6/16/2008 5:17:35 PM , Rating: 3
Read the Anand review. Their OCZ WhateverStream 1000 would not post with GTX 280 SLI. NV recommends 1200W models on their supported PSU list.

RE: 1200w needed for SLI?
By Ringold on 6/16/2008 7:14:20 PM , Rating: 2
If an OCZ 1kw PSU can't even post with a system at idle, that clearly must indicate that the PSU is crap. Yes, maybe that indicates the GPUs are gulping power, but the PSU should be stable up to its full rated load.

Is that OCZ a rebadged lower quality unit, or is it 1000w 'peak'? Something must be cheap about it.

RE: 1200w needed for SLI?
By SlyNine on 6/17/2008 12:03:41 AM , Rating: 2
During post is one of the most demanding times on a powersupply. Iv seen many systems that wouldnt post but after hitting the rest button it starts up fine ( most likely because the harddrives and every thing has spun up).

Not that I recommended them run with their systems like that. But try convincing somone they need to spend 100$ when their computer is running fine other wise.

RE: 1200w needed for SLI?
By Silver2k7 on 6/17/2008 2:58:53 AM , Rating: 2
can't post wit a 1KW PSU gah.. is that really SLI or Tri-SLI ??

Ive never really known much about the OCZ PSU's.. but this does not sound so good.. but I bet their recent aquisition 'PC Power & Cooling' are much better and they have a 1200W something PSU.

RE: 1200w needed for SLI?
By DeepBlue1975 on 6/16/2008 10:57:18 PM , Rating: 2
My main point was that a single gtx 280 grabs 23 more watts (almost 10%) than a 9800gx2 while performance is usually lower and yet it'll set you back an extra $150.

Not the best price/performance/power draw combo out there if you ask me :D

RE: 1200w needed for SLI?
By SlyNine on 6/17/2008 12:12:49 AM , Rating: 1
Every since the 9700 pro required a power connecter, we've been on this path to inevitability. Every generation of video cards since have required more power and bigger faster fans, and even then are producing more heat. The 1900XT and its big copper heatsynk ran at 3dspeeds at around 80C. Now the 8800GTs run at around 95-100C.

It was only a matter of time before they either hit this wall or just started requiring what no user would be willing to buy.

The wall I'm talking about is the wall that we hit when the 8800GTX was first released how many years ago now ???? And since then what have we seen, a card with 80% performance increase. 4 times the power, and I have no clue the heat that sucker must put out.

Unless things evolve fast in the manufacturing process it could be a while before we see 4x performance of the previous generation in 6 months like we did with the 9700pro over the TI4600. Because we can no longer just double the power requirments, heat of the GPU and HSF size.

RE: 1200w needed for SLI?
By FITCamaro on 6/17/2008 10:09:02 AM , Rating: 2
Well my 8800GTSs run under 65C at full load. :)

I do think you're right though. GPUs eventually are going to hit a wall because of the power requirements. A wall outlet can only put out so much power.

Unfortunately though to make things faster, you have to add more transistors. More transistors means more power. More power means more heat.

GPUs have far more transistors than CPUs in terms of transistors that are actually for logic. The fast majority of a CPUs transistor count these days is for L2 cache. At least on Intel CPUs.

This all said, I am interested to see where ATIs 4xx0 series fits in.

RE: 1200w needed for SLI?
By DeepBlue1975 on 6/17/2008 10:40:18 AM , Rating: 2
That's were die shrinks come to help.
Problem is, as Nvidia is a fabless company, they have to rely on someone like TSMC, that has its own timing and investment strategy and that can certainly imply not upgrading the process facility at the time a customer like Nvidia could need it the most.

As for the wall's power outlet, that's far from being a problem yet but if 2000w PSUs start appearing on the PC market, I guess more than one will have to face the choice of wether to turn on the aircon or the PC :D

RE: 1200w needed for SLI?
By SlyNine on 6/17/2008 9:55:10 PM , Rating: 2
I wonder, Since AMD does have that. Could that help ATI's devision. Or is it somthing that just doesnt translate in to GPU's or they wouldnt have enough resorces to produce both.

Before I thought TSMC did both AMD and ATI.

RE: 1200w needed for SLI?
By SlyNine on 6/17/2008 9:50:50 PM , Rating: 2
Not sure who rated me down. But they need to double check what they know. While not all 8800GT's may run at 95C under load. Many do run at 95 and the card doesnt even start to clock its self down untill 110C. EVGA first 8800GT's didnt even spin the fan up untill they hit around 90C. The card's are supposed to run at up to 100C.

RE: 1200w needed for SLI?
By Das Capitolin on 6/17/2008 9:06:21 PM , Rating: 2
Where the hell do you get that figure? Under full load, this card consumed almost 200W. Take two of them for SLI, and you're up to 400W in video cards. 3-Way SLI moves it to 600W.

So even if you run 3-way SLI, you're still safe with an 800W PSU for your entire system.

Just curious
By GDstew4 on 6/16/2008 3:32:07 PM , Rating: 2
Has nVidia mentioned why they chose to stick with 10 and not go with 10.1 for this GPU family?

RE: Just curious
By Doormat on 6/16/2008 3:34:33 PM , Rating: 2
Ars had a quote from nvidia along the lines of "game developers didn't find it useful enough for us to include it in the chip".

RE: Just curious
By gyranthir on 6/16/2008 3:39:30 PM , Rating: 2

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

RE: Just curious
By JLL55 on 6/16/2008 3:49:30 PM , Rating: 2
As mentioned above, NVIDIA came up with a reason and anandtech said uh, BS. Anandtech further tried to reason out why they would not and they came up with some interesting ideas. I would recommend reading:

Its a very long article but very interesting IMHO.

RE: Just curious
By gyranthir on 6/16/2008 4:03:15 PM , Rating: 2
Right but the question was, Has Nvidia mentioned why....

That is their reasoning behind it. Also, I don't really agree with all of Anand's reasoning. Most of that stuff, although cool, isn't terribly useful, and does not improve performance by more than like 1/10 of a percent.

Lots of developers are barely catching up to 10, let alone 10.1 and they have to make them backwards compatible too, which gives them more issues to deal with.

RE: Just curious
By Chadder007 on 6/16/2008 4:34:30 PM , Rating: 2
Was there a game out that supposedly supported 10.1 that ran faster on ATI's cards since they have supported it? I think NVidia (supposedly) then got with the game developers and then patched the game to run only in 10 mode which made the game slower on the ATI cards.
I believe the game was Assassin's Creed.


RE: Just curious
By Polynikes on 6/16/2008 5:15:35 PM , Rating: 2
It was revealed that the reason for the faster performance in DX10.1 was because of the way it was implemented in the game. They basically forgot some stuff so since there was less load it had higher framerates. They got rid of DX10.1 support for the game because they didn't implement it correctly.

RE: Just curious
By Polynikes on 6/16/2008 5:16:26 PM , Rating: 1
Forgot to add: It had nothing to do with influence from Nvidia.

RE: Just curious
By ninjaquick on 6/17/2008 11:41:47 AM , Rating: 2
Give us a quote. I saw some screens and it looked fine. It ran faster on dx10.1 hardware and the incompatible nvidias suffered. So gimme some quotes that prove the move back to dx10 was just a bug fix, and id also like to see the bad code.

RE: Just curious
By ninjaquick on 6/17/2008 11:43:46 AM , Rating: 2
Oh, and dont give me no performance gains without aa, since dx10.1 is supposed to show performance gains with 4xAA on. (its an API hardware requirement)

RE: Just curious
By Polynikes on 6/17/2008 1:44:04 PM , Rating: 2
However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.

Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.

As for your retarded demand that I give you code, well, I don't work for Ubisoft.

RE: Just curious
By kilkennycat on 6/17/2008 5:40:58 PM , Rating: 2
The reasons quoted by nVidia for not going with Dx10.1 appear to be mere platitudes to calm consumer purchasers. The focus for the release of the GTX280/260 does not appear to be the desktop gamers at all -- if PC gamers with lots of spare cash and the "need to have the best" buy them, it sure helps cover the development cost. And the retail availability 280/260 will fend off competition for that money from ATi. The true focus of the 280/260 is on the non-consumer crowd that needs massively-parallel processing on the desktop -- the Tesla initiative, the CUDA users. No need for any increment in DX10 for these folk... The GTX280/260 removes some very significant time-consuming computational barriers for these users. Notice the double-precision data paths and significant additional on-board processing horse-power. A very rapidly-growing and extremely profitable chunk of nVidia's business. No doubt nVidia has its true next-gen (Dx10.x etc) GPU/GPGPU family in development. Expect to see it some time early next year.

A die-shrink of the 280/260 for either the TSMC 55nm or 45nm process is highly likely before the end of 2008. The new family of GPU/GPGPUs then will no doubt be introduced on the same process. A variant of Intel's "tic-toc" strategy.

Impressive.... But not the GT200
By pauldovi on 6/16/2008 4:30:18 PM , Rating: 1
Remember, the G80 came out in November 2006 and it still hasn't been topped by either Nvidia or ATI. That speaks very highly of the achievement that was the G80. Now ATI and Nvidia need to step it up. Yes the G92 a great improvement in cost and heat, but I honestly expected more from 1.5 years of development.

RE: Impressive.... But not the GT200
By Parhel on 6/16/2008 4:44:12 PM , Rating: 2
Remember, the G80 came out in November 2006 and it still hasn't been topped by either Nvidia or ATI.

What do you mean? These cards beat the G80 soundly.

RE: Impressive.... But not the GT200
By pauldovi on 6/16/2008 7:25:24 PM , Rating: 2
You have to look at the overall picture and not just the top score.

2 x 8800GT's are cheaper than and beat the GT280.

RE: Impressive.... But not the GT200
By Lightning III on 6/16/2008 8:57:54 PM , Rating: 2
an 8800gt is really a g92 part

By pauldovi on 6/16/2008 10:14:57 PM , Rating: 1
Which is nothing but a 65nm G80.

RE: Impressive.... But not the GT200
By just4U on 6/16/2008 10:32:42 PM , Rating: 2

Remember, the G80....

I didn't accually expect all that much more over the past 1.5 years. It's hard to top what they did with the G80 in such a short time frame. It's one of those cards that comes out every 5 years or so.. after that it's speed bumps until the next great design comes along.

RE: Impressive.... But not the GT200
By SlyNine on 6/17/2008 12:17:49 AM , Rating: 4
The 9700Pro was easly one of those cards, every since then we have just tripled the size of heat synk, doubled Power demands, and incressed heat the card runs at.

We are at a point where that is no longer acceptable. what are they going to do release a card that doubles Power demands, and incresses heat the card runs at, and use a 3 slot GPU cooler to get 4x the performence agian, and even if that worked for this generation what about the next one.

RE: Impressive.... But not the GT200
By just4U on 6/18/2008 4:50:32 PM , Rating: 2
I agree with you.

ALTHO ... some of those new coolers (3870, 8800 GTS/X line do look kinda funky. Peoples eyes get big and round when I say...

... and that's your video card!)

The high price makes sense...
By Brian23 on 6/16/2008 8:34:50 PM , Rating: 1
Everyone is complaining about the high price of these chips, but if you look at the numbers, you'll see that Nvidia isn't making as much money on them as you think.

Here are some example calculations:
1. We'll assume that yeilds are 100% for sake of argument. I know this isn't right, but I don't know what they actually are.
2. We'll assume that we're using 300mm wafers.
3. We'll assume that the wafer manufacturing costs are equal to the retail price of the product. (ie it costs $650 to produce the 1 die of a GT280 and the cost of all other materials is 0) I know this isn't true, but it's to illustrate a point.

One 300mm wafer can hold 105 GT200s, 630 Penryns, or 2,500 atoms.

Assume a Penryn costs $189 and an atom costs $50.

105 GT200 x $650 = $68,250
630 Penryn x $189 = $119,070
2,500 Atom x $50 = $125,000

As you can see from the above numbers it is much more profitable for TSMC to make something else than the GT200. When you factor in the facts that the GT200 will have lower yeilds and it takes a lot more than a peice of silicon to make a video card compared to just a CPU, you'll see that Nvidia really isn't making much money here.

RE: The high price makes sense...
By Parhel on 6/17/2008 12:15:38 AM , Rating: 2
Also Intel will sell more Penryns than Nvidia will sell GT200s, probably by a few orders of magnitude. Nvidia needs to make more money per chip sold, and there isn't anything wrong with that. If companies couldn't make money off GPUs, they couldn't stay in business making them, and we'd be some pretty unhappy gamers.

That said, $650 is too much to charge for a card that can be nearly matched with a pair of $140 8800GTs. I don't fault them for the price of the card. I fault Nvidia for designing a next-gen card that cut the price/performance ratio of the previous generation in half. From a business perspective, I fault them for diluting the market and cannibalizing their own lineup with cheap kick-ass cards mere months before introducing their uber-expensive next-gen card. With cheap 8800GTs on the market, who in their right mind would buy this?

RE: The high price makes sense...
By Maskarat on 6/17/2008 4:24:29 AM , Rating: 2
And how's all that the consumer's problem???

Nvidia is trying to sell a hot, not so good, expensive chip!! That's nvidia's problem .. not the consumer's problem! Why should the consumer carry around nvidia's problem?

RE: The high price makes sense...
By Brian23 on 6/17/2008 7:03:35 AM , Rating: 2
I'm not saying it's consumer's fault, I'm just pointing out that the price isn't unreasonable for what you're getting. as far as silicon goes.

That doesn't change the fact that the G80 and G92 are a real bargain.

Nvidia saw a market for a new high end chip and took advantage of it. The thing is, if you want one of these enough, you'll pay for it. If not, then you're not part of the market that Nvidia is targeting.

By VooDooAddict on 6/17/2008 5:51:22 PM , Rating: 2
All that said it's better they produce a few of these for sale to recoup some costs. Then concentrate on getting the new GPUs out the door.

I hope we'll simply see NVIDIA spend less on marketing this generation and more on accelerating the die shrink.

RE: The high price makes sense...
By MAIA on 6/17/2008 12:06:01 PM , Rating: 2
The real problem never was about nvidia profiting with this chip. The real problem is the price you have to pay for it when comparing with similar performers and their lower price.

As a consumer what matters the most is the money you have to pay, not what the company profits.

RE: The high price makes sense...
By MrBungle123 on 6/17/2008 12:23:23 PM , Rating: 2
Here are some example calculations:
1. We'll assume that yeilds are 100% for sake of argument. I know this isn't right, but I don't know what they actually are.
2. We'll assume that we're using 300mm wafers.
3. We'll assume that the wafer manufacturing costs are equal to the retail price of the product. (ie it costs $650 to produce the 1 die of a GT280 and the cost of all other materials is 0) I know this isn't true, but it's to illustrate a point.

My dad works at a 200mm TSMC fab... according to him the 200mm wafers cost $1600 each. Assuming that price level..

A = pi*r^2
31400mm^2 = 3.14 *(100^2)

So there is 31400 square mm of space on a 200 mm wafer. the GT 200 chip is 576mm^2 so that means that they can get 54 chips out of a 200mm wafer, however because of a combination of yields and the fact that the chips are square and the wafers ar round that number should be mulitplied by .80 so that gives us 43 chips per 200mm wafer.

$1600 / 43 = $37.20 per chip.

By the time its all said and done there is maybe $100 worth of parts in those cards so don't feel too sorry for nVidia.

By Das Capitolin on 6/17/2008 9:10:29 PM , Rating: 2
I suppose you should account for the cost of research and development, testing, failures, marketing, shipping, logistics, and last but not least employees and their heathcare. At the end of the day, I doubt it will be a $37 chip.

Now, try doing the math on an Intel or AMD CPU. Better yet, a quad-core or extreme edition. Intel is making HUGE bank.

By Staples on 6/16/2008 3:57:26 PM , Rating: 4
I really hope ATI does really well this time around so that many NVIDIA will stop it with this crappy strategy they have of releasing really expensive cards 6+ months before their mainstream cards.

RE: Uninterested
By KCjoker on 6/16/2008 5:10:57 PM , Rating: 1
If you call Nvidia's strategy crappy what do you call ATI's since they've been behind for a very long time now.

RE: Uninterested
By Goty on 6/16/2008 8:34:34 PM , Rating: 4
By "very long time", he means "one generation".

RE: Uninterested
By SlyNine on 6/17/2008 12:21:43 AM , Rating: 3
Anyone remember the 5800 ultra. Nvidia's had their share of screw ups.

By gyranthir on 6/16/2008 3:19:45 PM , Rating: 5
Too rich for my blood. And my blood's pretty rich....

RE: $649
By ajfink on 6/16/2008 3:50:40 PM , Rating: 2
They're both overpriced. The 3870X2 trades blows with the GTX 260 and two 8800GT in SLI make the 280's price seem obscene.

I'm sure ATI's upcoming launch will bring the prices down a little more, though.

RE: $649
By GhandiInstinct on 6/16/2008 6:34:15 PM , Rating: 2
Considering less and less developers are making games for PC this pricing seems insane. Who are they targeting? Minimal performance over current solutions and no game support?



RE: $649
By Pirks on 6/16/08, Rating: 0
RE: $649
By SlyNine on 6/17/2008 12:20:14 AM , Rating: 2
Us PC gamers have been hearing this for years. So far plenty of PC games still get released and bought. We are not going anywhere.

Its not like anyone needs this. Just get a 3850 and game on.

By Chosonman on 6/16/08, Rating: 0
RE: fck!
By majBUZZ on 6/16/2008 10:14:16 PM , Rating: 3
Seems like we have reached a point where the hardware is out pacing the software, by that I mean the amount of games that need or can use the full power of current video cards are far and few between.

RE: fck!
By BruceLeet on 6/16/2008 11:41:34 PM , Rating: 2
Dont forget all those "But can it run" threads. Graphics are that good or its just poorly optimized? I say both.

RE: fck!
By SlyNine on 6/17/2008 12:26:14 AM , Rating: 2
I remember when videocards were 3-4 times more powerfull every year. its been 1 1/2 years and this is only 80-90% more powerfull.

So either software development has slowed down alot, witch isnt the case when you consider suppreme commander and crysis. No computer setup can run SC at max settings and have simulation speeds at 1x.

Looking for mid range cards
By jeromekwok on 6/16/2008 9:56:52 PM , Rating: 2
It sounds great. Yet another expensive toy that will obsolete in 2 years. However Nvidia needs to address mainstream users.

I have my PC next to bed and I have to run it overnight. I need a video card that is silent and cool. I have a 3470 now but it is too weak to play any game.

I am looking for a passively cooled video card, better without 6-pin power to save on my electricity bill, and able to run Red Alert 3 at 1920x1200 30fps.

9600GT looks OK, but too hot and power hunger. I hope Nvidia would address these markets next refresh.

RE: Looking for mid range cards
By kake on 6/16/2008 11:43:50 PM , Rating: 2
It sounds like you're looking for a Hummer H2 with the performance and handling of a Mustang GT and the gas mileage of a Prius.

Sign me up, I'll take one too.

RE: Looking for mid range cards
By MAIA on 6/17/2008 12:09:01 PM , Rating: 2
Yeah lol. This is more like an "egg and the chicken" reasoning ...

Con DX10.1
By Woobagong on 6/17/2008 3:34:16 AM , Rating: 2
I think the decision against 10.1 came out of several arguments.

Software: We are still happy about new 10.0 titles which really use the advantages of 10.0, developers are slower than the hardware industry. Hopefully next gen graphics API's will help there in the future.

Operating System: 10.1 is Vista exclusive, all XP users won't have any advantage of it. Since Vista is still struggling to find the majority among gamers it's a good idea to wait, milk some more out of the older technology and invest later. Maybe we won't see an NVidia 10.1 card ever and they are skipping to 11.0?

GPU die: this GPU is already packed with a huge amount of transistors. Maybe they wanted to wait for the next die shrink to be able to add decent support for the newest directx features?

Stick to the strengths: We'll see if this strategy will be a success. After next XMas we'll know more.

RE: Con DX10.1
By Goty on 6/19/2008 1:48:57 AM , Rating: 2
Software: Saying "we're satisfied" is another way of saying either "we're too lazy/greedy to give our paying customers better features" or "our competition's implementation is better.

Operating System: Ok, by that argument, why support DX10 at all? It's not supported on XP, either.

GPU die: Didn't ATI implement DX10.1 on the 3800 series of cards and REDUCE transistor count?

Video Card... or Blade?
By alphadog on 6/18/2008 2:39:49 PM , Rating: 2
Anyone notice how the new video cards (and motherboards) look more and more like blades (and chassis)?

"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki
Related Articles
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki