Print 90 comment(s) - last by Goty.. on Jun 19 at 1:48 AM

NVIDIA GTX 280 3-Way SLI  (Source: NVIDIA)

The GPUs in the series include the GTX 280 and GTX 260

NVIDIA launched a new family of GPUs today called the GTX 200 series. Within the series there are currently two GPUs -- the GTX 280 and the GTX 260. The NVIDIA GTX 280 is now the flagship GPU from NVIDIA and sits in the line above the 9800 GX2.

NVIDIA is stressing with the new GTX 200 family that the GPUs go beyond gaming and are one of the most powerful processors in a PC and can be used for rendering video and other functions. NVIDIA says that its goals with the architecture of the GTX 200 series were to design a processor twice as powerful as the GeForce 8800 GTX, rebalance the architecture for future games with more complex shaders and more memory, improve efficiency per watt and per square millimeter, provide enhanced CUDA performance, and add a significant reduction in idle power requirements.

NVIDIA says that the GTX 200 line provides nearly a teraflop of computational power. The GTX 200 family also offers support for PhysX powered physics processing right on the GPU. Both the new GTX 280 and GTX 260 support SLI and 3-way SLI. The previous NVIDIA 9800 GX2 could not support 3-way SLI.

Key features in the new GTX 200 GPUs include support for three times the number of threads per flight at any given time. A new scheduler design allows for 20% more texturing efficiency. The memory interface for the GPUs is 512-bit (GTX 280) and full-speed, raster-operation (ROP) frame blending is supported. The GTX 200 series also features twice the number of registers for longer and more complex shaders and IEEE754R double precision floating-point. The GTXC 200 line also supports 10-bit color scan out via the DisplayPort only.

One of the main goals with the GTX 200 line was improved power management. Both the GTX 200 series GPUs have idle power requirements of about 25W; during Blu-ray playback power requirements are around 35W; full 3D performance requirements vary with the most power needed being 236W (GTX 280). The GTX 200 line is compatible with HybridPower, which makes the power needs of the GPU effectively 0W.

The GTX 280 is built on a 65nm process and has 1.4 billion transistors. The stock video cards have a graphics clock of 602 MHz, processor clock of 1,296 MHz, and a memory clock of 2,214 MHz. The GTX 280 has 1GB of GDDR3 and 240 processing cores and 32 ROPs.

The GTX 260 is also built on the 65 nm process and has 1.4 billion transistors. The graphics clock for the GTX 260 is 576 MHz, the processor clock is 1,242 MHz, and the memory clock is 1,998 MHz. The memory interface on the GTX 260 is 448-bit and it has 896MB of GDDR3 memory. The GTX 260 has 192 processing cores and 28 ROPs. The maximum board power is 182W.

Both video cards will support PhysX processing on the GPU. NVIDIA purchased Ageia in early 2008.

GTX 280 video cards will be available tomorrow for $649 and the GTX 260 cards will be available on June 26 for $399.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Great Achievement
By FITCamaro on 6/16/2008 3:51:12 PM , Rating: 4
The performance of these cards right now definitely does not justify the price. A pair of 8800GTs in SLI routinely outperforms even the GTX 280. Cost of a pair of 8800GTs is $300-350.

In the single card world, yes these are good parts. But if you can get SLI, the 8800GT or GTS is a far better value. And yes it will be interesting to see how the 4850 stacks up against these two.

RE: Great Achievement
By AssassinX on 6/16/2008 4:30:40 PM , Rating: 5
Yeah, IMO, it's kind of a win-lose/lose-win situation with SLI.

Create a new video card that is more powerful than the top of the line video card in SLI mode and people will say that SLI is useless and just buy the new video card.

Create a new video card that is less powerful than the top of the line video card in SLI mode and people will say it is not worth the money.

I realize price is a huge factor in determining if it is worth it or not but really if Nvidia did drop the price of the GTX 200 family then we will have people saying again to screw SLI and just go for the new card. Obviously this is just my opinion but it seems like almost any tech forum site swings back and forth on the subject of SLI. Although, I do feel it is interesting to see SLI as a viable upgrade option since it TYPICALLY SEEMS like it is a waste of money, power/heat and space.

RE: Great Achievement
By FITCamaro on 6/16/2008 4:47:14 PM , Rating: 3
If even the GTX 260 performed at the level of two 8800GTs in SLI for $300-325, I'd consider it a good buy. But when its about 33% more money and around 10-15% slower, its not.

When you look at the fact that the GTX 280 is over double the cost and doesn't perform even identical most of the time, its definitely not a good buy.

The only time it'd seem reasonable to run it is if you don't have a dual slot board. But in my mind if you can afford a $650 graphics card, you probably have a motherboard with dual 16x slots.

RE: Great Achievement
By Parhel on 6/16/2008 4:56:33 PM , Rating: 2
. . . its definitely not a good buy.

I wish I could disagree with you there. I guess that nvidia's yields were so bad that the only choice was to release them at far beyond the price/performance levels of other options.

It's a very impressive chip. If the die shrink allows them to raise clock speeds, reduce heat and power, and lower the price it might be worth purchasing. Hopefully they will execute on that. That said, if I were looking for a card right now, I'd be anxiously awaiting AMD's new lineup.

RE: Great Achievement
By Adonlude on 6/16/2008 6:13:06 PM , Rating: 5
Holy heatsinks Batman! The only powersupplies certified to run SLI with these bad boys are 1200W or above. I have a single 8800GTX in my machine and it literally heats my small room. In the summer it gets pretty hot and I always have to have my window open and my door open to get air flow through my room.

I am amused that I would literally need to install an A/C unit into my room if I wanted to SLI these suckers.

RE: Great Achievement
By ImSpartacus on 6/16/2008 5:20:49 PM , Rating: 2
I agree. I never thought about what would happen if nVidia released a product that was inferior to two last gen GPU's in SLI.

I actually have an 8800GT, and I may wait until 8800GT's go down in price and get another.

RE: Great Achievement
By paydirt on 6/17/2008 8:51:21 AM , Rating: 2
Huh? Maybe you're being sarcastic? Recent history (past 3-4 years?) shows that nVidia releases a single card that underperforms two lesser gen cards in SLI and the two lesser gen cards are less expensive combined than the new single card. This isn't a new concept, not sure why people are reacting the way they are, or why people think things should have been different this time.

RE: Great Achievement
By FITCamaro on 6/17/2008 9:00:23 AM , Rating: 3
I don't think the prices on 8800GTs are going to get much lower. I mean you can get an ECS 512MB 8800GT with the AC Accelero S1 on it for $145 right now. The heatsink is $30 so you're paying $115 for the GPU.

RE: Great Achievement
By Reclaimer77 on 6/16/2008 9:07:50 PM , Rating: 4
Good points.

Personally as a PC gamer I would like to see SLI not become the standard by which every game will be required to run to enjoy it.

I mean, Crisis isn't even that good of a game. Its certainly not revolutionary. Its just one game that someone decided to make at the time when hardly any system could run it at good frame rates. But everyone is running around acting like you need the two hottest video cards in SLI to enjoy high frame rates on the PC. I just don't get it.

Video cards are expensive enough. I enjoy building my own systems and buying all the components, with the emphasis on performance. But I would hate to see the day where your pretty much required to buy two of the hot video cards in SLI to enjoy PC gaming.

RE: Great Achievement
By SlyNine on 6/16/2008 11:56:56 PM , Rating: 2
You can play Crysis with a X800XT. Why people insist you need the best of the best in videocards x2 to play this game is beyond me. Heres a clue, turn down the graphics. Unless you are implying that the Devs should just turn the graphics down for you so no one elses game can look better ?

As far as gameplay thats pure opinion. I loved Crysis and thought it was a great game with a bad ending, But alot of games and movies end up that way.

RE: Great Achievement
By othercents on 6/16/2008 4:51:29 PM , Rating: 2
9800GX2 outperforms this card too and is running ~$480.


RE: Great Achievement
By Cunthor666 on 6/16/2008 7:14:05 PM , Rating: 2
...I'm still not too keen on buying two of something just to play new games for next six months.

I wonder how much does it cost to actually produce a video card these days, because the price for something that *still* can't give you decent frames at max settings on 24" screen is beyond justification for an average gamer like myself.

RE: Great Achievement
By Silver2k7 on 6/17/2008 2:53:22 AM , Rating: 2
I agree 75-80 fps @ 1900x1200 should be required of a flagship video card.

RE: Great Achievement
By FITCamaro on 6/17/2008 9:02:20 AM , Rating: 1
Why when the most the human eye can see is 60 fps?

RE: Great Achievement
By ali 09 on 6/17/08, Rating: 0
RE: Great Achievement
By FITCamaro on 6/17/2008 9:03:19 AM , Rating: 2
I am going off the review numbers here at Anandtech. Perhaps you should read the article. If you have an issue with their numbers, take it up with them.

"DailyTech is the best kept secret on the Internet." -- Larry Barber
Related Articles
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki