backtop


Print 90 comment(s) - last by Goty.. on Jun 19 at 1:48 AM


NVIDIA GTX 280 3-Way SLI  (Source: NVIDIA)

NVIDIA GTX 280 GPU  (Source: NVIDIA)
The GPUs in the series include the GTX 280 and GTX 260

NVIDIA launched a new family of GPUs today called the GTX 200 series. Within the series there are currently two GPUs -- the GTX 280 and the GTX 260. The NVIDIA GTX 280 is now the flagship GPU from NVIDIA and sits in the line above the 9800 GX2.

NVIDIA is stressing with the new GTX 200 family that the GPUs go beyond gaming and are one of the most powerful processors in a PC and can be used for rendering video and other functions. NVIDIA says that its goals with the architecture of the GTX 200 series were to design a processor twice as powerful as the GeForce 8800 GTX, rebalance the architecture for future games with more complex shaders and more memory, improve efficiency per watt and per square millimeter, provide enhanced CUDA performance, and add a significant reduction in idle power requirements.

NVIDIA says that the GTX 200 line provides nearly a teraflop of computational power. The GTX 200 family also offers support for PhysX powered physics processing right on the GPU. Both the new GTX 280 and GTX 260 support SLI and 3-way SLI. The previous NVIDIA 9800 GX2 could not support 3-way SLI.

Key features in the new GTX 200 GPUs include support for three times the number of threads per flight at any given time. A new scheduler design allows for 20% more texturing efficiency. The memory interface for the GPUs is 512-bit (GTX 280) and full-speed, raster-operation (ROP) frame blending is supported. The GTX 200 series also features twice the number of registers for longer and more complex shaders and IEEE754R double precision floating-point. The GTXC 200 line also supports 10-bit color scan out via the DisplayPort only.

One of the main goals with the GTX 200 line was improved power management. Both the GTX 200 series GPUs have idle power requirements of about 25W; during Blu-ray playback power requirements are around 35W; full 3D performance requirements vary with the most power needed being 236W (GTX 280). The GTX 200 line is compatible with HybridPower, which makes the power needs of the GPU effectively 0W.

The GTX 280 is built on a 65nm process and has 1.4 billion transistors. The stock video cards have a graphics clock of 602 MHz, processor clock of 1,296 MHz, and a memory clock of 2,214 MHz. The GTX 280 has 1GB of GDDR3 and 240 processing cores and 32 ROPs.

The GTX 260 is also built on the 65 nm process and has 1.4 billion transistors. The graphics clock for the GTX 260 is 576 MHz, the processor clock is 1,242 MHz, and the memory clock is 1,998 MHz. The memory interface on the GTX 260 is 448-bit and it has 896MB of GDDR3 memory. The GTX 260 has 192 processing cores and 28 ROPs. The maximum board power is 182W.

Both video cards will support PhysX processing on the GPU. NVIDIA purchased Ageia in early 2008.

GTX 280 video cards will be available tomorrow for $649 and the GTX 260 cards will be available on June 26 for $399.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

The high price makes sense...
By Brian23 on 6/16/2008 8:34:50 PM , Rating: 1
Everyone is complaining about the high price of these chips, but if you look at the numbers, you'll see that Nvidia isn't making as much money on them as you think.

Here are some example calculations:
1. We'll assume that yeilds are 100% for sake of argument. I know this isn't right, but I don't know what they actually are.
2. We'll assume that we're using 300mm wafers.
3. We'll assume that the wafer manufacturing costs are equal to the retail price of the product. (ie it costs $650 to produce the 1 die of a GT280 and the cost of all other materials is 0) I know this isn't true, but it's to illustrate a point.

Numbers:
One 300mm wafer can hold 105 GT200s, 630 Penryns, or 2,500 atoms.

Assume a Penryn costs $189 and an atom costs $50.

105 GT200 x $650 = $68,250
630 Penryn x $189 = $119,070
2,500 Atom x $50 = $125,000

As you can see from the above numbers it is much more profitable for TSMC to make something else than the GT200. When you factor in the facts that the GT200 will have lower yeilds and it takes a lot more than a peice of silicon to make a video card compared to just a CPU, you'll see that Nvidia really isn't making much money here.




RE: The high price makes sense...
By Parhel on 6/17/2008 12:15:38 AM , Rating: 2
Also Intel will sell more Penryns than Nvidia will sell GT200s, probably by a few orders of magnitude. Nvidia needs to make more money per chip sold, and there isn't anything wrong with that. If companies couldn't make money off GPUs, they couldn't stay in business making them, and we'd be some pretty unhappy gamers.

That said, $650 is too much to charge for a card that can be nearly matched with a pair of $140 8800GTs. I don't fault them for the price of the card. I fault Nvidia for designing a next-gen card that cut the price/performance ratio of the previous generation in half. From a business perspective, I fault them for diluting the market and cannibalizing their own lineup with cheap kick-ass cards mere months before introducing their uber-expensive next-gen card. With cheap 8800GTs on the market, who in their right mind would buy this?


RE: The high price makes sense...
By Maskarat on 6/17/2008 4:24:29 AM , Rating: 2
And how's all that the consumer's problem???

Nvidia is trying to sell a hot, not so good, expensive chip!! That's nvidia's problem .. not the consumer's problem! Why should the consumer carry around nvidia's problem?


RE: The high price makes sense...
By Brian23 on 6/17/2008 7:03:35 AM , Rating: 2
I'm not saying it's consumer's fault, I'm just pointing out that the price isn't unreasonable for what you're getting. as far as silicon goes.

That doesn't change the fact that the G80 and G92 are a real bargain.

Nvidia saw a market for a new high end chip and took advantage of it. The thing is, if you want one of these enough, you'll pay for it. If not, then you're not part of the market that Nvidia is targeting.


By VooDooAddict on 6/17/2008 5:51:22 PM , Rating: 2
All that said it's better they produce a few of these for sale to recoup some costs. Then concentrate on getting the new GPUs out the door.

I hope we'll simply see NVIDIA spend less on marketing this generation and more on accelerating the die shrink.


RE: The high price makes sense...
By MAIA on 6/17/2008 12:06:01 PM , Rating: 2
The real problem never was about nvidia profiting with this chip. The real problem is the price you have to pay for it when comparing with similar performers and their lower price.

As a consumer what matters the most is the money you have to pay, not what the company profits.


RE: The high price makes sense...
By MrBungle123 on 6/17/2008 12:23:23 PM , Rating: 2
quote:
Here are some example calculations:
1. We'll assume that yeilds are 100% for sake of argument. I know this isn't right, but I don't know what they actually are.
2. We'll assume that we're using 300mm wafers.
3. We'll assume that the wafer manufacturing costs are equal to the retail price of the product. (ie it costs $650 to produce the 1 die of a GT280 and the cost of all other materials is 0) I know this isn't true, but it's to illustrate a point.


My dad works at a 200mm TSMC fab... according to him the 200mm wafers cost $1600 each. Assuming that price level..

A = pi*r^2
31400mm^2 = 3.14 *(100^2)

So there is 31400 square mm of space on a 200 mm wafer. the GT 200 chip is 576mm^2 so that means that they can get 54 chips out of a 200mm wafer, however because of a combination of yields and the fact that the chips are square and the wafers ar round that number should be mulitplied by .80 so that gives us 43 chips per 200mm wafer.

$1600 / 43 = $37.20 per chip.

By the time its all said and done there is maybe $100 worth of parts in those cards so don't feel too sorry for nVidia.


By Das Capitolin on 6/17/2008 9:10:29 PM , Rating: 2
I suppose you should account for the cost of research and development, testing, failures, marketing, shipping, logistics, and last but not least employees and their heathcare. At the end of the day, I doubt it will be a $37 chip.

Now, try doing the math on an Intel or AMD CPU. Better yet, a quad-core or extreme edition. Intel is making HUGE bank.


"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki