backtop


Print 90 comment(s) - last by Goty.. on Jun 19 at 1:48 AM


NVIDIA GTX 280 3-Way SLI  (Source: NVIDIA)

NVIDIA GTX 280 GPU  (Source: NVIDIA)
The GPUs in the series include the GTX 280 and GTX 260

NVIDIA launched a new family of GPUs today called the GTX 200 series. Within the series there are currently two GPUs -- the GTX 280 and the GTX 260. The NVIDIA GTX 280 is now the flagship GPU from NVIDIA and sits in the line above the 9800 GX2.

NVIDIA is stressing with the new GTX 200 family that the GPUs go beyond gaming and are one of the most powerful processors in a PC and can be used for rendering video and other functions. NVIDIA says that its goals with the architecture of the GTX 200 series were to design a processor twice as powerful as the GeForce 8800 GTX, rebalance the architecture for future games with more complex shaders and more memory, improve efficiency per watt and per square millimeter, provide enhanced CUDA performance, and add a significant reduction in idle power requirements.

NVIDIA says that the GTX 200 line provides nearly a teraflop of computational power. The GTX 200 family also offers support for PhysX powered physics processing right on the GPU. Both the new GTX 280 and GTX 260 support SLI and 3-way SLI. The previous NVIDIA 9800 GX2 could not support 3-way SLI.

Key features in the new GTX 200 GPUs include support for three times the number of threads per flight at any given time. A new scheduler design allows for 20% more texturing efficiency. The memory interface for the GPUs is 512-bit (GTX 280) and full-speed, raster-operation (ROP) frame blending is supported. The GTX 200 series also features twice the number of registers for longer and more complex shaders and IEEE754R double precision floating-point. The GTXC 200 line also supports 10-bit color scan out via the DisplayPort only.

One of the main goals with the GTX 200 line was improved power management. Both the GTX 200 series GPUs have idle power requirements of about 25W; during Blu-ray playback power requirements are around 35W; full 3D performance requirements vary with the most power needed being 236W (GTX 280). The GTX 200 line is compatible with HybridPower, which makes the power needs of the GPU effectively 0W.

The GTX 280 is built on a 65nm process and has 1.4 billion transistors. The stock video cards have a graphics clock of 602 MHz, processor clock of 1,296 MHz, and a memory clock of 2,214 MHz. The GTX 280 has 1GB of GDDR3 and 240 processing cores and 32 ROPs.

The GTX 260 is also built on the 65 nm process and has 1.4 billion transistors. The graphics clock for the GTX 260 is 576 MHz, the processor clock is 1,242 MHz, and the memory clock is 1,998 MHz. The memory interface on the GTX 260 is 448-bit and it has 896MB of GDDR3 memory. The GTX 260 has 192 processing cores and 28 ROPs. The maximum board power is 182W.

Both video cards will support PhysX processing on the GPU. NVIDIA purchased Ageia in early 2008.

GTX 280 video cards will be available tomorrow for $649 and the GTX 260 cards will be available on June 26 for $399.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Great Achievement
By gyranthir on 6/16/2008 3:38:22 PM , Rating: 1
No DX 10.1 isn't a big loss because it does support the only real useful part of DX10.1 already.

http://www.anandtech.com/video/showdoc.aspx?i=3334...

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The requiring a 6 pin and an 8 pin, annoys me greatly, as my beast of a computer will require a brand new PSU to be able to use one. My 8800GTX will have to suit me for a while because I don't have $850 to waste on a new video card and a new PSU....


RE: Great Achievement
By JLL55 on 6/16/2008 3:46:03 PM , Rating: 4
quote:
WNo DX 10.1 isn't a big loss because it does support the only real useful part of DX10.1 already.

http://www.anandtech.com/video/showdoc.aspx?i=3334...

"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

The requiring a 6 pin and an 8 pin, annoys me greatly, as my beast of a computer will require a brand new PSU to be able to use one. My 8800GTX will have to suit me for a while because I don't have $850 to waste on a new video card and a new PSU....


If you continued to read TA on anandtech you would realize that they basically said BS to that comment, however, I completely agree with you on that last part. Requiring those extra pins is a big PITA cause if I wanted to stick it in my machine, I would have to get a new PSU besides the purchase of the new video card. I can't wait to see the 4870 X2 and pray that it can be used without a new PSU


RE: Great Achievement
By FITCamaro on 6/16/2008 3:53:06 PM , Rating: 2
Pretty sure the 4870X2 is going to require 2 8-pin connectors.


RE: Great Achievement
By Warren21 on 6/16/2008 5:11:42 PM , Rating: 2
I don't think it will. The vanilla HD 4870 may come with 2 x 6-pin PEG connectors, but it certainly doesn't need them, rather they are for more clocking headroom. A better way to think of it is HD 4850 X2 with a little bit of a OC. The 4850's only require 1 6-pin, so I would assume 1 75W 6-pin PEG + 1 150W 8-pin PEG + PCIe 1.1 75W slot (must be backwards compatible) = 300W should be enough.


RE: Great Achievement
By gyranthir on 6/17/2008 11:14:48 AM , Rating: 2
Brilliant to rate me down, for explaining Nvidia's point of view as to why they excluded DirectX 10.1,

Let's not forget that DX 10.1 is vista only.

When something like 75-80% of gamers aren't using Vista, because they see ZERO benefit from it. So the idea to not use DX10.1 isn't really that big of a stretch anyway.

Latest hardware + lamest software = LOL for gaming.


RE: Great Achievement
By Mitch101 on 6/17/2008 1:19:55 PM , Rating: 3
No problem here running Vista and Gaming on it. So maybe I don't get 130FPS anymore but my LCD monitors refresh rate wouldn't allow it anyway. Its not like Linux would run Crysis any better the engine exceeds todays hardware abilities and Im not sure its necessary to improve the graphics that high after seeing Cinema 2.0

Everyone knows that DX10.1 gave ATI hardware in Assasins Creed a what 25-35% speed increase on some Anti-Alias modes. It didnt put the ATI lightyears ahead of NVIDIA but it gave it enough of a boost to make the cards relatively the same. By that I mean you can play it on either system with the same level of visual quality on either.

Seriously if Assassins Creed was broken using DX10.1 why doesn't anyone have a screen shot of one of those game play reducing images yet we hear about but haven't seen proof of? If the DX10.1 was so broken where are all the screen shots and complaints to prove it? Seems there are more complaints to put DX10.1 back than there were of anyone screaming the visuals are broken.


“Then they pop up and say ‘Hello, surprise! Give us your money or we will shut you down!' Screw them. Seriously, screw them. You can quote me on that.” -- Newegg Chief Legal Officer Lee Cheng referencing patent trolls

Related Articles
Update: NVIDIA to Acquire AGEIA
February 4, 2008, 5:31 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki