Print 153 comment(s) - last by valkator.. on Nov 13 at 1:51 PM

DailyTech spends more quality time with NVIDIA's upcoming GeForce 8800GTX

NVIDIA is set to launch its upcoming G80 GeForce 8800GTX and 8800GTS graphics cards next week, however, DailyTech snagged a GeForce 8800GTX board to run a couple quick benchmarks on. The GeForce 8800GTX used for testing is equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively. Other GeForce 8800 series features include 128-bit HDR with 16x anti-aliasing and NVIDIA’s Quantum Physics Engine.

Previous NVIDIA graphics cards in single card configurations were limited to lower levels of anti-aliasing. With the GeForce 8800 series, users can experience 16x anti-aliasing with only a single card. DailyTech has verified the option is available in the NVIDIA control panel.

The physical card itself is quite large and approximately an inch and a half longer than an AMD ATI Radeon X1950 XTX based card. It requires two PCI Express power connectors and occupies two expansion slots. An interesting tidbit of the GeForce 8800GTX are the two SLI bridge connectors towards the edge of the card. This is a first for a GeForce product as SLI compatible graphics cards typically have one SLI bridge connector.

Having two SLI bridge connectors onboard may possibly allow users to equip systems with three G80 GeForce 8800 series graphics cards. With two SLI bridge connectors, three cards can be connected without any troubles.  NVIDIA is expected to announce its nForce 680i SLI and 650i SLI chipsets with the GeForce 8800 series. NVIDIA nForce 680i SLI and 650i SLI based motherboards are expected to have three PCI Express x16 slots.

Moving onto the performance DailyTech has selected Half Life 2: Lost Coast, Quake 4, Prey and 3DMark06 for benchmarking. These games and applications were selected as other games use the same game engine. In addition to performance tests, DailyTech was also able to measure power consumption.

The test system configuration is as follows:
  • Intel Core 2 Extreme QX6700
  • NVIDIA nForce 650i SLI based motherboard
  • 2x1GB PC2-6400
  • NVIDIA GeForce 8800GTX
  • PowerColor ATI Radeon X1950 XTX
  • Western Digital Raptor 150

Futuremark 3DMark06

Radeon X1950 XTX GeForce 8800GTX

Kicking off the benchmarking festivities is 3DMark06. NVIDIA’s GeForce 8800GTX scores 59% higher than ATI’s current flagship. This isn’t too surprising as the GeForce 8800GTX has plenty of power.

Half Life 2 4xAA/16xAF 1600x1200

Radeon X1950 XTX GeForce 8800GTX

Quake 4 4xAA 1600x1200

Radeon X1950 XTX GeForce 8800GTX

 Prey 4xAA/16xAF 1600x1200

Radeon X1950 XTX GeForce 8800GTX

Half Life 2: Lost Coast loves the GeForce 8800GTX. Here the GeForce 8800GTX is able to show significant performance gains over AMD’s ATI Radeon X1950 XTX—approximately 92%.

Quake 4 shows similar gains as Half Life 2: Lost Coast  too, an approximate 92% improvement.

Prey is based on the same game engine as Quake 4. However, Prey shows smaller performance differences between the GeForce 8800GTX and ATI Radeon X1950 XTX, albeit its still 60%.

Power Consumption
Radeon X1950 XTX GeForce 8800GTX

Power consumption was measured using a Kill-A-Watt power meter that measures a power supply’s power draw directly from the wall outlet. The power supply used in the test system is a Thermaltake Toughpower that carries an efficiency rating up to 85%.

DailyTech previously reported NVIDIA recommends a 450-watt power supply for a single GeForce 8800GTX graphics card. This isn’t too farfetched of a recommendation. Power consumption of NVIDIA’s GeForce 8800GTX isn’t as bad as expected. When compared to AMD’s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD’s ATI Radeon X1950 XTX in terms of performance-per-watt.

Expect NVIDIA’s GeForce 8800GTX and 8800GTS graphics cards to be available next week. As NVIDIA has had plenty of time to ramp up production and ship out cards, this will be a hard launch with immediate availability.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Power
By xFlankerx on 11/4/2006 8:55:30 PM , Rating: 2
Thanks, also to Joepublic2, for clearing that up. So in your scenario, we would need 340W supplied , and a 485W PSU @ 70% efficiency (70% is common) to supply it, correct?

So for the 8800GTX, the system was pulling 273W from the PSU, so you would need 273W/.7 = 390W PSU @ 70% efficiency to supply that. That makes more sense, haha. So the 450W PSU recommendation isn't too bad, and gives some headroom as well.

RE: Power
By Lord Evermore on 11/4/2006 10:22:01 PM , Rating: 5
The PSU would need to be called a 340W PSU, indicating it can supply up to 340W (actually it'd need to be higher since they never can supply their max rating continuously). The actual draw at the wall would be 485W at 70% efficiency, with 30% of that being lost as heat in the PSU. With an 85% efficiency PSU, it'd only need to draw 400W at the wall, but it would still be rated as a 340W PSU.

Whatever amount the system pulls from the PSU is what the PSU has to be rated for, not what it pulls from the wall. PSU rating tells you what it can supply to the components, efficiency tells you how much higher the actual draw at the wall will be.

If the 8800 system draws 321W at the wall, at 85% efficiency it's only sending 272.85W to the components. So you need a 273W rated PSU. For the Radeon, 308W at the wall is 261.8.

The actual percentage differences (4% at load) stay the same though whether you use the wall or actual output, because the amount of difference between the two cards, in terms of how much they add to the draw from the wall, is affected by the same 85% efficiency difference.

Given the components besides the video cards, 150W when the system is fully loaded probably isn't unlikely for the draw for the rest of the system. That would give a 9.9% higher draw for the 8800 itself at load.

Assume 75W for all other components when idle (which sounds pretty low to me), then the 8800 is drawing 46.8% more power. Even if you use 100W for all other components under load or idle, no matter how you cut it, at idle and presumably during any plain 2D work, the Radeon is astoundingly more efficient, but ramps up power draw way faster when loaded. Of course when you consider how much performance you're getting out of the 8800, the per-watt performance is much better at load.

At load using the above, in HL2, the 8800 gets .95 frames per watt, while the Radeon gets .54. That's a 76% increase in frames per watt for the 8800. It's 79% higher if you assume 100W for the other components.

Of course, this is the next generation compared to current generation.

Since most systems are actually idling the majority of the time, or running 2D apps, the 8800 would end up costing you more to use, while not giving you any difference in performance the majority of the time. But when you do need the speed, it costs relatively less per unit performance, but costs more overall due to higher performance.

RE: Power
By xFlankerx on 11/5/2006 12:44:53 AM , Rating: 1
Lol, so you're saying that you can run a 8800GTX in your system with a 300W PSU?

RE: Power
By Lord Evermore on 11/5/2006 5:38:21 AM , Rating: 4
I'm not. DailyTech is, just not in so many words. If you had a really great quality PSU, 300W would cut it for this specific configuration. Although realistically, given the fact that PSUs don't like to run close to maximum continuously, and we don't know what sort of fans are in this test case, and differences between particular systems unless you got exactly the same components, you'd at least want a 350W, since at 300W that's only a bit over 2A more than the component draw in this test, and you don't know what parts might need a little boost. Particularly upon bootup, hard drives will use more power than when running, a CDROM spinning up needs more, video cards may need max power at boot.

This is why Dell and other OEMs have gotten away with putting 250W and 300W PSUs into P4 systems and the like for so long. They were just enough to run the system as configured.

RE: Power
By mindless1 on 11/5/2006 11:09:28 PM , Rating: 2
Dell et al also got by because they weren't buying the supplies rated for peak rather than sustained output.

RE: Power
By Spivonious on 11/7/2006 2:21:45 PM , Rating: 2
Thank you very much for clearing that up. I was always wondering why we "need" 800W power supplies when system draw from the wall was always under 400W.

“And I don't know why [Apple is] acting like it’s superior. I don't even get it. What are they trying to say?” -- Bill Gates on the Mac ads
Related Articles
NVIDIA nForce 680i Board Image Leaked
October 13, 2006, 3:54 PM
"G80" To Feature 128-bit HDR, 16X AA
October 5, 2006, 11:21 AM
Power and the NVIDIA "G80"
October 4, 2006, 11:56 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki