Print 153 comment(s) - last by valkator.. on Nov 13 at 1:51 PM

DailyTech spends more quality time with NVIDIA's upcoming GeForce 8800GTX

NVIDIA is set to launch its upcoming G80 GeForce 8800GTX and 8800GTS graphics cards next week, however, DailyTech snagged a GeForce 8800GTX board to run a couple quick benchmarks on. The GeForce 8800GTX used for testing is equipped with 768MB of GDDR3 video memory on a 384-bit memory bus as previously reported. Core and memory clocks are set at 575 MHz and 900 MHz respectively. Other GeForce 8800 series features include 128-bit HDR with 16x anti-aliasing and NVIDIA’s Quantum Physics Engine.

Previous NVIDIA graphics cards in single card configurations were limited to lower levels of anti-aliasing. With the GeForce 8800 series, users can experience 16x anti-aliasing with only a single card. DailyTech has verified the option is available in the NVIDIA control panel.

The physical card itself is quite large and approximately an inch and a half longer than an AMD ATI Radeon X1950 XTX based card. It requires two PCI Express power connectors and occupies two expansion slots. An interesting tidbit of the GeForce 8800GTX are the two SLI bridge connectors towards the edge of the card. This is a first for a GeForce product as SLI compatible graphics cards typically have one SLI bridge connector.

Having two SLI bridge connectors onboard may possibly allow users to equip systems with three G80 GeForce 8800 series graphics cards. With two SLI bridge connectors, three cards can be connected without any troubles.  NVIDIA is expected to announce its nForce 680i SLI and 650i SLI chipsets with the GeForce 8800 series. NVIDIA nForce 680i SLI and 650i SLI based motherboards are expected to have three PCI Express x16 slots.

Moving onto the performance DailyTech has selected Half Life 2: Lost Coast, Quake 4, Prey and 3DMark06 for benchmarking. These games and applications were selected as other games use the same game engine. In addition to performance tests, DailyTech was also able to measure power consumption.

The test system configuration is as follows:
  • Intel Core 2 Extreme QX6700
  • NVIDIA nForce 650i SLI based motherboard
  • 2x1GB PC2-6400
  • NVIDIA GeForce 8800GTX
  • PowerColor ATI Radeon X1950 XTX
  • Western Digital Raptor 150

Futuremark 3DMark06

Radeon X1950 XTX GeForce 8800GTX

Kicking off the benchmarking festivities is 3DMark06. NVIDIA’s GeForce 8800GTX scores 59% higher than ATI’s current flagship. This isn’t too surprising as the GeForce 8800GTX has plenty of power.

Half Life 2 4xAA/16xAF 1600x1200

Radeon X1950 XTX GeForce 8800GTX

Quake 4 4xAA 1600x1200

Radeon X1950 XTX GeForce 8800GTX

 Prey 4xAA/16xAF 1600x1200

Radeon X1950 XTX GeForce 8800GTX

Half Life 2: Lost Coast loves the GeForce 8800GTX. Here the GeForce 8800GTX is able to show significant performance gains over AMD’s ATI Radeon X1950 XTX—approximately 92%.

Quake 4 shows similar gains as Half Life 2: Lost Coast  too, an approximate 92% improvement.

Prey is based on the same game engine as Quake 4. However, Prey shows smaller performance differences between the GeForce 8800GTX and ATI Radeon X1950 XTX, albeit its still 60%.

Power Consumption
Radeon X1950 XTX GeForce 8800GTX

Power consumption was measured using a Kill-A-Watt power meter that measures a power supply’s power draw directly from the wall outlet. The power supply used in the test system is a Thermaltake Toughpower that carries an efficiency rating up to 85%.

DailyTech previously reported NVIDIA recommends a 450-watt power supply for a single GeForce 8800GTX graphics card. This isn’t too farfetched of a recommendation. Power consumption of NVIDIA’s GeForce 8800GTX isn’t as bad as expected. When compared to AMD’s current flagship ATI Radeon X1950 XTX, the GeForce 8800GTX only consumes 24% more power at idle. The power consumption differences under load decreases to around 4%. Considering the performance differences, the GeForce 8800GTX is no worse than AMD’s ATI Radeon X1950 XTX in terms of performance-per-watt.

Expect NVIDIA’s GeForce 8800GTX and 8800GTS graphics cards to be available next week. As NVIDIA has had plenty of time to ramp up production and ship out cards, this will be a hard launch with immediate availability.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

2 power connectors?!?
By Myrandex on 11/3/2006 7:49:41 AM , Rating: 2
It looks pretty sweet, and that black PCB and heatsink is a real attention grabber, however 2 power connectors seems pretty absurd, even with all of that power. Under load it consumes only 4% more power than the ATI; they should be able to squeeze 4% more through 1 connection :-/

RE: 2 power connectors?!?
By coldpower27 on 11/3/2006 8:02:57 AM , Rating: 3
Teh yare being careful not to load the PCI-E 6 Pin power Connectors to heavily.

Also remember that is System Power Consumption, so when you comapre the difference between the actual video cards themselves it is more then a measly 4%.

FOr example the 7900 GT has 1 power Connector but only consumes 48W or so which is well under the 75W limit of the PCI-E slot alone, so it doesn't technically need the power connector, but nevertheless it has it to be on the safe side.

Same kind of issue here on the 8800 GTX, I wonder how it would fare agains the older more power hungry X1900 XTX.

RE: 2 power connectors?!?
By Oxygenthief on 11/3/2006 9:38:14 AM , Rating: 5

I don't think anyone is too concerned with overloading the "power connectors" as you stated.

The simple fact is with the ATX 2.2 PSU standard single 12V rails on a power supply cannot exceed ~20 amps. The 1900 series pulls much more than this and only runs well with psu's designed to "violate" the ATX 2.0 and 2.2 standards.

NVIDIA's move to work with the ATX 2.2 standard will help the rest of us by allowing us to use midrange PSU's with 12V rails instead of the high end 4 rail $200+ PSUs.

By distributing the load across two different rails the G80 can pull more power without causing instability issues and thus is a better product.

It is obvious by the numbers shown in this article that it doesn't pull much more power than the 1950's. Again, its a stability issue.

RE: 2 power connectors?!?
By coldpower27 on 11/3/2006 9:42:44 AM , Rating: 4
Which is exactly what I am getting at, a level of redundancy acts for stability.

Since they don't have to pull all the juice through a single rail, which you alluded they can't, the load is distributed across 2. So the product becomes more stable in the end.

RE: 2 power connectors?!?
By lopri on 11/3/2006 1:33:37 PM , Rating: 2
So is it a redundancy or a safety? They're not the same and I'm curious to know which way it falls into.

RE: 2 power connectors?!?
By feelingshorter on 11/3/2006 2:55:57 PM , Rating: 2
What are you asking? Redundancy is safety. Why do people who sky dive have two parachutes? If your computer "flicks off" by a loss of power due to a weak PSU, your video card could fry due to a power surge. By spreading the power across two 12v lines, there is redundancy built into the video card, should the PSU not have that redundancy. Good PSU manufactures will underestimate their amps rating, bad ones wont. This can easily be calculated by finding the max output per line, and adding it up and comparing to what the manufacture is marketing the PSU as. Just go to websites that benchmarks the PSUs in a max overload test and what the max amps you can run it at.

RE: 2 power connectors?!?
By othercents on 11/3/2006 4:10:57 PM , Rating: 3
Your not really getting redundancy until you are using two separate PSUs. When a PSU shorts out it is usually everything at one time instead of just a part of it. Plus you still have a single connector on the motherboard making it impossible for the computer to be totally redundant.

They put two connectors on the video card because they believed it would require that much power. Now if you had three of these bad boys in a computer you are going to need a decent size PSU. Total output should be around 800 or 900w with at least (3) 20a 12v rails.


RE: 2 power connectors?!?
By Clauzii on 11/3/2006 9:53:28 PM , Rating: 2
The only thing is that double number of connections means a THEORETICALLY higer fault-rate.

RE: 2 power connectors?!?
By mindless1 on 11/5/2006 11:00:30 PM , Rating: 2
Untrue. When the addt'l power consumption is minimal over a card having one connector, it is conceivable the card could continue running from only one. Having two, there is actually a redundancy in the mechanical portion of the supply chain and ideally, roughly halved current through each which might also improve connector reliability.

RE: 2 power connectors?!?
By Clauzii on 11/9/2006 8:20:32 PM , Rating: 1
Unless if that card exceeds the ATX specification on one connector, the load that before was shared equally on two connectors, now runs on one, and if specs exceeded, might be able to blow the supply.

RE: 2 power connectors?!?
By Griswold on 11/3/2006 12:26:53 PM , Rating: 2
I don't think anyone is too concerned with overloading the "power connectors" as you stated.

Maybe not in the case of the 8800GTX, but in general. Have you ever seen a plug like that melt? You can kiss your whole computer goodbye if that happens.

RE: 2 power connectors?!?
By johnsonx on 11/3/2006 1:58:00 PM , Rating: 2
The simple fact is with the ATX 2.2 PSU standard single 12V rails on a power supply cannot exceed ~20 amps. The 1900 series pulls much more than this and only runs well with psu's designed to "violate" the ATX 2.0 and 2.2 standards.

12v at 20 amps is 240 watts. No graphics card draws that much. I'm not arguing with your general point, as I do not know whether you are correct or not. However your stated reason cannot be correct, at least not as you have worded it.

RE: 2 power connectors?!?
By Goty on 11/3/2006 2:44:08 PM , Rating: 2
I run an X1900XT, a Laing DDC 12V water pump, an overclocked A64 X2, and a number of other peripherals on a PSU with a single 12V rail and I have no issues....

RE: 2 power connectors?!?
By Ard on 11/3/2006 3:41:54 PM , Rating: 2
How do you figure that the difference btw the two cards themselves is more than 4%? We have two identical systems here. The only thing that changes is the video card. The only way what you say could be valid is if the 8800 isn't pumping out as much juice as possible. Other than that, because the difference btw these two systems is 4%, we know that the difference btw the only variable in those systems (the two cards) is also 4%.

RE: 2 power connectors?!?
By howardluo on 11/3/2006 4:05:14 PM , Rating: 2
Do you need a math lesson? First take out the constants (the rest of the system components), then calculate the percentage difference between the two cards. Then tell me if it's 4% or not.

RE: 2 power connectors?!?
By Ard on 11/4/2006 1:37:31 AM , Rating: 2
The proper question is, do YOU need a math lesson? Let's compute it shall we? Say the constant draw of both systems, without any vid card, is 100W. What does that tell us? That the X1950 is sucking down 208W and G80 is sucking down 221W. What's the difference btw those two cards? Approx. 5-6%.

The difference will clearly shrink/grow depending on the constant draw of both systems (higher constant draw, higher difference in the two cards), but the point remains that because we know that the difference between the two complete systems is only 4%, we know that the difference between the two individual cards is going to be damn close to that. 5-6%, and that might be a liberal value, is certainly not the "more than a measly 4%" I was expecting from such a comment.

RE: 2 power connectors?!?
By howardluo on 11/4/2006 12:51:10 PM , Rating: 2
You have just countered your own statement and supported mine. You first stated that the difference between the 2 cards is 4% because the difference between the 2 systems is 4%. Now you stated that it is more than 4% which was my point of argument to begin I just gave you a math lesson.

RE: 2 power connectors?!?
By coldpower27 on 11/4/2006 1:51:23 PM , Rating: 2
Your numbers used are too high for the X1950 XTX, as since all it has access to is 1 PC1-E 6 Pin Power Connector and the PCI-E slot hence limiting the power draw to 150W. And going along those lines the 8800 GTX as well.

As well it is unlikely that it could be drawing that much as Nvidia and ATI would leave some safety margin in their designs. So even 150W for the X1950 XTX would be deemed alot and highly unlikely given that Nvidia already use a PCI-E 6 Pin Connector when the load exceeds 50W's for the 7900 GT.

I consider 8-10% more then a measly 4% since they are double.

RE: 2 power connectors?!?
By Spoelie on 11/4/2006 4:21:34 PM , Rating: 2
u da funnah.
Your twisted theory only works when you take the constants that small. If the system would be sucking down 300W, then the numbers go XTX 8W and GTX 21W. Now your 'almost the same percentage' is 260% versus 4%. And since these numbers were pulled out of my arse, just like yours were, they are just as valid.

We do not know the constant factor, we can only say "14W more". Making a statement that percentage points do not change significantly because of constants is retarded.

RE: 2 power connectors?!?
By coldpower27 on 11/3/2006 4:09:19 PM , Rating: 2
Simple because the isolated video cards power consumption will be a value of difference between 100 something watts to 100 something watts. Since the values are lower the percentage difference will deifintely increase.

308W - 321W is 4%

For instance 128 - 141W is 10%. I am not saying those are the exact figures mind you, I am just saying those numbers as an example.

The aboslute difference remains the same since like you said the only difference betwee the 2 systems is the video card.

RE: 2 power connectors?!?
By Clauzii on 11/3/2006 9:56:40 PM , Rating: 2
Remember that efficiency is different with different loads.

RE: 2 power connectors?!?
By coldpower27 on 11/4/2006 1:53:44 PM , Rating: 2
That is something to be concerned about, but I am going to put some faith in Dailytech that they don't use a non efficient PSU and the efficiency is about 75%-85% range.

RE: 2 power connectors?!?
By Anh Huynh on 11/4/2006 4:05:53 PM , Rating: 2
75% efficiency and this is the power draw of the whole system from the wall outlet. Speedstep was turned off.

RE: 2 power connectors?!?
By mindless1 on 11/5/2006 11:05:07 PM , Rating: 2
Not necessarily, as when the video card is reaching higher performance it will tend to utilize the CPU and memory subsystems more, too, increasing their power consumption.

RE: 2 power connectors?!?
By Ebbyman on 11/3/2006 9:25:54 AM , Rating: 2
I read somewhere that the card is CPU limited at this point. Higher power requirements and performance will likely come with faster chips and/or overclocked chips. So the power consumption might be lower than what it is capable of using.

RE: 2 power connectors?!?
By Aikouka on 11/3/2006 3:12:08 PM , Rating: 2
Now, this is only my speculation...

Now, I don't know how viable this would be and it'd still probably require a PSU with 3+ true 12V rails, but if we believe the card doesn't need the power from two 12V PCI-E 6-pin connectors, perhaps it's for the same purpose as having separate rails in the first place? I'm wondering if the point of the two connectors are for two separate power "feeds" into the card.

Based on this, the separate of power streams may help stability by reducing noise. I can't be definitive as I'm no electrical engineer (I'm a software engineer :P), but hey, it's something to mull over :).

RE: 2 power connectors?!?
By yipsl on 11/6/2006 11:19:01 PM , Rating: 2
Keep in mind that it's next generation Nvidia vs. this generation ATI. We'll see how it compares when ATI gets DX10 parts out.

I'm an ATI fan because of the All in Wonder cards. I've preferred a combo gaming/video recording card since the Radeon AIW 8500 128 days.

Still, I like the idea of DX10 parts arriving before DX10 games. I wonder what's taking ATI so long, they should have it down based on what they did with the Xbox 360's graphics.

By Spivonious on 11/3/2006 10:39:01 AM , Rating: 2
So why would anyone need more than a 400W power supply?

RE: Power
By killerroach on 11/3/2006 11:33:12 AM , Rating: 1
For the same reason that 640K of RAM wasn't "good enough for anybody", methinks. Even before my recent power supply upgrade, I had over a 400 in my own system (albeit 420 watts), but, when it decided to start going flaky on me, I picked up a 580W PSU. Yes, it's absolutely overkill at the moment, but it gives you headroom for future upgrades, not to mention these newer high-wattage PSUs are able to provide cleaner and more efficient power to your particularly twitchy components. For those of us who do any overclocking at all, both of these are rather important things, especially the "cleaner" part.

Now, if only the rest of my system was as nice as the power supply...

RE: Power
By Griswold on 11/3/2006 12:38:38 PM , Rating: 3
For the same reason that 640K of RAM wasn't "good enough for anybody"

scroll down to "Misattributions" - claiming that nobody actually ever said that, wouldnt be too far from the truth, I think.

Let this urban myth die already.

RE: Power
By BladeVenom on 11/3/2006 7:14:20 PM , Rating: 4
He never attributed it to Bill Gates. But if you want some amusing ones with, I hope, correct attribution.

"There is no reason anyone would want a computer in their home." Ken Olson, president/founder of Digital Equipment Corp., 1977

"I think there is a world market for maybe five computers." -Thomas Watson, chairman of IBM, 1943

RE: Power
By Clauzii on 11/10/2006 8:05:23 PM , Rating: 2
And having headroom means, at least for temperature controlled aircooled supplies, that it will run more silent. I use a NQ4001 (400W) supply on a machine, that, measured at the wall outlet is 212.

I NEWER hear the supply fan :)

RE: Power
By Jkm3141 on 11/3/2006 6:53:25 PM , Rating: 2
if there is 400watt power draw, a 400 watt power supply will not suffice. No power supply is 100% efficiently, and only the best PSU's are 85%. so to get 400 watts you would need like 500-550 at least to make up for the inefficiency's of the psu.

RE: Power
By xFlankerx on 11/4/2006 1:49:11 AM , Rating: 2

Pay attention to that, people. Important. Don't think that a 40)W will do you any good. A 550W PSU @ 70% efficiency (normal) is what you need for this. I don't know why Nvidia would say 450W, that'd be a very dangerous minimum, and with a super-efficient PSU.

RE: Power
By Joepublic2 on 11/4/2006 2:52:42 AM , Rating: 2
Those figures are measured from the wall:

Power consumption was measured using a Kill-A-Watt power meter that measures a power supply’s power draw directly from the wall outlet. The power supply used in the test system is a Thermaltake Toughpower that carries an efficiency rating up to 85%.

Power supplies are rated by how much current they can continuously supply , not how much they draw from the wall.

321 * .85 = ~273W draw from the system.

RE: Power
By saratoga on 11/4/2006 6:34:28 PM , Rating: 2
Thats not how it works. The efficiency does not impact how much power a supply can deliver, only how efficiently it delivers it. A 450w supply should deliver 450w (though many cheaper supplies are overrated!). If the efficiency sucks, it'll get hotter, but still deliver the power.

RE: Power
By ATWindsor on 11/4/2006 3:29:10 AM , Rating: 2
These measurments are out of the wall, if yu measure 400 wats out of the wall, you have to calculate "the other way" If the efficency is 85% you would actually only need a 340 watt supply (in theory).


RE: Power
By xFlankerx on 11/4/2006 8:55:30 PM , Rating: 2
Thanks, also to Joepublic2, for clearing that up. So in your scenario, we would need 340W supplied , and a 485W PSU @ 70% efficiency (70% is common) to supply it, correct?

So for the 8800GTX, the system was pulling 273W from the PSU, so you would need 273W/.7 = 390W PSU @ 70% efficiency to supply that. That makes more sense, haha. So the 450W PSU recommendation isn't too bad, and gives some headroom as well.

RE: Power
By Lord Evermore on 11/4/2006 10:22:01 PM , Rating: 5
The PSU would need to be called a 340W PSU, indicating it can supply up to 340W (actually it'd need to be higher since they never can supply their max rating continuously). The actual draw at the wall would be 485W at 70% efficiency, with 30% of that being lost as heat in the PSU. With an 85% efficiency PSU, it'd only need to draw 400W at the wall, but it would still be rated as a 340W PSU.

Whatever amount the system pulls from the PSU is what the PSU has to be rated for, not what it pulls from the wall. PSU rating tells you what it can supply to the components, efficiency tells you how much higher the actual draw at the wall will be.

If the 8800 system draws 321W at the wall, at 85% efficiency it's only sending 272.85W to the components. So you need a 273W rated PSU. For the Radeon, 308W at the wall is 261.8.

The actual percentage differences (4% at load) stay the same though whether you use the wall or actual output, because the amount of difference between the two cards, in terms of how much they add to the draw from the wall, is affected by the same 85% efficiency difference.

Given the components besides the video cards, 150W when the system is fully loaded probably isn't unlikely for the draw for the rest of the system. That would give a 9.9% higher draw for the 8800 itself at load.

Assume 75W for all other components when idle (which sounds pretty low to me), then the 8800 is drawing 46.8% more power. Even if you use 100W for all other components under load or idle, no matter how you cut it, at idle and presumably during any plain 2D work, the Radeon is astoundingly more efficient, but ramps up power draw way faster when loaded. Of course when you consider how much performance you're getting out of the 8800, the per-watt performance is much better at load.

At load using the above, in HL2, the 8800 gets .95 frames per watt, while the Radeon gets .54. That's a 76% increase in frames per watt for the 8800. It's 79% higher if you assume 100W for the other components.

Of course, this is the next generation compared to current generation.

Since most systems are actually idling the majority of the time, or running 2D apps, the 8800 would end up costing you more to use, while not giving you any difference in performance the majority of the time. But when you do need the speed, it costs relatively less per unit performance, but costs more overall due to higher performance.

RE: Power
By xFlankerx on 11/5/2006 12:44:53 AM , Rating: 1
Lol, so you're saying that you can run a 8800GTX in your system with a 300W PSU?

RE: Power
By Lord Evermore on 11/5/2006 5:38:21 AM , Rating: 4
I'm not. DailyTech is, just not in so many words. If you had a really great quality PSU, 300W would cut it for this specific configuration. Although realistically, given the fact that PSUs don't like to run close to maximum continuously, and we don't know what sort of fans are in this test case, and differences between particular systems unless you got exactly the same components, you'd at least want a 350W, since at 300W that's only a bit over 2A more than the component draw in this test, and you don't know what parts might need a little boost. Particularly upon bootup, hard drives will use more power than when running, a CDROM spinning up needs more, video cards may need max power at boot.

This is why Dell and other OEMs have gotten away with putting 250W and 300W PSUs into P4 systems and the like for so long. They were just enough to run the system as configured.

RE: Power
By mindless1 on 11/5/2006 11:09:28 PM , Rating: 2
Dell et al also got by because they weren't buying the supplies rated for peak rather than sustained output.

RE: Power
By Spivonious on 11/7/2006 2:21:45 PM , Rating: 2
Thank you very much for clearing that up. I was always wondering why we "need" 800W power supplies when system draw from the wall was always under 400W.

By RussianSensation on 11/3/2006 9:17:23 AM , Rating: 5
Nice performance increases. On the other hand,

X1900 series is easily playable in HL2 and Prey at 60 and 55 fps respectively. What we really need is new software that is 2x more demanding since I'd rather play a game that looks 2x better than HL2 at 50-60fps than HL2 at 120 frames (which adds 0 to the single player experience of the game).

Come on software, catch up already!

By killerroach on 11/3/2006 9:30:31 AM , Rating: 3
You're also assuming that Prey and HL2 are particularly graphically-demanding games... they run pretty good on a GeForce 6-series card, to say nothing of an 8800GTX. What might be more interesting to see is what this card does in games like F.E.A.R. and Oblivion, two much more visually-intensive games that, until recently, have given sizable performance advantages to ATI silicon. If the 8800GTX is still able to be 50-60+% faster than an X1950XTX in those games, then it's going to be really hard for me to keep from picking one of these things up.

By FITCamaro on 11/3/2006 4:48:58 PM , Rating: 4
Seriously. Why didn't they do a benchmark using Oblivion which is one of the most graphically demanding games out there. Or with Black and White 2 which puts most cards to their knees. I kind of think they used games Nvidia cards are known to be good at instead of ones that really push the envelope. Half Life 2 ran fine on my 3200+, 2GB RAM, and 6800GT system at high quality settings and 1280x1024.

By Frank M on 11/3/2006 9:36:04 AM , Rating: 1
Why? So that the 1% of the population who can/will drop 1K on this card can play? That wouldn't be a wise investment.

By Griswold on 11/3/2006 11:10:48 AM , Rating: 2
Just because a game can make use of the power of such cards, doesnt mean it shouldnt run on lower specs - which in and itself is a different story, but there are actually companies who deliver to everyone, regardless of wallet size.

By RussianSensation on 11/3/2006 11:51:08 AM , Rating: 2
That's not true at all. Look at doom 3 when it came out. At 800x600 it still looked better than almost all games (besides far cry at the time). At the same time you could crank it up to 1600x1200 if you had the horsepower. As it stands now, it usually takes software developers 1-2 years for us gamers to appreciate an investment in a $500 graphics card. Sure it's nice to have the extra power but I want a benefit in the majority of my games today not tomorrow.

I am not advocating that Geforce 8 is not necessary, since for some people I am sure it is, but every new generation when cards are released you are essentially paying a premium for the bragging rights as maybe at most 1-3 games really take advantage of it.

By Tsuwamono on 11/4/2006 7:16:36 AM , Rating: 2

Image quality
By Phynaz on 11/3/2006 10:45:27 AM , Rating: 3
Did Nvidia use some of this processing power to fix their image quality?

RE: Image quality
By imaheadcase on 11/3/06, Rating: -1
RE: Image quality
By Ard on 11/3/2006 1:44:42 PM , Rating: 2
Incorrect. There are differences in image quality between the two cards, AF being the most striking at this point. That being said, yes, some of this processing power has been devoted to image quality. NVIDIA now has angle-independent AF and improved AA.

RE: Image quality
By gramboh on 11/3/06, Rating: -1
RE: Image quality
By Phynaz on 11/3/2006 3:31:18 PM , Rating: 5
#2) Turn on high quality mode with NV driver then it is fine

The smart ass reply would be to ask how you know this, since you don't have a 8 series nv card in your possesion.

The real reply is along the lines of while nv high quality mode is roughly equivalent to ATI normal quality, this is very rarly how the cards are benchmarked. Secondly, there is nothing you can do on a 6 or 7 series nv card to fix the texture shimmering issue.

RE: Image quality
By Chillin1248 on 11/3/2006 4:44:23 PM , Rating: 3
What is ironic is that the ill-fated 5800 Ultra still has the best quality AF to date, if you don't believe this then head over to and look at their AF tests on the 5800 and then on the 6/7 series and the ATI X8xx/X18xx series.


RE: Image quality
By Xavian on 11/3/06, Rating: 0
RE: Image quality
By Ard on 11/4/06, Rating: 0
RE: Image quality
By Chillin1248 on 11/4/2006 10:30:10 AM , Rating: 2
Actually the Inquirer stole that story directly from who were the first to break the story.


RE: Image quality
By Hare on 11/4/2006 11:00:38 AM , Rating: 2
It's funny that you laugh at the Inq considering that they aired G80's specs a few months ago, before DailyTech or anyone else.

The inq has published maybe 50 different specifications for the G80. It's just statistics. Guess enough and sometimes you get it right...

RE: Image quality
By otispunkmeyer on 11/6/2006 5:20:00 AM , Rating: 2
yes they have

hey have a new AF mode which is like ATi's HQAF (angle independant) and they also offer 16xAA from a single card

there is also, i think, a mode called 16xQ (AA) which provides select areas on screen with 16xAA, and it has the same performance hit as 4xAA.

i think this mode picks certain edges (tree's, power lines, contrasting edges, where jaggies usually reside and are easy to see) and apply more AA to them, while applying less AA to other area's where its jaggies are less obvious

By clayclws on 11/3/2006 12:36:57 PM , Rating: 5
FEAR, GRAW, Battlefield, Oblivion, Gothic3, NFS:Carbon, W:MoC, Medieval2...great games with great graphics...depending on your tastes. I don't see a lot of them are ports from consoles.

Although I agree that I won't be shelling out RM2000++ (USD 540++) for one of these cards, I don't think PC games are at their lowest point in 10+ years. Take a look at what we have now and what these cards are meant for...Alan Wake, Crysis, C&C3, Supreme Commander, etc. I don't think many people would say that PC games are at their worst.

If you think console games are better, then, I'd say, "To each, his own." I love console games, but my root is still PC games.

By shabby on 11/4/2006 11:22:41 PM , Rating: 1
Carbon? Please...

By clayclws on 11/5/2006 12:05:38 AM , Rating: 2
To each, his own. I don't play racing games, but hell lots of people love em, especially with Carbon here. In reality, Carbon is the type of racing you get in most Asian countries. Familiarity breeds fanaticism...or something like that.

By goku on 11/4/2006 6:48:52 AM , Rating: 2
I would hardly compare fear to oblivion. Oblivion actually had a decent game and story line, Fear not so much. FEAR was overhyped, not Oblivion because nobody saw Oblivion coming. Fear had demos and all this hype prior to release last year in oct. oblivion caught everyone off guard. Oblivion [u]is[/u] a good game with good graphics.

By AzureKevin on 11/3/2006 4:13:18 PM , Rating: 2
Alan Wake. Period.

By AzureKevin on 11/3/2006 4:21:41 PM , Rating: 2
Though, you're totally right. The cost of the video cards doesn't really justify the benefit of being able to play one or two good games. But that won't stop some of us from upgrading, will it? Personally, I've only bought two graphics cards in my life, and I'm ready to upgrade to something that will handle a generation or two of decent games. Of course, I'm waiting for the 8800GT.

By Ringold on 11/4/2006 3:23:18 PM , Rating: 2
Supreme Commander. Exclamation point. Eleven.

I think there is a gem coming out in almost every category over the next 6 months, 12 at the most. The only category not so blessed graphically is 4X space strategy, but SE4/5 and GalCiv2 aren't played for their graphics anyway.

Then there's also current games like Eve that continuously improve themselves graphically as technology allows them to do so.

Not wanting to pay the money is a personal preference when it comes to a video card, but saying there aren't any good games on the horizon or if they are they're merely ports from a console (which for the really goods are is almost never the case; Halo comes to mind, and it was a horrible porting job performance wise) is painting with a slightly too wide of a brush.

Impressive, but not enough yet
By tseng517 on 11/3/06, Rating: 0
RE: Impressive, but not enough yet
By gramboh on 11/3/2006 12:39:29 PM , Rating: 2

RE: Impressive, but not enough yet
By theprodigalrebel on 11/3/2006 3:40:16 PM , Rating: 3
Well, it would be pretty stupid of them to wait a few months to launch a card that will NOT beat this. There (obviously) is no proof but it sure as hell won't make any sense for them to launch a card that won't do much to pay for its few hundred million dollars(?) of R&D.

And the original dude, I don't know what you mean by "Impressive, but not enough yet". Seriously, what were you expecting? More??? This takes a teeny-tiny bit more power than X1950XTX and delivers performance up to 90% more? Has integrated Physics Acceleration? 16X AA?

RE: Impressive, but not enough yet
By Xavian on 11/3/2006 5:44:26 PM , Rating: 2
remind me again.... what was the X1800 graphics card?

If i remember rightly, the X1800 was delayed greatly and when it finally launched it only managed to be on par or below par to the 7800 series at the time, a card that had been out some 4-5 months before it.

Its very possible that ATI could launch a card late that can only compete with Nvidia's offerings.

By Chillin1248 on 11/3/2006 6:14:08 PM , Rating: 4
However I don't believe the R600 is delayed per-se, I believe it is still launching in the same fiscal quarter as originally planned. And I am sure I can count on DailyTech when the time comes to leak the first detailed info about that card as well.


By mindless1 on 11/5/2006 11:15:52 PM , Rating: 2
LOL, so you think they can just CHOOSE to launch ahead of time or to have it more powerful than they can do cost-effectively? Hardly.

It might be faster, it might not, but it has nothing at all to do with your concept of launch date.

RE: Impressive, but not enough yet
By h04x on 11/10/2006 2:22:41 PM , Rating: 2
by tseng517 on November 3, 2006 at 12:18 PM

AMD/ATI's R600's gonna kick this puppy's bum

Fanboi'ism at it's best..

No proof, no (hard) architecture, no facts, just speculation on the final specs of the R600. Yet it will kick this cards "bum". Go surf the web more.

Pricing point
By clayclws on 11/3/2006 10:17:13 AM , Rating: 2
I'm from Malaysia, and there are a few stores that I particularly frequent to check their newest stocks and prices. One of the stores have already posted 8800GTS and 8800GTX at RM2000(USD540) and RM2500(USD675) respectively. I have no idea whether that is the pricing point for those cards in the US though. But at such pricing point, I don't think a lot of people will be jumping ship to DirectX10 immediately.

BTW, that looks kind of "army" like colour coding. Reminds me of some stealth helicopter or fighter.

RE: Pricing point
By clayclws on 11/3/2006 10:22:43 AM , Rating: 2
Hmm...if only RSX was based on G80.

RE: Pricing point
By hubajube on 11/3/2006 1:29:42 PM , Rating: 2
Speaking of pricing, does anyone know what the price of these are going to be?

RE: Pricing point
By Chillin1248 on 11/4/2006 10:35:32 AM , Rating: 3
$649 USD for the 8800 GTX
$499 USD for the 8800 GTS


RE: Pricing point
By Ringold on 11/4/2006 3:27:47 PM , Rating: 3
Reminds me of some stealth helicopter or fighter.

Their marketing department has earned their pay. :)

RE: Pricing point
By clayclws on 11/5/2006 12:09:47 AM , Rating: 2
Hopefully the cards also work like those stealth vehicles...silent and deadly =D

Power consumption
By Mudvillager on 11/3/2006 7:47:57 AM , Rating: 2
Not that bad at all.

RE: Power consumption
By defter on 11/3/2006 8:10:02 AM , Rating: 2
Yes it is quite low. Taking power supply efficiency into account, 8800GTX consumes only 11W more power than X1950XTX undel load. Not bad considering it has about 50-100% higher performance, almost double amount of transistors and it's made on the same process.

RE: Power consumption
By clayclws on 11/3/2006 10:31:07 AM , Rating: 2
Not bad at all...compared to AMD.ATI's product. What about comparing to NVIDIA's own products?

RE: Power consumption
By XataX on 11/3/2006 11:37:12 AM , Rating: 3
I was thinking...if one card needs 2 PCI-E power connectors...with the launch of Nforce 6 series of motherboards 3 G80s could be a possibility. And assuming if one does have the cash to shell out for 3 of these monsters think about the PSU that would be needed.

That PSU should have atleast 6 PCI-E power connectors...Gulp!

I wonder even the 1 KW PSUs available today have that many PCI-E power connections.

RE: Power consumption
By lkalbert on 11/4/2006 8:56:23 AM , Rating: 2
PC Power & Cooling, has the following specs:
24-pin, dual 8-pin, 4-pin, quad 6-pin video
15 Drive Connectors (6 SATA, 8 Molex, 1 mini)

The problem is that this PSU is VERY long and will not fit many cases. I have a Lian Liu 1200 and a PC Power & Cooling 850 (same size as 1KW) just barely fits

RE: Power consumption
By mindless1 on 11/5/2006 11:13:24 PM , Rating: 2
You're going to need a bit of a deep case for this card too, unless you have a rather unusual drive rack arrangement.

compared to crossfire?
By shamgar03 on 11/3/2006 8:00:20 AM , Rating: 3
So this would be about as fast as two xtx's in crossfire? Is that about right? I suppose it uses alot less power than two xtx's though.

RE: compared to crossfire?
By AzureKevin on 11/3/2006 9:48:58 AM , Rating: 2
That's a good point. I'm glad the power requirement of this card is a lot less than everyone had expected. This is good news; I'm planning to ditch my current 6800GS AGP PC and build a microATX media/gaming computer that'll handle UT '07, Alan Wake, Crysis, and whatever else 2007 brings. See the amazing performance of this card, I would expect the 8800GT to be close, but with a lower power requirement and price tag. I won't even need to mess with SLI or crossfire. I'm excited.

RE: compared to crossfire?
By Chillin1248 on 11/5/2006 4:15:44 AM , Rating: 2
The million dollar question then is how you would fit a 9+ inch 8800 GTS (or the 10 1/2 inche 8800 GTX) into a microATX case, the there is the obvious problem of heat and then a SLI capable PSU and then you have a general issue of it taking out two slots minimum. My drift is that you should perhaps wait till the 8600/8700 series before you embark on a MicroATX adventure.


RE: compared to crossfire?
By AzureKevin on 11/9/2006 4:16:36 PM , Rating: 2
That might be a good idea, actually. The 8800GTS is too expensive for my tastes right now. I'm actually a poor college student.

GTS review and noise comparison!
By yacoub on 11/3/2006 8:41:19 AM , Rating: 2
Looking forward to seeing how the GTS lines up amidst the GTS and X1900 series cards, as well as the 7950GT and GTX for reference against other NVidia 512MB products.

Also curious how the card's noise level compares at idle and load compared to the ATI solutions and previous 7900 series GTX cooler.

RE: GTS review and noise comparison!
By Chillin1248 on 11/3/2006 8:54:32 AM , Rating: 2
as the 7950GT and GTX

7950 GTX? Care to provide a link as this is most interesting.

Also other previewers said the noise was quite a normal level for cards of its type and not another repeat of the infamous "blower" on the 5800 Ultra series.


By johnsonx on 11/3/2006 1:59:57 PM , Rating: 2
he probably means the 7950GX2

RE: GTS review and noise comparison!
By Griswold on 11/3/2006 11:12:06 AM , Rating: 2
Ayup, I also want to know how much of a screamer (or not?) the stock fan is.

RE: GTS review and noise comparison!
By Pirks on 11/3/2006 1:57:12 PM , Rating: 1
I couldn't hear it, very silent.
Got a retail sample yesterday, GTX one.
Heatsink was mildly hot, maybe 50 degrees Celsius.
Drivers seem to be in beta, 'coz some DX10 samples worked real slow.
Card looks gorgeous, killer black, slick beauty. Nice to hold it in my hand (like a woman, hehe ;)

Power in Idle
By iwod on 11/3/2006 8:07:23 AM , Rating: 2
Are there currently any tech in GFx card that lowers there power during idle mode?

The way i look at it is that these gfx monster sucks too much power even when we are only using normal windows applications.

RE: Power in Idle
By Donegrim on 11/3/2006 8:36:34 AM , Rating: 2
I would have expected it to at least use the 2 different "2D" and "3D" profiles like my 6800 used. You could set the clock speed for 2D operation to be quite low then whack it up on the 3D one. I also would have expected more advanced tech like shutting down some of those shaders when they're not needed. Maybe this was just an engineering sample.

RE: Power in Idle
By saratoga on 11/3/2006 10:46:36 AM , Rating: 2
I'm hoping the high power consumption is just a driver issue that will be fixed at launch. Otherwise that would kind of suck to have that kind of PSU load all the time, even when the computer wasn't doing anything.

RE: Power in Idle
By hadifa on 11/3/2006 6:28:55 PM , Rating: 2
Maybe it is designed with vista's 3D interface in mind.

Many people buy this for DX10 which comes with vista.

Quake 4 results
By SirPsychoZeo on 11/3/2006 12:03:51 PM , Rating: 2
60-odd FPS? Check out the following link:

I wouldnt expect a 8800 GTX to only manage 60-pdd FPS. As such, I would now question all results, particular Preys.

RE: Quake 4 results
By Anh Huynh on 11/3/2006 12:11:54 PM , Rating: 2
They didn't enable 4xAA.

RE: Quake 4 results
By Spoelie on 11/5/2006 5:51:27 AM , Rating: 2
they did in the second graph

RE: Quake 4 results
By killerroach on 11/3/2006 12:13:25 PM , Rating: 2
You also don't know what demo is being used in each instance... comparisons across sites are often an "apples and oranges" kinda deal. While I am somewhat surprised by the numbers being that low (Quake 4 isn't exactly a game that'll tax most high-end hardware), we also don't know what part of the game that the DT guys were using for their comparison. I'm sure there probably are things you could cherry-pick out of that game that would be more stressful than normal on a video card.

Dailytech, more info perhaps please?
By Chillin1248 on 11/3/2006 8:58:10 AM , Rating: 2
Can you list the weight of the card, as well as the driver set you are using to test it with. Perhaps also you can show what exactly the smaller die near the end of the card actually is/does?

But thank you very much for these preview numbers, going to hold off actually on getting a 7600 GT and wait for these boys to drop in price.

Also can you give us a heads-up on what the situation with overclocking with these cards are as we have been hearing varied reports across the web if you are even able to overclock them.


RE: Dailytech, more info perhaps please?
By KristopherKubicki on 11/3/2006 4:53:40 PM , Rating: 2
We use Forceware 96.94 non-WHQL and Catalyst 6.10.

I'll see if we can get a scale for the weight of the card.


By Chillin1248 on 11/3/2006 6:11:31 PM , Rating: 2
Thank you very much for taking the time to reply and furthermore even submit one of my requests for review.

Out of curiosity, is there any place that somebody can download the 96.94 drivers as Google turns up empty.... That is if the 96.94 even support cards other than the G80 series.

Looking foward to the weight measurements.


DX9 or DX10?
By wingless on 11/3/2006 11:04:44 AM , Rating: 2
Were these tests done in Windows XP or Vista? If the 8800 is running in DX9 mode then its a whole lot slower than what we'll see when its using DX10. Thats a very FAST video card either way you look at it. A worthy upgrade.

RE: DX9 or DX10?
By KeithTalent on 11/3/2006 11:52:07 AM , Rating: 3
This is what I was wondering. When will we have some DX10 content to test? I was pretty sure this thing would blow everything else out of the water in DX9, but I would love to see it work with some DX10 content.

Also, I would like to see it compared to a 7950GX2.

7950 GX2?
By RMSistight on 11/3/2006 5:24:08 PM , Rating: 2
What I want to know is how one 8800GTX card goes against a Quad-SLI setup.

RE: 7950 GX2?
By Soviet Robot on 11/6/2006 2:52:23 AM , Rating: 3
The 8800GTX is supposed to be a little bit faster than a 7950GX2, which is two cards. It'd be cheaper to buy two GX2's compared to two 8800's, but you don't get DX10 or teh quantum physics thingummy.

maybe nvidia's in bed with american coal companies
By 8steve8 on 11/4/2006 3:26:46 AM , Rating: 4
45 extra watts at idle????

that is horrible.

By Clauzii on 11/5/2006 1:29:45 PM , Rating: 1
Agree, that's ridiculous!

By Dfere on 11/3/2006 9:37:16 AM , Rating: 2
So I guess video cards are now going the way of Hummers? Bigger is better and so is shiny?

By Aikouka on 11/3/2006 9:49:04 AM , Rating: 2
Nah, it looks like we get pretty decent "gas mileage" too ;).

One thing I hate to nitpick over, but it might be better... is that instead of talking about the cards using more power, you should say the system. The G80 system may use 7% more under load, but that's system to system. I'm not complaining as the point still stands that power consumption isn't as bad as expected, but for the sake of understanding exactly what's being tested, including "system" may help.

This article does make me feel better about holding off to buy a G80 :).

Just my opinion
By boe on 11/3/2006 10:11:00 AM , Rating: 2
I'm stoked there is a new king of the hill however I wish they made a few design changes. First is the two power connectors - overkill one would have worked plust the fact not many people have power supplies with two PCIE connectors although if you have this card, you can probably afford to replace your power supply although it seems rediculous to do it if you have adequate power.

The length is absurd, they should have made it fatter instead of so long, not many cases will handle that without the card hitting the drive cage. Most cases have plenty of width to space even after the cover is on.

As for the article, it is great getting the early scope but I agree with the earlier poster. While PREY runs fine on the x1950, people who want to upgrade probably are more concerned about the most demanding games such as FEAR or Oblivion.

RE: Just my opinion
By boe on 11/3/2006 10:18:57 AM , Rating: 2
I should be clear, this is a very nice article and covers many great things. The part about FEAR was about the only additional information that would have been great. Sound db would be nice as well but the power and other information provided in this article is GREAT!

Can you imagine...
By GTaudiophile on 11/3/2006 10:44:21 AM , Rating: 2
Just how many people does nVidia expect to buy THREE of these things for their next nForce6 platform with THREE x16 PCI-E slots? That's like $1800 on videocards, $200 for the board...does a power supply exist yet to power three of them? Anyone who does buy three doesn't need to worry about a heater for the house either.

RE: Can you imagine...
By Aikouka on 11/3/2006 10:57:28 AM , Rating: 2
I've seen some cases that allow for two power supplies in the same case, which was meant for the newer video cards. They even have the proper ATX connector splitter to allow both to turn on with the same switch.

More than three cards
By aiken666 on 11/3/2006 2:20:17 PM , Rating: 2
You know, if they fabricated a diagonal SLI bridge that ran from the back connector of one card to the front connector of the next, you could daisy chain as many of these as you wanted. Sure, with $600 cards, it would be kind of silly, but why wouldn't Nvidia just design it that way and let the market sort out what's actually wanted? The incremental cost of the bridge would be pennies per card.


RE: More than three cards
By Rock Hydra on 11/3/2006 3:06:02 PM , Rating: 2
Question about that: What if you used both connectors in SLI with 2 cards...could it possibly be for more SLI data bandwidth?

No oblivion benchmarks?
By DukeN on 11/3/2006 2:59:09 PM , Rating: 4

3X's DoubleWide 8800GTXs on one MB!?
By Spacecomber on 11/4/2006 8:49:29 AM , Rating: 2
I'm waiting to see what this looks like. Will there still be room for a sound card? Will you need noise cancelling headphones when playing games?

Inquiring minds want to know. ;-)

By davekozy on 11/5/2006 3:31:14 AM , Rating: 2
How about 3 8950gx2's for sex-SLI with dual quad core extremes? I'll need to rewire my place with dedicated circuits for my dual 1kw psu's. Maybe that's a little excessive. Good for performance but bad for my finances and the environment.

Break in NEWS
By crystal clear on 11/6/2006 7:08:40 AM , Rating: 2
News : GPUs & Graphic Cards : All 8800GTX Cards Being Recalled.

RE: Break in NEWS
By Clauzii on 11/11/2006 7:39:25 PM , Rating: 2
Where does it say that??

ATI vs Nvidia
By blotch87 on 11/9/2006 3:41:04 AM , Rating: 2
This Nvidia card is looking very good but its up against a card a bit older then itself. Also from what i read this Nvidia card is using a hybrid of unified shadrers and old shader caps.
ATI will have a fully unified shader architecture with no caps and it will prolly be using GDDR4 where as Nvidia is using GDDR3 still.
Yes i am a ATI fan but i try not to be biased..lets wait and see what ATI comes out with. my money is on ATI's card beating this one.

RE: ATI vs Nvidia
By valkator on 11/13/2006 1:51:06 PM , Rating: 2
Look at the size difference between ATI's top card and Nvidia. Ever notice how Nvidia always tries to get teh bigger/longer card out? Maybe they need the space on the PCB but it sure makes it seem like they have that "i have the bigger dick" kind of theme with their cards. I dunno i could be wrong on this.

By Leeman on 11/3/2006 7:49:58 AM , Rating: 1
I want one of those bad boys! I was expecting a more power hungry part. I may still get more use out of my 560W PS!

RE: 8800gtx
By Snoopvelo on 11/5/2006 10:12:30 PM , Rating: 2
I am getting this card tomorrow and I won't say the store because they may get in trouble. They even gave me a discount on it ($75). Was put on shelf by mistake wasn't suppose to sell until 2 weeks later according to them. I talked to the store manager to sell me the card but to also get $75 off from this was surprising. I will post the pictures tomorrow when I get it. BTW the card that I saw was EVGA 8800 GTX $649.99, but with the $75 off it will only cost me $575 plus tax.

What about the Temp.
By alaaaweee on 11/3/2006 8:03:11 AM , Rating: 1
In term of power consumtion Nvidia beats ATI not only in high end cards but also in mid range, and still Nvidia using 90nm Technology not the 80nm like the ATI.

Looking forward for the full review for the new cards specially the GTS!!!

RE: What about the Temp.
By dagamer34 on 11/3/2006 2:29:40 PM , Rating: 2
Comparing across generations is usually pointless, especially when they are in different price brackets.

Half Life 2
By granulated on 11/4/2006 11:22:38 AM , Rating: 1
the bench for the ATI seems a bit low ???

Half Life 2 4xAA/16xAF 1600x1200

.......X1950 XTX.....GeForce 8800GTX
FPS..... 60.74 ........... 116.93

RE: Half Life 2
By granulated on 11/4/2006 11:23:22 AM , Rating: 2
ahh lost coast...SORRY IGNORE ^^^^

Proper power ratings
By Lord Evermore on 11/4/2006 10:28:35 PM , Rating: 3
It would be pretty simple to judge at least well enough for our purposes how much power draw the system components aside from the video card draw at idle, and at load. Boot the system with no video card, which all OSes can do (well I assume OSX can), and measure the draw at idle. Then remotely access the system and run a stress test like Prime95 using maximum memory and CPU. It might not be perfectly mimicking the load during gaming, but it's close enough. Once you have those numbers, then reviewers could report what they've determined to be the actual draw of the video cards themselves. Giving us numbers that indicate the draw from this particular system configuration doesn't really tell us anything about the video cards except an exact wattage difference between them, which doesn't tell us how much power they actually use and how much of a difference they'd make in our own system.

Heck if nothing else, put some mangy old TNT2 Vanta card in there to test it.

RE: Proper power ratings
By L1NUXownz1fUR1337 on 11/5/06, Rating: 0
RE: Proper power ratings
By Pirks on 11/6/06, Rating: 0
By muzu x2 on 11/6/2006 1:46:02 AM , Rating: 3
does anyone bother here to comment on performance of 8800GTX.59% advantage in 3Dmark 06 over X1950XTX, is this because of unified shaders.

Wow, that's pretty freakin cool.
By jon1003 on 11/3/2006 7:58:27 AM , Rating: 2
Wow, 2 PCIe connectors per card, x3 cards = 6 power connectors for video! muhahahaha ...just WOW. What kind of monitor do you need to have, at what resolution, to even put 3x G80's under some stress!?!?

CPU scaling in games
By jmke on 11/3/2006 9:34:23 AM , Rating: 2
Would love to see some high resolution results (1600x1200 and up) at different CPU speeds, as some people are claiming that the 8800 will make system CPU dependent again.. (yeah right ;)... Crysis?)

Very nice :-)
By Zurtex on 11/3/2006 9:35:01 AM , Rating: 2
Those numbers look very good. I have a friend programming physics engines on graphics cards, see if I can get him to tell his department to request nVidia to give him one for 'physics testing' :P.

I wasn't planning to put a computer together till sometime next year, hopefully then there will be a 8900GTX using GDDR4 and the AMD ATI equivalent (imagine something like Supreme Commander running on 2 high resolution wide screen monitors lol).

By qdemn7 on 11/3/2006 10:05:06 AM , Rating: 2
That is the only way to describe that card. WOW!

2 questions
By Mudvillager on 11/3/2006 12:43:11 PM , Rating: 2
How noisy is it (in dB)?

Can I use it in my Lian Li V-series case where the mobo is placed upside-down?

By inthell on 11/3/2006 7:19:12 PM , Rating: 2
gfx cards are starting to scare they are so big..

$625 @ & more details...
By miahallen on 11/4/2006 8:39:06 AM , Rating: 2
..."standard cooler Of geForce 8800 GTX came out sufficiently quiet."
..."geForce 8800 GTX and geForce 8800 GTS: 30 amperes for the first and 26 amperes for the second. The overall power that come from power units must be equal to 450 W and 400 W respectively."

Minimum framerates
By DingieM on 11/6/2006 6:31:22 AM , Rating: 2
What are the minimum framerates and what is drop in framerates between minimum stress and maximum stress of the card?
G7x had massive framedrops under stress while the X1k had much less. Thats one of the advantages of the ATI technology.

By masher2 on 11/6/2006 9:34:10 AM , Rating: 2
Assuming a rather conservative system draw of 130 watts under load, and a 75% efficient supply, the actual consumption of each card would be:

- x1950: 135w
- 8800 : 148w

Or roughly a 10% increase under load. The real killer, though, is the idle difference. Assuming an 85w idle load, it works out to:

- x1950: 71w
- 8800 : 116w

Or over a 60% increase.

By DrewBear11 on 11/6/2006 12:37:01 PM , Rating: 2
It will be interesting to see if this card or any of the 8 series cards, will support simultaneous HDR + AA.

Also, the Lost Coast stress test rsults, it says of high resolution and high settings, but does that include HDR ?

By ultimaone on 11/8/2006 11:33:59 PM , Rating: 2
ya considering other sites that now have reviews...

the numbers here, DON'T even come close to results on
other sites with the same suppossed settings (resolution and AA, etc)

this card fluxuates all over the place, i do see a 60% gain in some cases, in others, hardly any gain (unless you play on 2000+ resolution....ahem....)

We need oblivion benchmark
By xNIBx on 11/7/2006 8:59:35 AM , Rating: 2
What i need to see is how g80 performs in oblivion, on an outdoor area with plenty of trees/foliage, preferably with hdr+aa on(if possible). Leaves is for the dx10 what water was for dx8 IMO. So show me some bushes :P

"Can anyone tell me what MobileMe is supposed to do?... So why the f*** doesn't it do that?" -- Steve Jobs
Related Articles
NVIDIA nForce 680i Board Image Leaked
October 13, 2006, 3:54 PM
"G80" To Feature 128-bit HDR, 16X AA
October 5, 2006, 11:21 AM
Power and the NVIDIA "G80"
October 4, 2006, 11:56 PM

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
Snapchat’s New Sunglasses are a Spectacle – No Pun Intended
September 24, 2016, 9:02 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki