Print 77 comment(s) - last by skroh.. on Oct 10 at 5:38 PM

More G80 features abound

As if we mere mortals needed more reasons to be excited about G80, here are a couple more tidbits: 128-bit high dynamic-range and antialiasing with 16X sampling.

The high dynamic-range (HDR) engine found in GeForce 7950 and Radeon series graphics cards is technically a 64-bit rendering.  This new HDR approach comes from a file format developed by Industrial Light and Magic (the LucasFilm guys).  In a nutshell, we will have 128-bit floating point HDR as soon as applications adopt code to use it. OpenEXR's features include:
  • Higher dynamic range and color precision than existing 8- and 10-bit image file formats.
  • Support for 16-bit floating-point, 32-bit floating-point, and 32-bit integer pixels. The 16-bit floating-point format, called "half", is compatible with the half data type in NVIDIA's Cg graphics language and is supported natively on their new GeForce FX and Quadro FX 3D graphics solutions.
  • Multiple lossless image compression algorithms. Some of the included codecs can achieve 2:1 lossless compression ratios on images with film grain.
  • Extensibility. New compression codecs and image types can easily be added by extending the C++ classes included in the OpenEXR software distribution. New image attributes (strings, vectors, integers, etc.) can be added to OpenEXR image headers without affecting backward compatibility with existing OpenEXR applications.
NVIDIA already has 16X AA available for SLI applications.  The GeForce 8800 will be the first card to feature 16X AA on a single GPU.  Previous generations of GeForce cards have only been able to support 8X antialiasing in single-card configurations.

This new 16X AA and 128-bit HDR will be part of another new engine, similar in spirit to PureVideo and the Quantum Effects engines also featured on G80.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Small leap
By TURTLE786 on 10/5/2006 11:54:14 AM , Rating: 4
One small leap for Nvidia one giant leap for gaming!

RE: Small leap
By cnimativ on 10/5/2006 12:06:03 PM , Rating: 5
and one gigantic leap for your electric bill

RE: Small leap
By wingless on 10/5/2006 12:16:34 PM , Rating: 1
Im not a tree hugger or anything like that but my current system pretty much makes up the majority of our $500 light bill here in Texas. Keep in mind, energy in Texas is relatively cheap.

Specs: X2 4200+@2500mhz, 2GB DDR500, 4 HDDs, overclocked 7900GT, 2 CRT monitors, and a 500watt power supply.

Im actually thinking about building a super engergy efficient computer next year. If I shave off $100 a month off that bill which I bet I could do, my family would have a bit more disposable income to play with. As much as I love the new stuff from Nvidia and ATI, they just suck way too much power. I hope Nvidia will have some energy saving features on this card.

RE: Small leap
By Knish on 10/5/2006 12:25:01 PM , Rating: 5
You need to find out who has their search lights plugged into your house and disconnect them. A 500W power supply and two monitors (let's round everything up to 1KW) runs just about $800 *for the year* at $0.10 per KW hour.

And that's at full load, 24 hours a day, 365 days a year.

RE: Small leap
By UNCjigga on 10/5/2006 1:09:34 PM , Rating: 2
Hmm, I wonder how much power a triple-play setup of Xbox 360, PS3 and Wii would consume? Not like you'd be running all 3 at the same time though.

I am completely in awe and blown away at the specs of this chip though. 384-bit memory interface, 768MB of GDDR3 (or 4?), 128 unified shaders with Shader Model 4.0 and DirectXX goodness, etcetera etcetera ad infinitum. And to think this will likely launch (or at least be widely available) BEFORE PS3 even hits store shelves (and cost only a little bit more for GTX.) I guess all the PC gamers win this round??

RE: Small leap
By Pirks on 10/5/06, Rating: -1
RE: Small leap
By Clienthes on 10/5/2006 3:18:09 PM , Rating: 2
I think he meant that the GTX version would be just a bit more expensive than the PS3.

RE: Small leap
By Pirks on 10/5/2006 3:37:03 PM , Rating: 2
my mistake - you're right, I stand corrected. GTX is slightly more expensive than PS3 but there's no Crysis for PS3 - fair enough :)

RE: Small leap
By EclipsedAurora on 10/6/2006 5:28:46 AM , Rating: 2
But still we need to include other PC component cost (RAM, CPU), and he's probably also miss out those hit titles in consoles.

RE: Small leap
By theapparition on 10/5/2006 12:47:21 PM , Rating: 5
Why does it not surprise me your from Texas?

First off.........$100 is not the majority part of $500. If you were to run your 500W system 24hrs a day your bill would be $36/mo (assuming .10/KWh). Not to mention the fact that you computer should go into standby mode, and under normal operating circumstances is taking less than 200W.

Even more power hungry would be the 2 CRT monitors. Although I'm sure that they also go into standby mode and consume less energy. Go LCD, but don't expect to save more than $5/mo for getting LCD's and a new energy efficient computer. Your dis-illusioning yourself if you think you'll get better gains.

The primary factors in an energy bill are always;
1. Air Conditioning/Heat Pump
2. Hot water heater (electric)
3. Clothes dryer (electric)
4. Oven/rangetop
and if your like my household.........too much 1500W hair dryer use.

To reduce your bill, invest (<--notice I did not say spend) for a new 16-18 SEER/12 HSPF heat pump (I'm assuming HP since that's Texas) and really save that $100/mo.

With reguards to the Nvidia card, I'm dissapointed that it takes so much power.

RE: Small leap
By Lakku on 10/5/2006 1:38:56 PM , Rating: 3
Why does it not surprise me your from Texas?

If that's a swipe at Texans or their intelligence, or maybe lack thereof, perhaps you should use YOU'RE, instead of your.

RE: Small leap
By Ringold on 10/5/2006 2:39:24 PM , Rating: 3
Why does every post that tries to combat the whiny "my computer gets hot!" posts get voted to oblivion? Admittedly, the swipe at Texans wasn't smart, but the rest of the post was worthy.

I run my X2 @ 2.6ghz system, 600w Seasonic S-12, which has the standard components, a pump for the water cooling loop, and an LCD (all fed through a UPS which probably introduces minor inefficiency) 24-7 and the impact on my Florida bill is actually quite negligible. And when I say 24-7, I mean full CPU load 24-7, non-stop. (Recently before a reboot after 29 days of up time, had 6 hours of idle time noted in task manager). And yet, a few months ago, the system was dead for a month, about 3wks of it, while I fooled with a busted motherboard. Did I notice a drop in my light bill? Nope.

Therefore, will anyone that can afford such a monster of a video card notice a significant swing in their bills? Very much doubt it..

It's a concern for laptops and SFF for energy drain and heat respectively, but for desktops? Give me a break.

RE: Small leap
By MrSmurf on 10/6/2006 9:23:04 AM , Rating: 1
First off.........$100 is not the majority part of $500.

Read more carefully. He never stated his computer caused $100 of his bill. He mentioned buying a new computer and SHAVING off $100 of his bill. Whether his statements are accurate is moot. My point is you should read and comprehend before you start criticizing people's posts.

RE: Small leap
By MrSmurf on 10/6/2006 9:24:13 AM , Rating: 1
My point is you should read and comprehend before you start criticizing people's posts.
...and intelligence for that matter.

RE: Small leap
By ReblTeen84 on 10/5/2006 1:32:21 PM , Rating: 2
Dude..i have 3 computers running in my house, plus EVERYTHING in my house is electric..AC/HP, stove, water heater, etc. I live in VA and my light bill is around $170/mo, plus or minus 10 bucks. my computers never go into standby, i usually leave lights on and the fan outside is always running. Seems to be like you're getting the shaft with no vaseline from your power company.

RE: Small leap
By Christobevii3 on 10/5/2006 1:42:24 PM , Rating: 2
Electricity is like .16 cents a kw/h in texas. Include your heating bill in winter...

RE: Small leap
By ReblTeen84 on 10/5/2006 3:22:49 PM , Rating: 2
heating bill in winter isn't much different..we run the AC most of the time in the summer. I think we had a $300 electric bill once when we never turned the thing off. Why turn the PCs off? 2 desktops plus my server. Still not seeing how anyone has a $500 light bill with one PC.

RE: Small leap
By PlasmaBomb on 10/5/2006 1:56:57 PM , Rating: 2
Ever think of turning some of that stuff off?

RE: Small leap
By FITCamaro on 10/5/2006 1:40:13 PM , Rating: 2
Seriously. I have a C2D E6600 @ 3GHz, 2GB RAM, X1950XTX, 2 IDE drives, 4 120mm case fans, and 5 hard drives. At idle my system, including the 19" LCD monitor, cable modem, and router, uses around 240W through my UPS At max load my system uses 274W. My energy bill this month was $63 without much AC usage.

In a 3 bedroom 1100sq ft apartment in college, me and my two roommates in the summer only had a power bill of $250. Thats with 5 desktops(3 were high end gaming systems), 3 laptops, 3 TVs, speaker setups, AC, and lights. And that bill was only that high because the AC unit in that apartment was insanely inefficient.

Your computer maybe accounts for $10-15 of your energy bill a month.

RE: Small leap
By DangerIsGo on 10/5/2006 2:57:28 PM , Rating: 3
I find that BS. I have a more powerful computer than yours (with more stuff in it with a 700w PSU (800w max) and my bill is nowhere near that amount. Texas = lots of AC. Try not having your AC on 24/7 :p There are definitely other factors in your statement.

RE: Small leap
By AxemanFU on 10/5/2006 5:12:34 PM , Rating: 2
Yah. I have a gaming Rig that only gets used for gaming, video editing, and stuff like that, and a much less powerful machine that gets used for web browsing, email, word proc. The big machine gets shut down alot. It can warm a rather large room ten degrees F higer in just a couple of hours.

RE: Small leap
By AxemanFU on 10/5/2006 5:17:55 PM , Rating: 2
And Yes, I'm from TX too. I have to run my AC 9 months out of the year, and if it's not run in the computer room it gets annoyingly hot unless it is winter and cold out. You have to factor cooling all that heat you make with the PC also, and the efficiency factor for cooling 500 watts with an A/C is low, so you are probably talking at ex. 0.5 factor another 1000 watts of a/c on top of 500 watts PC. I have no idea what the real efficiency factor is. But it does make running a PC in the south and southwest expensive.

RE: Small leap
By Sharky974 on 10/5/06, Rating: -1
RE: Small leap
By Sharky974 on 10/6/06, Rating: 0
PS3's video chip?
By Chadder007 on 10/5/2006 1:06:28 PM , Rating: 3
And hence is seems that the PS3's video chipset is now obsolete. PC > Consoles!

RE: PS3's video chip?
By michal1980 on 10/5/2006 1:21:44 PM , Rating: 1
and hench your video card costs as much as a ps3 and without lots of other expensive goodness does not come eqaul to one.

And while you don't need it, the blu-ray drive will cost you like 700 bucks. or about 200 more then the ps3.

add os,
add ram,
add hd
add case
add cpu
add mobo

yup the pc is > ps3.

(lol I love my pc, but comparing the 2 like people love to do is stupid)

RE: PS3's video chip?
By JWalk on 10/5/2006 1:49:08 PM , Rating: 2
My guess is that he meant the PC will once again be capable of putting out better graphics than the latest concoles. This will likely be true. But then, it always has been. The big upside to a console will always be the amount of gaming power you get for a "reasonable" price.

Of course, I might not be the right person to talk about "reasonable" prices. I owned all 3 of the major consoles during the last generation, as well as an overclocked gaming PC. I currently own a gaming PC and an XBox 360, and I plan on buying a Wii when it launches, and eventually a PS3 at some point down the road. So, I may not be the "voice of reason". LOL

RE: PS3's video chip?
By Ringold on 10/5/2006 2:44:44 PM , Rating: 2

Consoles vs PCs, Part LXVIII..

Pretty much lays out the old XBox360 vs PC thing.

But on the other hand, only uber-enthusiasts will be getting these puppies. The rest of us will be waiting for the 8600GT or whatever their 200-300 mid-range card is. Or possibly a cut down GTS.

RE: PS3's video chip?
By Sharky974 on 10/5/06, Rating: -1
RE: PS3's video chip?
By Sharky974 on 10/5/06, Rating: -1
RE: PS3's video chip?
By EclipsedAurora on 10/6/2006 5:57:48 AM , Rating: 2
You made a good point.

I can't live without my 40" LCD TV and THX home theater system now, while PC can only stuck with small monitors and cheapie speakers.

One more thing, the upscaler of any latest TV perform similar things that GPU is doing with their FSAA, but in far better quality. If u guys would like to focus on FSAA, why not talking how great video system like Fujouda DCDi, Philips Pixel Plus 3, Sony DRC-MFv2.5 can boost the visual quality for consoles?

The greatest benifits for consoles start from here, it can make use of your current home theater hardware to improve your gaming experience, while PC can never capable of doing this. Why do u need to use a silly mouse+keyboard for Quake, but not a gun in Time Crisis?

RE: PS3's video chip?
By Sharky974 on 10/6/2006 7:18:34 AM , Rating: 1
More stupidy in that article:

He says the 360 costs 650, and video cards that can challenge it for 299. That works out to 484 and 223 in US dollars.

Well first of all in USA it's 399/299 for xbox360. And actually currently, you CAN probably get a $200 card that will challenge 360 (like a X1900GT or 7900GS). But it's close. I'm not sure those card are up to 360 level but they're close enough. But those cards also just came out, 360 is year old. Anytime in the last 12 months you'd generally pay 299+ for cards like that. At the time the 360 came out in Nov 2005, making the comparison more fair, more like 599 (but 360 was still 299). But just his currency conversion made the comparison seem worse than it is even today. Even today you are generally looking at 299 for a video card alone to equal 360, and the 360 costs 299. His figures make it look like 360 is twice the price a decent card, which is pretty much false and due to Aus currency, plus he shoots pretty low on the card price (223).

But his real backwardsass statements are the ones about the TV's. He says you must pay a few thouands dollars for one. First of all, he's backwards, one advantage consoles always have over PC's is that everybody already owns a TV. Nobody buys consoles and Tv's in pairs yet that's exactly what you do with PC's. The fact is the monitor as an added expense applies much more to PC's than consoles. You buy a PC you dont have a monitor (unless you already had one, which may often be the case nowdays, but you still will probably need to upgrade from CRT to LCD or something like that). You buy a console you already have a TV. Historically.

Now there's a slight temporary monkey wrench this time because of HDTV. But it's completely false to state you must have a HDTV to enjoy 360. Very false. It helps, yeah, but the machine is just fine on SDTV (and still toasts previous consoles graphically at 640X480). Hell, look at something like Gears of war, that's going to blow away old consoles on SDTV. Unless you're of the opinion SDTV is useless for games, in which case there's about 100 million PS2's out there you just called useless.

The other thing is "thousands of dollars". Lie. I got a 27" 720P LCD for ~$500. No dead pixels and it is super nice. From Office Depot. You can get that deal all the time. Either Syntax or Westinghouse are great qaulity low price LCD's. And hell, for $1000 you could easily get some big name brands, or better yet a 32" or 37" Westinghouse. Or for under 500 Best Buy and Wal Mart carry big name brand (Toshiba etc) 30-32" CRT HDTV's all day long. I mean shit, I dont even think many Sony's which are the most expensive around top 2 grand anymore. So basically the "thousands" is a pretty blatant fabrication, In reality it's $500 all that's required tops.

Also it's a TV, which you're going to buy an HDTV in the next five years (at least most Americans are) ANYWAY. Whether you have a Xbox360/PS3 or not. So it's really not part of the cost at all, it is something most people will be buying regardless.

He also neglect to mention that that 27", 32", or 37"+ TV is going to be hugely bigger and more impressive to play on that your puny PC monitor that is likely 19-22". Your PC monitor will do higher res, but 37" is going to blow that away as an overall visual/audio experience every time.

Overall he makes some good points, and I generally agree with him, he just throws some biased misinformation at times.

RE: PS3's video chip?
By beepandbop on 10/5/2006 10:39:24 PM , Rating: 1
The PS3 and the PC have rights to be compared: Sony has setup the PS3 to be a "PC" Only thing is that PS3'ers are stuck with a propiety mobo, last gen graphics cards, last gen crap. The PS3 is a crap deal considering that PCs already surpass it in every way. You're basically paying for a souped up Dell from a year ago with a DVD drive nobody cares about except people who hype it up and have no idea what they're talking about.
Congratulations, have a nice day on your way out.

RE: PS3's video chip?
By EclipsedAurora on 10/6/2006 5:35:18 AM , Rating: 2
>>>The PS3 is a crap deal considering that PCs already surpass it in every way. You're basically paying for a souped up Dell from a year ago with a DVD drive nobody cares about except people who hype it up and have no idea what they're talking about.

Before u said that, please proof:

1. If any Intel or AMD x86 offering can beat Cell BE in floating point performance
2. Can your $500 PC is fast enough for Quake4 at 1280x1024 32bit (around 1MP resolution)?
3. Can your $500 PC can playback SACD and BluRay?
4. Can even HT2.0 has faster bandwidth than Rambus FlexIO
5. $500 PC still use outdated DDR2, while PS3 use XDR

Then it'll be time to say PC is a crap deal.

RE: PS3's video chip?
By obeseotron on 10/5/2006 2:32:03 PM , Rating: 2
The PS3's graphics chip is basically a G70 based design as I recall. It was outdated before now.

RE: PS3's video chip?
By EclipsedAurora on 10/6/2006 5:40:00 AM , Rating: 2
It's normal for console having GPU slower than high end PC.
It's non-sense to compare a $2000 high end PC with consoles.
The performance PS3's GPU should come close with 256MB 7800GTX, if programmer can utilize the RSX's Rambus XDR and FlexIO interface properly to overcome the reduced GDDR bandwidth.

RE: PS3's video chip?
By EclipsedAurora on 10/6/2006 5:47:04 AM , Rating: 2
One thing I wanna mention

The 8 years old PS2 is capable of running the well known GT4 game title at 1080i (1920x1080, 2MP resolution), while the best PC hardware announced at that time (TNT2 Ultra) even have performance trouble to display 2D graphics at same resolution.

The main problem is PC developers would stop developing efforts on older system; while in console world, game developers are continously pushing the limit of the console systems. It's another way to protect your investment, without the need of upgrade.

RE: PS3's video chip?
By Sharky974 on 10/6/2006 7:23:20 AM , Rating: 3
It isnt 1080i though, the PS2 isn't even capable of 1080i due to the small EDRAM.

GT is like 540X480 line doubled or something stupid. I forget the exact tactic but it's definitly highly "fake". You can bet the PS2 doesn't have anywhere near the power to do that game at 1920X1080.

Your point is still right (consoles get more out of the fixed hardware) but that example is incorrect.

RE: PS3's video chip?
By EclipsedAurora on 10/6/2006 5:48:23 AM , Rating: 3
If you are keen on programming u'll also notice how stupid the way DirectX and Windows is wasting your bandwidth and processing power.

By Sunbird on 10/5/06, Rating: 0
RE: ...
By JoKeRr on 10/5/2006 12:27:35 PM , Rating: 2
Any word on the tranny count on G80 yet?? Is it really rumored 700 million??

RE: ...
By vhx500 on 10/5/06, Rating: 0
RE: ...
By Arkham1 on 10/5/2006 1:32:49 PM , Rating: 5
But it certainly opens up some interesting connectivity options, port-wise...

RE: ...
By Lakku on 10/5/2006 1:43:57 PM , Rating: 2
You FTW my friend.

RE: ...
By therealnickdanger on 10/5/2006 1:57:05 PM , Rating: 2
LMAO! Flawless victory!

RE: ...
By R3MF on 10/5/2006 2:03:03 PM , Rating: 1
but very funny

RE: ...
By tdawg on 10/5/2006 2:31:07 PM , Rating: 1

RE: ...
By horatio777 on 10/5/2006 2:49:12 PM , Rating: 2
I have an auto-repair shop near my office with a sign that reads:

"Free Tranny Fluid Exchange with Oil Change"

I got a similar chuckle out of that.

Problem with article
By Chillin1248 on 10/6/2006 2:03:28 AM , Rating: 2
I quote this from a B3D member:

The DailyTech article completely misinterpreted the whole "NVIDIA and OpenEXR share fp16/32 pixel formats" issue (which has been true since NV30 and the launch of EXR). It may have been referenced again in the marketing materials on which these posts were based, presumably in the context of the new fp32 framebuffer blending (required for DX10...) However, this is just a matter of using a common binary representation for 16-bit and 32-bit floating-point image values. EXR, the file format, is a much more complex standard than the uncompressed fp16/32 pixel formats it specifies (and shares with NVIDIA).

DailyTech then went to and read about the OpenEXR *software* library -- a C++ library for reading and writing EXR files, including RLE, wavelet, etc. compressed disk-storage formats.

This is not at all related to G80.

Can we clear this up?


RE: Problem with article
By Chillin1248 on 10/6/2006 2:05:28 AM , Rating: 2
Another thing pointed out by another B3D member:

It should also be noted that DailyTech lists some rather unlikely numbers in terms of memory clocks and texture rate. They list the GTX as having 86GB/s of bandwidth, and the GTS only 64GB/s. However, they also claim both have 900MHz GDDR3, which is highly unlikely. It should be relatively clear that this was a simple typo, and that the GTS actually sports 800MHz GDDR3. As for texture rates, they give the number of 38.4GPixels/s for the GTX, but few logical ways exist to attain such a number with the specified clockrates. As such, it seems much more likely that this part of the specification relates to the GeForce 8800GTS and its 1200MHz stream processors.

Looking foward for a response from DailyTech on what's going on.


RE: Problem with article
By KristopherKubicki on 10/6/2006 7:01:47 AM , Rating: 2
Unfortunately, until we have a way to test some of these metrics I can't really go into much more detail other than what was provided for us.

RE: Problem with article
By KristopherKubicki on 10/6/2006 6:49:59 AM , Rating: 2
Can we clear this up?

You'll have to bear with me a little here because some of this information is paraphrased from marketing materials, and we obviously would rather not have people tracking down the exact materials we are referencing.

The 128-bit implementation of HDR on the G80 is based on several technologies from OpenEXR, the binary storage method I'm sure just being one of them. This part was quoted near verbatim from the included material we received, though as he B3D reader pointed out you'll have to look at it in context and not as a "feature" in G80.

RE: Problem with article
By Chillin1248 on 10/6/2006 10:35:15 AM , Rating: 2
Thank you very much for responding, looking foward to further updates.

Your truly,

RE: Problem with article
By skroh on 10/10/2006 5:38:58 PM , Rating: 2
Let me clear it up for you even more.

You're quoting the forums at Beyond3D. Those forums are noted for a membership that is rabidly, passionately, unabashedly pro-ATi/anti-nVidia.

That you are finding posts there that are trying to "debunk" the apparently fantastic specs of the G80 with nitpicks about OpenEXR and nakedly subjective speculation such as "doubtful" and "highly unlikely" should come as no surprise. This has been the tone of discourse on that board for a long time now.

Bringing their pro-ATi bigotry over here and using it to imply criticism of the competence and journalistic integrity of the DailyTech staff is just spreading the disease.

By skroh on 10/5/2006 12:01:17 PM , Rating: 2
Does this new improved HDR support HDR+AA like the ATi X1900 series? And does the card support angle-independent anisotropic filtering? I'd like to see nVidia use the power of this beast to catch up to ATi on the image-quality front.

By KristopherKubicki on 10/5/2006 12:04:55 PM , Rating: 2
Does this new improved HDR support HDR+AA like the ATi X1900 series

And does the card support angle-independent anisotropic filtering?

That I don't know yet.

By skroh on 10/5/2006 12:10:34 PM , Rating: 2
Excellent! Now we can at least experience apples-to-apples gameplay in Serious Sam 2 and Oblivion in that respect. I much prefer choosing my upgrades based on overall performance rather than having to pick and choose which features I can live without...

OpenEXR - what exactly is it?
By iwod on 10/5/2006 1:18:59 PM , Rating: 2
I think i am a little lost in what exactly OpenEXR does. it list many usage in movie making but how does it work in gaming prospective?

RE: OpenEXR - what exactly is it?
By iwod on 10/5/2006 1:20:30 PM , Rating: 3
Ah i find the answer myself in Wiki

By Nightmare225 on 10/5/2006 4:03:48 PM , Rating: 2
This thing's amazing. I've honestly never seen such a leap in Graphics processing power since the original GeForce! :D

By Gigahertz19 on 10/5/2006 5:53:38 PM , Rating: 2
"This thing's amazing. I've honestly never seen such a leap in Graphics processing power since the original GeForce! :D "

WOAHHHHHHHH slow down...specs don't mean anything, wait until we see benchmarks done by anandtech before you make a statement like that.

Pretty awesome
By 05SilverGT on 10/5/2006 4:11:21 PM , Rating: 2
This might be the card that puts my 6800GT out of service. My montior is limited to 1280x1024 so it has kept me from going all out. However father time is creeping up on my system.

RE: Pretty awesome
By Xavian on 10/5/2006 4:36:38 PM , Rating: 2
Indeed, using a 6800GT PCIe myself and its starting to show its age at 1600x1200. This is my next card for sure.

By Chillin1248 on 10/5/2006 12:36:03 PM , Rating: 2
On what manufacturing process is the G80 based on? Did Nvidia release it on TSMCs 80nm or are we still on TSMC 90nm?


RE: Kristopher...
By Chillin1248 on 10/5/2006 12:45:19 PM , Rating: 1
P.S. - What drivers are you using with your card, are you still on the Forceware 90 series?


By Mudvillager on 10/5/2006 2:21:43 PM , Rating: 3
How many dB at idle and load?

Cool HDR... to disable
By Hare on 10/5/2006 5:27:57 PM , Rating: 3
This is great for some games but most serious gamers turn of the HDR when playing online. People want to see every corner of a dark room. No time for HDR when there are other players trying to shoot you :)

Yesterdays News?
By trabpukcip on 10/6/2006 9:32:54 AM , Rating: 3
... new GeForce FX and Quadro FX 3D graphics solutions.

Erm... Welcome to 2004?!

By Wwhat on 10/6/2006 12:48:14 PM , Rating: 3
All those bits, 128bit hdr, 32 bit color, yet most people moved to a TN panel LCD with doesnt even do 24bit color.

Looking forward to it.
By JWalk on 10/5/2006 11:52:23 AM , Rating: 2
It will be hot and expensive, but honestly I can't wait to see what these new cards are capable of. :)

Actually Excited
By maevinj on 10/5/2006 11:54:41 AM , Rating: 2
Normally I'm not all about the hype of the new cards but this G80 sounds awesome. I might have to take out a loan to get it tho.

32x AA
By TheDoc9 on 10/5/2006 6:25:01 PM , Rating: 2
I wonder if SLI'ing these cards will produce high quality 32x AA. That would be one sick upgrade.

By Jethrow on 10/5/2006 9:55:09 PM , Rating: 2
will this fit in my asus a8n32-sli deluxe?
our I need a new motherboard?
thankyou much.

By gramboh on 10/5/2006 12:13:02 PM , Rating: 2
Yes all 6xxx and 7xxx can do HDR, they cannot do HDR+AA in all games e.g. Oblivion

By AmbroseAthan on 10/5/2006 2:37:17 PM , Rating: 2
Ahh, didn't know that.

It works on HL2 why I thought it might be different. I run at 1920x1200 with HDR and 4xAA. I had assumed the extra 256MB (over other 7900GT's) was what enabled the switch to let me run them together in HL2.

By Xavian on 10/5/2006 4:35:24 PM , Rating: 2
HL2 uses a different form of HDR i believe, this is what allows HDR+AA on nVidia cards. Games like Oblivion and Far Cry use a different form of HDR, which currently cannot be used with AA on nVidia cards because of their choice in FP precision (32Bit).

This new HDR support on nVidia cards however allows HDR+AA in all forms.

"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki