backtop


Print 95 comment(s) - last by leidegre.. on Oct 16 at 4:42 AM

Gentlemen, start your DirectX10 engines

DailyTech received its first looks at a GeForce 8800 production sample today, and by the looks of it, the card is a monster: at least with regard to size and power requirements.

The GeForce 8800 comes in two flavors, which we will get into more detail about over the course of the next few days.  The first card, the GeForce 8800GTX, is the full blown G80 experience, measuring a little less than 11 inches in length.  The GeForce 8800GTS is a cut down version of the first, and only 9 inches in length.

The marketing material included with the card claims NVIDIA requires at least a 450W power supply for a single GeForce 8800GTX, and 400W for the 8800GTS.  Top tier vendors in Taiwan have already confirmed with DailyTech that GeForce 8800 cards in SLI mode will likely carry a power supply "recommendation" of 800W.  NVIDIA's GeForce 7950GX2, currently the company's top performing video card, carries a recommendation of 400W to run the card in single-card mode. 

NVIDIA is slated to launch both versions of the GeForce 8800 in November of this year.  More details on the GeForce 8800 will be available later today on DailyTech.

Update 10/05/2006: We originally reported the GeForce 8800GTX and 8800GTS are 9" in length.  The reference design for the 8800GTX is actually a little less than 11 inches.  The GTX has two 6-pin power adaptors, the GTS has only one.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: They're going in reverse...
By Ringold on 10/5/2006 1:39:52 AM , Rating: -1
I don't understand the random 'performance per watt' craze that really just started, I *think*, or so it seemed, when Intel started hinting it was about to crush AMD with it.

Call me old school, but I could honestly care less. Technology advances and actually I'm pleased that my two year old Seasonic S-12 600w is not only plenty powerful enough, but still a relatively up to date model! Darn quiet, too.

Again, maybe I'm old school, but with it established that my 600w PSU (I'll be building a whole new Vista system, but comparable PSU) will be powerful enough to supply the juice, all I care about now are three things: performance over the last generation (one has to be nuts to think there will be none) and overclocking, and noise. Every thing else is a cost of doing business. Water cooling? So what? I've already got a nearly-silent water cooling loop as it is, with CPU and GPU both on the loop. My next system, OH MY, I actually plan to slap blocks on the RAM and motherboard chipset if it needs it! I must be CRAAAAZZYY!! Besides, the average joe won't hear these thins, the dust-buster FX cards stole all our hearing years ago.

So whats this leave us with? A new computer that, like every other significant upgrade cycle in computer history, actually requires new parts here and there on the low end. More performance than the last generation at comparable price points to when last generation parts where first introduced, maybe accounting for some inflation. So what if one part runs hotter while the rest runs cooler? So what if its the equivalent of one extra conventional lightbulb?

Seriously now people, if you can't afford an extra lightbulb, what the hell are you doing buying an 8800GTX anyway?! Spend that money on some college courses so you can afford more lightbulbs, please!

Besides, just twisting back the water heater's temp dial the tiniest fraction would save tremendously power than the extra spent on a 7800GT compared to a 8800GTS I'd bet.

Performance to watt has its place when talking about mobile solutions, or HTPC ones, but none at all on enthusiast-class gaming video cards. My opinion, anyway.


RE: They're going in reverse...
By Dactyl on 10/5/2006 1:52:28 AM , Rating: 1
Gamers who only run 1 computer/1 graphics card don't care about performance per watt.

If I had to run 100 computers, I would care about performance per watt.

Therefore: gamers won't care about performance per watt until NVidia releases an SLI system with 100 GPUs.


RE: They're going in reverse...
By Jkm3141 on 10/9/2006 8:42:04 PM , Rating: 2
and 80% of people in car accidents ate carrots in the week preceeding the accident so therefor carrots must cause car accidents? No. Gamers will never care so much about Performance per watt. The places that it accually matters are in businesses where there are hundreds of computers (hence a hundred times the power consumption of one computer) or servers where they have to be online 24/7 and stable and cannot overheat. Believe it or not power does cost money, granted 1 computer doesnt cost much but it adds up when u have a lot of computers. To the avarage gamer though, performane per watt is a waste of peoples time. People go on a rant about the Performance per watt of their CPU or GPU, but then proceed to put 2 or 4 GPU's in their computers. Way to go.


RE: They're going in reverse...
By Pirks on 10/5/06, Rating: -1
RE: They're going in reverse...
By Scrogneugneu on 10/6/2006 1:26:33 AM , Rating: 2
quote:
there are still people with brains AND money and they're with you!


Should have been

quote:
there are still people with dead brains AND too much money and they're with you!


RE: They're going in reverse...
By Pirks on 10/6/2006 2:24:30 PM , Rating: 1
envy is a bad thing - if you don't have money to afford G80 - go cry and call your mommy or something - crying here just makes other people annoyed. thank you.


RE: They're going in reverse...
By Kim Leo on 10/11/2006 6:03:38 AM , Rating: 2
hmm ok pirks, so you are the smart one with money? wow well i'm convinced. you have proven you are "sooo smart";)..

not everybody lives at home and don't have to pay theyre electricity bill themselves.. i like the performance per watt, and i like the idea of my system being quiet when i'm not using it(C&Q), and even if i did win the lottery or got a job that payd good money, i would still go out and buy something energy efficient and smart system.


RE: They're going in reverse...
By Pirks on 10/11/2006 2:42:56 PM , Rating: 1
the problem is not with you Kim or anyone who likes SFF stuff which is quiet and slow (slow compared to high end of course) - the problem is with downmodding el cheapos here who think that if G80 was not designed for THEM cheap SFF lovers then they have right to downmod ANYONE who even THINKS about SFF being not The Best Thing Since Sliced Bread. I just laugh at those idiots.

those who own SFF and also KNOW that G80 is not for them at all, because IT WAS NOT DESIGNED for SFF, absolutely - these guys are normal and don't downmod people who say G80 is just that - a pilot DX10 card which is designed for high end 1337 boxes, just this and nothing more.

see, Kim, it's not being me smart and rich (whish is not true btw ;) it's those clowns being cheap and dumb as a wood. they can downmod me and Ringold for what they want I don't care - this, unfortunately, won't help their poor little brains to understand some important things about G80 I and Ringold were talking about. they're such sad people, just leave them poor souls alone ;)


RE: They're going in reverse...
By VooDooAddict on 10/6/2006 11:18:11 AM , Rating: 2
Performance to watt deals with mobile computing and Data Center implementations. In the data center more efficient CPUs can not only decrease your power bill but it can also increase the effectiveness of your UPS and AC implementations.

Should not be a concern for gamers. The "massive" power draw will also be while gaming. I'm sure they won't need to draw as much power to render Vista's Aero Glass. For most gamers, this won't be an issue. The x1900 series already produces a boatload of heat and draws significantly more power then it's competitor ... this hasn't stopped it from being a performance favorite of so many.

This isn't to say that I'm not slightly disappointed by the power supply requirements. I build easy to transport LAN Party rigs. I've been going with NVIDIA for "customers" over the past 9-18 months due to the quieter stock coolers and lower heat output. This new card is going to be problematic for the SFF/mATX crowd. While you can drop a new massive power supply in a Q-Pack or an UltraFly, many SFF (Shuttle-like) cases are still stuck with smaller form factor power supplies. I may have to wait for a more efficient "8 9 00" series. I guess I'll just have to wait and see what the performance is like... I may have to put SFF aside for a few months and stick with overloaded Q-Packs.


RE: They're going in reverse...
By Pirks on 10/6/2006 2:28:06 PM , Rating: 1
that's ok, since G80 is just a pilot generation of dx10 hardware and will get much cooler (in all senses of the word) in the next generations, so you'll get back your SFF eventually. it's just moronic downmodding clowns that don't understand some obvious things, but we're talking about normal people, now don't we?


"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki