backtop


Print 95 comment(s) - last by leidegre.. on Oct 16 at 4:42 AM

Gentlemen, start your DirectX10 engines

DailyTech received its first looks at a GeForce 8800 production sample today, and by the looks of it, the card is a monster: at least with regard to size and power requirements.

The GeForce 8800 comes in two flavors, which we will get into more detail about over the course of the next few days.  The first card, the GeForce 8800GTX, is the full blown G80 experience, measuring a little less than 11 inches in length.  The GeForce 8800GTS is a cut down version of the first, and only 9 inches in length.

The marketing material included with the card claims NVIDIA requires at least a 450W power supply for a single GeForce 8800GTX, and 400W for the 8800GTS.  Top tier vendors in Taiwan have already confirmed with DailyTech that GeForce 8800 cards in SLI mode will likely carry a power supply "recommendation" of 800W.  NVIDIA's GeForce 7950GX2, currently the company's top performing video card, carries a recommendation of 400W to run the card in single-card mode. 

NVIDIA is slated to launch both versions of the GeForce 8800 in November of this year.  More details on the GeForce 8800 will be available later today on DailyTech.

Update 10/05/2006: We originally reported the GeForce 8800GTX and 8800GTS are 9" in length.  The reference design for the 8800GTX is actually a little less than 11 inches.  The GTX has two 6-pin power adaptors, the GTS has only one.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

They're going in reverse...
By Scrogneugneu on 10/5/2006 1:02:50 AM , Rating: 5
The whole industry is converging into power savings. CPU now consume less power, we talk about having solid-state hard drive to have faster, but also much less energy hungry drives, we want to get the systems everywhere and we want the to be silent, we want laptops that won't burn the user if the laptop sits on the lap...

But here they are, with a monster, requiring an incredible amount of power to run. And it can be used in double mode. What are they thinking? Performance is not the only factor. I want a system which will be almost silent, so I can still hear what happens around. Water cooling? Yeah... as if it's a good news it's now required if you want to keep the card cool enough.

It's really a shame. I won't be looking for these. They're like Prescotts, but with a higher price tag and an integrated water cooling mecanism. Someone would like to test that kind of system? An OC Prescott with 2 of those beasts inside. Just watch the heat / electricity bill jump.

The performance has to be astonishing compared to competition, given the drawbacks. If it's not, then I guess NVidia won't make much money this round with the ultra high-end...




RE: They're going in reverse...
By Ringold on 10/5/06, Rating: -1
RE: They're going in reverse...
By Dactyl on 10/5/2006 1:52:28 AM , Rating: 1
Gamers who only run 1 computer/1 graphics card don't care about performance per watt.

If I had to run 100 computers, I would care about performance per watt.

Therefore: gamers won't care about performance per watt until NVidia releases an SLI system with 100 GPUs.


RE: They're going in reverse...
By Jkm3141 on 10/9/2006 8:42:04 PM , Rating: 2
and 80% of people in car accidents ate carrots in the week preceeding the accident so therefor carrots must cause car accidents? No. Gamers will never care so much about Performance per watt. The places that it accually matters are in businesses where there are hundreds of computers (hence a hundred times the power consumption of one computer) or servers where they have to be online 24/7 and stable and cannot overheat. Believe it or not power does cost money, granted 1 computer doesnt cost much but it adds up when u have a lot of computers. To the avarage gamer though, performane per watt is a waste of peoples time. People go on a rant about the Performance per watt of their CPU or GPU, but then proceed to put 2 or 4 GPU's in their computers. Way to go.


RE: They're going in reverse...
By Pirks on 10/5/06, Rating: -1
RE: They're going in reverse...
By Scrogneugneu on 10/6/2006 1:26:33 AM , Rating: 2
quote:
there are still people with brains AND money and they're with you!


Should have been

quote:
there are still people with dead brains AND too much money and they're with you!


RE: They're going in reverse...
By Pirks on 10/6/2006 2:24:30 PM , Rating: 1
envy is a bad thing - if you don't have money to afford G80 - go cry and call your mommy or something - crying here just makes other people annoyed. thank you.


RE: They're going in reverse...
By Kim Leo on 10/11/2006 6:03:38 AM , Rating: 2
hmm ok pirks, so you are the smart one with money? wow well i'm convinced. you have proven you are "sooo smart";)..

not everybody lives at home and don't have to pay theyre electricity bill themselves.. i like the performance per watt, and i like the idea of my system being quiet when i'm not using it(C&Q), and even if i did win the lottery or got a job that payd good money, i would still go out and buy something energy efficient and smart system.


RE: They're going in reverse...
By Pirks on 10/11/2006 2:42:56 PM , Rating: 1
the problem is not with you Kim or anyone who likes SFF stuff which is quiet and slow (slow compared to high end of course) - the problem is with downmodding el cheapos here who think that if G80 was not designed for THEM cheap SFF lovers then they have right to downmod ANYONE who even THINKS about SFF being not The Best Thing Since Sliced Bread. I just laugh at those idiots.

those who own SFF and also KNOW that G80 is not for them at all, because IT WAS NOT DESIGNED for SFF, absolutely - these guys are normal and don't downmod people who say G80 is just that - a pilot DX10 card which is designed for high end 1337 boxes, just this and nothing more.

see, Kim, it's not being me smart and rich (whish is not true btw ;) it's those clowns being cheap and dumb as a wood. they can downmod me and Ringold for what they want I don't care - this, unfortunately, won't help their poor little brains to understand some important things about G80 I and Ringold were talking about. they're such sad people, just leave them poor souls alone ;)


RE: They're going in reverse...
By VooDooAddict on 10/6/2006 11:18:11 AM , Rating: 2
Performance to watt deals with mobile computing and Data Center implementations. In the data center more efficient CPUs can not only decrease your power bill but it can also increase the effectiveness of your UPS and AC implementations.

Should not be a concern for gamers. The "massive" power draw will also be while gaming. I'm sure they won't need to draw as much power to render Vista's Aero Glass. For most gamers, this won't be an issue. The x1900 series already produces a boatload of heat and draws significantly more power then it's competitor ... this hasn't stopped it from being a performance favorite of so many.

This isn't to say that I'm not slightly disappointed by the power supply requirements. I build easy to transport LAN Party rigs. I've been going with NVIDIA for "customers" over the past 9-18 months due to the quieter stock coolers and lower heat output. This new card is going to be problematic for the SFF/mATX crowd. While you can drop a new massive power supply in a Q-Pack or an UltraFly, many SFF (Shuttle-like) cases are still stuck with smaller form factor power supplies. I may have to wait for a more efficient "8 9 00" series. I guess I'll just have to wait and see what the performance is like... I may have to put SFF aside for a few months and stick with overloaded Q-Packs.


RE: They're going in reverse...
By Pirks on 10/6/2006 2:28:06 PM , Rating: 1
that's ok, since G80 is just a pilot generation of dx10 hardware and will get much cooler (in all senses of the word) in the next generations, so you'll get back your SFF eventually. it's just moronic downmodding clowns that don't understand some obvious things, but we're talking about normal people, now don't we?


RE: They're going in reverse...
By mindless1 on 10/5/2006 2:03:34 AM , Rating: 2
What are they thinking? It's pretty obvious- that those who want lower powered video cards already have several choices, but those wanting utmost performance will be on a neverending buying spree.

It's great that you recognize your needs and don't waste power or mony with an overkill video card, and most OTHER people don't either - remember that Intel integrated video IS the most used video solution in PCs.

So it is a bit ridiculous how much power they're using, but they'll also devalue lesser cards too which is good for consumers.


RE: They're going in reverse...
By Pirks on 10/5/2006 9:45:59 AM , Rating: 2
exactly, this is the most sensible business strategy everyone follows - reap the most money off the first gen cool technology and then improve it and refine it and make mass market polished versions later when all the early adopters paid $$$$$$ for the first gen hardware - this is what we see here. unfortunately, I'd bet the DX10 cards when released will create a new record in price - they will actually market it as "over the ultra" market segment, you know, people that want even more than latest 7900 quad SLI, so this will obviously create a new price level - I predict street/ebay prices to be around $700/$800 at launch, for the very first buyers, maybe even more, but then quickly declining. so you're wrong that they will push the mass market prices down and "devalue" them. no they won't, in THIS generation they won't, but in the next they WILL (when power reqs are back to normal after die shrink), this is just how the market works.


"DailyTech is the best kept secret on the Internet." -- Larry Barber











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki