backtop


Print 95 comment(s) - last by leidegre.. on Oct 16 at 4:42 AM

Gentlemen, start your DirectX10 engines

DailyTech received its first looks at a GeForce 8800 production sample today, and by the looks of it, the card is a monster: at least with regard to size and power requirements.

The GeForce 8800 comes in two flavors, which we will get into more detail about over the course of the next few days.  The first card, the GeForce 8800GTX, is the full blown G80 experience, measuring a little less than 11 inches in length.  The GeForce 8800GTS is a cut down version of the first, and only 9 inches in length.

The marketing material included with the card claims NVIDIA requires at least a 450W power supply for a single GeForce 8800GTX, and 400W for the 8800GTS.  Top tier vendors in Taiwan have already confirmed with DailyTech that GeForce 8800 cards in SLI mode will likely carry a power supply "recommendation" of 800W.  NVIDIA's GeForce 7950GX2, currently the company's top performing video card, carries a recommendation of 400W to run the card in single-card mode. 

NVIDIA is slated to launch both versions of the GeForce 8800 in November of this year.  More details on the GeForce 8800 will be available later today on DailyTech.

Update 10/05/2006: We originally reported the GeForce 8800GTX and 8800GTS are 9" in length.  The reference design for the 8800GTX is actually a little less than 11 inches.  The GTX has two 6-pin power adaptors, the GTS has only one.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

They're going in reverse...
By Scrogneugneu on 10/5/2006 1:02:50 AM , Rating: 5
The whole industry is converging into power savings. CPU now consume less power, we talk about having solid-state hard drive to have faster, but also much less energy hungry drives, we want to get the systems everywhere and we want the to be silent, we want laptops that won't burn the user if the laptop sits on the lap...

But here they are, with a monster, requiring an incredible amount of power to run. And it can be used in double mode. What are they thinking? Performance is not the only factor. I want a system which will be almost silent, so I can still hear what happens around. Water cooling? Yeah... as if it's a good news it's now required if you want to keep the card cool enough.

It's really a shame. I won't be looking for these. They're like Prescotts, but with a higher price tag and an integrated water cooling mecanism. Someone would like to test that kind of system? An OC Prescott with 2 of those beasts inside. Just watch the heat / electricity bill jump.

The performance has to be astonishing compared to competition, given the drawbacks. If it's not, then I guess NVidia won't make much money this round with the ultra high-end...




RE: They're going in reverse...
By Ringold on 10/5/06, Rating: -1
RE: They're going in reverse...
By Dactyl on 10/5/2006 1:52:28 AM , Rating: 1
Gamers who only run 1 computer/1 graphics card don't care about performance per watt.

If I had to run 100 computers, I would care about performance per watt.

Therefore: gamers won't care about performance per watt until NVidia releases an SLI system with 100 GPUs.


RE: They're going in reverse...
By Jkm3141 on 10/9/2006 8:42:04 PM , Rating: 2
and 80% of people in car accidents ate carrots in the week preceeding the accident so therefor carrots must cause car accidents? No. Gamers will never care so much about Performance per watt. The places that it accually matters are in businesses where there are hundreds of computers (hence a hundred times the power consumption of one computer) or servers where they have to be online 24/7 and stable and cannot overheat. Believe it or not power does cost money, granted 1 computer doesnt cost much but it adds up when u have a lot of computers. To the avarage gamer though, performane per watt is a waste of peoples time. People go on a rant about the Performance per watt of their CPU or GPU, but then proceed to put 2 or 4 GPU's in their computers. Way to go.


RE: They're going in reverse...
By Pirks on 10/5/06, Rating: -1
RE: They're going in reverse...
By Scrogneugneu on 10/6/2006 1:26:33 AM , Rating: 2
quote:
there are still people with brains AND money and they're with you!


Should have been

quote:
there are still people with dead brains AND too much money and they're with you!


RE: They're going in reverse...
By Pirks on 10/6/2006 2:24:30 PM , Rating: 1
envy is a bad thing - if you don't have money to afford G80 - go cry and call your mommy or something - crying here just makes other people annoyed. thank you.


RE: They're going in reverse...
By Kim Leo on 10/11/2006 6:03:38 AM , Rating: 2
hmm ok pirks, so you are the smart one with money? wow well i'm convinced. you have proven you are "sooo smart";)..

not everybody lives at home and don't have to pay theyre electricity bill themselves.. i like the performance per watt, and i like the idea of my system being quiet when i'm not using it(C&Q), and even if i did win the lottery or got a job that payd good money, i would still go out and buy something energy efficient and smart system.


RE: They're going in reverse...
By Pirks on 10/11/2006 2:42:56 PM , Rating: 1
the problem is not with you Kim or anyone who likes SFF stuff which is quiet and slow (slow compared to high end of course) - the problem is with downmodding el cheapos here who think that if G80 was not designed for THEM cheap SFF lovers then they have right to downmod ANYONE who even THINKS about SFF being not The Best Thing Since Sliced Bread. I just laugh at those idiots.

those who own SFF and also KNOW that G80 is not for them at all, because IT WAS NOT DESIGNED for SFF, absolutely - these guys are normal and don't downmod people who say G80 is just that - a pilot DX10 card which is designed for high end 1337 boxes, just this and nothing more.

see, Kim, it's not being me smart and rich (whish is not true btw ;) it's those clowns being cheap and dumb as a wood. they can downmod me and Ringold for what they want I don't care - this, unfortunately, won't help their poor little brains to understand some important things about G80 I and Ringold were talking about. they're such sad people, just leave them poor souls alone ;)


RE: They're going in reverse...
By VooDooAddict on 10/6/2006 11:18:11 AM , Rating: 2
Performance to watt deals with mobile computing and Data Center implementations. In the data center more efficient CPUs can not only decrease your power bill but it can also increase the effectiveness of your UPS and AC implementations.

Should not be a concern for gamers. The "massive" power draw will also be while gaming. I'm sure they won't need to draw as much power to render Vista's Aero Glass. For most gamers, this won't be an issue. The x1900 series already produces a boatload of heat and draws significantly more power then it's competitor ... this hasn't stopped it from being a performance favorite of so many.

This isn't to say that I'm not slightly disappointed by the power supply requirements. I build easy to transport LAN Party rigs. I've been going with NVIDIA for "customers" over the past 9-18 months due to the quieter stock coolers and lower heat output. This new card is going to be problematic for the SFF/mATX crowd. While you can drop a new massive power supply in a Q-Pack or an UltraFly, many SFF (Shuttle-like) cases are still stuck with smaller form factor power supplies. I may have to wait for a more efficient "8 9 00" series. I guess I'll just have to wait and see what the performance is like... I may have to put SFF aside for a few months and stick with overloaded Q-Packs.


RE: They're going in reverse...
By Pirks on 10/6/2006 2:28:06 PM , Rating: 1
that's ok, since G80 is just a pilot generation of dx10 hardware and will get much cooler (in all senses of the word) in the next generations, so you'll get back your SFF eventually. it's just moronic downmodding clowns that don't understand some obvious things, but we're talking about normal people, now don't we?


RE: They're going in reverse...
By mindless1 on 10/5/2006 2:03:34 AM , Rating: 2
What are they thinking? It's pretty obvious- that those who want lower powered video cards already have several choices, but those wanting utmost performance will be on a neverending buying spree.

It's great that you recognize your needs and don't waste power or mony with an overkill video card, and most OTHER people don't either - remember that Intel integrated video IS the most used video solution in PCs.

So it is a bit ridiculous how much power they're using, but they'll also devalue lesser cards too which is good for consumers.


RE: They're going in reverse...
By Pirks on 10/5/2006 9:45:59 AM , Rating: 2
exactly, this is the most sensible business strategy everyone follows - reap the most money off the first gen cool technology and then improve it and refine it and make mass market polished versions later when all the early adopters paid $$$$$$ for the first gen hardware - this is what we see here. unfortunately, I'd bet the DX10 cards when released will create a new record in price - they will actually market it as "over the ultra" market segment, you know, people that want even more than latest 7900 quad SLI, so this will obviously create a new price level - I predict street/ebay prices to be around $700/$800 at launch, for the very first buyers, maybe even more, but then quickly declining. so you're wrong that they will push the mass market prices down and "devalue" them. no they won't, in THIS generation they won't, but in the next they WILL (when power reqs are back to normal after die shrink), this is just how the market works.


Is this where we're going?
By keitaro on 10/5/2006 2:29:24 AM , Rating: 3
It's mentioned already how power-hungry this bad boy will be. But is this really the direction we're going for 3D performance?

We all know that as technology advances, the process of producing these chips become cheaper (due to better fabrication process) and the products themselves become smaller. It'll also generate less heat and consume less energy should the chip be tweaked to be more efficient. This is all commonly true in most respect. Yet I cannot help but wonder... is this really true?

In the Pentium era, the most we ever need was probably a 100W(??) power supply. We're now at an era where we need 300W to 500W just to power some of our stuff. The video cards keep getting more and more power hungry. But is it getting any more efficient at using these power?

Intel's move to be more energy efficient is long overdue and their Core 2 designs show that they have succeeded in making the processor more powerful but without the need of being power hungry or dissipate a lot of heat. I'm sure both NVIDIA and ATI do not have the luxury of doing that. But somehow I am starting to think that both need to look at how their product is designed and what they are aiming for.

I'm all for more 3D power as long as it isn't something requiring some insane configuration. But am I wrong to think that this is getting out of hand? It's understandable that there's an extreme market for high-end products. But what about those who have the money to spend but also spends wisely on what high-end products to buy?

My concerns on this and possibly all future 3D card products are how much more power are we going to need? I still feel it's crazy to have something like a 1 kilowatt power supply readily available at retail and e-tail. And I question if this is the direction computers should be heading towards.

The power requirement and power consumption of the G80 only makes me believe that somehow, in some form or another, console systems (ie. Xbox 360, Wii, and PS3) are better investment for gaming.




RE: Is this where we're going?
By kelmon on 10/5/2006 3:26:05 AM , Rating: 2
I'm in total agreement. The graphics card industry has been nuts for the last few years and the problem is only getting worse.

Wake me when the power available by one of these cards is in a form that doesn't require its own power station and can be used in a laptop or something.


RE: Is this where we're going?
By Wwhat on 10/5/06, Rating: -1
RE: Is this where we're going?
By Googer on 10/5/2006 9:02:11 AM , Rating: 2
I think you may be wrong.

The last thing I read about Vista is that the Looking Glass Aero interface will be using DX9 render the user interface.


RE: Is this where we're going?
By Pirks on 10/5/06, Rating: -1
RE: Is this where we're going?
By Wwhat on 10/5/06, Rating: -1
RE: Is this where we're going?
By Pirks on 10/5/06, Rating: -1
RE: Is this where we're going?
By Wwhat on 10/5/06, Rating: -1
RE: Is this where we're going?
By nilepez on 10/5/2006 9:36:39 PM , Rating: 2
quote:
And yes once you have DX10 games you need a DX10 gfx card to play them, ie it becomes a requirement in a sense, and DX10 will only be available on vista.


Unfortunately, this statement is also incorrect. Crysis, for example, is a DX10 game. Just because a game has DX10 features, doesn't mean it can't run on DX9 hardware or that it's incompatible with DX9.

It's unlikely that DX10 will be required for ANY game for at least 2 years. Game companies aren't in the business of cutting out 90% more more of their potential customers.


RE: Is this where we're going?
By Pirks on 10/6/06, Rating: -1
RE: Is this where we're going?
By Wwhat on 10/8/2006 12:45:32 PM , Rating: 1
Dummies, I plurarlised! :)
btw if a DX10 games uses the geometryshader for nurbs or something good luck having that run on a dx9 card.


RE: Is this where we're going?
By Pirks on 10/10/2006 2:41:01 PM , Rating: 1
and if you remove all these "if"s from your statements there won't be much left :))) not a sign of "Vista requires balahblah"

just try to remove all those "if"s and enjoy the result ;)


RE: Is this where we're going?
By Wwhat on 10/5/2006 1:33:59 PM , Rating: 3
Oh and no private individual likes to waste money, and no sensible person wants a computer that uses up 1000 Watts and gets hotter than a heater.


RE: Is this where we're going?
By Pirks on 10/5/06, Rating: -1
RE: Is this where we're going?
By Wwhat on 10/5/2006 5:04:07 PM , Rating: 3
I found that the cheapest most stingy people are also the richest, if you did not then it must be your daddy who has the money, or you are some hiphop star.


RE: Is this where we're going?
By Pirks on 10/5/06, Rating: -1
RE: Is this where we're going?
By MrFluffo on 10/5/2006 4:01:35 PM , Rating: 3
I like to waste money.... on stuff i want.


RE: Is this where we're going?
By Wwhat on 10/8/2006 12:46:48 PM , Rating: 1
If you want it is it a waste then? I sense selfrespect issues possibly O_o


RE: Is this where we're going?
By del on 10/6/2006 5:01:37 PM , Rating: 2
Yeah, I'm in total agreement too. I'm not going to use an SLI configuration for GeForce 8800 GTX... haha... I'm just going to go with the single card solution. Not to mention that there aren't many PC games being made these days; using a console
is a better option. Nevertheless, PC will always be the ultimate gaming platform in terms of performance.


Yes, and Winter's almost here
By rupaniii on 10/5/2006 1:01:18 AM , Rating: 4
This is great news. For $300, I can buy a great 1300watt psu and then spend the amount of money I would on a new heating system on these cards in SLI mode. I'm hoping they have a Quad Configuration as well because I need to exhaust the entire side of the case directly into my AC Ducts and I need the extra fan power to push it upstairs, and I want it to be as warm and toasty as possible. I may just have to put all silent heat sinks and setup a warning monitory system to generate EVEN MORE HEAT and then use an external exhaust fan to whiff it up into the house.

I am very happy that such technology can provide true comfort for the family room in the upstairs where we will stream video to for the family visits.
In the basement, where this will be, I may have to wear shorts to keep from sweating too much, but it will be worth it.

The Pentium 4 Extreme Edition doesn't, by itself, generate the amount of heat to warm more than an entire basement, but Intel Technology together with Nvidia push the Thermal Envelope to new heights, and I thank them, and my cat thanks the.




RE: Yes, and Winter's almost here
By nerdye on 10/5/06, Rating: 0
RE: Yes, and Winter's almost here
By Furen on 10/5/2006 1:42:28 AM , Rating: 1
You might wanna get a 4-way Paxville setup with that, lol.


RE: Yes, and Winter's almost here
By GNStudios on 10/5/06, Rating: 0
By Darth Farter on 10/5/2006 4:42:14 AM , Rating: 3
"and my cat thanks them" that all just made my day LMFAO!!!


RE: Yes, and Winter's almost here
By shamgar03 on 10/5/06, Rating: 0
RE: Yes, and Winter's almost here
By protosv on 10/5/06, Rating: 0
RE: Yes, and Winter's almost here
By Trisped on 10/5/06, Rating: 0
RE: Yes, and Winter's almost here
By vanka on 10/5/2006 2:30:25 PM , Rating: 2
Since you have decided to convert your PC into a heating appliance, I have a suggestion for you. You said that the P4EE doesn't generate enough heat to be effective as a house-wide heating unit, therefore making the purchase of two 8800GTXs necessary. Here's a suggestion, sell the P4EE on eBay and get a Pentium D 805; then clock it to about 4 GHz. This should solve your insufficient heat generation issue; note water cooling maybe necessary to keep the CPU stable and the 1300 watt PSU may not be enough. With the money you save on the 805 you will be able to get an AC unit.


RE: Yes, and Winter's almost here
By Arc4173c7 on 10/6/2006 12:40:56 PM , Rating: 2
Laughing Out Loud

I totally agree with you, winter is coming and this superb G80 with its super truper heat too, so forget to pay for a coal or gas... New era of heating devices is here.

Whats gonna be next? Well, Im really looking forward to DirectX 11 and quadro or more G90 with its power consumption about 10kW... graphics even based on 65nm!

Well, there might be extra graphic cards cases in the future, while computers are getting smaller, graphic cards are while increasing their performance, their size too. If this is logical rule, then dont wanna see them in the far-sighted future.)


RE: Yes, and Winter's almost here
By carl0ski on 10/10/2006 7:31:17 PM , Rating: 2
This was the philosophy i used choosing and Athlon XP over a Pentium 4
Years back all the talk about how reduculously hot Athlons ran. and it was winter.

You have no idea how extremely disappointed i was when to my shock it run extremely cool compared to my mates Pentium 4. Sub 40C mates Pentium 4 ran a steady 60C :(


The rumour is just an example of how people stereotype a company.


Interesting facts
By Chillin1248 on 10/5/2006 12:58:27 AM , Rating: 3
It seems that the rumor of 32 Pixel, 16 texture and 16 geometry shaders may be false. According to leaked tech documents on vr-zone.com (taken down shortly after) it appears that it has Unified Shaders and is 384-Bit and features a watercooling/fan and a fan versions.

What is ironic is that the supposed pictures of the of the G80 that surfaced on the web recently do show 384-Bit memory configuration and a watercooling/fan hybrid version... So could the third part of the document stating USA (Unified Shader Architecture) be true?

-------
Chillin




RE: Interesting facts
By KristopherKubicki (blog) on 10/5/2006 12:59:43 AM , Rating: 5
It is unified. We'll post the details of the architecture later today when the DT writers recover :-P


RE: Interesting facts
By Chillin1248 on 10/5/2006 1:03:39 AM , Rating: 3
Yeah just noticed it, nearly had a heart attack. Keep up the good work Kris!

-------
Chillin


RE: Interesting facts
By tuteja1986 on 10/5/2006 1:43:42 AM , Rating: 2
I already sloved my Power issue ... I have 2 PSU

1st PSU for Mobo , Optical Drive , HDD
2nd PSU for GPU , Case Fan

problem sloved ... crossfire R600 and SLI G80 ready.


Amperage?
By theslug on 10/5/2006 7:58:40 AM , Rating: 2
More importantly that just total wattage, how many amps are needed on the 12V lines to power this thing?




RE: Amperage?
By shamgar03 on 10/5/2006 8:44:28 AM , Rating: 3
Probobly 1/12th the power draw...


RE: Amperage?
By theslug on 10/5/2006 9:21:54 AM , Rating: 2
Guess I wasn't clear..I was looking for an actual number. Like the way some cards require 25amps, some 30, etc.



RE: Amperage?
By sadffffff on 10/5/2006 11:13:15 AM , Rating: 4
LOL you missed that one

P=V*I
so
I=P/V...


RE: Amperage?
By theslug on 10/5/2006 4:46:07 PM , Rating: 2
Ok, upon re-reading my first post, I realize I should have mentioned that I'm talking about on a PSU. Total wattage (or combined amps) on the 12v lines of a PSU must need or exceed what the card manufacturer recommends. But I'm guessing no card makers has mentioned this yet, considering how far off the release is.


Oh no.....!
By dude on 10/5/2006 4:12:23 AM , Rating: 5
Global Warming is going to hit an all time high in accelleration!




RE: Oh no.....!
By goku on 10/5/2006 5:24:58 AM , Rating: 3
Buy the G80 TODAY
Thats right, these video cards are massively powerful, unlike old cards these new cards now have the power to render the entire globe's forcast for global warming! We've finally got the processing power to predict the events and observe the process that will occur due to global warming!

The bad news is that these new graphics cards will likely be the cause of global warming.


RE: Oh no.....!
By Wwhat on 10/5/2006 7:32:30 AM , Rating: 1
Sign up for green power and you got that solved.
Over here they use managed forest for it, you grow trees they absorb co2, you burn them and the co2 remains the same for they release what they previously absorbed, then you grow new trees on the same place. and that's just one way of course, sure it costs a few percent more, but with the oilprices so high not that much more or possibly even less.





RE: Oh no.....!
By glennpratt on 10/5/2006 11:43:13 AM , Rating: 2
Hrmm, while burning trees sounds good for CO2, I doubt it's that great in terms of overall polution. With ethonal that rationale seems to make sense, but thats after the corn has been turned into alchohol with questionable effeciency.

FYI, for those that live in Texas, we are supposedly the biggest wind energy producer in the country now. So get wind, like Green Mountain, and help fund more of it.


a
By hans007 on 10/5/2006 4:20:56 AM , Rating: 2
honestly i don tthink the power requirements even matter. performance per watt was a big server / rack space thing. most enthusiasts probably dont care.

people still buy x1900xts even though they use over twice the power of a 7900gt.

if this card performs people will buy , that simple. besides the ATI one is supposed to use even more power. and the demographic this is being sold to "gamers" genreally could care less about power / noise. generally anyway




RE: a
By Wwhat on 10/5/2006 7:38:16 AM , Rating: 2
Over twice? talk about being silly in exaggeration, ha.
And don't forget that graphic cards rev up for #D gameplay and use much less during normal operation.
but with vista it gets tricky, ATI I know won't rev up unless 3D is in fullscreen so I hope the vista aero interface won't put it 24/7 in high gear and keep it still in check.


RE: a
By Pirks on 10/6/2006 5:00:40 PM , Rating: 2
don't worry Wwhat, just read this
http://techreport.com/onearticle.x/10945 and stay happy ;)


RE: a
By Wwhat on 10/7/2006 2:13:28 PM , Rating: 2
Cool, thanks for the link.


exciting stuff comming
By SignalPST on 10/5/2006 12:50:52 AM , Rating: 2
I'm getting the feeling that Nvidia will be launching some pretty crazy stuff in November. There's G80, nForce 680i/650i SLI, and supposely AMD 4X4 stuff.




RE: exciting stuff comming
By danz32 on 10/5/2006 12:57:51 AM , Rating: 2
I cant wait :) I am building my new computer next summer...I originally planned for this winter, but prices are going to be too high on launch for the dx10 video card I would want. I also want to see how quad core pans out, and hopefully the Asus Republic of Gamers motherboard will come out for Core 2 Duo.

And Vista is coming out...It is a sweet time in the technology world :)


RE: exciting stuff comming
By Wwhat on 10/5/2006 7:24:52 AM , Rating: 1
Sweet apart from the vista part you mean.


RE: exciting stuff comming
By kilkennycat on 10/5/2006 10:59:05 AM , Rating: 2
You timing of next Summer (or maybe even Fall) is perfect. Both AMD and Intel will be battling on quad-core processors, Vista will be slightly stable and the core and drivers might even have got to 80% of XPs efficiency in gaming and second generation DX10 cards will be available.
And all the "early adopters" will have paid their dues. Meanwhile, I'll happily soldier on with my o/c'd X2 4400+.


ahahaha
By Armorize on 10/5/2006 1:02:04 AM , Rating: 2
This pales in comparison to the r600 http://www.theinquirer.net/default.aspx?article=34...
"R600 to clock at more than 700MHz
ATI's R600 will consume over 250W
ATI's R600 delayed until next year
ATI's R600 has 64 real pipes "

one way or another, it sounds like were going to have to build our own powerplants to run our machines lol, or say hello to a not so nice electric bill.




RE: ahahaha
By Chillin1248 on 10/5/2006 1:09:38 AM , Rating: 5
May I divert your attention to this article, please brace yourself:

http://www.dailytech.com/Article.aspx?newsid=4441&...

quote:
NVIDIA’s GeForce 8800GTX will be the flagship product. The core clock will be factory clocked at 575 MHz. All GeForce 8800GTX cards will be equipped with 768MB of GDDR3 memory, to be clocked at 900 MHz. The GeForce 8800GTX will also have a 384-bit memory interface and deliver 86GB/second of memory bandwidth. GeForce 8800GTX graphics cards are equipped with 128 unified shaders clocked at 1350 MHz. The theoretical texture fill-rate is around 38.4 billion pixels per second.



-------
Chillin


Hmm
By archcommus on 10/5/2006 12:46:25 AM , Rating: 3
I suppose a 400/450 W overall PSU requirement is not too bad, 300/350 is kind of out-dated for an enthusiast system now anyway. However, I wonder how many things you could run in that box alongside that card with only a 450 W PSU.

Hate to say it but I may be wanting to wait until the second-gen DX10 cards arrive, with designs more focused around power savings.




RE: Hmm
By soydios on 10/5/2006 12:50:56 AM , Rating: 2
Damn that's a beast of a card. I tremble to think of the power R600 will draw.

I'll be waiting until second-generation DX10 as well. My X1900XT is hot enough already.


It could be more useful.
By mindless1 on 10/5/2006 2:05:51 AM , Rating: 3
It would be nice if they just told us the power consumption per rail of the video card through the slot and the connector instead of trying to provide a not-entirely-useful PSU total wattage rating, particularly with the variables involved in current per rail and even accuracy of ratings on some PSU, particularly for sustained current which is all the more significant with a continual draw as when gaming.




RE: It could be more useful.
By Wwhat on 10/5/2006 7:41:22 AM , Rating: 2
Yes it's weird, and if you email manufacturers they claim they don't know the actual power, how hard is it to meassure a system with and without the card and see if your designers really have no clue? -which would be weird in the first place.


Well I won't be buying it...
By NT78stonewobble on 10/5/2006 5:20:21 AM , Rating: 3
Ive allready given up on trying to follow the need for speed.

I neither can't or will not try to afford the latest and greatest allmost 1000 $ graphics card. Times 2 because you
wan't em in SLI. And on top of that a huge power bill...

I'm still gaming at 1280x1024 on an old quirky 17" crt for crying out loud. I might need newer features but alot of cards can push 1280x1024 just fine.

Yes, yes I'll buy something new when its at half price and half power usage for the same money... And live happily with the thought of not owning something which lost 500 $ in value over the last year...





RE: Well I won't be buying it...
By Wwhat on 10/5/2006 7:43:58 AM , Rating: 2
"still" at 1280x1024? uhm that's the standard right now, albeit LCD's course.


no 8050 version?
By FXi on 10/5/2006 7:59:07 AM , Rating: 2
Kind of expected a repeat of there being 3 cards, a 8xxx gtx a 8xxx GT and a 8xxx gx2 kind of thing.

Interesting news anyway :)




RE: no 8050 version?
By Warder45 on 10/5/2006 8:15:16 AM , Rating: 1
Isn't this the PC version of what is in the PS3? This could explain the overheating problems sighted in the PS3's.


RE: no 8050 version?
By fumar on 10/9/2006 9:42:58 PM , Rating: 2
No, the PS3's GPU, the RSX, is based on the 7800GTX 512. Except the RSX has only 256MB of GDDR3 and a 128bit memory interface. This might be 2x as fast as the RSX. It certainly is more than 2x as fast as the Xbox360's Xenos.


horrified, yes. but a question
By RedStar on 10/5/2006 9:34:22 AM , Rating: 2
I won't cover the large negative of the power draw.

But i am wondering if it might not be so large a negative.

assumption: the g80 will be 2x as good as the 7900gtx.

Thus, won't one g80 be the equivilant of 7900gtx running in sli. If so, is the power draw actually a bit smaller?

Of course, this is based on one pretty large assumption :)




By mindless1 on 10/5/2006 6:47:01 PM , Rating: 2
The assumption might be that people only buy as much video card as they need instead of by budget or per bragging rights. Those who had SLI'd 7-series would be among those most likely to do it again.


11" is Nuts
By Ard on 10/5/2006 8:17:26 PM , Rating: 2
I really hope they find a way to shrink that reference board down. My GTO sits at 9" and I'm damn near entering the HD cage as it is. No way in hell an 11" card would fit in my case (which is no slouch btw).




RE: 11" is Nuts
By ultimaone on 10/10/2006 7:46:05 PM , Rating: 2
ya and the non-GTX is at least managable, i think 8 or 8 1/2 inchs

i really would like to know why the GTX is 3 inchs longer
like really

personlly i'm not going to get one
not if it needs that much power
makes that much heat

it'll either be picked up by the enthusiests as normal
or be a very slow hit....



From bad to worst.
By Apprentic3 on 10/7/2006 2:38:12 PM , Rating: 2
After reading most of the post, I do feel that the graphic card industry is going haywire man. A single next gen graphic card will draw possibly, as much as the entire rig running on integrated graphics. Personally, I think I will still stick on to my current 7900GT and run WinXP. I'll go for a Mac for my next rig since it can dual boot, and that its graphically impressive and speedy, without the need for super high end components.




RE: From bad to worst.
By fumar on 10/9/2006 9:46:53 PM , Rating: 2
That 400W figure is not for the GPU alone, it is for the entire system. HDD, CPU, RAM, Mobo, optical drives, etc. It still requires a crazy amount of power though, and I'm not one of those low end users, I have 2x7800GT.


just a quick Q.
By crowol on 10/9/2006 10:54:11 PM , Rating: 2
i run quad sli, spent 1200 on the 4 7950s alone. Is it true that direct x 10 will NOT work with the geforce 7950's? i have the XFX version of the 7950's.

sincerly,
-Total Noob About This Stuff.




RE: just a quick Q.
By Pirks on 10/10/2006 2:49:14 PM , Rating: 2
yep, DX10 will work on DX10 hardware only, and this is NOT 7950. but there won't be many games with nice DX10 graphics for a long time, so I'd bet you can enjoy your QSLI setup for another year easily. and then you may be able to buy another four DX10 cards - jee I wish you could film that Crysis stuff and post it in HD video somewhere - I'd even buy it from you! given the decent quality of course.

but I doubt FRAPS can capture DX10 and even if it will be able to, there's no horsepower to play and capture at the same time - do you mind to purchase a quad CPU as well and make some nice video captures of quad DX10 Crysis while you're at it?

just kidding of course ;)


Uhhh
By JonMooring on 10/5/2006 12:50:25 AM , Rating: 1
800W? Wow. Glad I don't plan to run SLI anytime soon! I'm all set with my 500W PSU.




RE: Uhhh
By Hare on 10/7/2006 9:22:39 AM , Rating: 2
The specs are ridiculous. A X6800 + 7900GX2 setup draws around 200W. There is absolutely no way the new cards could draw anywhere even close to 800W without melting their own PCB.


Ya'know...
By knowyourenemy on 10/5/2006 8:14:54 AM , Rating: 1
It's funny. I was just looking to upgrade to a new 24" Apple LCD when news of this comes. Looks like I'm building a new rig entirely. I cannot wait. :)




RE: Ya'know...
By Spoonbender on 10/5/2006 2:55:43 PM , Rating: 2
Remember when NVidia went out and said the FX series "required" a 450W PSU or whatever it was? Sounded outrageous at the time. Then people realized it really meant "If you have a power-hungry CPU, a crappy-quality PSU and a GF FX, you're still 100% sure everything will be stable with a 450W PSU".
In the real world, people still got by by with 350W.

So I choose to be optimistic about this, and assume that the 450W for a single card means roughly the same.

Well, I can hope. :)


wait and see
By shamgar03 on 10/5/2006 7:51:57 AM , Rating: 2
Well I suppose I will suspend judgement until I see the performance numbers, but it is extremely annoying that the only thing int my case getting less efficient is my video card. Does anyone else remember the last time nvidia had a dual slot card? Hopefully this will be better than the FX series




Cold Fusion
By shamgar03 on 10/5/2006 8:43:38 AM , Rating: 2
In other news nVIDIA decided it would probobly need to use cold fusion to power its next generation of cards, so they went ahead and solved the case sized cold fusion problem...




By Narutoyasha76 on 10/5/2006 10:11:30 AM , Rating: 2
WHOA WHOA...Unbelievable G80 at last..yes at long last I will be able to play Direct X10 games...450W minimum...shaders galore!!! pixel beauty avalon!!! more powerful than a locomotive, and surpasses Xbox360 and PS3...Even Vista 2008 wouldn't be a challenge for it...oh but wait...$600+ per card for games that even yet utilize 100% of Direct X9c...nah I'll wait for the mainstream versions, those G80 and R600 with a price tag of $250 or less. For now my Geforce 6800 works just fine.




Good news for cooling market
By Lazarus Dark on 10/5/2006 10:31:52 AM , Rating: 2
Actually with the cool performance of conroe even oc'd I was getting worried that the watercooling and extreme cooling market may die out but this pumps new life into the industry.

I think i'll be getting an OCZ cryo-z for my g80 and "just" watercooling my conroe or kentsfield. The extreme cooling community will thrive for some time!yeahhhhhh!




Power
By iwod on 10/5/2006 1:11:13 PM , Rating: 2
Well not everyone is going to play games 24/7 running at full gfx power.

So what i am wondering is if the Geforce 8 series has any power saving features. Just wanted to know for example how much power it will use when i am only using Window Vista Aero interface.




By UltraSparc on 10/5/2006 7:18:06 PM , Rating: 2
I believe its cheaper to heat your house with two of these than it is to use a full heating system. Hell, I used to have a Prescott clocked to 3.6 heating my room, and i'll tell you one thing, its cheaper than powering a room heater! :)




Do we only care about power?
By leidegre on 10/16/2006 4:42:54 AM , Rating: 2
It's like hooking up the loudest possible vacum-cleaner to your computer.

Jokes aside, the fact that the tremendous power graphic card manufacture's are pushing through this kind of development, is kinda odd. It's been said before, but while all other development world wide, is going to more environmental firendly, and more power saving, and less heat, the graphics manufactures are still going for power trough watt.

ATI, beeing the leader of mobile graphics, might actually turn up with some intresting parts in the next year, the aquisition by AMD, will hopefully create a new market, and hopefully ATI/AMD will be able to establish a more environmentel friendly and less power consuming graphics solution. And might even be able to compete with Intel's integrated graphics. Not perfomance wise, but value and enironmental wise.

I want power, but I also want to avoid speding money on expensive cooling/sound proofing solutions.

It's an intressting thoguht, anyway...




"If you mod me down, I will become more insightful than you can possibly imagine." -- Slashdot











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki