backtop


Print 51 comment(s) - last by InsaneScientis.. on Jan 20 at 2:46 AM


More GeForce 8800 derivatives make an appearance

NVIDIA has released to its partners plans to announce a 320MB GeForce 8800 GTS product.  NVIDIA's embargo date for the 8800 GTS 320MB is slated for mid-February.

The GeForce 8800 GTS is designed to reduce the cost of enthusiast-level DirectX 10.  Samsung K4J55323QG-BC12 (800MHz GDDR3 memory) has a spot market price of about $50 for 640MB.  We can reasonably expect a $25 reduction in the bill of materials for the 320MB version of the card. 

The 320MB card is one of the many G80 derivatives expected for Q1'07.  NVIDIA CEO Jen-Hsun Huang confirmed the company's plans for more DirectX 10 video cards in an interview late last year.

GeForce 8800 GTS, also known as G80-100, already exists as a 640MB product.  All aspects of the 640MB card will remain the same for the 320MB derivative: 500MHz core frequency, 800MHz memory frequency and 96 stream processors.  The PCB layout and cooling will remain the same.

Other DirectX 10 GeForce cards such as G84-300 and G86-300 are expected to ship before the AMD R600 graphics card.  R600 is currently scheduled to launch at Cebit 2007 in late Q1'07.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

$25!!
By Slappi on 1/16/2007 8:26:11 PM , Rating: 1
Hmmm $400 for a 8800GTS 640mb or $375 for a 8800GTS 320mb...


...hmmm I guess I'll go with double the memory for about 1/15th the price extra.




RE: $25!!
By KristopherKubicki (blog) on 1/16/2007 8:27:31 PM , Rating: 3
Well, keep in mind that' $25 on the BOM not on the MSRP.

But I sort of agree with you.


RE: $25!!
By kdog03 on 1/16/2007 8:32:55 PM , Rating: 5
Sort of?...You'd be two fry's short of a Happy Meal if you didn't.


RE: $25!!
By Furen on 1/16/2007 10:23:14 PM , Rating: 2
Just in case you guys didn't know these video cards sell at a huge premium, that's why every company under the sun is trying to get into the video card business.

While the $25 BOM drop for half the memory sounds insanely mediocre, card manufacturers would likely also take a profit-margin hit to make these more apealing, since they'd end up not being as "high-end" 'cause everyone knows less memory is bad (I'm being sarcastic here). Nvidia might also drop the price for the GPUs, if needed, so don't expect these parts to be well-priced.


RE: $25!!
By Furen on 1/16/2007 10:25:03 PM , Rating: 2
Err, I meant "Nvidia might also drop the price for the GPUs, if needed, so expect these parts to be well-priced."


RE: $25!!
By Samus on 1/17/07, Rating: -1
RE: $25!!
By Pirks on 1/17/2007 11:53:46 AM , Rating: 3
quote:
even if its $100 less for the 320mb, it isn't worth it. it loses DX10 compliance having less than 512mb memory...at least I think the base requirements for DX10 are 512mb, might be 448mb
sounds like bs from your a## unless you prove it with a link. thank you.


RE: $25!!
By Samus on 1/17/07, Rating: 0
RE: $25!!
By Sulphademus on 1/17/2007 2:44:52 PM , Rating: 3
The weird memory configurations are because of the width of the memory controller nVidia is using.

"The GTS also has only five 64-bit memory controllers with 640MB of GDDR3 memory running at 800MHz."

http://anandtech.com/video/showdoc.aspx?i=2870&p=1...

also: http://www.tomshardware.com/2006/11/08/geforce_880...


RE: $25!!
By InsaneScientist on 1/17/2007 2:53:05 PM , Rating: 4
I haven't seen anything saying that 512MB is a minimum for DX10 requirements... nor does it make any sense... It would be impossible to create midrange and low-end cards that are DX10 compliant.

Since DX10 is a big part of Vista (Aero Glass doesn't run at its full capabilities without a DX10 card), I don't think that you're going to see many people in IT departments being very happy with the idea that they would have to get a $300+ video card, just for full functionality in Vista. Nor would your average home user, who might like to play games, but doesn't want to drop that much on a video card (or 2 or 3... many people have multiple computers).

I personally fall into both those categories, and I suspect that most would just ignore it and dump DX10 (for a long while at least, probably permanently in the IT sector), along with which goes one of Vista's major selling points...
Microsoft would have shot themselves in the foot if they put that in the DX10 Specs, since that would hurt the sales of Vista.

quote:
why else do you think the cards come with such weird memory capacities and requirements? your welcome.


That's actually quite simple...

nVidia wanted more bandwidth to the Video RAM.
There are 2 ways to increase memory bandwidth:
1st (and more commonly) you can hike the clockspeed of the memory.
Your 2nd option is to increase the bus width. (High end DX9 parts all had 256-bit memory, most mid and low range cards had 128-bit memory, and some of the really low end parts had 64-bit.)
The caveat with increasing the bus width is that it requires a lot of additional circuitry, which increases the complexity of the board, and therefore the overall production cost.

Since memory can only go so fast, with the ultra high-end of this new generation, nVidia decided to take the second option; they increased the bus width. However, for whatever reason (probably the additional cost, and they likely don't need quite that much bandwidth), they didn't want to take it up to a full 512-bit bus...

384-bit for the 8800GTX and a lower (and cheaper) 320-bit for the 8800GTS

Since each GDDR3/4 chip has a 32-bit interface, to get a 284-bit interface, you need to run 12 chips in parallel (32*12=384) which is what we find on the GTX. For the GTS's 320-bit interface, we only need 10 chips (32*10=320), which, again, is what we find.

However... computers need to do most things in exponents of 2. (2, 4, 8, 16, 32, 64, 128, 256, etc...)
Though it is not necessary to run a number of chips in parallel that is an exponent of 2 (and therefore the interface bit rate does not necesarily need to be an exponent of 2), the same is not true of memory capacities...
Each VRAM chip's capacity must be an exponent of 2.
Since that is fixed, there is no way to offset the fact that the number of chips used isn't an exponent of 2 and get the memory capacity back to a standard 512MB, 1GB or whatever.

I.E. The normal GTX and GTS use 512Mb memory chips. In the case of the GTX's 12 chips, we get 512*12/8=768MB of RAM. With the GTS's 10 chips we see 512*10/8=620MB VRAM.
The only thing that they can do is halve the capacity of the chips (again, because it must be an exponent of 2... so they can't just take off a third), in which case the GTS's 10 chips gives us 256*10/8=320 MB VRAM. Which is exactly what we see in this article.


RE: $25!!
By JCheng on 1/18/2007 11:04:17 AM , Rating: 2
quote:
Since DX10 is a big part of Vista (Aero Glass doesn't run at its full capabilities without a DX10 card)


Yeah, I'm pretty positive that's not true, full glass runs on DX9. I'm sitting here with a (don't laugh) GeForce 6500 looking at full glass.

Unless there are some special glass features that Microsoft has not announced that you know of...?


RE: $25!!
By UsernameX on 1/18/2007 11:18:03 AM , Rating: 2
quote:
quote:
Since DX10 is a big part of Vista (Aero Glass doesn't run at its full capabilities without a DX10 card)


Yeah, I'm pretty positive that's not true, full glass runs on DX9. I'm sitting here with a (don't laugh) GeForce 6500 looking at full glass.

Unless there are some special glass features that Microsoft has not announced that you know of...?


I would like to know this as well. I haven't heard of any DX10 only features for windows Vista. Can someone enlighten us?


RE: $25!!
By InsaneScientist on 1/20/2007 2:46:59 AM , Rating: 2
My apologies... I look at that now, and realize that I phrased that wrong...

It's not really a huge thing (it doesn't look any better or anything; it's just a small functionality issue.)

It isn't Aero glass, per se, that doesn't function at its full capabilites, but rather the Desktop Window Manager engine (DWM).

This might not be exactly correct, it is just my best understanding of what's going on... if someone has better information, please enlighten us.
Basically, DirectX 9 cards can only process one graphical stream at a time. Since the Aero glass interface requires a constant stream of GPU acceleration, if something else that requires GPU horsepower is started on the system, Aero glass needs to shut down so that the other process can get access to the resources of the GPU. (and it does)
Most of the time you don't see this, because the majority of the things that require graphics acceleration are full screen games... there are a few things, though, that will revert Vista to the Windows Vista Basic theme.

For example, one of them is Java. If you want to check this out, install Sun's Java package (if you don't already have it.) and then load a webpage (any page) that has java embedded in it. Vista should revert to the Windows Vista Basic Theme until you close your internet browser. (I haven't actually done this with anything other than IE, but considering how the engine is supposed to work, I would assume that it's the same with Firefox, Opera, etc.)

Anyhow, DX10 allows for simultaneous processing of multiple graphics streams, allowing Vista to continue using the glass interface, while still giving the other process the resources it needs.

That's all it really is... I didn't mean to make it seem like a big deal. It's not... but there is a little bit more functionality that you get with a DX10 part over a DX9 one.

P.S. Forgive my incoherency... I'm extremely tired at the time of this post. Hopefully I got my point across.


RE: $25!!
By tigen on 1/17/2007 3:15:22 PM , Rating: 2
Why is it weird? It's just because they have two separate memory interfaces on the chip. That's not a DX10 requirement.


RE: $25!!
By SunAngel on 1/16/07, Rating: 0
RE: $25!!
By KristopherKubicki (blog) on 1/16/2007 9:03:30 PM , Rating: 2
Good heads up. Fixed.


RE: $25!!
By AzureKevin on 1/16/2007 9:46:09 PM , Rating: 3
I'm pretty sure I read somewhere that the 320 mb version of the 8800 GTS is likely to have an MSRP of $299, making it a much more reasonable buy. Less ram isn't a big issue, since the GPU itself is a much more important aspect of a video card.


RE: $25!!
By nurbsenvi on 1/17/2007 7:46:47 AM , Rating: 2
320mb version will be $100 dollar cheaper than the 640mb version for sure.

My only concern is that would 320mb be enough to run Crysis and UT2007?

if it is enough my money is on 320mb version.






RE: $25!!
By walk2k on 1/17/2007 5:11:57 PM , Rating: 2
It's $100 retail.

320mb will be more than enough. 256 is overkill now.

No game comes even close to using that much texture memory, unless you run Doom 3 on "ultra" settings that doesn't use texture compression...


RE: $25!!
By edge929 on 1/18/2007 11:36:43 AM , Rating: 2
Just wait for ATi to release R600 soon (next month?). By the time Crysis, UT07 come out, there will also be a mid-range R600 to compete for your money and that means prices will drop on the GTS/GTX.


Wait for the 8600 series
By Shark Tek on 1/16/2007 8:54:54 PM , Rating: 3
I prefer to wait for a 8600GT card, those 8800 are very power hungry. Hopefully it will be close in performance to a 7800 or 7900 card and probably with DX10 support. If the power requirements are high with the 8600 I will go with a 7950GT card.

Video card manufacturers should begin to design solution that consume less power, radiate less heat and with the same or more power. Just like AMD and Intel do to their cpus.




RE: Wait for the 8600 series
By Shark Tek on 1/16/2007 8:57:32 PM , Rating: 2
quote:
Video card manufacturers should begin to design solution that consume less power, radiate less heat and with the same or more power .


I really mean with the same processing power or better than current cards.


RE: Wait for the 8600 series
By Ringold on 1/16/2007 9:14:02 PM , Rating: 2
I was really, really impressed by that 600 or 650 watt PSU shown at CES running dual 8800GTX. I guess I don't see the problem. If it got much higher, it'd start to be one, but given that just about every enthusiast with a recent build has 500-800 watt PSU's already I don't think its an issue yet. 600 watt PSU's arent even that expensive any more.


RE: Wait for the 8600 series
By xFlankerx on 1/16/2007 9:23:10 PM , Rating: 2
Forget watts. The 550W Antec True Power Trio is my new favorite. Three +12v rails with 18A on each of them. Nvidia recommends 30A for a system with one 8800GTX, 54A for two shouldn't be a problem.


RE: Wait for the 8600 series
By Ringold on 1/16/2007 11:04:35 PM , Rating: 3
I'm partial to Seasonic myself, but yeah, I'll get getting a E4300 + 2gb ddr2-800 + G80 (of some variety - probably 8800GTS 640mb) + 2 SATA HD's, pairing it with a S12 or M12 of around 600watts, overclocking the bloody hell out of it, and I suspect I'll never even hear the PSU fan boost its speed under load. Not even the slightest concern about power consumption at this point in time.

Oh, and watercooling. Not because it's necessary, I should note, but because I want to OC (the bloody hell out of it, as I said).


RE: Wait for the 8600 series
By KayKay on 1/17/2007 12:30:22 PM , Rating: 2
That is like my exact planned system build, i'll have the watercooling going, and the heavy overclock as possible.

I recommend a silverstone zeus 650W, thats the one i got and it is fantastic!


RE: Wait for the 8600 series
By Pythias on 1/17/2007 7:52:32 AM , Rating: 2
THANK you! Its not hard to find a good psu that'll one of these. Just don't go with a manufacturer known for inflating its specs. All it takes is a little research :)



RE: Wait for the 8600 series
By Ard on 1/17/2007 3:27:22 PM , Rating: 2
The PSU he was referring to also has 3 rails at 18A a pop (Corsair's 620W HX). I've been partial to my boys at Antec, so a Trio is definitely a consideration. Of course, Corsair's 620W really impressed me with that demo at CES, so I might buy that for my next build.


RE: Wait for the 8600 series
By Ard on 1/17/2007 3:30:01 PM , Rating: 2
We need a goddamn edit button. The one thing that is keeping me from the Trio is that it's not modular. So actually, I'd be deciding btw the NeoHE (which also has 3 12V rails at 18A) or the 620HX.


RE: Wait for the 8600 series
By phusg on 1/17/2007 12:00:06 PM , Rating: 2
The PSU's may not be expensive any more, but electricity IS, and getting more so every year. There's also the little issue of GLOBAL warming that some consider a problem ;-)


RE: Wait for the 8600 series
By timmiser on 1/17/2007 12:21:30 PM , Rating: 2
Of course PSU wattage is just a rating of maximum wattage that the PSU is capable of outputting. It doesn't mean your CPU will be constantly running at that wattage rating if ever.


RE: Wait for the 8600 series
By bespoke on 1/17/2007 1:25:49 PM , Rating: 2
Right. I have a 420 watt SeaSonic PSU powering an Athlon 3500+, 2 gigs of RAM, a 7900GT, 2 hard drives and 2 cd/dvd drives and even when running Prime95 or the 3D view in ATITool, the power draw never goes above 200 watts.

(Neat tool to view power draws - http://www.p3international.com/products/special/P4... )


RE: Wait for the 8600 series
By therealnickdanger on 1/17/2007 12:56:39 PM , Rating: 2
quote:
There's also the little issue of GLOBAL warming

Sheesh... G80 ain't that hot.


RE: Wait for the 8600 series
By ADDAvenger on 1/17/2007 2:29:29 PM , Rating: 2
Yup, far more important is the electricity usage of server farms. I don't know how many there are, but seriously, there are more servers in the world than there are enthusiast-level computers, and I'm sure the big ones easily draw more current than any enthusiast rig.

I think there was an article here on DT about some audit or something on companies' use of efficient servers, and someone made the same comment about how they need to look into G80 instead of servers; I don't think they were voted down, but it was pretty clearly laid out why G80 is negligible from an electrical cost perspective, especially on a national scale.


RE: Wait for the 8600 series
By Hoser McMoose on 1/18/2007 4:30:33 PM , Rating: 2
Just to toss some numbers at it, an extra 100W of power consumed for a system running 24x7 for a year will cost $105 a year extra assuming an electricity cost of $0.12/kWh (fairly typical for North America). If the system is run 8x5 (ie typical office use) that number is still $25/year.

Now, obviously simply having an extra 200W on your power supplies rated output does not translate to an extra 200W of power used. In fact, even looking at the maximum power draw doesn't mean much of anything since most household computer systems are idle ~99% of the time. However a difference of 100W at idle DOES matter, and a pair of GeForce 8800 cards in SLI will easily consume 100W more at idle than almost any other single video card. Something to think about, especially for those who leave their computer on all the time.


RE: Wait for the 8600 series
By jabber on 1/17/2007 7:02:36 AM , Rating: 2
I think they should revisit the PowerVR method of just rendering what you can actually see at any one time. Dont bother with what isn't visable. Cuts the actual amount of processing etc. by a significant amount.

I dont know how much longer this 'bruteforce' method of graphics rendering can continue. We need something a little more sophisticated.


You'd think...
By Messudieh on 1/16/2007 11:46:34 PM , Rating: 2
They'd come out with a 512MB version of this card. I haven't heard of it anywhere. I'd think that 320 is getting awfully close to the lower limit for a card of this power before it becomes a problem. As I can remember it

128 was good for the ATI 9700/9800 series

256 was generally good for the 6800/X800 series, and into the 7800/7900/X1800/X1900 series

It seems fairly logical that the 8800 series would need around 512 to be future proofed for the next generation of games that will be coming out.






RE: You'd think...
By Warren21 on 1/17/2007 12:12:37 AM , Rating: 3
The 640/320 divide is because of the odd 320-bit bus on the 8800 GTS, that's why you won't see 512 versions of this card.

Also, while -- "128 was good for the ATI 9700/9800 series, 256 was generally good for the 6800/X800 series, and into the 7800/7900/X1800/X1900 series. It seems fairly logical that the 8800 series would need around 512 to be future proofed for the next generation of games that will be coming out. " -- there have always been cards in those generations with double the 'needed amount'. Eg. 256MB Radeon 9 series/FX cards, 512MB 6800/X800/7800/7900/X1800/X1900 cards and then there will be 1GB on the R600, while it may possibly only 'need' 512 (won't know until its out/they make a 512MB card version to compare).

Usually, for high resolutions ( > 1600 x 1200) it is nice to have twice the amount of memory as per the memory bus width, that way the buffer always has room and the memory bus is always full. Eg: Radeon X18/900 & GeForce 78/900 512/256-bit bus, 8800 768MB/384-bit bus, R600 1024MB/512-bit bus


RE: You'd think...
By otispunkmeyer on 1/17/2007 4:03:22 AM , Rating: 2
i think the GTX and GTS PCB's are different, theyre not the same PCB's with one getting a surgeons knife to some memory traces. they are physically different...one wired for 384bit, the other wired for 320bit

384bit = 6x64bit controllers controlling 2 64mb chips each = 12x 64 = 768mb GDDR3

320bit = 5x64bit and 10x64mb chips for 640mb

if they wanted to produce a 512mb version...another PCB would have to be made with 256bit interface to connect to 8 64mb chips.

its far cheaper and easier to use the existing GTS PCB and just plonk on some 32mb chips instead of the 64mb ones currently used.


$25 is still $25
By Loser on 1/16/2007 8:51:27 PM , Rating: 2
whats the point spending $25 on more ram which will give you no benefit?




RE: $25 is still $25
By deeznuts on 1/16/2007 8:55:57 PM , Rating: 2
Because it MIGHT get you a benefit. Which would you rather be out, $25 or a $350 video card?


RE: $25 is still $25
By FITCamaro on 1/17/2007 6:59:04 AM , Rating: 2
More RAM helps at higher resolutions where the textures are much larger. Yes the game has to be able to use that RAM but with newer games using larger and larger textures, the extra memory can help. Vanguard is one of the most visually demanding games I've ever seen. I'm glad I have a 512MB card for it.

The only time having 512MB of RAM is silly is when its on a lowend card such as the 7600GS. The card doesn't have the processing power to be able to play the game on high resolutions anyway. The extra memory is solely there for the idea that "more is always better".


cost
By Spoelie on 1/16/2007 8:32:17 PM , Rating: 2
Normally, component costs do not linearly correlate to end user prices at all. The price differential will be at least double that - $50, probably more. I'm guessing a $350 street price.




RE: cost
By 05SilverGT on 1/16/2007 8:46:47 PM , Rating: 2
Even at $350 street I just snagged a 640MB 8800GTS for $369.99 with a mail in rebate from EVGA. I think I'll take the extra 320MB for a couple bucks more.


MSRP
By Russell on 1/16/2007 10:02:15 PM , Rating: 2
Every other site describing this card is saying it'll have a $299 MSRP, not $375 or $350.

If that's the case then this is a steal.




RE: MSRP
By Pwnt Soup on 1/16/2007 11:01:06 PM , Rating: 2
i think its best too see how it performs befor thinking its a steal. while it very well maybe, it could also be a flop. arent we seeing a need for 512MB buffers with the last gen gpu's too keep them fed? seems too me, with a faster gpu, you would want too start off with at least that big a buffer and go up. just how much more often will the 320MB cards be asking for fresh data compared with the 640MB ver? and what effect will it have on performance?


No point
By electriple9 on 1/17/2007 8:46:05 AM , Rating: 2
Soon we will see a 128meg 8800.
Thanks




RE: No point
By Samus on 1/17/2007 1:21:51 PM , Rating: 2
crysys will run like crap on it. it also wont meet most DX10 game recommendations. crysys recommends 512mb video memory for peak performance.


G80 series
By crystal clear on 1/17/2007 5:26:25 AM , Rating: 2
few new G80 strings:

G80-200, G80-400, G80-600, G80-850 and G80-875.







Perhaps
By praeses on 1/17/2007 12:50:42 PM , Rating: 2
A more attractive card would be a 384MB 8800GTX, atleast to me providing the price is right. If it were to have a $375-400 street I'ld definetly consider it.




Why 320mB are enough
By xNIBx on 1/18/2007 3:22:23 AM , Rating: 2
Many people have lcd monitors with 1280x1024 maximum resolution. They dont want to get a new monitor but they do want to play games with all eye candy on. At that resolution, 320mB is enough to play whatever current and probably most future games with all eye candy on.

So you get the raw power of the 8800gts in a more suitable package. If you have such a monitor, there is no reason to buy something with more memory. You will be just wasting your money. And almost all 19" or smaller lcd monitors have 1280x1024 maximum resolution.

Thats the market that this card is aiming for. And for that market, its ideal.




"People Don't Respect Confidentiality in This Industry" -- Sony Computer Entertainment of America President and CEO Jack Tretton

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki