Print 130 comment(s) - last by scrapsma54.. on Oct 30 at 5:32 PM

DirectX 10 compliant GeForce 8800GTX and 8800GTS headed your way

DailyTech's hands-on with the GeForce 8800 series continues with more information about the GPU and the retail boards. The new NVIDIA graphics architecture will be fully compatible with Microsoft’s upcoming DirectX 10 API with support for shader model 4.0, and represents the company's 8th generation GPU in the GeForce family.

NVIDIA has code-named G80 based products as the GeForce 8800 series. While the 7900 and 7800 series launched with GT and GTX suffixes, G80 will do away with the GT suffix. Instead, NVIDIA has revived the GTS suffix for its second fastest graphics product—a suffix that hasn’t been used since the GeForce 2 days.

NVIDIA’s GeForce 8800GTX will be the flagship product. The core clock will be factory clocked at 575 MHz. All GeForce 8800GTX cards will be equipped with 768MB of GDDR3 memory, to be clocked at 900 MHz. The GeForce 8800GTX  will also have a 384-bit memory interface and deliver 86GB/second of memory bandwidth. GeForce 8800GTX graphics cards are equipped with 128 unified shaders clocked at 1350 MHz. The theoretical texture fill-rate is around 38.4 billion pixels per second.

Slotted right below the GeForce 8800GTX is the slightly cut-down GeForce 8800GTS. These graphics cards will have a G80 GPU clocked at a slower 500 MHz. The memory configuration for GeForce 8800GTS cards slightly differ from the GeForce 8800GTX. GeForce 8800GTS cards will be equipped with 640MB of GDDR3 graphics memory clocked at 900 MHz. The memory interface is reduced to 320-bit and overall memory bandwidth is 64GB/second. There will be fewer unified shaders with GeForce 8800GTS graphics cards. 96 unified shaders clocked at 1200 MHz are available on GeForce 8800GTS graphics cards.

Additionally GeForce 8800GTX and 8800GTS products are HDCP compliant with support for dual dual-link DVI, VIVO and HDTV outputs. All cards will have dual-slot coolers too.  Expect GeForce 8800GTX and 8800GTS products to launch the second week of November 2006. This will be a hard launch as most manufacturers should have boards ready now.

Power requirements for the G80 were detailed in an earlier DailyTech article.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

why odd?
By ForumMaster on 10/5/2006 1:15:00 AM , Rating: 2
why is there an odd amount of memory? aren't equal amounts better? 1GB and a 512bit link? wonder who's going to buy these monsters.

RE: why odd?
By Knish on 10/5/2006 1:19:00 AM , Rating: 1
It has to do with the specs for GDDR4

RE: why odd?
By Doormat on 10/5/2006 1:34:28 AM , Rating: 4
768MB is a round amount - for a 384b memory interface thats six 64b channels, each channel addressing 128MB of RAM. The whole six channels thing is kinda odd but they needed more bandwidth but a 512b memory interface is too expensive.

RE: why odd?
By coldpower27 on 10/5/2006 8:37:23 AM , Rating: 2
It's the first time in awhile that we have had, something that isn't 2 to the power of x. Remember when we had 24Bit precision of Pixel Shader 2.0, on the R300 Series, well same kinda thing applies here, it's doable, just not mathmematically as elegant perse.

RE: why odd?
By Lonyo on 10/5/2006 10:59:14 AM , Rating: 2
XGI were going to make a 192-bit product for mainstream/low end market sectors, but it got cancelled.
It's no suprise that we are seeing an odd number of bits, it's been (IMO) a long time coming, although I expect it to be seen first in lower end/mid range cards.

"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone
Related Articles
Power and the NVIDIA "G80"
October 4, 2006, 11:56 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki