backtop


Print 130 comment(s) - last by scrapsma54.. on Oct 30 at 5:32 PM

DirectX 10 compliant GeForce 8800GTX and 8800GTS headed your way

DailyTech's hands-on with the GeForce 8800 series continues with more information about the GPU and the retail boards. The new NVIDIA graphics architecture will be fully compatible with Microsoft’s upcoming DirectX 10 API with support for shader model 4.0, and represents the company's 8th generation GPU in the GeForce family.

NVIDIA has code-named G80 based products as the GeForce 8800 series. While the 7900 and 7800 series launched with GT and GTX suffixes, G80 will do away with the GT suffix. Instead, NVIDIA has revived the GTS suffix for its second fastest graphics product—a suffix that hasn’t been used since the GeForce 2 days.

NVIDIA’s GeForce 8800GTX will be the flagship product. The core clock will be factory clocked at 575 MHz. All GeForce 8800GTX cards will be equipped with 768MB of GDDR3 memory, to be clocked at 900 MHz. The GeForce 8800GTX  will also have a 384-bit memory interface and deliver 86GB/second of memory bandwidth. GeForce 8800GTX graphics cards are equipped with 128 unified shaders clocked at 1350 MHz. The theoretical texture fill-rate is around 38.4 billion pixels per second.

Slotted right below the GeForce 8800GTX is the slightly cut-down GeForce 8800GTS. These graphics cards will have a G80 GPU clocked at a slower 500 MHz. The memory configuration for GeForce 8800GTS cards slightly differ from the GeForce 8800GTX. GeForce 8800GTS cards will be equipped with 640MB of GDDR3 graphics memory clocked at 900 MHz. The memory interface is reduced to 320-bit and overall memory bandwidth is 64GB/second. There will be fewer unified shaders with GeForce 8800GTS graphics cards. 96 unified shaders clocked at 1200 MHz are available on GeForce 8800GTS graphics cards.

Additionally GeForce 8800GTX and 8800GTS products are HDCP compliant with support for dual dual-link DVI, VIVO and HDTV outputs. All cards will have dual-slot coolers too.  Expect GeForce 8800GTX and 8800GTS products to launch the second week of November 2006. This will be a hard launch as most manufacturers should have boards ready now.

Power requirements for the G80 were detailed in an earlier DailyTech article.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

R600 Pwned
By AggressorPrime on 10/5/2006 4:04:42 PM , Rating: 1
With 2x the shaders as the R600, pwned is the perfect word.




RE: R600 Pwned
By hwhacker on 10/5/2006 7:10:41 PM , Rating: 3
With all due respect; what an ignorant comment.

We *think* we know R600 has 64 pixel arrays, similar to the way R500 (Xenos) and R580 (x1900) have 16. Now, I don't know anything you don't, but I'm taking a shot in the dark assuming each array can process more than one ALU. Nvidia has 128 ALUs total according to these specs; probably 48/64 x 2 full alus . Now...What is that number on R600? If it's 64x2 it has an equal number of of ALUs to G80, and if it's x3, like R580/R500 before it, then R600 could process 192 shader ops. Of course, how it would compare to G80's architecture even with more shading processes would depend on a lot of architectural variables as well as clock speed and overall performance dependent on other specs (raster ops, tmus, bus, memory, etc etc) and I won't even begin to guess who will come out where as not enough information is available...perhaps you should consider that approach yourself. :)


"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone

Related Articles
Power and the NVIDIA "G80"
October 4, 2006, 11:56 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki