Print 65 comment(s) - last by leexgx.. on Mar 15 at 9:04 PM

NVIDIA GeForce 8600GTS

NVIDIA GeForce 8600GT
NVIDIA prepares its next-generation mid-range and mainstream DirectX 10 GPUs

Earlier today DailyTech received it's briefiing on NVIDIA’s upcoming GeForce 8600GTS, 8600GT and 8500GT graphics processors. NVIDIA’s GeForce 8600GTS and 8600GT are G84-based GPUs and target the mid-range markets. The lower-positioned G86-based GeForce 8500GT serves as the flagship low to mid-range graphics card.

The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and PureVideo technologies.

NVIDIA touts three dedicated video engines on the G84 and G86-based graphics cards for PureVideo processing. The video engines provide MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p. G84 and G86 support hardware accelerated decoding of H.264 video as well; however, NVIDIA makes no mention of VC-1 decoding. G84 and G86 also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.

At the top of the mid-range lineup is the GeForce 8600GTS. The G84-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.

GeForce 8600GTS-based graphics cards require an eight-layer PCB. Physically, the cards measure in at 7.2 x 4.376 inches and available in full-height only. NVIDIA GeForce 8600GTS graphics cards feature a PCIe x16 interface, unlike ATI’s upcoming RV630. GeForce 8600GTS-based cards still require external PCIe power. NVIDIA estimates total board power consumption at around 71-watts.

Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs. G84-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for HDMI.

NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.

GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.

The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.

NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.

Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on AMD’s upcoming RV630 and RV610.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Disappointing 128-bit bus
By hadifa on 3/14/2007 12:50:38 AM , Rating: 0
The 128-bit bus of the 8600 GTS seems like a serious bottle-neck even at 1000GHZ.

It seems both AMD/ATI and NVIDIA are using 128 bit interface for their midrange cards and that is while they have improved the interface beyond 256 bit for high end cards. I was hoping midrange/performance cards like 8600 GTS to be 256 bit or at least 172 bit.

This is disappointing.

RE: Disappointing 128-bit bus
By R3MF on 3/14/07, Rating: 0
RE: Disappointing 128-bit bus
By coldpower27 on 3/14/2007 10:52:50 AM , Rating: 2
This isn't possible as Nvidia is reserving the 256Bit Bus Interfaces for Lower Range High End SKU's rather then the mid range. The mid range will remain at 128Bit, probably until some breakthrough comes along that reduces the actual cost of implementing 256Bit Bus.

RE: Disappointing 128-bit bus
By InsaneScientist on 3/14/2007 7:49:47 PM , Rating: 3
Wow... I can't believe the amount of complaining about this 128-bit bus that I'm seeing as I go through these comments.

First, it's not possible to have a 172 bit bus. Each memory chip has a 32-bit interconnect, so for each additional chip, you add another 32 bits to the bus with. No other figure is even theoretically possible.
After 128 we'd see 160, then 192, 224, and then 256. And 160 and 224 aren't at all likely as they would require an odd number of chips.

Second, the bus may be only 128-bit, but with the memory clocked at 1GHz it still blows past the old generation. The 7600GT has 22.4GB/s of memory bandwith. Memory at 1GHz on a 128-bit bus should yield 32GB/s bandwidth.
That's an increase of 10GB/s! Almost 50%!
Why are we complaining????

These aren't going to be ultra high end GPUs. There comes a certain point where more memory bandwith becomes redundant because the GPU can't process any faster. Going to a 256-bus would increase the cost, but not neccesarily much more than that.

RE: Disappointing 128-bit bus
By hadifa on 3/15/2007 8:27:57 PM , Rating: 2
With cards like 1950 pro and 7900 GS, the expectation from the upper midrange cards is not what it used to be.In the price range of US$ 200, we should expect solid performers that do not cost as much as the higher end.

First, it's not possible to have a 172 bit bus. Each memory chip has a 32-bit interconnect, so for each additional chip, you add another 32 bits to the bus with. No other figure is even theoretically possible.

I made a mistake. I calculated 128+64 and got 172!! I meant 192-bit. thanks for the correction.

"When an individual makes a copy of a song for himself, I suppose we can say he stole a song." -- Sony BMG attorney Jennifer Pariser
Related Articles
The Future of HDMI
February 19, 2007, 12:47 AM

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki