NVIDIA GeForce 8600-Series Details Unveiled
Anh Tuan Huynh
March 14, 2007 12:16 AM
comment(s) - last by
NVIDIA GeForce 8600GTS
NVIDIA GeForce 8600GT
NVIDIA prepares its next-generation mid-range and mainstream DirectX 10 GPUs
received it's briefiing on NVIDIA’s upcoming GeForce 8600GTS, 8600GT and 8500GT graphics processors. NVIDIA’s GeForce 8600GTS and 8600GT are
-based GPUs and target the mid-range markets. The lower-positioned
-based GeForce 8500GT serves as the flagship low to mid-range graphics card.
The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and
NVIDIA touts three dedicated video engines on the
-based graphics cards for PureVideo processing. The video engines provide MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p.
support hardware accelerated decoding of H.264 video as well; however, NVIDIA makes no mention of VC-1 decoding.
also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.
At the top of the mid-range lineup is the GeForce 8600GTS. The
-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.
GeForce 8600GTS-based graphics cards require an eight-layer PCB. Physically, the cards measure in at 7.2 x 4.376 inches and available in full-height only. NVIDIA GeForce 8600GTS graphics cards feature a PCIe x16 interface, unlike ATI’s upcoming RV630. GeForce 8600GTS-based cards still require external PCIe power. NVIDIA estimates total board power consumption at around 71-watts.
Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs.
-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for
NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.
GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.
The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.
NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.
Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on
This article is over a month old, voting and posting comments is disabled
Don't be so quick to judge..
3/14/2007 8:30:53 AM
While I would like to see a 256-bit Memory bus, but it's not always the defining factor of performance. Remember that a 6800 Ultra is slower than a 7600GT despite the 6800ultra's 256-bit memory bus.
Memory bandwitdh is important, however these days the trend seems to be with increasing shader performance above all. Look at the 1900xtx vs the 1800xtx. Yes it had more Memory bandwidth with it's gddr4, but the main improvment was with shaders.
Why don't we wait for more information or previews before we bring out the torches and pitchforks.
RE: Don't be so quick to judge..
3/14/2007 8:48:35 AM
X1900 series used the GDDR3 same as the X1800 series. Only the X1950XTX used GDDR4. I have one though and it is quite a good card. I run Vanguard: SOH at 1280x1024 at max detail and get 50-60 fps in the open and in dungeons, 20-30 fps average in cities (still perfectly playable). All running at a "cool" 40C despite OC on memory and core thanks to Swiftech's Apex Ultra+ water cooling system.
"Can anyone tell me what MobileMe is supposed to do?... So why the f*** doesn't it do that?" -- Steve Jobs
The Future of HDMI
February 19, 2007, 12:47 AM
NVIDIA H.264 Decoding on PureVideo Finally Available
March 6, 2006, 7:01 AM
"Prepare to be Punished": Microsoft is Killing OneDrive With Cuts, Blames Users
November 3, 2015, 8:23 PM
Apple's New "Magic" Peripheral Line Packs High Tech, High Prices
October 13, 2015, 9:39 PM
Samsung Adds 2 TB 850 EVO, PRO SSDs for $800, $1000
July 7, 2015, 4:23 PM
Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
May 13, 2015, 2:49 PM
How to Recover Most Apps After Your NVIDIA Driver Crashes in Windows 10
March 30, 2015, 12:54 PM
Tinkerer Gets Old School Mac Plus Running on the Modern Web
March 24, 2015, 6:41 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information