NVIDIA GeForce 8600-Series Details Unveiled
Anh Tuan Huynh
March 14, 2007 12:16 AM
comment(s) - last by
NVIDIA GeForce 8600GTS
NVIDIA GeForce 8600GT
NVIDIA prepares its next-generation mid-range and mainstream DirectX 10 GPUs
received it's briefiing on NVIDIA’s upcoming GeForce 8600GTS, 8600GT and 8500GT graphics processors. NVIDIA’s GeForce 8600GTS and 8600GT are
-based GPUs and target the mid-range markets. The lower-positioned
-based GeForce 8500GT serves as the flagship low to mid-range graphics card.
The budget-priced trio feature full support for DirectX 10 features including pixel and vertex shader model 4.0. NVIDIA has yet to reveal the amount of shaders or shader clocks though. Nevertheless, the trio supports NVIDIA SLI and
NVIDIA touts three dedicated video engines on the
-based graphics cards for PureVideo processing. The video engines provide MPEG-2 high-definition and WMV HD video playback up to resolutions of 1080p.
support hardware accelerated decoding of H.264 video as well; however, NVIDIA makes no mention of VC-1 decoding.
also feature advanced post-processing video algorithms. Supported algorithms include spatial-temporal de-interlacing, inverse 2:2, 3:2 pull-down and 4-tap horizontal, and 5-tap vertical video scaling.
At the top of the mid-range lineup is the GeForce 8600GTS. The
-based graphics core clocks in at 675 MHz. NVIDIA pairs the GeForce 8600GTS with 256MB of GDDR3 memory clocked at 1000 MHz. The memory interfaces with the GPU via a 128-bit bus. The GeForce 8600GTS does not integrate HDCP keys on the GPU. Add-in board partners will have to purchase separate EEPROMs with HDCP keys; however, all GeForce 8600GTS-based graphics cards feature support for HDCP.
GeForce 8600GTS-based graphics cards require an eight-layer PCB. Physically, the cards measure in at 7.2 x 4.376 inches and available in full-height only. NVIDIA GeForce 8600GTS graphics cards feature a PCIe x16 interface, unlike ATI’s upcoming RV630. GeForce 8600GTS-based cards still require external PCIe power. NVIDIA estimates total board power consumption at around 71-watts.
Supported video output connectors include dual dual-link DVI, VGA, SDTV and HDTV outputs, and analog video inputs.
-based GPUs do not support a native HDMI output. Manufacturers can adapt one of the DVI-outputs for
NVIDIA’s GeForce 8600GT is not as performance oriented as the 8600GTS. The GeForce 8600GT GPU clocks in at a more conservative 540 MHz. The memory configuration has more flexibility, letting manufacturers decide between 256MB or 128MB of GDDR3 memory. NVIDIA specifies the memory clock at 700 MHz. The GeForce 8600GT shares the same 128-bit memory interface as the 8600GTS. HDCP support on GeForce 8600GT is optional. The GPU and reference board design support the required HDCP keys EEPROM, however, the implementation is up to NVIDIA’s add-in board partners.
GeForce 8600GT-based graphics cards only require a six-layer PCB instead of the eight-layer PCB of the 8600GTS. The physical board size is also smaller too – measuring in at 6.9 x 4.376 inches. GeForce 8600GT-based cards do not require external PCIe power. NVIDIA rates the maximum board power consumption at 43-watts – 28-watts less than the 8600GTS.
The GeForce 8600GT supports similar video outputs as the 8600GTS, however, the 8600GT does not support video input features.
NVIDIA has revealed very little information on the GeForce 8500GT besides support for GDDR3 and DDR2 memory. It supports dual dual-link DVI, VGA and TV outputs as well.
Expect NVIDIA to pull the wraps off its GeForce 8600GTS, 8600GT and 8500GT next quarter in time to take on
This article is over a month old, voting and posting comments is disabled
Don't be so quick to judge..
3/14/2007 8:30:53 AM
While I would like to see a 256-bit Memory bus, but it's not always the defining factor of performance. Remember that a 6800 Ultra is slower than a 7600GT despite the 6800ultra's 256-bit memory bus.
Memory bandwitdh is important, however these days the trend seems to be with increasing shader performance above all. Look at the 1900xtx vs the 1800xtx. Yes it had more Memory bandwidth with it's gddr4, but the main improvment was with shaders.
Why don't we wait for more information or previews before we bring out the torches and pitchforks.
RE: Don't be so quick to judge..
3/14/2007 8:48:35 AM
X1900 series used the GDDR3 same as the X1800 series. Only the X1950XTX used GDDR4. I have one though and it is quite a good card. I run Vanguard: SOH at 1280x1024 at max detail and get 50-60 fps in the open and in dungeons, 20-30 fps average in cities (still perfectly playable). All running at a "cool" 40C despite OC on memory and core thanks to Swiftech's Apex Ultra+ water cooling system.
“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs
The Future of HDMI
February 19, 2007, 12:47 AM
NVIDIA H.264 Decoding on PureVideo Finally Available
March 6, 2006, 7:01 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Microsoft Surface Pro 3 - Battery Issue Fixed
August 30, 2016, 6:30 AM
First Apple Computer Auctions for $815,000
August 27, 2016, 7:51 AM
Lenovo vs. Asus vs. HP - Best Laptop Under $500.00
August 19, 2016, 4:00 AM
Best Router for Your Home Network (Under $200.00)
August 6, 2016, 9:51 PM
5 Top Rated Printers for Home or Small Office
July 29, 2016, 10:44 PM
Most Popular Articles
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM
Latest Blog Posts
Burlington Gun Attack
Sep 27, 2016, 5:00 AM
Who is in Risk of Getting Oral Cancer?
Sep 23, 2016, 6:02 AM
France Bans Plastic Eating Utensils in Restaurants
Sep 18, 2016, 10:49 AM
Progress Against Acute Myeloid Leukemia
Sep 17, 2016, 5:30 AM
Apple Watch Series 2 - Number 1 in the Customer Satisfaction.
Sep 7, 2016, 6:19 PM
More Blog Posts
Copyright 2016 DailyTech LLC. -
Terms, Conditions & Privacy Information