backtop


Print 34 comment(s) - last by throughhypersp.. on Apr 18 at 11:17 AM


GeForce 8600 GTS - Gigabyte GV-NX86S256H

GeForce 8600 GT - Gigabyte GV-NX86GT256H

GeForce 8500 GT - Gigabyte GV-NX86T256H

MSI NX8600 GTS

MSI NX8500 GT
NVIDIA pulls the wraps off its new mainstream products

NVIDIA today announced its latest DirectX 10 product offerings to serve the $129-$229 price points. The new GeForce 8600 GTS, 8600 GT and 8500 GT introduces technology previously exclusive to the high-end GeForce 8800-series to more affordable levels. NVIDIA’s new mainstream product lineup features Shader Model 4.0 support, GigaThread technology and NVIDIA’s Quantum Effects physics processing technology. The addition of NVIDIA’s Lumenex engine to the mainstream sector provides value-conscious buyers with 128-bit floating point HDR rendering and anti-aliasing levels of 16x.

The new mainstream flagship is the GeForce 8600 GTS, which takes the spot previously held by the GeForce 7900 GS. NVIDIA slots the GeForce 8600 GTS below the GeForce 8800 GTS 320MB, which currently occupies the $299-$329 price points. GeForce 8600 GTS based products fill in the $199-229 price points. GeForce 8600 GTS-based products feature 32-stream processors clocked at 1.45 GHz. NVIDIA has set the reference core and memory clock speeds at 675 MHz and 1.0 GHz, respectively.

Catering to the $149-159 price points is the slightly detuned GeForce 8600 GT. GeForce 8600 GT-based products feature 32-stream processors as with the GeForce 8600 GTS; however, NVIDIA detunes the shader clock down to 1.18 GHz. Reference core and memory clock speeds are set at 540 MHz and 700 MHz, respectively. NVIDIA slots the GeForce 8500 GT at the bottom of its new mainstream product lineup. The new GeForce 8500 GT fills in the $89-$129 price points with its 16-stream processors. NVIDIA clocks the 16-stream processors at 900 MHz on the value-oriented offering. Reference core and memory clock speeds of the GeForce 8500 GT are 450 MHz and 400 MHz, respectively.

All three new models feature support for NVIDIA’s PureVideo HD video processing technology. PureVideo HD technology remains unchanged from the previous product generation. Nevertheless, PureVideo HD provides hardware acceleration for H.264, VC-1 and MPEG-2 high-definition and standard-definition video formats.

NVIDIA GeForce 8-series
Model
8600 GTS 8600 GT
8500 GT
8800 GTS
Stream processors
32 32 16 96
Core clock
675 MHz 540 MHz 450 MHz 500 MHz
Shader clock
1.45 GHz
1.18 GHz 900 MHz 1.2 GHz
Memory clock
1.0 GHz 700 MHz 400 MHz 900 MHz
Memory interface
128-bit 128-bit 128-bit 320-bit
Memory bandwidth
32 GB/sec 22.4 GB/sec 12.8 GB/sec 64 GB/sec
Texture fill rate (Billion/sec)
10.8 8.6 3.6 24

NVIDIA does not integrate HDCP support for its GeForce 8600 and 8500 products. Add-in board manufacturers will have to purchase external EEPROMs with HDCP keys. Expect GeForce 8600 GTS-based products to all have HDCP support. NVIDIA does not require HDCP support on GeForce 8600 GT and 8500 GT based products. Manufacturers are free to include or exclude the feature at their own discretion.

Expect NVIDIA’s new products to hit retail immediately after the announcement. AMD expects to launch its attack on the GeForce 8600 and 8500-series next month with its upcoming ATI Radeon HD 2600 and HD 2400.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

R600...where are u...
By klingon on 4/17/2007 6:07:24 AM , Rating: 2
well...im jus waitin for the upcoming ATI Radeon HD 2900 XT, HD2600 and HD 2400....

ATi's R600's delay is somewat similar to PS3's delay except for the fact tht there wasn't any bogus announcements with R600.

As for the nvidia...the problem is that they should have launched 8600gts, 8600gt and 8500gt during the 8800GTX & GTS launch.




RE: R600...where are u...
By Metroid on 4/17/2007 6:32:46 AM , Rating: 2
It is terrible to think that R600 still very premature than G8 in benchmarks due to a good drivers support. I hope things will change for its sake. The truth is as long as Nvidia does not have a driver for vista, I do not see any reason to blame ATI for delaying the R600.


RE: R600...where are u...
By AndreasM on 4/17/2007 8:13:25 AM , Rating: 2
quote:
The truth is as long as Nvidia does not have a driver for vista


http://www.nvidia.com/object/winvista_x86_100.65.h...
http://www.nvidia.com/object/winvista_x64_100.65.h...

Both are WHQL certified. One can argue about their quality (haven't personally bothered to install Vista yet so I have no clue), but claiming that they don't exist is incorrect.


RE: R600...where are u...
By SpaceRanger on 4/17/2007 9:45:53 AM , Rating: 3
While they ARE WHQL Certified, I would hardly call them solid performing drivers. I'm currently using the Beta 101.41's and I'm still not pleased with the performance I am getting out of them.

I'd like to see more effort put into release better drivers than crippled hardware (compared to their GTX counterparts).


RE: R600...where are u...
By othercents on 4/17/2007 11:05:17 AM , Rating: 3
This could still be an issue with Vista since all manufacturers are having issues with performance. Microsoft changed a bunch of things which was the same problem we had moving from Windows 98 to Windows XP. It is just going to take time and usually going to take game developers to changed the way the game interacts with the video card.

Not everyone has the extra money to spend on the top of the line cards, but DX10 cards available in everyone's price range will generate more sales than updating drivers for early adopters to a new operating system. Plus I'm sure that Nvidia has multiple teams running that have different tasks. One for hardware development and another for Software/Driver development.

Other


RE: R600...where are u...
By SpaceRanger on 4/17/2007 11:31:32 AM , Rating: 2
I'm aware there are multiple teams for multiple tasks (R&D, etc.), but the team behind the Driver Development has not been performing all that great (from the eyes of a consumer).


RE: R600...where are u...
By jrb531 on 4/17/2007 12:23:30 PM , Rating: 2
Going to make a small change in your comment....

Not everyone has the extra money to spend on the top of the line cards, but DX9 cards available in everyone's price range will generate more sales than updating drivers for early adopters to a new operating system. Plus I'm sure that Nvidia has multiple teams running that have different tasks. One for hardware development and another for Software/Driver development.

See what the change was? *points to DX9*

Does anyone remember those great "first to market before the drivers were finished" DX9 cards were?

Anyone?

Anyone?

Why the now "infamous" "FX" cards... namely the 5200, 5600 and 5800 cards that were perhaps the very worst cards Nvidia ever produced. Also remember that the FX cards actually ran the then current DX8 code very well but when real DX9 games came out the fireworks began.

While I'm not saying that the 8xxx cards will be a bust in DX10 the similarities are interesting. The 8xxx's run DX9 code very well and the new drivers are late, lacking and untested in real DX10 games.

After the last incident I am no longer taking Nvidia's word for it that they will be killer DX10 cards. I'll wait and see once real DX10 games come out and not some DX9 games with minor DX10 "hacks" added.

-JB


RE: R600...where are u...
By griffynz on 4/17/2007 8:41:01 AM , Rating: 1
what ever !

I was waiting for AMD to release something that would beat a C2D, well I brought the Intel E4300 and over-clocked it to 3200ghz because I can't wait for ever. This is my first non-AMD computer. I brought an 8800 gts 320, and Vista and you know what? I am loving every second.
If you wait for ever you might miss the boat.
All these web site tell you stay with XP, what they should say is :
If you are playing openGL games you MIGHT need XP.
The new drivers from NVIDIA are fine, I get 75 to 145 fps on Rainbow six Vegas, oh and I'm happy... have been for 4 weeks now...


RE: R600...where are u...
By GoatMonkey on 4/17/2007 9:01:52 AM , Rating: 2
That's some serious overclocking. All the way to 3200ghz from 1.8ghz, awesome. It must be running at absolute zero.


RE: R600...where are u...
By deeznuts on 4/17/2007 1:09:33 PM , Rating: 2
Nope bro, the E4300 apparently does some serious oc'ing without phase.


RE: R600...where are u...
By throughhyperspace on 4/18/2007 11:17:32 AM , Rating: 2
What? You don't have your Kelvin Kooler (TM) pushing your cpu to 3.2 terahertz?


Where will it end???
By jabber on 4/17/2007 5:05:26 AM , Rating: 2
Hmm extending the heatsink outside of the case now?

Time for a new, more sophisticated approach to graphics, rather than the current bruteforce method? PowerVR style maybe?

Nice to see the new mid-range though. Will be checking these out to see what they offer against the 7900GT.




RE: Where will it end???
By Jedi2155 on 4/17/2007 5:21:14 AM , Rating: 2
But like the powerVR, using a new approach would probably cause problems in rendering unless developers specfically designed around those problems (the Kyro's case of tile rendering). Unlikely to find a new method unless the big 2 both decided on switching to that new method (like a DX generation jump) at the same time to ease the burden on the developers.


RE: Where will it end???
By jabber on 4/17/2007 5:31:44 AM , Rating: 2
Surely there would be no harm in offering a tile style rendering method as a built in alternative? Could well help the budget sector if you could switch between full image rendering and tile rendering.

Would be interesting to see what todays cards could put out framerate wise if they only had to render 60% of a scene.

Could well mean less power consumption too.


RE: Where will it end???
By Chadder007 on 4/17/2007 10:21:22 AM , Rating: 2
Agreed. Its a great loss to the gaming community that Tile Rendering has taken a backseat.


RE: Where will it end???
By jmke on 4/17/2007 5:27:50 AM , Rating: 2
actually a good idea to move the heat outside the case; do note that this is NOT the reference heatsink, you can't see a fan on there; so cooling is had through heatsink surface area.


RE: Where will it end???
By Anh Huynh on 4/17/2007 10:08:44 AM , Rating: 3
The images posted are the Gigabyte models with the Silent-Pipe heat sinks. Silent-Pipe is Gigabyte's passive heat sink for graphics cards. NVIDIA board partners can install any cooler they choose to and Gigabyte's solution does not reflect the choice of other board partners.


RE: Where will it end???
By rbuszka on 4/17/2007 12:34:50 PM , Rating: 2
I, for one, am actually glad that these OEM partners (Gigabyte for AMD and NVidia, and HIS and PowerColor for AMD) are coming out with silent, passively-cooled thermal solutions. Not only is there no fan to break or fail, but these designs are completely noiseless. I'm sick of listening to noisy PCs, which is why I spent a good $200-$250 more on my most recent PC project for silent fans, a silent CPU cooler, a microprocessor-based programmable fan controller from MCubed Tech, and the fanless HIS Radeon X1650XT. I'm glad to see that all these manufacturers are finally ready to talk about noise elimination. If you want your bleeding-edge overclocking, you can just buy one of the cards with a fan-forced heatsink instead, and quit complaining.


Not promising...
By Rookierookie on 4/17/2007 6:19:04 AM , Rating: 2
Yesterday I read some magazine reviews which put the 8600GTS' performance at below that of the X1950Pro, the 8600GT's performance below the 7900GS, and the 8500GT's performance below the 7600GS.

I'm still waiting to see some online reviews, but if the results I read were legit, it really doesn't speak well for Nvidia's new toy...I do think I'd rather sacrifice DX10 to have 50% higher DX9 performance.




RE: Not promising...
By redbone75 on 4/17/2007 8:37:12 AM , Rating: 2
Why don't you guys go check out [H]ard|OCP for a sec. Seems your speculations and worries are all for naught. The 8600 GTS kicks the X1950 Pro's butt. Plain and simple. Pretty good read, too. Just might have to pick up one of these babies. I'm comfortable with the sub $250 category.


RE: Not promising...
By kalak on 4/17/2007 10:01:33 AM , Rating: 2
quote:
go check out [H]ard|OCP for a sec


Links, please ?


RE: Not promising...
By KristopherKubicki (blog) on 4/17/2007 10:04:27 AM , Rating: 2
RE: Not promising...
By kalak on 4/17/2007 2:11:19 PM , Rating: 2
I check the review at HARDOCP and, if their review is correct, this beauty is really better than my ATI X1950 PRO. But I can't find none to buy here in Brazil, only the 8800 GTS 320 ($600). So I will wait some more time to change my Videocard....


RE: Not promising...
By kalak on 4/17/2007 2:14:05 PM , Rating: 2
Castrated
By psychobriggsy on 4/17/2007 7:49:43 AM , Rating: 2
I don't understand why there is no 64 shader variant.

We have 128, 96, 32, 32 and 16 at a paltry clock speed.

The 16 shader variant is not 1/6th of an 8800GTS, due to the slower shaders it's more like 1/10th. The $200-$230 8600GTS is 1/3 of a $300-$330 8800GTS - you simply cannot justify the purchase with this comparative performance/dollar. Sure the core is clocked a little higher to make up some of the difference...

I guess this shows how expensive the 8800GT[X|S] core is to make, if these cut down variants are so costly still.

Could DailyTech include a table of the family, with all the specifications inside (shaders, clock speeds, texture fill rate, etc) for easier comparison?




RE: Castrated
By Pitbulll0669 on 4/17/07, Rating: 0
RE: Castrated
By bubbacub616 on 4/17/2007 8:53:59 AM , Rating: 2
over the last 2 generations ATI/AMD haven't given the best value (e.g. 6600gt>x700, 7600gt>x1600 etc).

here's hoping they can go back to the days of the 9600pro


RE: Castrated
By Vanilla Thunder on 4/17/2007 1:06:11 PM , Rating: 2
No one knows whose offering will take the cake, Nvidia or Dammit, but I think to buy a card now, without waiting to see what the next few weeks will offer, is pure idiocy. Options are the reasons that we have PC's,not an iMac, and I think they should all be considered before a purchase like this. Let's wait for some benchmarks, and stop trying to look into the crystal ball of bullshit and make unfounded "guesses". It'll all be on paper soon enough.

Vanilla


RE: Castrated
By Chadder007 on 4/17/2007 1:19:08 PM , Rating: 2
Who knows....that may come as a 8600GT X


RE: Castrated
By Aquila76 on 4/18/2007 7:49:36 AM , Rating: 2
I think you may be on to something. What a way for nVidia to steal thunder from ATI's upper mid-range release - show decent but not exceptional performance up front and then surprise ATI with a killer part at the end.


Disappointment
By TechLuster on 4/17/2007 5:16:59 AM , Rating: 5
I strongly suspect that the universal disappointment with the specs of the 8600 GTS (less than half an 8800 GTS) expressed in these forums over the past few days is going to give way tomorrow to universal disappointment with the performance numbers.

NVIDIA, would it have been so hard to give us a 64-shader, 192-bit, 384MB card for around $230??




RE: Disappointment
By KaiserCSS on 4/17/2007 9:54:05 AM , Rating: 2
Amen to that.

I've been eagerly awaiting the lower-cost 8X00 series cards for a while now, preparing for DX10, but when I looked at the specs, I was underwhelmed, if not outright disheartened. I suppose I'll just wait for the R600 series. After ATI's release, we'll be able to see some solid comparisons performance-wise.


RE: Disappointment
By edge929 on 4/17/2007 3:17:09 PM , Rating: 2
R600 FTW. Patience is a virtue.


Lack of hdcp on ALL models...
By stepone on 4/18/2007 10:42:14 AM , Rating: 2
I for one am more dissapointed by the lack of hdcp on the 8600GT & 8500GT. Yes the individual card makers can add support themselves but from looking through the specs on the initial cards relased from xfx, asus, bfg etc it appears that NONE of the cards below the GTS are sporting HDCP support.

I was in the process of choosing components for a new media PC when these cards were released and it seemed like a streak of luck for low power cards with excellent HD video decode to be released at just the right moment to go into my new system (coupled with a cheap xbox360 HD DVD drive or possible blu ray in the future when it becomes cheaper or if it wins the format war)... boy was I mistaken!

For most gamers out there this won't be a problem, however with the new pure video 2 engine providing excellent hadware decode of all the codecs used in both HD DVD and Blu Ray these cards would be perfect for a media centre PC and at a good price (around £55 for an 8500GT/£90 for a 8600GT). I just assumed that even though hdcp was't mandatory on these cards that most of the board makers would still integrate hdcp support but since that has fallen through I then cast my eye to the GTS. Now remember I won't be using the card for gaming so lukewarm frame rates in <insert generic fps game name here> don't bother me so much.
However what DOES matter is the noise of the card and low and behold it's strike 2 against the card with just about every review stating that the fans are pretty damn noisy and worse still the 8500GT cards have only 2 pin fan connectors and thus have no thermal control meaning they run the fan at full speed ALL THE TIME!

After having carefully picked out Noctua case fans and CPU heatsink & fan precisely for their low noise and good cooling im not about to have the graphics card mess that up for me.
Now I realise that the 2nd revision of these cards will hopefully redress these issues but they probably won't be available for another 2-3 months and i'm not waiting that long to build my new system.

My only hope now lies with the new AMD ATI cards due out in May in the hope that they offer hdcp on their HD 2400\2600 and that their new UVD lives up to expectations.

Who knows, perhaps this was just fate dashing my hopes, but I hope that by making me wait a few more weeks until ATI release their new cards it may all work out in the end...




"Folks that want porn can buy an Android phone." -- Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki