The 9800 GTX+ reference cards have a shader clock of 1836 MHz and a graphics
clock of 738 MHz. The new GTX+ GPU supports PhysX on the GPU and supports
CUDA-based applications as well. CUDA-based applications include Folding@Home
and Badaboom video transcoding.
The GPU supports both 2-way and 3-way SLI -- just as the original 9800 GTX
does. NVIDIA claims that when pitted against the Radeon
HD 4850, the GTX+ is on average 22% faster than the 4850. NVIDIA’s
reviewers guide for the 9800 GTX+ shows that the card can deliver 30 fps on
Crysis at 1920 x 1280 resolution verses about 27 fps for the HD 4850. PhysX on
the 9800 GTX+ is supported with driver version 177.39. NVIDIA also claims that
the PhysX capability of the 9800 GTX+ gives a noticeable performance boost on
3DMark Vantage CPU test 2 to the tune of a 7.5 times gain.
Perhaps the best news for gamers looking for a video card that performs
well, but doesn’t cost a fortune is that the 9800 GTX+ card will retail for
$229. The old 9800 GTX drops to a price of only $199, making for a great
performance for price bargain considering the card was over $300 a few weeks
One more important note on the 9800 GTX+ GPU is that it is built using a
55nm refresh of the original G92 core in the 9800 GTX, which was built using
65nm process. Unconfirmed rumors point to a July 16 launch for the 9800 GTX+.
quote: 9800 GTX+ = 8800 revision number 3
quote: Originally Posted by Slappi, 09-Jun-2008, 22:26 You guys didn't learn your lesson from the 2900 fiasco.Same pre launch hype all based on rumors.Everyone looked stupid launch day.Guess we are all in for a repeat.
quote: your card wasn't ramping up under load fan wise
quote: While the idea of a quality product AND a low price is a new one for nVidia...
quote: Beautiful way of encouraging ATi, without whom nVidia would continue to line their pockets with your parents hard earned money.
quote: I believe that the 1836MHz clock speed is of the shaders, the memory clock speed should be something like 1000MHz - 1200MHz.
quote: ...am I the only person who hasn't seen these benchmarks yet?
quote: ..am I the only person who hasn't seen these benchmarks yet? Or has AMD gone 'hey Nvidia lets compare cards, yes, that should a great experience' before giving them to the rest of us?