backtop


Print 50 comment(s) - last by mykelu.. on Sep 3 at 1:09 AM

New GeForce 7100GS, 7900GS and 7950GT

NVIDIA is set to release three new products in the coming weeks to do battle with ATI during the holiday season. This time around NVIDIA is launching new mid-range and entry level offerings to take on ATI’s recently launched Radeon X1300XT, X1650 Pro, X1900XT 256MB, X1950XTX and X1950 CrossFire. NVIDIA’s new offerings arrive in the form of the GeForce 7100GS, 7900GS and 7950GT.

At the entry level side of things NVIDIA is replacing its GeForce 6200TC with a new GeForce 7100GS. The G72 based 7100GS retains Turbo Cache support like the GeForce 6200TC it replaces. There will be 128MB of onboard memory which translates into 512MB effective memory with NVIDIA’s Turbo Cache technology. The GeForce 7100GS is also backed by a 64-bit memory interface with compatibility with DDR2 memory. Four pixel shader units are available on the GeForce 7100GS. Core clock of the GeForce 7100GS is set at a rather conservative 350 MHz. NVIDIA SLI and PureVideo technologies are supported by the GeForce 7100GS, though it lacks the SLI bridge connector.

Moving up is the retail NVIDIA GeForce 7900GS. Previously the GeForce 7900GS was only available as an OEM offering from system manufacturers such as Dell. Woot also had OEM versions of the MSI GeForce 7900GS for sale at $139 a pop a couple days ago as well. Specifications for the retail GeForce 7900GS are identical to the OEM versions. A 450 MHz core clock is joined by a 1.32 GHz memory clock. It’s also equipped with 20 pixel shader units and 256MB of graphics memory. Support for HDCP is dependant on individual graphics card manufacturers.

Dell has been selling the 7900GS in its XPS systems for some time, but you cannot buy the cards individually.

At the upper spectrum of NVIDIA’s mid-range lineup is the GeForce 7950GT. Sporting a 550 MHz core clock and 1.4 GHz memory clock the GeForce 7950GT is a successor to the previous GeForce 7900GT. Memory size has increased to 512MB with the GeForce 7950GT., The GeForce 7950GT will also have 24 pixel shader units like the previous GeForce 7900GT. HDCP compliance and dual dual-link DVI outputs are standard on all GeForce 7950GT cards as well.

SLI technology is supported on the GeForce 7900GS and 7950GT. Availability of GeForce 7100GS, 7900GS and 7950GT cards are expected in the first half of September for for less than $100, $199 and $299 respectively.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

G80...
By Nightmare225 on 8/25/2006 2:18:46 PM , Rating: 4
G80, G80, G80, G80, G80, G80, G80, G80...

Do you think this subliminal message is working? ;)




RE: G80...
By mendocinosummit on 8/25/2006 2:59:57 PM , Rating: 1
I thought the G80 was going to be released in less than two months? Why are they releasing these?


RE: G80...
By Josh7289 on 8/25/2006 3:26:32 PM , Rating: 2
Because G80 is going to be some ultra high end GPU. These refreshes serve the vast majority of the rest of us quite well.

Besides, early DX10 cards are going to suck for DX10.


RE: G80...
By Smurfer2 on 8/25/2006 3:45:41 PM , Rating: 1
I hope not and I still want a G80. :( I hope that message of yours works.

Wait wait wait, G80 ultra high end? The 7900GS does not replace the 7900GT and the 7950GT is just an attempt to unseat ATI. Right? I mean, the G80 is a whole new core, so I hope this is just trying to keep pace with ATi.


RE: G80...
By RussianSensation on 8/25/2006 4:02:34 PM , Rating: 3
Yes but...take 9700Pro for example, the first 9.0 card. It was a really fast card for its time and even 9800Pro was not worth the upgrade from 9700pro. It was 2x faster than geforce 4 when it got released and was still faster than the 6 months late 5800 series. It wasn't amazing for later DX9 games, you are right, but it still delivered 1024x768 doom 3 experience and ate all the DX8.1 games for breakfast.

In fact with R600 being a unified shader tech, should make it a lot faster in all current DX9 titles and still be good enough for early DX10 titles. Whether or not DX10 titles will take a year or longer as with DX9 titles remains to be seen, but remember that next gen cards will be probably 2x faster in all new dx9 graphics heavy titles such as U2k7, ET: Quake Wars, Crysis and COD3 where 7900 and x1900 series will start to struggle.



RE: G80...
By hadifa on 8/25/2006 7:16:30 PM , Rating: 2
I second that.

9700 pro was a sweet, sweet card. It was amazing from the beginning and it kept improving by driver releases.


RE: G80...
By sxr7171 on 8/25/2006 10:31:18 PM , Rating: 2
Yeah 9700pro was a sick card. I almost regret selling mine, that thing was the best. It really put ATI back on the map at a time when it seemed that nVidia was going to make a 3DFX out of them.


RE: G80...
By Hare on 8/26/2006 5:33:00 AM , Rating: 2
Dont' forget the 9500 cards with 9700 pcb. Easy to mod to 9700. Can't beat the value. I also had a 9700, great card.

Hard to imagine seeing another leap like the one with 9700 (GeForce4 -> 9700).


RE: G80...
By Jedi2155 on 8/26/2006 5:34:41 PM , Rating: 2
I can't agree more, I had a 9700 pro for almost 3 years before I finally succumbed to the upgrade urge.


RE: G80...
By mykelu on 9/3/2006 1:09:43 AM , Rating: 3
Crysis will use DX10 by the way, so next gen cards will make the game look shinier ;D


RE: G80...
By FITCamaro on 8/27/2006 5:20:51 PM , Rating: 2
Maybe Nvidias are since their architecture doesn't use Unified Shaders as DX10 calls for. However ATIs does so it should run DX10 just fine.


RE: G80...
By coldpower27 on 8/29/2006 4:48:48 PM , Rating: 4
Unified Shaders aren't a necessary feature to be DX10 Compliant. We will have to see how fast each implementation is.


RE: G80...
By Crusader on 8/29/2006 6:12:06 PM , Rating: 2
You'll see the R600 being equal or faster than the G80 in native DX10 apps.. which wont exist for years and years on mass scale until UE4.0/Source2/Doom4 engines are released.

While G80 will be able to run DX10 games in all their glory, while providing vastly superior legacy DX9 support due to having specialized, efficient shaders that require no driver coding and/or balancing schemes to weigh out what each application is demanding for vertex/pixel shading at a given moment in-game.

G80 will most likely be the best card considering what will be the longevity of the DX9 era..
ponder the fact that we dont even have Unreal Tournament 2007 (UE3).. quite possibly the most influencial and important game engine "series" available.. and you will see that there will very few DX10 games.

Even Dark Messiah of Might and Magic is Source-based. The vast majority of people wont have fast DX10 cards for years, nor is Vista exactly a hit either.. DX9 will be here for quite a while as theres a lot of life in the efficient and beautiful Source/Doom3/UnrealEngine3.0 based games.


RE: G80...
By SyK on 8/31/2006 2:12:42 PM , Rating: 2
And you don't think anybody will drop in DX10 to UE3/Source/Doom3? Given the huge majority of changes are in shaders?

Overly pessimistic!

That said, voted "Worth a read" for the first time, at least you have a reasonably informed opinion. But please do not underestimate the motivation of developers, what % do you think even had "fast" DX8 cards these days, yet here we are with 10?...


RE: G80...
By SyK on 8/31/2006 2:14:05 PM , Rating: 2
Wait, apparently I cannot post here and still vote for one? How weird! Apologies... But nonetheless, consider yourself that little bit respected. ;)


"Nowadays you can buy a CPU cheaper than the CPU fan." -- Unnamed AMD executive

Related Articles
ATI Readies X1950, X1900XT 256MB
August 22, 2006, 1:13 AM
Dell Jumps Gun on GeForce 7900GS
May 14, 2006, 3:12 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki