backtop


Print 44 comment(s) - last by johnford64.. on Mar 6 at 1:20 PM

NVIDIA documents reveal details about GeForce 7900 GTX such as core speed

We've come across information that confirms NVIDIA's release of its GeForce 7900 GTX, the next flagship for the company. According to NVIDIA documents, the new cards won't be announced until sometime in the beginning of March (the 8th or 9th). The new GPU will be manufactured on 90nm and set to operate at 655MHz. Although this number is still unconfirmed, our sources indicate that this is a final number.

Quick specifications for GeForce 7900 GTX:
  • PCIe native
  • 655MHz core frequency
  • 256-bit memory interface
  • 52GB/sec. memory bandwidth
  • 15B pixels/sec. fill rate
  • 1450M vertices/sec.
  • 24 pixels per cycle
  • Built in twin dual-link DVI support for twin 2560x1600 resolution displays
The documents from NVIDIA also indicate that GeForce 7900 GTX will be "twice as fast as previous generation chipsets" in floating-point performance, but it would be difficult to really consider the 7900 series a next generation component over GeForce 7800. No confirmed board pictures were available at the time of this article but the GeForce 7900 GTX is set to be a dual-slot design. According to the documents, NVIDIA is making sure that the launch of the GeForce 7900 GTX will be a hard launch, indicating that  users can expect products to be available for purchase immediately on the day of the announcement.

NVIDIA does have one trick up its sleeve.  In late March, NVIDIA wil also announce a single slot version of the GeForce 7900 specifically for Quad SLI setups.  Some manufacturers have suggested that Quad SLI may even become available as a discrete solution -- not just Dell's playtoy.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

What happend to 32 pipes?
By tonjohn on 2/21/2006 5:39:02 PM , Rating: 2
This card was supposed to have 32 pipes but now it just has 24... That is rather unfortunate.

I feel sorry for all those guys waiting to get a PS3...




RE: What happend to 32 pipes?
By gersson on 2/21/2006 6:14:42 PM , Rating: 2
I agree. SO, is it faster or slower than an x1900xtx?? Cos if its slower I'll place my order this second. YES I know no one can know for sure but surely some of you nerds (I mean that affectionately) know enough to make an educated guess.


RE: What happend to 32 pipes?
By tonjohn on 2/21/2006 6:19:43 PM , Rating: 2
I'm gonna go with slower, but I bet performance would be pretty similar.


RE: What happend to 32 pipes?
By feraltoad on 2/21/2006 6:58:38 PM , Rating: 2
I would go with faster, but pretty similar. Look at this http://www.theinquirer.net/?article=29229

That is a 7800gtx 512(o/ced i think) and it keeps a little bit ahead of the 1900xtx. I would expect the new 7900 to be 25 to 30 fps over a regular GTX in most things due to that massive clock increase, but I'm just guessing. It does suck that for months now they've been saying 32 pipelines and now its 24. lol just 24 *remembers his 8 pipe 9700 pro*


RE: What happend to 32 pipes?
By crazydingo on 2/21/2006 7:10:45 PM , Rating: 2
25 to 30 fps over a regular GTX is a blanket statement.

~15% faster over 512MB GTX is a better guess.


RE: What happend to 32 pipes?
By Sharky974 on 2/21/2006 9:04:43 PM , Rating: 2
Err no. At your link at higher res with AA and AF 7800 loses almost everty single benchmark to 1900XTX.

Everybody agrees the X1900 is faster. Really what banchmarks in that INQ link are you looking at where 7800 wins?

Now, add another 105mhz, and yeah, 7900 will be more competitive.

Stating 7800 512 beats X1900XTX is simply false, though.


RE: What happend to 32 pipes?
By Sharky974 on 2/21/2006 9:08:49 PM , Rating: 2
quote:
That is a 7800gtx 512(o/ced i think) and it keeps a little bit ahead of the 1900xtx. I would expect the new 7900 to be 25 to 30 fps over a regular GTX in most things due to that massive clock increase, but I'm just guessing. It does suck that for months now they've been saying 32 pipelines and now its 24. lol just 24 *remembers his 8 pipe 9700 pro*
That is a 7800gtx 512(o/ced i think) and it keeps a little bit ahead of the 1900xtx. I would expect the new 7900 to be 25 to 30 fps over a regular GTX in most things due to that massive clock increase, but I'm just guessing. It does suck that for months now they've been saying 32 pipelines and now its 24. lol just 24 *remembers his 8 pipe 9700 pro*

On second thought you dont know what you're talking about clearly.

First, nobody goes to the INQ for reviews.

Second, the 7800 loses all those benchmarks.

25-30 FPS increase? Where'd you pull that number out of?

And massive clock increase for 7900? It's like under 20%.

All this is not to say 7900 wont be a nice card. Just that you're crazy here.


RE: What happend to 32 pipes?
By feraltoad on 2/22/2006 2:34:32 AM , Rating: 2
Whoops, looks like I was looking at the 1800xt. O and u guys R *holes; you can correct or critique w/out being jerks about it. The guy above wanted to know if the 7900gtx would be faster than the 1900xtx and it will be for sure. I only threw my 2cents in to try and help someone and I even said I "EXPECTED" and I was "GUESSING"! Like him I am also waiting w/bated breath to buy one of these cards either the 1900xt or the 7900gtx depending on how the compare to one another. (my 6600gt-agp can't keep up)

Well crazy people like me think a clock speed jump from the 7800gtx-256's 430 to 650 is massive. As far as fps I looked at benchmarks for the difference between stock and o/ced 7800gtxs (unfortunately no one usually posts only core clock benchmarks leaving memory speed stock).

Yeah, yeah now tell me about linearity and AIB partner clocks..yeesh...


RE: What happend to 32 pipes?
By kilkennycat on 2/21/2006 8:16:21 PM , Rating: 2
And what are you smoking in your pipes ? Wait for the full technical analysis of the 7900GTX ( on Anandtech, of course ) before judging whether the pixel-pipe count has any relevance to the overall performance. As ATi has already demonstrated with their unified architecture, there are many ways of 'skinning' the video-card performance 'cat'.


RE: What happend to 32 pipes?
By tonjohn on 2/22/2006 1:01:05 AM , Rating: 2
quote:
As ATi has already demonstrated with their unified architecture


Are you referring to the r500 GPU used in the Xbox360? Because that is the only ATi GPU that uses a unified architecture...


And people were criticizing ATI...
By Shadowmage on 2/21/2006 5:31:55 PM , Rating: 2
Looks like dual slot cooling is back again!




RE: And people were criticizing ATI...
By crazydingo on 2/21/2006 6:01:58 PM , Rating: 2
Back when Nvidia started the dual slot trend, people used to criticize Nvidia.

Hypocrites!


RE: And people were criticizing ATI...
By bob661 on 2/21/2006 7:16:27 PM , Rating: 2
quote:
Back when Nvidia started the dual slot trend, people used to criticize Nvidia.

Yep. Big time hypocites. When ATI did the dual slot thing it was magically ok to use to 2 slots. Wankers.


RE: And people were criticizing ATI...
By nts on 2/21/2006 7:44:26 PM , Rating: 3
ATi's solutions don't sound like a vacume cleaner though :p


RE: And people were criticizing ATI...
By DeathByDuke on 2/21/2006 7:58:15 PM , Rating: 2
and nvidias top cards stayed dual slot for LOOONNNG time, while ATis stayed single slot until the X850XT (though even that can be single slot cooled (the GTO2 anyone?))

its inevitable now that dual slot cooling is needed, with the frequencies cores are getting to.


RE: And people were criticizing ATI...
By Ard on 2/22/2006 2:26:17 PM , Rating: 2
What exactly is a long time? IIRC, NVIDIA had dual-slot cooling on the 5800 Ultra, 5950 Ultra, and 6800 Ultra. ATI has dual-slot cooling on the X850 XT, X1800 XT, and X1900 XTX. Seems about the same to me. Oh, and there were single-slot versions of the Ultra as well. Therefore, I stand by the original statement made.


By johnford64 on 3/6/2006 1:20:18 PM , Rating: 2
but its become accepted now, i personally prefer the 2 slot coolers as they ALLWAYS work better! I dont mind loosing one PCI slot, who does?


By Heatlesssun on 2/21/2006 5:57:04 PM , Rating: 2
Shaders are FP intensive correct? Maybe this is how nVidia will combat ATI's better shader performace.




By OvErHeAtInG on 2/21/2006 6:58:55 PM , Rating: 2
eggs ACT ly.

Everyone's multiplying MHz times pipelines and being all disappointed. Look at the whining in the 7900GT article, WOW. I wish they would just wait a bit and see. What people don't realize is you're almost never fillrate-limited.


By DeathByDuke on 2/21/2006 8:00:00 PM , Rating: 2
yeah, lets wait and see first, instead of bitching that these are just a core bump. its possible theyve done extra optimisations (improved hidden surface culling, FSAA routines etc) to further its performance over a 7800


By Ard on 2/22/2006 2:31:06 PM , Rating: 2
Thank you. Ppl need to stop jumping on unconfirmed information. Just last month the Inq was dead sure G71 was going to have 32 pipes, now they're saying it has 24 but maybe has 32. Truth be told, they don't know and that's all there is to it. For all you know NVIDIA and it's partners are being extremely tight-lipped about what exactly has changed. History suggests that this going to be more than just a bump in clock-speed.


By lik on 2/23/2006 4:12:22 PM , Rating: 2
The Fudo in INQ is a fan of ATI. So nvidia news in INQ can never be trusted.


What abotu the other 7900s?
By Mr Perfect on 2/21/2006 6:22:35 PM , Rating: 2
Every time either company anounces a new top card, there is never anything said about the rest of the high-end line. What information is there about a 7900 GT? Will there even be one? What will it have over a 7800 GT or X1800 XL? I'm willing to bet that most of us don't care about a $550+ card, but a $300 card would be of considerable interest.




RE: What abotu the other 7900s?
By tonjohn on 2/21/2006 6:24:46 PM , Rating: 2
RE: What abotu the other 7900s?
By Mr Perfect on 2/21/2006 6:35:48 PM , Rating: 2
Doh. Thanks for the links. I hadn't seen those, and figured this was just like when ATI launched the X1900 XTX and no mention was made of a X1900 XL. Heh, my bad.


RE: What abotu the other 7900s?
By tonjohn on 2/21/2006 6:36:34 PM , Rating: 2
It's all good! :)


RE: What abotu the other 7900s?
By DeathByDuke on 2/21/2006 8:00:56 PM , Rating: 2
personally forget all the 300+ cards, give me the juicy details on the 179-249 ones!


blah
By RallyMaster on 2/21/2006 6:04:25 PM , Rating: 2
Why would anyone need quad SLi? Competition is great...but not when it's overkill.




RE: blah
By bob661 on 2/21/2006 7:19:48 PM , Rating: 2
I've got a 1kW PCP&C power supply waiting for some quad action!


RE: blah
By smitty3268 on 2/21/2006 9:00:50 PM , Rating: 2
Perhaps for those 30" monitors.


RE: blah
By Griswold on 2/22/2006 7:46:45 AM , Rating: 2
Some people bought diesel generators for when there is a power outage. Well, soon they can put them to good use, as standard wall outlets wont be enough for a top notch gaming PC.

So yea, we dont need quad SLI, we need more! Octagon SLI!


nVidia's getting lazy
By Assimilator87 on 2/22/2006 12:52:06 AM , Rating: 2
I now understand why the 7900 is so measly compared to what we thought it would be. Instead of making a really powerful GPU, nVidia is just relying on quad SLI to trump ATi. This is just despicable =P




RE: nVidia's getting lazy
By Brassbud on 2/22/2006 1:39:35 AM , Rating: 2
When I read the 7900 GT specs today I new the GTX wasn't going to be that special. If the 7900 GTX was the 32 pipe 700+ MHz monster we were lead to believe, then I was thinking Nvidia would have the performance crown again, but with this measly specs release, G71, from the end-user standpoint, is just an OCed G70, how disappointing, I hope this report is wrong.


RE: nVidia's getting lazy
By Lakku on 2/22/2006 4:03:26 AM , Rating: 2
Well, according to the above links (the ones posted by tonjohn), the 7900GT has 24 pixel pipes. It is essentially a 7800GTX on the surface, but they may be adding things such as FSAA+HDR, better aniso and the like to match ATi. These two stories conflict, as I highly doubt nVidia would release a GT and GTX with the same number of pipes. Since this says 24 pixels per cycle, maybe it means ROPs? Perhaps someone can derive a better answer for us, but the clear fact is, these stories conflict as to how many pipes the GTX will have, since the linked story clearly states the 7900GT will have 24 actual pipes.


RE: nVidia's getting lazy
By OvErHeAtInG on 2/22/2006 6:20:43 PM , Rating: 2
I don't see the contradiction. Kubicki's story from early this morning also states the GT will have 24 pipes.
http://www.dailytech.com/article.aspx?newsid=907

They will both have 24 pipes, with the GTX having higher clocks, a DS cooler, and probably twice the memory. (assuming the GT has 256.) Just like 6800GT vs. Ultra, yes?


Competition FTW
By Fenixgoon on 2/21/2006 5:13:23 PM , Rating: 5
Regardless if you're an ATI or Nvidia fanboy, you have to admit: competition = better graphics cards = good news for all of us.

Hopefully both ATI and Nvidia will start heavy competition in the low-midrange (100-200), because that's what my budget will be :)




RE: Competition FTW
By granulated on 2/21/2006 9:57:14 PM , Rating: 2
bah !

Due to apparent technically difficulties, no vendor managed to release an AGP 6800 series card with vivo functionality.

Now it looks like the same thing is going to happen with the weirdly priced 7800GS.




In the end...
By sircuit on 2/22/2006 3:00:42 PM , Rating: 2
It comes down to pushing an extra 5 billion pixels then the 1900xtx can. If the GTX part does indeed have 32 pipes make that 10 billion.




RE: In the end...
By czarchazm on 2/22/2006 6:59:27 PM , Rating: 2
Yep, pushing those extra pixels out wiht no anti-iso. When you turn the shaders on, though, the pipes are waiting for the shader calculations to finish and aren't pushing pixels out. This is just the sort of problem that ATI ran into with Doom 3 and Quake 4. Inherent bottleneck with specular lighting calculations prevent the pixels from being rendered, so they had to implement an approximation technique to alleviate the problem. Link:

http://www.anandtech.com/video/showdoc.aspx?i=2701...

With such complex shader programs and limited resolutions, more pixels are not the answer. You got to prepare the surface before you can pave the road.


hmm
By MMilitia on 2/27/2006 7:51:16 AM , Rating: 3
Being in the market for a new ridiculously high end video card currently, I am eager to see some benchmarks for the GTX. I would assume that the card will perform atleast as well as the 1900XT. It's not like nvidia to release a new high end card which performs worse than the competition.

Assuming it does perform as well as the XT (even XTX but the difference is too small to care IMO) i'm a bit worried that its going to be much more expensive. :/




Awesome!
By del on 2/21/2006 5:18:40 PM , Rating: 2
I can't wait to build a new computer.




asdfas
By Spikesoldier on 2/21/2006 5:31:09 PM , Rating: 2
/drool




Memory Speeds, etc.
By nordicpc on 2/22/2006 9:50:56 AM , Rating: 2
Well looks like the memory on it will be a touch slower than the 512mb GTX, probably around 1.6ghz. I was really hoping that they would use the same 1.8ghz memory for that extra boost that ATI left out. Those 200mhz could cost them the crown.

I agree with you guys in that I'm a little dissapointed that this is just a die shrink. I was hoping to see atleast one extra quad, but it looks like we'll all miss that until G80 or whatever is around the corner in summer.

This floating point performance increase is fairly interesting though. Maybe tweaked branch prediction or something? Not to mention the affect on the GPGPU movement, gaming should get a fairly substancial boost with better FP, seeing as most everything is floating point in 3d these days.

Guess we'll just have to wait for the benchmarks to see what the real deal is.




Oh well
By Anemone on 2/23/2006 2:37:09 PM , Rating: 2
Was hoping for 32 pipes. I guess I'll have to wait for the G80. Wonder if we'll see that early in the year or at some far off future, even IF they can get it to work.

:)




"Intel is investing heavily (think gazillions of dollars and bazillions of engineering man hours) in resources to create an Intel host controllers spec in order to speed time to market of the USB 3.0 technology." -- Intel blogger Nick Knupffer











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki