backtop


Print 85 comment(s) - last by Shankar.. on Jun 1 at 2:28 PM


NVIDIA's GeForce 7950GX2 adaptor

Preliminary GeForce 7950GX2 reference benchmarks (rescaled)
NVIDIA is prepping the launch of its Radeon X1900XTX killer

DailyTech earlier reported that consumers will have to wait a little while longer for the 7950GX2 QUAD SLI adaptor, but details of the card were not revealed.  Today we take a deeper look at the specifications of the NVIDIA GeForce 7950GX2 video card. 

NVIDIA's sales material claims the following specifications:

  • 500 MHz core clock
  • 600 MHz memory clock
  • 1GB GDDR3 memory
  • Dual dual-link DVI+HDTV-out
  • HDCP Support

GeForce 7900GTX 512MB cards typically run core clock speeds around 650MHz, meaning the 7900GX2 is significantly under clocked.  However, there are two cores present.

NVIDIA's GeForce 7950GX2 is a dual PCB adaptor that is directly derived from the GeForce 7900GX2, which was already derived from the GeForce 7900GTX ASIC.  GeForce 7950GX2 takes two GeForce 7900GTX boards, and joins them via 32 PCIe lanes.  16 additional lanes are routed to the motherboard out to the PCIe adaptor.  The GeForce 7900GX2 was designed specifically for OEM system builds and as a result nothing was compromised for performance.  However, GeForce 7950GX2 is designed to be the retail component, and as such a few things needed tweaking for retail sales.

For example, the cooler on the 7900GX2 has been drastically redesigned and reduced for the 7950GX2.  Gone is the large copper heatsink/fan combo in favor of a much smaller aluminum heatsink with a more powerful fan.  Furthermore, the majority of the mosfets and capacitors positioned at the rear of the card have been moved toward the DVI inputs instead, allowing engineers to cut down the size of the PCB substantially, down to nine inches.  It's also important to note that the SLI bridge is not present on the 7950GX2.

NVIDIA plans to target the Radeon X1900XTX with the 7950GX2 launch.  Both cards are approximately the same size and both consume two expansion slots in a standard chassis.  The marketing marterial bundled with the 7950GX2 claims that a full system using a Radeon X1900XTX requires just under 400W during peak operation, while a 7950GX2 system will only consume 358W and still provide greater performance. 

Like the GeForce 7900 and 7600, 7950GX2 has HDCP support built into the PureVideo portion of the 7950 ASIC.  However, once again, it’s up to the manufacturers to incorporate the CyptoROM for HDCP keys.  Without this additional chip, the GeForce 7950GX2 cannot do DVI-HDCP. 

The NVIDIA GeForce 7950GX2 graphics card will launch on June 6, and will retail for $599-649.  Even though the card is designed for QUAD SLI, there will be no four-GPU support at the time of launch with the 7950GX2 due to driver constraints.  ATI's Radeon X1900XTX retails for approximately $480 USD.

Update 05/27/2006: You can still view the original performance scaling image here. We originally reported that the 7950GX2 would require SLI to work, but this is only true of QUAD SLI performance.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

So.......
By Bull Dog on 5/24/2006 7:12:34 PM , Rating: 2
Its not really targeted at the X1900XTX. It costs a fair chunk more. And calling the 7950GX2 a single graphics card is really pushing it. I mean it has 2 PCB's, 2 GPU chips, 2 of just about everything, isn't that really like SLI?

IMO its not a sigle graphics card and its not really two either. Its sort of a hybred. Oh and here's something really funny, essentially it's a couple of 7900GT's mashed together. And whadya know, with a retail price of $600, it costs like two of them mashed together too? How about just getting 2x7900GT's? They will OC better and, as a result, give better performance for the same money........




RE: So.......
By TheBaker on 5/24/2006 8:57:40 PM , Rating: 2
quote:
Oh and here's something really funny, essentially it's a couple of 7900GT's mashed together. And whadya know, with a retail price of $600, it costs like two of them mashed together too? How about just getting 2x7900GT's? They will OC better and, as a result, give better performance for the same money........


Check the article one more time: It's two 7900GTX PCB's mashed together, not two 7900GT's. Two GTX's cost $900-1000, not $600-650, and of course it can't be OC'd as well as a GT, because it is GTX's, which are already basically OC'ed GT's with proper voltage and more memory.


RE: So.......
By Egglick on 5/24/2006 10:55:36 PM , Rating: 3
Yeah, but if you check the article one more time, you'd see that the chips are clocked at 500mhz core and 1200mhz memory. Most 7900GTX's are clocked at 650/1600 or higher.

The 500/1200 speeds of the 7950GX2 is actually slower than alot of 7900GT's with factory overclocks, especially the memory. Bulldog was correct in saying that this card is more like two 7900GTs.


RE: So.......
By Bull Dog on 5/25/2006 12:08:05 AM , Rating: 2
Lol Egglick. ;)


RE: So.......
By TheBaker on 5/25/2006 1:26:44 AM , Rating: 2
quote:
The 500/1200 speeds of the 7950GX2 is actually slower than alot of 7900GT's with factory overclocks, especially the memory. Bulldog was correct in saying that this card is more like two 7900GTs.


That depends entirely upon the vGPU on the card. If it is 1.2 volts, like the GT, then I'll stand corrected. But the article states the GPU's sit on GTX PCB's which likely means the vGPU is 1.4 volts, and 500/1200 would be a massive underclock of the card, so why the complaints about OC'ing? Factory clocks mean nothing! What matters is voltage and heat dissipation. If they are underclocked G71's with GTX voltages they can be OC'ed back to GTX speeds. They have probably dropped the clock speeds to reduce heat.

So, let me see if I've got this straight: The card performs like two GTX's in SLI (nVidia claims SLI increases performance 50-70%, which is just slightly above this card's average) and can most likely be taken even higher, is priced like two GT's (which it clearly out-performs), and takes up two slots, instead of the probable four assigned to two GT's or GTX's with aftermarket cooling, and you're complaining about clock speeds? Look to the CPU market, where they've figured out that nobody really cares what the specs on a processor are as long as it performs. If the chart is accurate, this card (or cards, if you prefer) is a strong performer.



RE: So.......
By Sunrise089 on 5/25/2006 3:43:39 AM , Rating: 4
How do you assume it will outperform 2 7900GTs? It's clocked equal to my eVGA on the core and MUCH lower on the memory, and my card isn't even the highest clocked part. Above poster is right when he says this part is closer to 2x7900GTs than 2 GTXs. So what are your arguements? It might OC better due to higher voltage? Well if we're overclocking than that just brings it right back to the 7900GT's level, which also has to OC to perform as well as a 7900GTX. Its possible higher volt level (which I really doubt) is meaningless, since 7900GTs voltmod easily.

There is probably only one reason this clock is downclocked: reliability, cost, and/or yeild issues necessitated either moving to 7800GT cores or a VERY downclocked GTX core. If heat was the problem why did they put a SMALLER heatsink on the card (by the way, good luck overclocking to GTX levels with that heatsink)? nVidia rushed this product, and the delays and downgrading of specs easily tell the story of a part that they couldn't get to perform as promised.

In summary you claim it's a 7900GTX, but it:
1) Is downclocked to below GT levels
2) Uses a GT style cooler
3) Isn't called a GTX

It suppose it's still possible that nVida used GTX cores, but why would they? In the video card e-penis match it's just living in denial if you really think nVidia could have extracted extra reliable performance out of this card at this price point and not released it. More than likely 2 GT cores was all they could afford at $600 and marketing told them an $800 part wouldn't sell.


RE: So.......
By Spoelie on 5/25/2006 4:22:31 AM , Rating: 2
The cores they are using are the mobile cores. it's voltage is as low as possible to reduce heat output.

Remember that there are people who want to pop 2 of those in their puter and expect it NOT to burn down their house on bootup.


RE: So.......
By TheBaker on 5/25/2006 10:35:14 AM , Rating: 2
quote:
The cores they are using are the mobile cores. it's voltage is as low as possible to reduce heat output.


OK, now we're getting somewhere. I for one didn't realize they were the mobile cores. I haven't checked to verify that statement, but if it is as you say, then all my arguments are wrong. Everything I said was based on them being chips from the desktop cards.


RE: So.......
By Bull Dog on 5/25/2006 11:00:59 AM , Rating: 2
Spoelie doesn't know what the hell he is talking about. There is NO reason as to why nVIDIA should use moble versions of the G71. It would be stupid (cost wise) and unecessary. You've seen the dinky little cooler on the 7900GT's right? Well the 7950GX2 is going to be running at similar speeds and (logically) similar voltages to a 7900GT. nVIDIA would like nothing more than to have people think that the 7950GX2 is two 7900GTX's in SLI, but the fact is, they're not. They are little more that the equivilient of some slightly factory OCed 7900GTs. Why can't some of you people wrap your heads around that? The 7900GT, 7900GTX, 7900GX2 and the 7950GX2 all have 24 pixel pipelines, so the only thing that matters is clock speed. And there isn't anything magical about that, a 7900GT running at 500/600 would perform virtually identically, to a 7900GTX downclocked to 500/600. Performance may vary slightly (1-2% due to memory timing being different, but thats it. So putting two G71 chips onto a special set of PCBs isn't going to somehow magically make then perform better at lower clocks.


RE: So.......
By Spoelie on 5/26/2006 4:12:25 AM , Rating: 2
Oh yes I do know. Look around. You'll find plenty of articles that mention this thing uses the 7900 Go chip. The Go as in mobile. And yes it does indeed retain its 24 pipelines, why would the top mobile chip have less?

The reason is very obvious. The initial batch of their quad sli setup ran too hot and sucked that much power that only specialized, non retail psu's and cooling could sustain it. So what better way to reduce power and heat then use the chips that were designed to run in constrained spaces on limited power?

It would be stupid not to use them. So look up some info before you post.


RE: So.......
By Bull Dog on 5/26/2006 8:22:48 PM , Rating: 2
If you say somthing as a fact you gotta back it up with LINKS PLEASE!


RE: So.......
By nangryo on 5/26/2006 4:33:35 AM , Rating: 2
quote:
OK, now we're getting somewhere. I for one didn't realize they were the mobile cores. I haven't checked to verify that statement, but if it is as you say, then all my arguments are wrong. Everything I said was based on them being chips from the desktop cards.


That's why people should read first before making argument. It could be embarassing u know.


RE: So.......
By Bull Dog on 5/25/2006 10:46:53 AM , Rating: 2
Oh so when people pop two 7900GT's (or 2x 7900GTX/X1900XTX) in their computer it burns the house down?


RE: So.......
By TheBaker on 5/25/2006 1:56:33 PM , Rating: 2
Two single cards in SLI have proper cooling, which the GX2's don't have the space for.


RE: So.......
By TheBaker on 5/25/2006 10:40:45 AM , Rating: 2
quote:
There is probably only one reason this clock is downclocked: reliability, cost, and/or yeild issues necessitated either moving to 7800GT cores or a VERY downclocked GTX core. If heat was the problem why did they put a SMALLER heatsink on the card (by the way, good luck overclocking to GTX levels with that heatsink)?


Actually, the reason for the downclock is heat dissipation. They used a smaller HSF because the one from the 7900GX2 wouldn't even fit into a standard ATX case. They had to make the card smaller, which means a smaller heatsink, which means lower dissipation, which necessitates a lower clock and voltage.

I agree, on stock cooling it will be virtually impossible to overclock this thing. That also applies to stock voltage. But no hard-core overclocker (meaning someone who will blow $600+ on a piece of hardware just to void the warranty) uses stock voltage or cooling.


RE: So.......
By PrinceGaz on 5/25/2006 4:14:34 PM , Rating: 3
Stock clocks for-

7900GT - 450/1320 (256MB)
7900GTX - 650/1600 (512MB)

7950GX2 - 500/1200 (512MB) SLI

What you are getting with the 7950GX2 is closest to a pair of 512MB 7900GT cards, with the core slightly overclocked (like many 7900GT have already) and slower memory to compensate for there being double the normal amount per card. Really it's no better than a pair of 7900GTs, and arguably worse because it uses slower memory; it's nowhere near as good as a pair of 7900GTXs and even comparing it with them is silly.

- Gaz.


RE: So.......
By maevinj on 5/25/2006 5:13:44 PM , Rating: 2
actually the 7950gx2 has 1gig of memory

NVIDIA's sales material claims the following specifications:

500 MHz core clock
600 MHz memory clock
1GB GDDR3 memory
Dual dual-link DVI+HDTV-out
HDCP Support


RE: So.......
By PrinceGaz on 5/25/2006 6:25:44 PM , Rating: 3
Yes, but what they mean by 1GB GDDR3 memory is that each of the two cards has 512MB so in total there is 1GB. In practice it is the same as having two 512MB cards, which means the total texture-memory available to the 1GB 7950GX2 is less than 512MB because the textures have to be loaded into the 512MB maximum available to each core.

The "double memory" scam with dual-GPU display-adapters is something that has been going on since the days of the V5 5500, and possibly even earlier (I believe ATI had a dual GPU card many many years ago)

- Gaz.


RE: So.......
By hstewarth on 5/25/2006 6:47:55 PM , Rating: 2
Interesting point, I am curious in SLI mode is the Texture memory distributed between the GPU and how is different between having a single GPU with 1024MB or two GPU with 512MB.

Then again is there anything that would take advantage of it.


RE: So.......
By PrinceGaz on 5/25/2006 7:13:23 PM , Rating: 3
The texture memory cannot be distributed between the GPUs as there is no practical way of knowing well in advance which textures will need to be accessed by each core.

Distributing the textures across the memory directly accessible by each core would be even worse than relying on the equivalent of nVidia's TurboCache or ATI's HyperMemory in their low-end cards, as not only would you be pulling the data across a much slower bus between the two cores, you would be imposing memory-bandwidth overheads on the core with the data just to satisfy the core which wants to read it.

High-performance GPUs have carefully designed 256-bit memory buses operating at speeds in excess of 1GHz which allows for a bandwidth of well over 32GB/s (256 Gbps). There is no way whatsoever that sort of bandwidth can be passed between two graphics cards, whether they be two seperate cards in SLI, or the equivalent glued together that is the 7950GX2.

- Gaz.


RE: So.......
By Bull Dog on 5/25/2006 12:07:36 AM , Rating: 2
I'd advise you to check the article one more time.

I mean excuise me, have you taken a look at the clock speeds? 500/600. Either I'm as blind as a bat, or that falls into the lower range of factory overclocked 7900GTs.

The article is WRONG about it being two 7900GTX's "mashed" together. Its a custom PCB with two G71 cores on it along with 16 (8,x2) high-density 1.4ns GDDR3 ram chips. Oh and maybe you haven't read about how soundly a 7900GTX SLI setup slaps around the 7900GX2 (I mean it should after all, seeing as how it cost a bit more). And for the record the 7900GX is the same as the 7950GX2 you just have a different PCB with a few other minor changes. Clock speeds have remained the same.



RE: So.......
By TheBaker on 5/25/2006 1:38:22 AM , Rating: 2
quote:
Oh and maybe you haven't read about how soundly a 7900GTX SLI setup slaps around the 7900GX2 (I mean it should after all, seeing as how it cost a bit more).


Maybe you haven't read about how the performance and stability increases dramatically with each driver beta that is given to the OEM's. Everything I've seen indicates that the GX2's will be practically identical to standard SLI setups when they get the bugs ironed out.

And I will stand corrected on the voltage. Apparently stock vGPU is 1.3 volts. However, the guys at VR-zone already figured out a volt mod for this thing, so OC'ing it to the sky will be no sweat with good cooling.


RE: So.......
By Bull Dog on 5/25/2006 11:25:01 AM , Rating: 2
I'm not saying that the 7950GX2 is behind two card SLI setups. In fact, I'd bet that it'll provide performance equal to two 7900GTs in SLI clocked at 500/600.


RE: So.......
By TheBaker on 5/25/2006 2:01:42 PM , Rating: 2
quote:
I'm not saying that the 7950GX2 is behind two card SLI setups. In fact, I'd bet that it'll provide performance equal to two 7900GTs in SLI clocked at 500/600.


Until we see real (read: non-nVidia) benchmarks, I won't say that you're categorically wrong. However, based on the nVidia chart included with the article, the GX2 does in fact outperform two GT's in SLI. Not dramatically, but pretty well, considering the memory clock. My guess is it's because of the memory capacity. In my opinion, i GB of memory (instead of 2 x 256 on 2 GT's) makes up for the lowered clocks. Again, though, we won't really know until they are made available.


RE: So.......
By PrinceGaz on 5/25/2006 6:45:22 PM , Rating: 2
What you are getting are two 512MB 7900GTs clocked with a core at 500MHz instead of 450MHz (though many 7900GTs come factory overclocked well above 500), and the memory clocked at 1200MHz (which is well below the usual 1320MHz of standard 7900GTs, and bodes as badly for overclocking the memory as the small heatsink does for the core).

1GB really means that each of the 7900GTs that make up the 7950GX2 has 512MB of onboard memory, it is not unified memory shared between both GPU cores that would make a true 1GB graphics-card. The 1GB is really just for marketing as it would be more accurate for it to be called a 512MB card. Everything except the back-buffer has to be loaded into the memory twice (once for each GPU) and the back-buffer only takes 16MB even at 2560x1600x32 meaning all the rest of the memory is duplicated.

Some people might find it convenient to buy a single card that is equivalent to a pair of 7900GTs (and at about the same price as them), but anyone with any sense would buy the seperate cards as that offers more flexibility and most likely better overclocking potential because the 7950GX2 uses inferior memory.

In short, the 7950GX2 is a card that will be bought at one extreme by gullible gamers with money to burn, and at the other by extreme overclockers who'll mount their own DIY coolers and use it as part of a quad-SLI setup in order to break records.

- Gaz.


RE: So.......
By PrinceGaz on 5/25/2006 6:56:17 PM , Rating: 2
And just to clarify my point about the 1GB memory being really only 512MB in practice, how often do we refer to a Toledo Athlon 64 X2 processor as having 2MB L2 cache rather than 2x1MB? That is exactly what you are doing if you call the 7950GX2 a 1GB card. Except it isn't even that.

A Toledo can make full use of the whole 1MB L2 cache it has available for each core to perform independent tasks so there is no duplication of data. The 7950GX2 has to load identical data into both halves of its 1GB total memory because both GPU cores are processing the same set of data meaning in practice it behaves almost exactly like a 512MB card.

Calling the 7950GX2 card a 1GB card is ridiculous. Even 2x512MB is stretching the truth when it comes to how it operates in practice. But obviously they aren't going to market it as a 512MB card (which is what it really is most like) when they can get away with calling it a 1GB card, and their are fools who will believe them.


RE: So.......
By fsardis on 5/29/2006 8:26:20 PM , Rating: 1
i dont really know everything there is to know about sli but if the things you say about the onboard ram of the two cards is true then there would be no point for an sli bridge.

its my understanding that the bridge transfers data from one gpu to the other which in essence means one gpu has access to the ram of the other gpu and vice versa.


How can they claim...
By AppaYipYip on 5/24/2006 7:11:21 PM , Rating: 2
It's the fastest single card in the world...when it's TWO cards using a single connection? Anyone else confused by this statement? It's two cards, using one bus.... @_@




RE: How can they claim...
By Gigahertz19 on 5/24/2006 7:21:05 PM , Rating: 2
Yeah..this is definitely pushing it. I totally agree, kind of lame of them to say fastest single card. What is ATI going to do link 4 cards together and say fastest single card...single card should mean it takes up one PCI Express slot.


RE: How can they claim...
By Lonyo on 5/24/2006 7:47:35 PM , Rating: 3
I thought it did take up one PCI-Express slot....


RE: How can they claim...
By PT2006 on 5/24/2006 7:48:56 PM , Rating: 2
It does


RE: How can they claim...
By Tsuwamono on 5/24/2006 8:31:40 PM , Rating: 2
if you look, the card takes up one, the fan for it takes up another then the second card takes up one then the fan for that takes up another. Or my eyes are tricking me and its only 2 slots. But either way.. its just two graphics cards put together onto one bus connector.. its not the fastest SINGLE graphics card in the world... infact its not even the fastest TWO cards in the world.

If you notice the graph's left side you will see that the 7800 GTX is the baseline. and its considered 1.0 Each line is .02 times faster. The TWO 7950 GX2s arent even twice as fast as the 7800... Sad. Nvidia needs to spend a few hours in the ATI R&D centre.


RE: How can they claim...
By Tsuwamono on 5/24/2006 8:35:01 PM , Rating: 2
sorry. meant to say 7900 GTX not 7800 GTX.


RE: How can they claim...
By TheBaker on 5/24/2006 8:52:40 PM , Rating: 2
quote:
If you notice the graph's left side you will see that the 7800 GTX is the baseline. and its considered 1.0 Each line is .02 times faster. The TWO 7950 GX2s arent even twice as fast as the 7800... Sad. Nvidia needs to spend a few hours in the ATI R&D centre.


I know you corrected that to the 7900 GTX, but you're missing the point. A single 7900 GTX costs $450-500. Two in SLI cost $900-1000 and have the same performance increase as a single 7950 GX2 that costs $650. This is simply a cheaper way to run an SLI configuration, which makes perfect sense, since (according to the article) that's basically what it is: Two cards soldered together in SLI with a hard-wired bridge and a single bus. You even have to have an SLI motherboard to use it. It shouldn't be any faster than two 7900GTX's - it IS two 7900GTX's! The only REAL difference is the price.


RE: How can they claim...
By akugami on 5/25/2006 6:08:31 PM , Rating: 2
This is closer to two 7900GT's than 7900GTX's.

7950GX2 - 500mhz core, 600mhz memory
7900GT - 450mhz core, 660mhz memory
7900GTX - 650mhz core, 800mhz memory

There are many factory overclocked 7900GT's running at 500-550mhz core and 700-750mhz memory. I guess you can attest to the marketing department at nVidia that everyone think they're 7900GTX's. Though the main difference between a 7900GT and 7900GTX is memory (256MB vs 512MB) and core speed (450mhz vs 650mhz).


RE: How can they claim...
By dilz on 5/25/2006 2:18:42 AM , Rating: 2
This is an excellent example of a marketing department doing its job. They are attempting to make people abandon critical thinking when evaluating their product.

"World's Fastest Single Video Card"

If you look at the PCI-E connectors, this is a simple "yes."

If you consider the width of the product, you might temper nVidia's statement with an alternative form of their slide title, such as "World's Fastest Video Card That Requires SLI And Two Slot Spaces But Only One PCI-E Slot." If that's how you see it, then maybe you're on to understanding marketing. I think.


RE: How can they claim...
By AppaYipYip on 5/25/2006 8:44:29 PM , Rating: 2
I fully understand that it is in fact TWO video cards, sharing a SINGLE connector. My first post was questioning the validity of the claim "Fastest Single Card." It is NOT a single card, but rather two cards that share a single bus. So by that logic I can claim that my SLi setup is the fastest single video card setup, since it's sharing a single PCI-e graphics bus on the motherboard!


RE: How can they claim...
By MrSmurf on 5/28/2006 9:56:19 AM , Rating: 2
IT'S ONE CARD WITH TWO GPU'S. My lord it's not that difficult to understand... sheesh...!!!


RE: How can they claim...
By tuteja1986 on 5/26/2006 10:33:16 AM , Rating: 2
ALso people that end up buying will become Nvidia suckers beta tester since the technology ain't ready , Driver are buggy and game crashes and issues everywhere.



DHCP support non dual gpu setups
By hstewarth on 5/25/2006 12:22:11 PM , Rating: 2
I am curious if nVidia going to have new cards that have the CyptoROM for DHCP support on new GX2. Say maybe a 7950GT.

I am curious if oneday someone could have 4 or these GX2's and 8 SLI - massive amount of the video cards.

Video card manufactors should follow Intel ( and now AMD ) suit and make lower power processing units.




RE: DHCP support non dual gpu setups
By maevinj on 5/25/2006 1:51:58 PM , Rating: 2
damn dude can you go 1 post without showing your intel fanboyism?


By hstewarth on 5/25/2006 4:45:01 PM , Rating: 3
At least I am not cussing at others. Its not Intel Fanboyism but logical. I assume you talking about the ( and Now AMD ). Intel made anounce about Conroe and shortly lately you see AMD stating they also coming out with lower power chips.. I will obmit that Intel Netburst chips use more power than current AMD's ... but this is no longer going to be true.

In know way my statement was fanboyism toward Intel, but instead desirng Video companies to follow suit and do what the cpu manufactors have done with with lowing the power.

It is pretty sick how fans out there pick on intel so much, but than one then AMD fans think that there chips are being badly comment on - they attack the commenter with comments.

Anyway comments on places like this and even articles are mostly just opinions - so it really doesn't matter much.


RE: DHCP support non dual gpu setups
By Zoolook on 5/25/2006 4:44:43 PM , Rating: 2
Boy you've really gotten it wrong, AMD has been leading the way on low powerconsumption/performance and it's Intel who is just now catching up with core/woodcrest designs


RE: DHCP support non dual gpu setups
By hstewarth on 5/25/2006 4:46:22 PM , Rating: 2
As other person, I only was referring to recent events - post Netburst - not Netburst and below. Sorry for the consfusion.


RE: DHCP support non dual gpu setups
By maevinj on 5/25/2006 5:05:03 PM , Rating: 2
nice of you to assume I'm an amd fanboy. I happen to use both intel and amd pc's. Your original statemnt said the gpu makers should follow intel and amd now. but amd was doing it before and are still producing chips thats use less power. Look at the new am2 ee chips that will be out in a few months. So your statement makes you look like an intel fanboy by saying intel started and amd just followed. If you're not then I apologize but your comments you make on the forums leans heavily that way.


RE: DHCP support non dual gpu setups
By hstewarth on 5/25/2006 5:27:39 PM , Rating: 3
Also if you look at my comments closely, I did not assume that you were fan boy - but instead that you were cussing at me which I thought was rude.

Perosnaly my opinion about the whole AMD vs Intel is very stupid. Intel actually came up with the orginal architexture and yes they made mistakes in Netburst - but its evolution toward the Core 2 architexture in my opinion. I know its hard for some to think that Hyperthreading is actually a precursor to Dual core, but I believed that it is.

In no way that I don't believe AMD should be out there, but trying to sue there way ( anti-trust ) to get more customers really make sick. If it was only Intel than the prices of cpus will go though the roof.

Anyway - real part of my statement and none of these statesment are going toward is that I wish GPU ( ATI and nVidia ) would research lowing power requirments on there GPU's.

Of course this is my opinion for what its worth. As anything else in places like this.


RE: DHCP support non dual gpu setups
By Squidward on 5/26/2006 11:25:03 AM , Rating: 1
This is a test of the emergency comments section, this is only a test. had this been an actual emergency intel stocks would have dropped 20%...

This concludes the test. Have a great day!

p.s. just testing my login here



RE: DHCP support non dual gpu setups
By hstewarth on 5/26/2006 1:22:38 PM , Rating: 3
FYI: the last couple of days Intel stocks have been going up. So has AMD. Its best to check before making sill responses. As for the future, a lot depends on that. But I expect Intel will do find because the Conroe and Woodcrest are coming out.

http://money.cnn.com/quote/quote.html?symb=INTC


By Squidward on 5/26/2006 2:21:02 PM , Rating: 2
No offense at all, but that was just a joke, I was only testing my new login and just wrote something as filler. The first post I attempted gave an error. I'm sure Intel stocks are doing great right now, sounds like their new architecture is going to be a huge success for them.





Nice graph....
By the Chase on 5/24/2006 9:40:58 PM , Rating: 2
Wow- twice the performance of the 7900GTX! Oh wait- just misleading advertising- just 40% faster than the GTX. But I'm sure it will be quieter with 2 small high speed fans rather than 1 large slow speed fan. Oh, well maybe not. But the price! I'm sure after the initial week the availability will dry up till people are paying $750 for one of these. What a deal! Er...scratch that too. Why did Nvidia make this again?




RE: Nice graph....
By Fenixgoon on 5/24/2006 9:53:09 PM , Rating: 2
40% should be expected when this "single card" is two cards mashed into one. its single card SLI. quite frankly, i don't see why anyone would find this graph surprising at all. SLI vs. non-SLI and same card? well DUH the SLI will do better.


RE: Nice graph....
By Devil Bunny on 5/24/2006 11:38:59 PM , Rating: 2
If you read the article they state that it is two cards in SLI using one x16 PCIe slot, and that you have to have a SLI capable MoBo.


RE: Nice graph....
By mushi799 on 5/25/2006 12:13:05 AM , Rating: 2
first - this takes one slot not two
2nd - it retails for $600, not $750. And of course you can fine it cheaper online.


RE: Nice graph....
By Fenixgoon on 5/25/2006 1:07:28 AM , Rating: 2
essentially, though, its on-board SLI, much like the voodoo5 5500 (of which i am a proud owner :D

they're sticking 2 cards into 1, big deal. it damn well better be faster


RE: Nice graph....
By akugami on 5/25/2006 6:19:16 PM , Rating: 2
It's MSRP is $600 which stands for Suggested Manufactorer Retail Price. Retailers can sell it for $6 or $6,000 if they want to. What people are worried about is that the availability will be so low that we'll see another 7800GTX 512MB redux. Heck, due to nVidia's inability to supply adequate amounts of 7900GT's and 7900GTX's the price actually went up during the first couple of months and is just now starting ti slide downward.


RE: Nice graph....
By Sunrise089 on 5/25/2006 3:46:53 AM , Rating: 2
exactly right TheChase


GPU and the Results
By AggressorPrime on 5/24/2006 11:16:01 PM , Rating: 2
First, these are 2 GeForce Go 7900 GTX GPUs.
Second, it shows the same performance increase as SLI. In low res solutions, it is CPU bound. In high res solutions, it competes better.

The only thing I'm sad about s that it doesn't have an SLI link (which will hurt GPU-GPU transfering) and that it will not launch with good drivers. Hopefully it will support Quad SLI soon afterwards.




RE: GPU and the Results
By Bull Dog on 5/25/2006 12:10:32 AM , Rating: 2
How about some links to the claim that the are moblie versions of the G71 CPU?


PSU would now need more power?
By Blackraven on 5/25/2006 12:22:08 AM , Rating: 2
Since this would be a power hog, would it require me to get a 500w Power Supply Unit? (430w would be inferior & cause problems?)


RE: PSU would now need more power?
By toyota on 5/25/2006 1:38:47 AM , Rating: 2
quote:
Since this would be a power hog, would it require me to get a 500w Power Supply Unit? (430w would be inferior & cause problems?)

did you even read the article?? it says
quote:
The marketing marterial bundled with the 7950GX2 claims that a full system using a Radeon X1900XTX requires just under 400W during peak operation, while a 7950GX2 system will only consume 358W and still provide greater performance.


RE: PSU would now need more power?
By Bull Dog on 5/25/2006 10:45:13 AM , Rating: 2
No I'm not happy. You didn't provide any proof. Neither page you linked to prove that the 7950GX2 is using a mobile version of the G71.

I can tell you right now that it doens't. nVIDIA is going to use some regular G71 dies. They'll probably put the core voltage to 1.2v or something close to there to help keep heat dissipation down.


RE: GPU and the Results
RE: GPU and the Results
By Bull Dog on 5/25/2006 11:26:58 AM , Rating: 2
^^^ That was odd, see my comment above ^^^


AA and HDR
By almvtb on 5/24/2006 7:53:51 PM , Rating: 2
Did anyone look at the benchmark picture, and notice they used AA and HDR with Serious Sam? I thought that was impossible with the 7 series, so maybe they did a little more than strapping two cards together in SLI.




RE: AA and HDR
By Bull Dog on 5/24/2006 8:38:54 PM , Rating: 2
They can't and WTH is 1xAA? Maybe its like the XYZ x 1 = XYZ thing.


RE: AA and HDR
By PrinceGaz on 5/25/2006 1:14:47 PM , Rating: 2
1xAA means that each pixel is generated using one sub-sample (2xAA uses two sub-samples, 4xAA uses four sub-samples, etc). Using a single sub-sample means that it is not doing any anti-aliasing at all, 1xAA is just another way of saying "No AA".

So on the graph where it says HDR, 1xAA; what they are really saying is HDR, No AA.


RE: AA and HDR
By smut on 5/24/2006 10:50:42 PM , Rating: 2
Geforce 7 cards could always do certain types (the type in the majority of HDR enabled games that I know of that are out) of HDR and AA, just not the certain type of AA (FP16) used in Oblivion so theres nothing else they added to the cards like you claim.


Hmmm...my thoughts
By sideswipe911 on 5/27/2006 5:31:21 AM , Rating: 3
SLI & Crossfire - good innovative ideas(and they should have remained as ideas).
Current crop of games are still CPU limited.
6 years ago - CPU power > GPU power.
Preset day - roles reversed.
Quad SLI - two's company and four is really pushing it!!
Intel & AMD - stop napping and wake up.

Bottom line - wait for a cpu thats capable of unleashing th true potential of SLIs & Crossfire systems including Quad-SLI then get a SINGLE DX10 capable card thats capable of outperforming SLI and its further iterations while pairing it with the right processor.




RE: Hmmm...my thoughts
By Master Kenobi (blog) on 5/30/2006 2:34:37 PM , Rating: 2
Agreed.


marketing ploy?
By Shankar on 6/1/2006 2:28:10 PM , Rating: 2
marketing release...nvidia want the crown of "fastest graphics card manufacturer" back from ATi, so they release this so called "single graphics card"...lol.


HDCP eh.....?
By nangryo on 5/24/2006 7:21:00 PM , Rating: 2
With the Image Constraint Token being On Hold For Four More Years, I wonder if the support for the HDCP/DVI will really come to the market with this card. Perhaps we should wait the nex gen card (with the DX-10 support probably) to really see HCDP implemented in the consumer card..




RE: HDCP eh.....?
By jkresh on 5/24/2006 8:49:27 PM , Rating: 2
I highly doubt anyone who is buying this card will still be using it (maybe in a 3rd or 4th system as a backup to a backup) when ICT actually starts being enforced (if it ever is, as I think even in 4 years there will still be enough users with HDTV sets without hdcp/hdmi to create a backlash if they start enforcing it). Generaly buyers of this type of card will replace it within a year (and certainly by the time UT2007 comes out along with other games based on that engine).


RE: HDCP eh.....?
By PrinceGaz on 5/25/2006 7:03:44 PM , Rating: 2
I hadn't heard that the Image Constraint Token thingy which would prevent full-resolution images being outputted has been delayed by four years. I had heard that all early HD-DVD discs (and Blu-Ray as well when they arrive) will not include it, but I didn't know they'd agreed to hold off that long. It's good news indeed if the movie publishers have decided not to include it for the next few years (which will probably mean forever as HDCP will have been broken long before then, indeed it already has) as all it did was make things more difficult for us the movie-buying public.

- Gaz.


Single pcb two gpus
By electriple9 on 5/24/2006 7:33:05 PM , Rating: 1
I think this only a single card, when you say its got one pcb, with two gpus on it.
Thanks




RE: Single pcb two gpus
By Bull Dog on 5/24/2006 8:36:55 PM , Rating: 2
It has 2 (two) PCB's


RE: Single pcb two gpus
By wuZheng on 5/24/2006 9:02:50 PM , Rating: 2
It does have two PCBs, its just linked together by some PCI-e bus links and nVidia's proprietary technology. But I think anyone can tell from the pic that its pretty much two of everything, but only one PCI-e x16 connection interface. Its sort of like cheating if you call it the world's "single fastest GPU" but acceptable, if you call it the world's "single fastest graphics card" because the term graphics card can refer to a single retail product. Just a thought though.


Two video cards one slot!
By aryaba on 5/25/2006 1:55:42 AM , Rating: 2
This seems to be exactly what I've been waiting for.

Am I correct in assuming that I could run this with two monitors and essentially each one would have its own GPU? If so, two of them hooked to 4 monitors would be perfect.




RE: Two video cards one slot!
By Spoelie on 5/25/2006 4:27:44 AM , Rating: 2
two video cards, two slots.

you better be happy it doesn't take 4 slots.


Good lord
By DigitalFreak on 5/25/2006 10:52:20 AM , Rating: 2
This card will not be faster, or even as fast, as a 7900GTX SLI setup. Both GPU & memory speeds are quite a bit lower than the standard ones on a 7900GTX. Benching it against a single 7900GTX and saying it won is useless. Duh. 7900GTs in an SLI config will beat or match a single 7900GTX as well.





RE: Good lord
By Bull Dog on 5/25/2006 11:31:35 AM , Rating: 2
Hallaula! Someone finally realizes the truth. These twits running around saying, "After driver improvements, something less than two 7900GTXs will be equal to them." Numbers don't lie, its 500MHz on the core folks; 150MHz less than a 7900GTX, Its 600MHz on the memory folks; 200MHz less than a 7900GTX.


How about...
By kuyaglen on 5/24/2006 8:31:34 PM , Rating: 2
How about saying..."Its the Single fastest display adapter."




multi-monitor?
By Cript on 5/24/2006 9:03:48 PM , Rating: 2
I hope you can use both dvi ports with this card with SLI on (will you even need to toggle it?), unlike current SLI disabled all but one of your 4 video ports.




grain of salt
By thilanliyan on 5/24/2006 9:23:47 PM , Rating: 2
Almost everything gains like 40%??? I'd take those benchmarks with a grain of salt.




By andrewrocks on 5/25/2006 4:43:42 AM , Rating: 2
if this does come out in quanity @ $600~$700 USD, the prices should fall alot on the X1900 and the single 7900 series right?

right now on newegg you can pick up evga's SIGNATURE PACK for $750

2x 7900GTs @ 600mhz core and 800mhz on the memory




/
By semo on 5/25/2006 9:08:30 AM , Rating: 2
i can't help but think that these pointless attempts from ati and nvidia to sell the most expensive hardware to a handful of people is starting to slow down progress.

40% gain for adding another gpu+memory is atrocious considering that in real world 40% doesn't make much of a difference since just one of those cards can already run today's games at high resolutions with quite a lot of detail. and let's not forget that this 40% is not such a big deal when we're talking next gen games (not sure if dx9 cards can even run dx10 games).

they should be making mainstream cards (those that can run today's games at atleast 1024x768+some eye candy) more affordable or concentrate on next gen cards so that current gen ones become cheaper (or previous gen ones seeing as how vid cards hold their value these days).




"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki