backtop


Print 124 comment(s) - last by Setsunayaki.. on Mar 17 at 9:13 AM


MSI's NX9800GX2 graphics card as featured at last week's CeBit expo in Hannover, Germany  (Source: 3DNews.ru)

Albatron's 9800GX2-1GX will be among the first of the halo twin G92 behemoth
NVIDIA's high-end is more of the same, and that's a good thing

The long awaited high-end successor to NVIDIA's wildly popular GeForce 8800 series is almost upon us. 

Those anticipating a new architecture, unfortunately, will have to sit this generation out.  The core logic used in the GeForce 9800GX2 and GeForce 9800GTX is none other than the 65nm G92 core found on the GeForce 8800GT, though all 128 shaders are enabled -- opposed to the 112 found on last year's G92.

NVIDIA's GeForce 9800GX2 is poised as the ultra high-end enthusiast graphics card.  The new adapter takes a page from the GeForce 7950GX2 since it's actually composed of two G92 cores on a single board. Factories are instructed to set these core frequency at 600 MHz and memory frequencies at 2000 MHz. 

The GeForce 9800GX2 uses a dual-slot cooling fan.  Reference designs of this card include twin DVI outputs and a single HDMI port.

In addition to the GeForce 9800GX2, NVIDIA will lift the embargo on the nForce 790i chipset.  nForce 790i is essentially the same as nForce 780i, but with DDR3 memory support.

One week later, on March 25, NVIDIA will lift its embargo on the GeForce 9800GTX.  Partners are instructed to pull out all the stops on the G92 silicon, unlocking the core frequency to 675 MHz and the memory frequency to 2200 MHz. Reference designs of this card include twin DVI outputs and no HDMI options.

The new feature on the 9800GTX is the inclusion of two SLI interfaces on the top of the card for three-way SLI support. 

Also on the March 25, NVIDIA will announce its nForce 780a and 750a chipsets.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Is it me or ...?
By keitaro on 3/11/2008 4:26:02 AM , Rating: 5
Am I the only one who thinks that these upcoming batch launch seem uninteresting?




RE: Is it me or ...?
By waltzendless on 3/11/2008 4:34:45 AM , Rating: 5
No, I second that opinion. I have to admit that the GX2 looks beastly though, like something I'd use to fight the Govnernator with.


RE: Is it me or ...?
By Mitch101 on 3/11/2008 10:05:30 AM , Rating: 5
My X-Ray vision is a little poor to see through my case once its installed.

I also find bright glowy lights to be annoying in a PC. Like someone driving behind you with their high beams on.


RE: Is it me or ...?
By Noya on 3/12/2008 7:00:45 AM , Rating: 2
Or rednecks in SUV's and pickups.


RE: Is it me or ...?
By daftrok on 3/11/2008 10:07:44 PM , Rating: 3
God I wish he takes a break being "Governer" to be in T4...then again Summer Glau is fine too, both figuratively and literally..


RE: Is it me or ...?
By johnsonx on 3/14/2008 5:10:13 PM , Rating: 4
Why? Because T3 was just so damned fantastic you've got to have more? That's like saying you hope someone has time to crap in your corn flakes this morning, because yesterday's pissed-on corn flakes were just so tasty!


RE: Is it me or ...?
By Polynikes on 3/11/2008 7:16:44 AM , Rating: 5
Quite uninteresting. I planned to wait to see if 9800GTX looked worthwhile, but now I guess I may as well get an OC'd 8800GTS 512. They'll be cheaper.


RE: Is it me or ...?
By coldpower27 on 3/11/2008 10:57:33 AM , Rating: 1
The only thing you would want a 9800 GTX for is if you wanted Triple SLI on a G92 core, if your looking to stick with SLI 9600 GT, 8800 GT, 8800 GTS 512 look just fine.


RE: Is it me or ...?
By AmberClad on 3/11/2008 7:24:04 AM , Rating: 5
Yeah, I'll most likely be sitting this generation of video cards out. I guess I've been spoiled by the advances in CPUs to expect massive increases in performance, lower heat output, and more cores for a relatively small increase in price in the GPU segment. None of which I'm seeing from either AMD or NVidia's offerings.


RE: Is it me or ...?
By ImSpartacus on 3/11/2008 7:58:48 AM , Rating: 5
I agree. I smell 9900's coming out purdy soon if these can't pull their own weight.

I honestly think that the 8800gt and 8800gts are basically 9800's. they use a g9* core for god's sake. nVidia should've just sucked it up and called the 8800gt the 9800gs, and then the 8800gts the 9800gt. I can't talk about the 9800gt, maybe it'll be a worthy successor to the 8800gt/gts. In that case the naming scheme is acceptable.


RE: Is it me or ...?
By RamarC on 3/11/2008 10:00:14 AM , Rating: 5
what did you expect? nvidia has no reason to debut a new gpu lineup since it can stay one speed bump ahead of ati/amd with a g92 refresh.

i expect the next-gen nvidia architecture to be truly revolutionary and double the g92's performance. they've certainly had enough time to do just that.


RE: Is it me or ...?
By Micronite on 3/11/2008 10:32:13 AM , Rating: 3
As long as NVidia can maintain a decent lead over competitors, there is no real incentive for them to market something to compete with themselves.

I think they're being smart. While they have this lead, they're taking their resources and doing things like gobbling up Ageia. Their present lead is enabling them to position themselves to be competitive in the future against threats like Intel. In addition, I wouldn't be suprised if there is stock in the NVidia-AMD rumor.


RE: Is it me or ...?
By StevoLincolnite on 3/11/2008 4:31:03 PM , Rating: 2
Well in June we should have the Radeon 4870 (RV770) in our hands, with a "Supposedly" 50% Performance boost over the previous generation, Lets hope ATI can throw some punches and get some more market competition going once again.

I miss the days of 3dfx, Matrox, S3, Rendition, nVidia and ATI all battling it out at the same time making cards cheaper and more powerful.


RE: Is it me or ...?
By Min Jia on 3/11/2008 10:45:14 PM , Rating: 2
And Nvidia GT200 too.


RE: Is it me or ...?
By StevoLincolnite on 3/12/2008 6:04:40 AM , Rating: 2
The GT200 chip is the G100 chip. they just keep changing its codename, It's newest Codename is called the D9E.

http://www.techfuzz.com/roadmaps/2008.aspx#nVidia_...


RE: Is it me or ...?
By Min Jia on 3/12/2008 8:38:08 AM , Rating: 3
I think GT200 is the true next gen card, not D9E which is the G92 based 9800GTX:
http://www.nordichardware.com/news,7472.html


RE: Is it me or ...?
By KristopherKubicki (blog) on 3/12/2008 1:30:07 PM , Rating: 2
Yep - D9E is basically what you're looking at in this article.


RE: Is it me or ...?
By Fnoob on 3/11/2008 5:03:32 PM , Rating: 2
Would it really be financially responsible to aquire Ageia? I can't imagine how many more (higher priced) cards they would have to sell to see a ROI. Can someone point me to a link that actually shows a Physics card actually doing something (besides taking up a PCI slot)?


RE: Is it me or ...?
By Min Jia on 3/11/2008 10:51:43 PM , Rating: 2
Nvidia isn't going to sell any physics cards. They're going to make a PhysX based software engine for their GeForce video cards.


RE: Is it me or ...?
By ImSpartacus on 3/11/2008 3:48:26 PM , Rating: 3
Idk about next gen (lol, 5 digits?), but I certainly think nVidia has some pressure. What is the 3870X2?

AMD isn't making ultra high end cards anymore. They are making scalable, low power cards to CF. I'm pretty sure the 3870X2 cleans house on a single 8800GT. Maybe not two GT's, but I see the 3870X2 as one card, not CF.

AMD might not be outright as a winner, but I think after the 3870X2, they are competition. I hope nVidia can come up with something pretty sweet for the 9900 (b/c you know they will exist).


RE: Is it me or ...?
By HHChadW on 3/11/2008 11:46:16 PM , Rating: 4
I think NVIDIA certainly has motivating factors to produce performing parts and quite simply that is the future. Both Intel and AMD are working on bringing the entire graphics pipeline onto the CPU die, probably as a specialized core at first. If NVIDIA can't give gamers a reason to buy add-in boards, then NVIDIA could be in serious trouble.


RE: Is it me or ...?
By 7Enigma on 3/13/2008 7:35:56 AM , Rating: 2
Working on new cores and RELEASING new cores are completely different things. Nvidia has held the crown for long enough now that they like any other business see no reason to release vastly better technology if they don't have to. It's like keeping an ace up your sleeve, they still make just as much money (since us enthusiasts who upgrade cards frequently are not a large part of the pie), and can be comfortable knowing they already have the next part ready....once their competition actually starts to compete.

And I have a heavy stake in this since I'm just about ready to build a new system. I'm on my 3yr old system which I built back in jan05, and so really REALLY was hoping this refresh would be a major overhaul. Unfortunately for me AMD/ATI dropped the ball and so I'm unsure of what I should do (go with a 8800GT, wait a year, get a new mid-range card, or do I get the new 9800GTX).

I think I'll take a wait and see with how much faster these guys are than the current generation and see if I can justify the guaranteed higher cost.


RE: Is it me or ...?
By imperator3733 on 3/11/2008 10:51:35 AM , Rating: 5
They should really be the 8900 GX2 (9800 GX2), 8900 GTX (9800 GTX), 8900 GT (8800 GTS), and 8900 GS (8800 GT). These are all just half-generation chips.


RE: Is it me or ...?
By coldpower27 on 3/11/08, Rating: -1
RE: Is it me or ...?
By stirfry213 on 3/11/2008 11:15:45 AM , Rating: 2
Maybe, however I think my G4 TI4800 lasted me longer than any other video card I've ever bought, in terms of performance that is.


RE: Is it me or ...?
By crimson117 on 3/11/2008 10:09:31 PM , Rating: 2
My 6800GT 256mb has lasted me 3.5 years now...

From my outpost.com order on 9/17/2004:
BFG FX6800 GT 256MB DDR3 FX 6800GT@370MHz/256MB $349.99

:)

Still going, now in its 3rd motherboard :)

Sadly it's AGP so it won't make it into my next build, but I think I got my money's worth. About $100/year spread over 3.5 years.


RE: Is it me or ...?
By Belard on 3/11/2008 11:44:06 AM , Rating: 4
Actually the ATI 3x00 cards are DX10.1 and a lot cheaper than the 2x00 cards they replaced... not super, but not as bad as these.

These new 9x00 cards SHOULD be 8000 series cards... simple.

The 9600GT is not faster than any of the 8800GT cards... so why not call it "8700GT" and these NEW cards SHOULD be 8900xxx

Nvidia blew it with the 8600s. And just love "88" for some reason. What do we have?

8800gs
8800gt (but comes in 256 or 512)
8800gts-320
8800gts
8800gts/512
8800gtx
8800Ultra

That is 7 8800 types!!

The GF 4-Ti series was valid for a new number. They were quite a bit different in design and had lastest me quite a good while. But the silly thing was making the 4800 which were 0~1% faster than the 4600s, oh yeah, it was AGP 8x - I think was the big difference. Or how about the GF2-mx which were replaced by the GF2mx 200 & 400? (The 400 being 99.99% the same as simply GF2mx)

Nvidia cards I've own since my Voodoo3 (sniff): TNT2Ultra, GF2-mx($150), GF2-pro ($175), GF3-Ti200($180), GF-ti4200($180)>ATI9800 > GF-7600GT($180)heatpipe.

Here's a bit outdated site with a chart of several Nvidia GPUs: http://www.geocities.com/nfaq/nufaq06.htm

Like they've said in Ananadtech - Nvidia as totally off on naming their GPUs. G92 for an "8" series card? BTW: Haven't seen ANY reviews on the 8800gs or ATI 3400/3600 series of cards.


RE: Is it me or ...?
By coldpower27 on 3/11/08, Rating: 0
RE: Is it me or ...?
By murphyslabrat on 3/11/2008 6:05:11 PM , Rating: 2
quote:
My point is if people are whining about the naming convention on Nvidia's cards they should be doing pretty much the same with ATi's.

But AMD/ATI had a good reason for the upgrade: they changed the naming scheme to a much more quantifiable system: using the two right-most places to denote individual placement in the series, while the left-most positions do the same thing as they have always done. This leaves a four digit number that is free from the suffix hell that video-cards have endured for the past 15 years.

Basicly, by muddying the water now, AMD made the future water clearer. With nVidia, they are just muddying the water.


RE: Is it me or ...?
By The Sword 88 on 3/11/2008 3:38:38 PM , Rating: 2
I'm not going to lie I love the number 88 and I guess Nvidia does as well


RE: Is it me or ...?
By Obsoleet on 3/12/2008 2:26:51 PM , Rating: 2
Why, because it's popular in neonazi circles used as a symbol of HH/Heil Hitler?

I don't know any other use of "88" unless you just "like it".


RE: Is it me or ...?
By Targon on 3/11/2008 11:46:50 AM , Rating: 4
Moving from a 65nm process to a 55nm process is the primary difference between the Radeon 2000 and 3000 series cards. I wouldn't call that cheating, but it wasn't a full generation/design jump either.

AMD is moving toward a model where the video cards will be based around how many GPUs will be on a video card, or how many cores they put in a given GPU. This is an advantage that things like HyperTransport can really deliver(which NVIDIA has the rights to use as well). Multi-chip technology is something AMD has a bit of an edge in compared to NVIDIA at this point, so the question is how far AMD can scale the technology, as well as getting four video cards each with two GPUs to work. It may seem like overkill, but a system with 8 Radeon GPUs in it WOULD give NVIDIA a run for their money.


RE: Is it me or ...?
By ComfyNumb on 3/11/2008 12:55:37 PM , Rating: 3
Correction... I think it was (2900 is 80nm) (2600 & 2400 65nm)to 55nm.. And the first cards replaced in the 2000 series were the top and mid-top cards... So it was basically 80nm to 55nm.


RE: Is it me or ...?
By coldpower27 on 3/11/2008 3:28:44 PM , Rating: 2
No you misunderstand what I mean, ATI made a HD 3000 Series that was based off the same technology as the HD 2000 Series with the primary improvement being the 80nm to 55nm shrink, the core logic for the DX10.1 features was likely there as it is largely a optical shrink.

Compare as follows HD 2900 XT to HD 3870, 8800 GTX to 9800 GTX, both are just cost efficient forms for the same or slightly greater performance.

Same thing with Nvidia except in their case they are using it on some of their old marketing line, and some of their marketing new line. 90nm 8xxx Series to 65nm 9xxx Series is also mainly a die shrink with a few minor improvements and cost cutbacks.

Such a system would be impressive, but I don't think realistically feasible as GPU consume too much energy as it is 8 GPU would consume 500-600W on their own. As GPU unlike CPU are mostly core logic rather then cache.


RE: Is it me or ...?
By ImSpartacus on 3/11/2008 3:35:13 PM , Rating: 2
Yes, I agree. A nice refresh of the 8000's. You know when a core (g92) is spread between two generations, it's just a spacer.

the g92 is nice, but it isn't a new generation. I heard the 9800gt was going to be 399$ so the 8800gt still have its spot (right between 399$ 9800gt and 180$ 9600gt). I honestly wouldn't get more than an 8800gt. It really seems to be a wonder card.


RE: Is it me or ...?
By Techiedude37 on 3/11/2008 8:23:21 AM , Rating: 4
I have zero reasons to upgrade from my current 8800GT card. It'll last me at least another year or two before most games really start choking it off.


RE: Is it me or ...?
By TheDoc9 on 3/11/2008 11:07:59 AM , Rating: 3
Yeah I was shocked and underwhelmed too, what a disappointment. I waited 6 months for these cards and i looks like I could've saved myself the trouble. Maybe the GX2 will be an incredible performer for a semi-cheap price.


RE: Is it me or ...?
By Golgatha on 3/15/2008 6:24:43 PM , Rating: 2
I will agree with you. The only thing that interests me as a 975X motherboard chipset owner is 8800GTS two-way SLI like performance, which is currently completely unavailable to me due to artificial software limitations.


9800GTX = 8800GTS 512 OC?
By Le Québécois on 3/11/2008 4:08:21 AM , Rating: 2
quote:
GeForce 9800GTX is none other than the 65nm G92 core found on the GeForce 8800GT, though all 128 shaders are enabled -- opposed to the 112 found on last year's G92.


I though the 8800 GTS 512 already had the 128 shaders units...

quote:
One week later, on March 25, NVIDIA will lift its embargo on the GeForce 9800GTX. Partners are instructed to pull out all the stops on the G92 silicon, unlocking the core frequency to 675 MHz and the memory frequency to 2200 MHz.


With the GTS 512 at 650 MHz and 1.94GHz, the 9800GTX seems like nothing more than a over clocked 8800 GTS 512.

With cards like the ASUS GeForce 8800GTS 512 Top running at 740MHZ and 2.07GHZ the 9800GTX already seems like old news.

Am I missing something?




RE: 9800GTX = 8800GTS 512 OC?
By Warren21 on 3/11/2008 4:19:45 AM , Rating: 3
No, you're not -- see my post.

9800 GTX = 8800 GTS 512 1.1.


RE: 9800GTX = 8800GTS 512 OC?
By Lifted on 3/11/2008 4:44:52 AM , Rating: 2
quote:
The 8800 GTS 512 core comes clocked at an impressive 650 MHz and the GDDR3 memory works at 1940 MHz allowing for a theoretical memory bandwidth of 62.7GB/s, which is surprisingly low. It is not the amount of onboard memory that lets the 8800 GTS 512 down but rather the 256-bit wide bus that it uses. The GeForce 8800 Ultra features a peak memory bandwidth capacity of 103.7GB/s while the GTX has a throughput of 86.4GB/s as both products utilize a 384-bit wide memory bus.


Perhaps they will increase the bus to 384 or higher?


By Le Québécois on 3/11/2008 4:56:22 AM , Rating: 2
quote:
Perhaps they will increase the bus to 384 or higher?


Wouldn't they need a different core for that?

Unlocking shader units is one thing, but I don't think it's possible to "unlock" a wider bus while using the same "old" G92 core.


RE: 9800GTX = 8800GTS 512 OC?
By Warren21 on 3/11/2008 5:09:41 AM , Rating: 2
I'd say it's a stretch (and quite a long one) at best to interpret this particular tidbit of journalistic opinion from an 8800 GTS 512 launch article as insight into a 'next-gen' architecture.

It's still the same 256-bit G92 core, except with faster memory to make up for the lack of bus width. Same as AMD did from the 512-bit R600 + 825 MHz GDDR3 to 256-bit RV670 + 1125 MHz GDDR4.


RE: 9800GTX = 8800GTS 512 OC?
By MonkeyPaw on 3/11/2008 7:54:35 AM , Rating: 2
From the leaked GPUz screenshots that I've seen on the interweb, the 9800GTX has a 256bit memory interface.

http://en.expreview.com/2008/03/07/memory-size-768...

Sometimes that link doesn't work, so here's a run-down:

quote:
The core of the card is G92-420, reference clock is 675MHz. SP running at 1688MHz. The card uses 512MB GDDR3 memory, memory interface is 256bit. We have already run the card on our X38 and QX9650 @ 4GHz. The [3Dmark06] score is not too high not too low: 14725.


Seems likely that the GX2 is going to have the same memory configuration, but with 2 GPUs and the different clocks listed above.


RE: 9800GTX = 8800GTS 512 OC?
By xporvista on 3/11/2008 10:22:55 AM , Rating: 2
Based on the benchmarks posted, I'm not too impressed, so far. The benchmark scored a 14xxx and my system scored a 12xxx (Core 2 6850@ 3.3, 2Gb ram, 8800 GTS G92 stock, on Vista 32) I think I seen the ATI hit a 18xxx, stock. with a 9650.


RE: 9800GTX = 8800GTS 512 OC?
By bfellow on 3/11/2008 4:27:41 PM , Rating: 2
The 2900XT scores the best overall in 3DMark and you're basing that benchmark to mean the 2900XT is better than a 8000GTS/X?


RE: 9800GTX = 8800GTS 512 OC?
By sgtdisturbed47 on 3/11/08, Rating: -1
RE: 9800GTX = 8800GTS 512 OC?
By FITCamaro on 3/11/2008 8:54:47 AM , Rating: 2
It will be the same as having 2 8800GTSs in SLI but likely for a lot more money. Not exactly groundbreaking. Only difference is people who only have one PCIE2.0 slot will be able to get SLI-like performance.

Get used to dual GPU cards people. I think they're starting to push the limit of what they can do in terms of heat and power with just one GPU. So they'll just use two slower GPUs to divide the work between them.


RE: 9800GTX = 8800GTS 512 OC?
By Azured on 3/11/2008 9:02:10 AM , Rating: 2
No they aren't. The 8800GTS has 128SPs just like the 9800GTX. The GX2 is another matter, tough, and it remains to be seen whether it will succeed or flop against the much cheaper HD 3870x2.


RE: 9800GTX = 8800GTS 512 OC?
By Goty on 3/11/2008 10:05:31 AM , Rating: 5
It's certainly going to be faster than the 3870X2, but whether or not it's going to be ~$200 faster remains to be seen.


RE: 9800GTX = 8800GTS 512 OC?
By HotBBQ on 3/11/2008 11:14:56 AM , Rating: 1
I recently purchased an eVGA 8800GTS 512 OC (I haven't even installed it in my new rig yet!). If the 9800GTX is at all better than what I've got (cooler, faster), I will be using eVGA's 90 day upgrade program. Pay the small difference in cost, get a brand new card.


RE: 9800GTX = 8800GTS 512 OC?
By Anosh on 3/11/2008 11:47:53 AM , Rating: 4
What is this? Commercial for eVGA?


RE: 9800GTX = 8800GTS 512 OC?
By HotBBQ on 3/11/2008 12:10:57 PM , Rating: 2
No, it's not.


RE: 9800GTX = 8800GTS 512 OC?
By nuarbnellaffej on 3/11/2008 1:59:35 PM , Rating: 3
No I don't think it is only an 8800GTS overclocked, keep in mind that the 9600gt has half the stream processors as the 8800gt, yet manages to do nearly as good.

I assume the 9800 series will have more efficient SP's than the 8800's, maybe more cycles per clock or something, I don't understand hardware that well.

Keep in mind that architecture doesn't change redicly every generation.


By Le Québécois on 3/11/2008 5:52:54 PM , Rating: 2
Yes but the 9600GT is based on a different core, the G94, while the 9800 are based on the same "old" G92 found in the 8800GT and 8800GTS 512.


RE: 9800GTX = 8800GTS 512 OC?
By coldpower27 on 3/11/2008 6:57:24 PM , Rating: 2
The only reason to get the 9800 GTX is Triple SLI support otherwise it's just a higher clocked 8800 GTS 512.


By Le Québécois on 3/11/2008 7:09:50 PM , Rating: 3
Well, if they'd call it 9800PRO I would sell my two 8800 GT and buy two 9800PRO just for old times sake... nostalgia:P.


Disappointing
By MMilitia on 3/11/2008 6:27:49 AM , Rating: 5
I've been long since planning to upgrade my 8800GTS (640) to something with a bit more power for Unreal 3 engine titles and such. If these new cards are genuinely just going to be slightly overclocked G92s, what is the incentive for enthusiasts to drop the £300 for the high end 9 series when they can spend £150 and get an practically identical mid-range 8 series?

Why would Nvidia bother upping the revision to 9xxx when they just as easily could've called these new cards the 8800GTS and 8800GX2?

I must be missing something here because right now these cards appear to be completely redundant for just about every prospective consumer group, with the possible exception of the ultra high-end quad SLI crowd.

It's worth noting there haven't been any decent benchmarks yet though, which may change my opinion.




RE: Disappointing
By paydirt on 3/11/2008 9:06:56 AM , Rating: 5
That's the same as asking why Intel sells Core2Extremes for $1000. Some people will pay up to have the very high end of performance, even if it is just 10-20% faster (and already above 30 fps for current games).


RE: Disappointing
By tallcool1 on 3/11/2008 12:50:29 PM , Rating: 5
300% markup for a 10-20% performance gain?
Yea, that's that ticket...

Lets use a hypothetical example.
I could buy a car for $25K that has 200HP
or...
I could buy the same car for $75K that has a 220HP engine...

That's when you say: Thank you, may I please have another.


RE: Disappointing
By Alias1431 on 3/12/2008 5:04:53 AM , Rating: 3
Extremes aren't any faster than their brothers. They just have an unlocked multiplier.


Meh
By gramboh on 3/11/2008 3:58:44 AM , Rating: 2
Some nice improvments I'm sure, but the 8800GTS/GTX has been out since November 2006, almost 16 months. I'm actually glad things have slowed down, so my 8800GTS 640mb hasn't devalued to nothing like a card would have in that timeframe 2 years ago.




RE: Meh
By Proteusza on 3/11/2008 6:21:48 AM , Rating: 2
yeah, I agree.

its not enough for me to bother replacing my 8800 GTS 640.

I suppose we will have to wait for R700 and D9P. or is it D10P?


RE: Meh
By MMilitia on 3/11/2008 7:10:08 AM , Rating: 2
I guess the up-side to the video card market slowing down so much is that now I can concentrate on getting the rest of my system up to date ready for the real 'next-gen'.


RE: Meh
By Proteusza on 3/11/2008 7:20:25 AM , Rating: 2
Yup, time to start selling the house to buy that 8 Core Nehalem and 10800GTX Quad SLI.


RE: Meh
By paydirt on 3/11/2008 9:02:21 AM , Rating: 5
No, you must achieve balance... Number of CPU cores = Number of GPU cores. This is the path to inner peace. Six-core SLI + Six-core CPU. Da Six Shooter.


RE: Meh
By Jedi2155 on 3/11/2008 12:57:56 PM , Rating: 2
I disagree as now games can't advance as fast as they use to, and forward thinking games like Crysis are being killed due to the expected hardware not really arriving.

Crysis should have performed great when it was released had the graphics hardware technology kept pace with the typical 12 month release cycle but we didn't have that over the past year and a half, which resulted in Crysis straining everyone's systems due to lack of faster hardware and SLI/Crossfire being lackluster implementations. I personally am very disappointed in the current lack of performance increase as I've been wanting to play Crysis the way it was envisioned rather than the way I am forced to due to the lack of more powerful hardware.


RE: Meh
By coldpower27 on 3/12/2008 12:03:31 AM , Rating: 2
The 12 month cycle wasn't really sustainable as Nvidia and ATI have been making performance gains at greater then what process technology allows at the same cost, so both companies have been achieving greater performance by making the products more costly at the expense of their bottom line, we need to give some time for process tech to catch up before Nvidia and ATI can do all that much to make better products.


8800GTX Amazing Value
By MoonRocket on 3/11/2008 10:27:38 AM , Rating: 5
The lackluster performance increases these cards offer only prove one thing.

The 8800GTX will go down as one of the best gaming investments ever.




RE: 8800GTX Amazing Value
By Warren21 on 3/11/2008 10:31:31 AM , Rating: 3
If I knew what I know now today, I would've bought an 8800 GTX at/near launch and bought one more every six months or so after, and had triple SLI GTXes running Crysis at V. High 19 x 12 30+ FPS -- it would no doubt be an amazing value.


RE: 8800GTX Amazing Value
By FITCamaro on 3/11/2008 10:47:03 AM , Rating: 2
Yes I see how its an amazing value when a far cheaper 512MB 8800GTS pretty much equals its performance.


RE: 8800GTX Amazing Value
By imperator3733 on 3/11/2008 11:00:28 AM , Rating: 2
I think his point is that the 8800 GTX has been available for over a year longer than the 8800 GTS 512. There would be no reason for anyone to buy an 8800 GTX today, but overall performance has not improved much in the last 16 months so the 8800 GTX is still quite competative.


RE: 8800GTX Amazing Value
By FITCamaro on 3/11/2008 11:11:33 AM , Rating: 3
Even over the original 640MB 8800GTS, its performance didn't justify the price premium. I built a system for a friend and he wanted to get two 8800GTXs. I told him better to get two 640MB 8800GTSs and spend the saved money elsewhere since the performance difference wasn't that great.


HDMI missing from top-end?
By Fnoob on 3/11/2008 10:42:16 AM , Rating: 2
GeForce 9800GTX. Partners are instructed to pull out all the stops on the G92 silicon, unlocking the core frequency to 675 MHz and the memory frequency to 2200 MHz. Reference designs of this card include twin DVI outputs and no HDMI options .

WTF? Is it not possible to run 2500x1600 thru HDMI? Good quality HDMI cables are now very affordable - unlike Dual-Link DVI. Any ideas why this was excluded?




RE: HDMI missing from top-end?
By kattanna on 3/11/2008 11:05:25 AM , Rating: 2
quote:
Any ideas why this was excluded?


yes, so you would spend more on the 9800GX2 card to get that option


RE: HDMI missing from top-end?
By Fnoob on 3/11/2008 4:28:44 PM , Rating: 2
Prolly. But here's wondering how the GX2 will compare to the GTX. $100 per 10% framerate increase?


RE: HDMI missing from top-end?
By imperator3733 on 3/11/2008 11:10:57 AM , Rating: 2
Doesn't AMD have a DVI-to-HDMI adapter that it uses? Would Nvidia do something similar?


RE: HDMI missing from top-end?
By Fnoob on 3/11/2008 4:36:03 PM , Rating: 2
Dunno if AMD makes one, but they are everywhere and cheap. I'm running an 8800GT with a DVI-HDMI adapter to a Sony 1080p LCD. This is a sweet, unsupported method of getting PC-input up to 1920x1080(60hz) - as most LCDs only support much lower res via VGA or DVI for PC connectivity. My old 1900GT would manage the same trick, but only up to 1080i.


RE: HDMI missing from top-end?
By Ajax9000 on 3/11/2008 9:16:48 PM , Rating: 2
quote:
Is it not possible to run 2500x1600 thru HDMI?

Yes, but ...

To do so requires either HDMI 1.2 Type B connectors & cables (items that make hens teeth seem common) OR HDMI 1.3 (which hasn't yet been implemented by ATI or Nvidia AFAIK).

So, in effect, no .

(Actually, 2560x1600p24 is possible.)


People seem to forget...
By Goty on 3/11/2008 10:11:39 AM , Rating: 2
I'm seeing quite a few posts here that seem to point to the fact that people are disappointed that the 9800 cards just seem like overclocked 8800GTs and GTS 512s, but what everyone seems to forget is that NVIDIA has tweaked the architecture quite a bit between the "old" G92s and these "new" ones. The 9600GT performs around the level of an 8800GT with something like 30 less shaders, so these 9800 GPUs should be a good deal faster than their 8800-series progenitors.




RE: People seem to forget...
By Warren21 on 3/11/2008 10:21:15 AM , Rating: 2
Not really, the G94 is based off the same G92 that debuted in the 8800 GT (and peaked spec-wise in the GTS 512). The only reason it might perform "around the level" of an 8800 GT is because it is normally clocked higher. Being a separate design, it was made with less transistors (750m vs 505m; G92 & G94 respectively) and generates less heat, while using the same cooler (reference).


RE: People seem to forget...
By MMilitia on 3/11/2008 10:23:33 AM , Rating: 2
From what I understand the high end are still using G92 chips, while the 9600 was a G94 part. So any architectural improvements will not apply to the GTX/GX2.

Though I could be wrong about this. I think I got this particular piece of info from another post in this thread.


By KristopherKubicki (blog) on 3/11/2008 12:07:42 PM , Rating: 4
quote:
NVIDIA has tweaked the architecture quite a bit between the "old" G92s and these "new" ones

I hate to disappoint you, but unfortunately that's not the case.


Reviews?
By gigahertz20 on 3/11/2008 4:11:19 AM , Rating: 2
So when will we see reviews of these cards?




RE: Reviews?
By dagamer34 on 3/11/2008 4:42:36 AM , Rating: 2
When the embargo is released next Tuesday.


RE: Reviews?
By TheDoc9 on 3/11/2008 3:05:01 PM , Rating: 2
No leaked results like last time, and there's an embargo on when the reviews can be released! Reminds me of the AMD K10 release a few months ago. I think we all know these cards will be nothing but an unimpressive/overpriced slight speed jump if that.

And to the poster that was saying they should wait it out because there' on top and making money, I'm sure Intel thought the same thing with the P4, and 3dfx probably did too with the voodoo series. I know it's taught that way in business school but when companies/anyone gets complacent they lose, nature doesn't like a vacuum of no change and no innovation. At least some CEO's will get rich while they bury the company.


RE: Reviews?
By 7Enigma on 3/13/2008 7:45:16 AM , Rating: 2
As I posted earlier just because they are not RELEASING better cards does not mean the engineers are not working on them. They didn't just say, "Hey guys, great work on the 8800GTX. How about you take a year vacation and come back fresh!".

I'm sure the've been working just as hard as they did to create the 8800GTX, just that they (smartly I may add) aren't canabalizing their own products and showing their hand earlier than needed.

Sucks for us, but makes complete sense as a business.


Cards are nice...
By Aikouka on 3/11/2008 9:08:58 AM , Rating: 4
I enjoy gaming bliss on my PC, but it seems that's quite hard to achieve these days. I bought an 8800GTX when it came out and expected some driver issues (that's fairly common early on), but even today, I still have issues while gaming. I'm not talking about some weird graphical anomalies, but entire computer crashes with the nVidia display driver listed as the culprit.

So what I'm trying to say is... these new cards are nice, but will we get stable drivers? The 8800GTX has been out for over a year, and it's sad that I have to try multiple driver versions to attempt to find the one with the least amount of issues that affect me.




RE: Cards are nice...
By imperator3733 on 3/11/2008 11:13:40 AM , Rating: 2
My display driver crashes quite frequently when I play World in Conflict. The screen freezes, then it goes black and reloads everything. I generally lose 10-15 seconds each time that it happens. (This is on my laptop with an 8600M GT)


RE: Cards are nice...
By Jedi2155 on 3/11/2008 1:01:57 PM , Rating: 2
I generally don't have issues unless I use the most crazy beta drivers. Even then I don't usually experience issues. And I play a wide variety of games as well (my xfire history proves it as well http://xfire.com/profile/jedi2158/)

Although I will admit that compared to ATI, nvidia driver stability leaves a lot to be desired but they usually not as bad as people make it out to be. Perhaps you just need to clean out your old drivers (with a driver cleaner - try www.guru3d.com) before you install the new ones which is what I usually do and probably why I don't experience as many problems.


45nm Support?
By Simon128D on 3/11/2008 10:48:24 AM , Rating: 3
So does this mean that the 780i boards don't support 45nm Quad? I'm pretty sure that it does as I've seen benchmarks with the QX9 range of Quad CPU's but I just have to ask to be sure.




RE: 45nm Support?
By KristopherKubicki (blog) on 3/11/2008 12:10:28 PM , Rating: 2
You are correct, I've updated the article.


RE: 45nm Support?
By Simon128D on 3/11/2008 12:32:07 PM , Rating: 2
Thank you, I was starting to worry ;)


New Cards for DirectX 10
By makots on 3/11/2008 12:08:19 PM , Rating: 2
What I do not understand is why more games are not being produced utilizing DirectX 10. The consoles (360 and PS3) use DX9 so why don't the developers create new games using DX10 so the PC can demonstrate its graphical power. These new cards from Nvidia are merely a stepping stone.




RE: New Cards for DirectX 10
By kattanna on 3/11/2008 12:33:18 PM , Rating: 2
simple, most people are not using windows vista, but are still using XP instead.


RE: New Cards for DirectX 10
By AlphaVirus on 3/11/2008 2:35:46 PM , Rating: 3
Too add salt to the wound, not many people are enthusiast or extremist such as most DailyTech visitors. To add more issues, not everyone is educated on the differences of DX9 to DX10 to DX10.1 and that confusion will stop an excited buyer from making their purchase on that top-dollar GPU.

I think AMD and Nvidia should really think about some sort of joint venture to educate the market of what their purpose is. I know this would be a long stretch as they are fighting to the death every week, but if they dont get their act together they will lose prospective buyers in the long run.


"Pull out all the stops" ?
By Warren21 on 3/11/2008 4:18:25 AM , Rating: 5
675 MHz ins't exactly earth-shattering, seeing as 8800's with the G92 core can be found factory-overclocked near this level (maybe =/+, didn't look very far), and will easily OC past it with a few clicks of RivaTuner or its ilk.

9800 GTX = 8800 GTS 512 with (barely) higher core/shader clocks, faster GDDR3 (still!), redesigned PCB with better power circuitry and a 2nd SLI connector. The cooler is also of the same design yet remade to fit the 11" length.

Whereas the G94 of the 9600 GT is actually a very large improvement over the 8600's, these G92's are already available as the cheaper, equally as fast 8800's.

This whole article reads more like free publicity for nVidia more than anything else -- at least the HD 3800 series integrated new features in the RV6x0 cores.




Oh well
By ali 09 on 3/11/2008 4:08:05 AM , Rating: 2
Its not as if nVidia need to really bring anything more to the table. AMD's 3870 X2 really is only as powerful as an 8800GTX, so they can save their R & D department quite a bit of money by putting out old products in new forms. It may not be the best for the consumer but in the long term it leads to a stronger and stabler product. Personally I still find my 8800GTS 320 quite usable - with my E6850 on Vista I still can play Crysis on medium at 1680x1050, which is more than enough for me. But anyway, I still reckon these cards will be pretty good.

Off topic, would Crysis look better if I used a lower resolution and higher settings, or vice versa? Thanks




RE: Oh well
By Le Québécois on 3/11/2008 4:13:37 AM , Rating: 2
The sweet spot for me in Crysis is running everything at High with an average of 35fps or so.

That's what I get with my 2 8800GT in SLI at 1680x1050. I've tried to lower the settings at medium to get higher frame rate but the game seems to lose to much of its "charm".

In the end, it's you're call but you should give a try to High even if it means lowering you resolution a little.


incorrect?
By xzc145 on 3/11/2008 5:30:46 AM , Rating: 2
quote:
nForce 790i is essentially the same as nForce 780i, but with support for 45nm quad-core Intel processors


Fairly certain this is incorrect, the 780i chipset supports 45nm quad core, the 680i doesn't (it supports 45nm dual core with a bios update)




RE: incorrect?
By RamarC on 3/11/2008 10:02:30 AM , Rating: 2
790i's basic distinction is support for DDR3.


Display Port?
By TheRequiem on 3/11/2008 10:50:20 AM , Rating: 1
Nvidia has HDMI on the new Graphics cards... does that make any sense? I thought the whole point of next-generation graphics was that they would inherit Display Port? I mean, they already have high-quality Display Port monitors out there from Dell and more coming, very, very soon. Yet, Nvidia has an HDMI port and not a Display Port on its new Graphics cards? This is very agitating and a huge let down... I have been wanting to upgrade for awhile now. I guess I'm going to have to skip this WHOLE generation of huge-let-down products from Nvidia.

Can someone out there figure this sh*t out and sort the mess? Jeez.




RE: Display Port?
By Targon on 3/11/2008 11:50:10 AM , Rating: 3
There may be some monitors out there with DisplayPort, but how many people have these compared to displays with HDMI? If you go to buy a 42 inch panel, what type of connectors do you see on them?

Adopting a technology too early will be a waste of money in the short term since not enough people have a display with DisplayPort.


WHOOO HOOOO
By Lightning III on 3/11/2008 11:28:14 AM , Rating: 2
Now this is exciting I can hardly wait to see what the planned price cut to the 3870-X2 will be.

And you know that will keep them from Launching those 2 gussied up 8800GT's in the 7 to 8 hundred dollar price range like they did with the ultra.

Yup happy days are here with Crossfire X changing things up with unheard of flexibility.

Even Derek Wilson of the infamous video quality slider review had to use Intel's SkullTrail Platform. A board with a Nvidia bridge chip in it to come up with a less than glowing review.

How did Intel get the only motherboard that supports both sli and crossfire anyway? I guess that's why it costs 617 dollars if you can find it in stock.

And the SkullTrail tiff continues

I could have sworn I heard Jen say only nvidia chipsets support sli.

with Larrabee hiding in the wings, the words "platform" take on an omninous ring. Lets see if Intels Reputation for not playing nice is deserved.

Oh yes I do love launch season !




RE: WHOOO HOOOO
By LostInLine on 3/11/2008 4:15:04 PM , Rating: 2
SkullTrail has an nVidia chip on it.


45nm Support?
By Simon128D on 3/11/2008 12:30:04 PM , Rating: 2
Ok so the 790i is to support 45nm Quad core chips then what about the 780i does it only support 45nm Duo core ships?

I'm pretty sure that the 780i supports Duo and Quad as I've seen benchmarks for this board running the QX9 range of chips.

I just wanted to ask to be sure because I want to buy a 780i and DDR3 doesn't interest me right now so the 790i is not what I want.




RE: 45nm Support?
By Simon128D on 3/11/2008 12:31:16 PM , Rating: 2
Whoops, ignore that as I was having internet problems and thought my post didn't upload, sory.


X2 og GX2?
By christianspoer on 3/11/2008 4:04:09 AM , Rating: 1
So is it called the one or the other? It says GX2 in the headline, but X2 in the article...




RE: X2 og GX2?
By Warren21 on 3/11/2008 4:22:02 AM , Rating: 2
X2 = AMD, GX2 = nVidia.


By chimeraforce7 on 3/13/2008 8:36:00 PM , Rating: 3
HEY GUYS I JUST FOUND THIS, ITS EVEN NEWER INFORMATION ON THE 9800 SERIES!

Ive just received word from the most reliable source in the world.....JESUS. He has told me the following about the 9800 JTX (Jesus X-Treme) !

-6500MM (mega metre) process technology, at MSMMMCFPPC
-Over 700 billion transvestites
-Second Generation Unified Vader Architecture
-Triple precision magnitude flux capacitors
-GPGPUPGPU-PEE-YEW! native
-Over one TeraFARTS of shader processing ability like jedi's
-MAD+AD homosexual party
-Fully secular priesthood design (touch those kiddies with extra fps!)
-native core clock speed of 12.8GHz
-1280-bit memory interface (cos god doesn't hold back)
-4TB GDDR7 graphics memory
-DirectX 42.12c support
-OpenJL 500.326.8 compliant shader support
-eDRAM die for "FREE BJ from a priest"
-built in boobs(for real men)
-Improved AA and Ahh f*** off quality levels

"Produced by god, patented by jesus"




Meh.
By marsbound2024 on 3/11/2008 8:02:17 AM , Rating: 2
Well I suppose I am a bit disappointed. It is starting to look like the 9 series is what the 8 series was SUPPOSED to be. Actually a decent midrange card (i.e.: 9600GT over 8600GT) and the highest of the high ends (i.e.: fully-hardware-enabled cards or the 9800 series over the 8800 series). So I am going to assume that the 9800 series just won't pack too much more of a punch than the 8800. However, I might outta hold my tongue and wait until I see the benchmarks.




By Beavermatic on 3/11/2008 8:55:29 AM , Rating: 2
will the 9800GX2 be available next week for purchase??




Directx 10?
By AlvinCool on 3/11/2008 9:21:41 AM , Rating: 2
Since they don't mention it I would assume these are DirectX10 and not 10.1?




GX2....again
By CyborgTMT on 3/11/2008 2:28:37 PM , Rating: 2
Maybe this time some decent drivers will come out so my massively over-priced and ultimately worthless 7950GX2 quad will scale.




Hang on a sec
By Sazar on 3/12/2008 1:54:06 PM , Rating: 2
On a single board? I thought this was 2 PCB's slapped together a la the 7950GTX, not the single pcb design AMD brought forth with their X2.

Am I missing something?




Hmmmmmmm.....
By Setsunayaki on 3/17/2008 9:13:49 AM , Rating: 2
I give Nvidia credit for working on the 65nm process reduction, but the problem that lies in the industry is the few games out there that require an intensity larger than a 8800GT and lower than a 9800GTX2.

A lot of us who buy video cards play high end games, but the 9800 GTX2 would fit the situation where one player wants 85 FPS in a next generation game on max settings while video capturing the game at 1680 x 1050 or higher resolutions. The card will also work on gamers who shelled out the 1000s to buy a 2560 x 1600 LCD monitor.

I am on a quad core processor. Im also overclocked at times when I need the extra processing power for video encoding and capturing, however I don't think the average gamer out there needs such a video card.

Looking at the current tech tree, it seems like outside of 2 - 3 games, one can get a perfect gameplay experience with an 8800 video card or AMD(ATI) Equivalent.

I played Unreal Tournament III on a 8800 Series video card with max framerate at maximum settings, Crysis played decently as well as Oblivion with HDR and World in Conflic. Note that Crysis is the only reason why one would need a 9800 video card if you strictly are a gamer on high resolution.

Other than that...in my case I don't find anything interesting outside of more processing power.

I kind of feel cheated you know.

I feel cheated because games keep increasing in complexity which means I have to wait longer for a game to be released and because of this, it takes longer to make a game as well. The end result is always one video card that is released, but 2 - 4 games are made that can take advantage of such a high end card and yet even so, the highest end card of the previous generation on lower resolution can play fine.

The industry knew that complexity was increasing and few games were to be made, so they jump on the Direct X 10 bandwagon, meaning you need windows vista, which means 20 - 30% less performance. Then one needs memory and if it goes over 4GB total on the system, you need a 64 bit OS, which is 20% less performance. Then we use Aero and the Video Card at the same time on the desktop to make it faster, so we are processing 2 heavier things at once in the way of graphics....

Just so that technology is made out in the way that one needs to upgrade because your existing configuration makes you lose 40 - 50% of your system performance and 20 - 30% of your singlecore video card processor that has to work on timeslicing....

This is the reason why although I like PCs, I know one major reason that games with super high end graphics are released is to fuel the people into spending more and more money on hardware.

This is why I like Console Gaming. The hardware exists and although it has failures, since the industry is stuck with the hardware for 2 - 7 years, they have to actually make games with the limitations of the hardware.....You get more games without developers thinking how to bloatware graphics so that Nvidia can give those CEOs a Christmas Bonus for forcing people to upgrade...Same with Intel.

In the end I can play a game without having to upgrade my PC each generation I want to get a decent framerate in a game.

Maybe the industry should make better games with better storylines and gameplay instead of bloating up games to force an upgrade.




Missing details...
By Complex Pants on 3/11/08, Rating: -1
RE: Missing details...
By Warren21 on 3/11/08, Rating: -1
RE: Missing details...
By toyota on 3/11/2008 5:04:37 AM , Rating: 2
quote:
Yes, it does come with 768 MB of memory, but on a 256-bit bus a la 8800 GTS 512


The people that have the cards are saying 512mb. Also based on the way Nvidia does memory theres no way it would be 768mb.


RE: Missing details...
By Warren21 on 3/11/2008 5:15:53 AM , Rating: 2
Actually, now that you've pointed that out it doesn't make sense.

I thought it had 12 memory chips (per side) like the G80 PCBs. Latest pics show 8 memory chips, so 768 doesn't work out (either 96MB or 48MB chips if 768 / 8).


RE: Missing details...
By James Holden on 3/11/2008 12:05:33 PM , Rating: 2
It must be 512MB or 1GB. This is due to the 256-bit bus, where as the G80 had a 384-bit bus.


RE: Missing details...
By Warren21 on 3/11/2008 5:35:23 PM , Rating: 2
I know this. More specifically however, it's actually due to the number of memory chips onboard (like I stated above). While it wouldn't make much sense, nVidia could've used an odd number like 768 or 384 MB if it so pleased them -- it just has to be feasible in terms of memory chip availability.

The G80's used 12 memory chips per side of the PCB. That's how they got the GTS 640MB & 320MB (10 x 64/32MB GDDR chips), and the GTX 768 MB (12 x 64). This overcomplicated the design however, so obviously if they could they'd have reduced it. Hence the new design is more contemporary and sticks to 8 chips, meaning only only capacities such as those you've listed (2^X).


RE: Missing details...
By nerdye on 3/11/2008 10:33:19 PM , Rating: 2
I was so hoping for a 512mb wide memory bus, that truly would have pushed the geforce 9 series to be a big enough gain over the geforce 8 series to warrant its incrementation in product name =(. I have been rocking a 7950gt 512mb for a long time now, I assumed the 9800's would have the larger memory bus, we can't really expect the 9900's refresh to have a different memory bus can we? Wouldn't that throw off the whole tick tock cycle? Does anyone else feel like we got 2 tocks in a row and were really hoping for a tick, so to speak? haha =)

-E


RE: Missing details...
By Major HooHaa on 3/13/2008 12:55:08 PM , Rating: 2
So with a G92 core, is the 9800x2 basically two "8800 GTS 512's" stuck together on one card, like the card that ATI (or AMD) have produced with their "X2" card?

Has the G92 had any improvements made to it and does it now support DirectX 10.1?


"We don't know how to make a $500 computer that's not a piece of junk." -- Apple CEO Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki