Print 52 comment(s) - last by overzealot.. on Apr 20 at 1:12 PM

The next codenames to remember: GT200 and RV770

After the NVIDIA financial analyst call last week, the company briefly hinted at its upcoming roadmap this year. 

AMD's RV770 architecture, set to launch this summer, will be immediately followed by a new graphics architecture from NVIDIA, codenamed GT200.  GeForce 9900, as it's been dubbed, is not simply a derivation of G92 like the current GeForce 9800 GX2 and GTX offerings.

NVIDIA has been extremely tight-lipped about GeForce 9900, as it overlaps considerably with its current high-end offerings.  NVIDIA's latest flagship products, the GeForce 9800 GTX and GeForce GX2 only made their debut last month.  Unlike the GeForce 9900, these cards were once again based on the G92 die -- nearly the same GPU used in the GeForce 8800 GT. 

Original NVIDIA roadmaps put the GT200 launch in late Fall 2008. However internal memos sent out to board partners in early March detail that the GT200 processor has already been taped out.  The same document alludes to the fact that the GT200 chip is very stable, and has been ready to ship for reference designs for several weeks already.

The company gave no reason for the tentative roll-in, but with AMD launching the successor to successful RV670 (Radeon 3850, 3870) this summer, it seems entirely plausible that NVIDIA anticipates another close race. 

AMD partners claim Radeon RV770 will make its debut this summer under $300. NVIDIA's 9900 is currently scheduled as an ultra-high end adapter to be priced higher than the GeForce 9800 GX2 offerings, which retail for more than $500 today.

Channel partners indicate that the 9800 GTX and GX2 will begin phasing out next month in preparation for the GT200 launch. Both companies have made promises to show demonstrations of their next-gen cards at the Computex Taipei trade show on June 3, 2008.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

I feel sorry for anyone who purchased 9800GX2
By bill3 on 4/16/2008 9:19:16 PM , Rating: 2
What a piece of crap. Nvidia should have never released that.

I guess you could say people who buy a 600 video card wont miss the money, but it's not really true, plenty of regular joes bought it. I remember a guy on H forums asking if he should buy it when it came out. Said he knew it was a bad idea but he wanted to buy the absolute highest end card for the only time in his life for his new system. It was really like a desire in his heart to buy the very best GPU.

I told him he was crazy, wait, (real) new cards were likely right around the corner. I was right..

RE: I feel sorry for anyone who purchased 9800GX2
By just4U on 4/17/2008 1:30:54 AM , Rating: 2
you know, I don't have a problem with the 9800GX2. To me that's fine. It's got the 512bt memory bus with the G92 core. I'd be curious to have a look at it myself.

THe 9800GTX on the other hand, hmm it really brings nothing new to the table over a 8800GTS/512 and probably shouldn't have had the GTX logo put to it as it's sort of misleading.

RE: I feel sorry for anyone who purchased 9800GX2
By Warren21 on 4/17/2008 1:55:19 AM , Rating: 2
No, it's actually 2 x 256 bit, just like the 3870 X2. It really is two cards in SLI, no two ways about it.

RE: I feel sorry for anyone who purchased 9800GX2
By just4U on 4/17/2008 11:11:20 AM , Rating: 2
Interesting comment Warren. Originally when I went to check out the X2's specs with the site that I tend to deal with the brands had said 512 memory interface.. Now they say 256+256 or 512(combined)

Damn I didnt know that. I'd thought they'd revamped the memory controller for the X2's.

RE: I feel sorry for anyone who purchased 9800GX2
By Warren21 on 4/17/2008 5:26:55 PM , Rating: 2
For sure a 1GB pool of shared memory with a revamped 512-bit controller which handles both cores would provide better performance (or at least I would think so...).

Both the X2 and the GX2 were sort of afterthoughts however (the X2 less so), so they were designed from cores not meant to be put into 'one card'. I guess a true 512-bit bus didn't show enough of a performance gain to outweigh the costs of redesigning a memory bus/PCB/etc.

By overzealot on 4/20/2008 1:05:43 PM , Rating: 2
I can cope with them referring to the GX2 and 3870x2 as 512bit interface cards (it's technically true, even if it is 2 256bit busses that can only be accessed by their particular GPU) but I wish they wouldn't refer to them as 1gb. Sure, they take up 1gb of address space (making 32bit users cry) but as far as textures and framebuffer goes, you're just as good as with 512mb.
It's intended to confuse people, and it does.

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher
Related Articles

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki