Print 77 comment(s) - last by Jkm3141.. on Oct 29 at 2:07 AM

AMD's RV670 processor runs on a single-slot cooler design, though working samples unveiled last week used a two-slot solution instead.  (Source:
Next-generation GPUs are the fastest things on the planet -- if they were released a year ago

Traditionally the Fall graphics refresh has been the battle of the titans -- ATI and NVIDIA both would debut behemoth video cards in an attempt to snag the headlines from one another.

Much of that changed when AMD acquired ATI last year.  Not only did ATI miss the Radeon HD 2900 launch window by almost six months, but NVIDIA's high-end GeForce 8800 became the undisputed ultra-high-end GPU as well.

This Fall, we will not get an ultra-high-end replacement from AMD or NVIDIA. Instead, November will be a clash of the sub-titans.  NVIDIA's mid-range G92 will go head-to-head with ATI's RV670.

ATI's RV670 has been called many things in the past. It was originally a 65 nanometer die-shrink of the R600 class GPU; then a 55 nanometer shrink. Taiwan Semiconductor Manufacturing Company, Asia's largest core-logic foundry, confirmed AMD would go with a 55nm R600 shrink in a memo forwarded to DailyTech earlier this year.

When TSMC debuted its 55nm process earlier this Spring, the company claimed "significant die cost savings from 65 nm, while offering the same speed and 10 to 20 percent lower power consumption."  Since R600 was manufactured originally on a 80nm node, thermal improvements should be fairly dramatic on RV670.

Last week at the World Cyber Games, Sapphire demonstrated a working RV670 using a dual-slot cooler.  Sapphire and ATI engineers alluded to DailyTech that this dual-slot configuration will likely be replaced with a single-slot solution by time of launch.

NVIDIA's G92 has also carried many names.  Originally slated as the 65nm "fill-in" GPU between GeForce 8600 and GeForce 8800, the company began changing documentation earlier this month as ATI's offerings began to firm up. 

NVIDIA confirmed the specifications of G92 with board partners earlier this week.  The GeForce 8800 GT will feature a 600 MHz core clock, a 900 MHz memory clock and a 256 bit memory interface.

Newest guidance from NVIDIA, released Monday, claims the 8800 GT will feature 112 stream processors and a shader clock locked at 1500 MHz. 

The one thing that didn't change on G92 is the process node.  NVIDIA's foundry partner, TSMC, forwarded a second memo to DailyTech confirming G92 is in mass production at the company's Fab 12 with samples available now on 65nm process node.  NVIDIA's GeForce 8500 and GeForce 8600 are manufactured on TSMC's 80nm node; GeForce 8800 GT will be the company's first 65nm graphics processor.

NVIDIA guidance suggests G92 will be here next month, followed by AMD's marketing blitz for RV670, RD790 and Phenom.  All three AMD offerings are expected to launch on the same day, which AMD distributors have penciled in for late November. Intel is expected to launch its 45nm Penryn processors on November 12, and any NVIDIA launch will likely coincide with that announcement.

Late last week, Maximum PC reported that NVIDIA senior vice president Dan Vivoli commented that NVIDIA would be releasing new hardware to go along with the upcoming title Crysis.  The confirmed launch date by Electronic Arts for Crysis is November 15, 2007

However, G92 fans might get a quick preview of the new GPU on October 29, 2007, when the company officially lifts the embargo on 8800 GT.

Neither AMD nor NVIDIA have released "firm" pricing for the products, though we can reasonably infer several key points regarding the price.  Since RV670 is effectively a smaller R600, performance will be very similar to existing R600-based cards on the market today.  However, since the card only utilizes a single-slot cooler and a considerably smaller die, the cost of these cards should be lower than existing R600s.

G92, which was originally called GeForce 8700 until just last week, has a soft suggested retail price of $250, according to NVIDIA board partners.  Since the GeForce 8800 GT will be launching first, it's fairly likely that AMD will adjust the suggested price of RV670 depending on the outcome of initial GeForce 8800 GT feedback.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By Some1ne on 10/9/2007 5:07:24 PM , Rating: 3
So the real next-gen cards are due out...when?

Has the pace of graphics innovation slowed down primarily because after being acquired by AMD, ATI offerings have ceased to be competitive with those from nvidia in terms of sheer performance? Is nvidia content to just milk their position at the top, for however long they happen to have no real competition from AMD/ATI?

RE: Sub-titans...
By James Holden on 10/9/2007 5:08:30 PM , Rating: 5
Has the pace of graphics innovation slowed down primarily because after being acquired by AMD, ATI offerings have ceased to be competitive with those from nvidia in terms of sheer performance? Is nvidia content to just milk their position at the top, for however long they happen to have no real competition from AMD/ATI?


RE: Sub-titans...
By kilkennycat on 10/9/07, Rating: 0
RE: Sub-titans...
By retrospooty on 10/9/2007 9:37:49 PM , Rating: 5
All of what you say may be true, except the "no" part. Nvidia is milking the hell out of the AMD/ATI stumble. They have been milking the 8800GTX for over 1 year, and will continue to do so as long as they can.

RE: Sub-titans...
By masher2 on 10/9/2007 10:56:47 PM , Rating: 1
> "They have been milking the 8800GTX for over 1 year"

Slightly under a year, actually. The GTX was released Nov 2006.

RE: Sub-titans...
By Dactyl on 10/9/2007 10:58:19 PM , Rating: 3
If by "milking" you mean "making profits while they develop the next generation of cards," then NVidia has been milking the situation for the past 10+ years.

Sometimes they have a good year (8800), sometimes they have a bad year (FX5900). NVidia isn't "milking" anything, because it's not resting on its 8800 line. NVidia did not fire its design team, pocket their salaries, and decide to sell 8800s for the next five years.

RE: Sub-titans...
By mushi799 on 10/10/2007 5:23:41 AM , Rating: 3
it's milking by today's standard. The current video war or lack thereof is dismal. Usually by now we get hints/previews of the next generation, but none so far.

RE: Sub-titans...
By retrospooty on 10/10/2007 9:48:11 AM , Rating: 5
"If by "milking" you mean "making profits while they develop the next generation of cards,""

No, Milking means they are not pushing the next gen out as fast as they could have and would have if AMD had a viable competitor product. thus they are "milking" profit out of the current chip. See Intel prior to the Athlon for reference.

RE: Sub-titans...
By daniyarm on 10/10/2007 11:30:25 AM , Rating: 1
So you would rather see Nvidia release a new GPU and then try to get the drivers right for the next year or so? I'd rather have them offer the whole solution at once. I think they learned their lesson with 8800. You can't rush a product to market when software support is in beta stages.

RE: Sub-titans...
By retrospooty on 10/10/2007 10:30:34 PM , Rating: 3
They need to do both, like they used too its not either or... The driver team is not related to the hardware team. the drivers suck now because they let alot of the talent go, either through layoff, or attrition, they did not replace talent that left the department.

RE: Sub-titans...
By 1078feba on 10/10/2007 12:50:56 PM , Rating: 2
Hopefully they're making profits. $425,000,000 in G80 R&D is quite a bit to recoup.

Although I can see the counterpoint: a lack of direct competition at the highest level most assuredly would lead a company to at least sit back and take a deep breath before rolling up it's sleeves and digging in again.

The flip side of this is that with the 8800 series now so prevalent, with so many sold, it gives NVidia serious impetus to keep working on the drivers to get the absolute max performance out of would think.

Mike Magee, the old geezer who runs the Inquirer, also has a weekly column in CPU magazine, used his space in November's issue (that's right, November, subscribers get it early) to lay it on pretty thick for Intel's upcoming Larrabee.

Times, they are a-changin'.

RE: Sub-titans...
By thartist on 10/10/2007 4:12:45 PM , Rating: 2
You don't have to take it to such an extreme. nVIDIA is not holding still in development, but definitely slowed down on some fronts.

Do the thinking yourself: do you think we would be under such frozen videocard line-ups if the AMDATI catastrophe hadn't occured?

RE: Sub-titans...
By Ryanman on 10/14/2007 2:57:34 PM , Rating: 1
Yeah, they are milking. And they have every damn right to do so.

RE: Sub-titans...
By kilkennycat on 10/10/2007 5:19:46 AM , Rating: 2
The acquisition of ATi was (imho ) the biggest blunder AMD ever made. Going several $billion into debt for a company with a consistent record of non-performance on meeting promised new product delivery schedules. And ATi has continued to non-perform in that department up to this day. This has severly hurt AMD's bottom line, besides the interest payments on the debt. With the imminent arrival of Penryn and the closely-following low-power mobile products on the 45nm Intel process, AMD is going to be in a world of hurt.

RE: Sub-titans...
By murphyslabrat on 10/10/2007 12:45:25 PM , Rating: 2
AMD is about 75% smaller than Intel. When you are in that kind of position, you cannot muscle your way up the ladder without a vastly superior product, and "playing it safe" will not get you there. You have to take risks, but sometimes they hurt.

AMD's K9 processors are an example of a failed gamble. Designed to be massively parallel in preperation for excellent multitasking, they performed poorly and sucked energy. And this failure was costly, the degree to which is currently unknown (at least to me, if anyone has any articles, please share).

Amd has also taken a risk with the acquisition of ATI. The intent of which is apparent, as anyone who has dredged through their hordes of "pretty pictures detailing market dominance" can tell you. A complete GPGPU solution, integrated into the CPU core. This would be an excellent solution, as it would make PC buying simpler for an uneducated consumer, and make integrated graphics redundant for an AMD box. This is in addition to the immediate benefits, namely: revenue from ATI's customer base, as well as potentially doubling the gross income. This would encourage investment in AMD, as it would appear to be twice the company it was before.

Obviously, as you say, it was an immensely risky move; and one that, as of yet, is a greater cause for gray hairs than is old-age for many of those involved. With the volatile nature of this market, you cannot afford downtime, as a few bad years can ruin the next decade. 3 years ago would have been an excellent time to perform this venture, as AMD was gaining market share and ATI contrasted a horrible year (x000 series) to an immensely successfull period immedietly prior. Furthermore, Intel had it's head up it's butt with Netburst, and is only now recovering from the poor image it had generated in the enthusiast market.

However, they didn't. They only attempted this venture as the horizon started to look particularly bleak. And, whether this is too little, too late remains to be seen; but nonetheless, as evidenced by Via, AMD is going to be around for a while yet.

RE: Sub-titans...
By RedFlyer7 on 10/11/2007 12:06:10 AM , Rating: 3
No doubt everyone and their grandma was expecting the G92 codename to be their new flagship 9800 XXX series product and to be released in November to coinside with the Intel Peryn CPU release. Then the usual subsequent 6-month trickle-down of the new technology into their mid-range and low-end products for this new generation. Obviously all parties are disappointed except for maybe AMD/ATI. This is somewhat refelcted in nVidia's recent share price fluctuations once the 8800GT news was exposed.

Milking the tech is ok thats business but hopefully not sitting on their laurels. Lessons from the past demonstrated how ATI dominated with the 9700pro and subsequently lost it for the enxt 4years with the intro of the nVidia 6800 series ->7800 series and 8800 series. 3 generations they have lost the battle to nVidia and though nVida holds the crown now the balance can shift on a dime. Any of you old geesers remember S3? S3 once sold more video chips thans than Intel. How abour Cirrus Logic? Trident? All once wore the crown and dominated the field yet where are their now? This leads to the next question what of the future?

AMD's acquisition of ATI. This may seem like a bad move at them moment because there are no immediate benefits but in the long-run this is a great move. The dream of system on a chip (SOC) is one step closer. Even with the power of todays Core 2 Duo Peryns the cost performance ratio of adding more than 2-4 cores on a chip die dramatically after going beyonf 4 cores except on a limited range of speciality applications.

"The next evolutionary step is to have multiple CPU cores side by side with multiple GPU cores and possibly other additional custom DSP cores like physics processors."

This is where you will get the most performance in the future. Lets examine the position of each horse in this race:

1. AMD/ATI have embarked on this road already witht heir Fusion project. From the CPU+GPU on the same die perspective they are way ahead since they both have mature and cutting-edge CPU and Graphics products. Chipset-wise they was dead last behiond Intel and nVidia but that should not be difficult to remedy. The great leap forward has already begun for them although in the immediate future they may have to stomach the Intel dominance into he CPU world for another year and nVidia GPU world.

I am an nVidia fan and have 7 cards from thier families but I have great faith in AMD/ATI. Afterall wasnt it AMD who came out with the Athlon in when the Pentium was king, wasn't it AMD who pioneered on-Die memory controller to increase performance, wasn't it AMD who created the type Hyper-transport type model for improved I/O for motherboards instead of just plain FSB+PCI I/O, wasn't it AMD who created EMT64 instruction set to support the like of 64-bit OSes like Vista x64 which later Intel was forced to copy and support evne to this day in their Core 2 Duo line?, and dont' forget it was AMD who first introduced dual-core to the world. The need for survival brings out the best in this Darwinian world and the the case remains for AMD to be the cut-edge leader once again. Time is of the essence because if you look farther into the future, they have the lead although it may not seem like it. Let the price-cutting for survival begin until Fusion is released.

2. Intel. Where to begin. Their history is well known. Their strength is not innovation unless there is competetion(ty AMD). What they do have is second to none manufacturing prowess. They can sweep the competition under a tidal wave of cpus, chipsets, etc. THe have been dominating for the past 2 years with the release of the Core Duos and Core 2 Duos. They release their chips on better lower micron tech faster and in more quantity than anyone else. AMD however can come back strong by using world-class manufacturers liek TSMC or UMC who are no sluches and can give Intel a run for its money in terms of technology and mass production capabilities.

Intel's CPU tech although not innovatiove copies and enchances the orignal in some cases. Remeber the Japanese use to be known for being able to cpoy and make betetr products than the US? They have a mature CPU platform but their weakness is graphics. Mostly they have been content in the last 2 decades to make integrated low-performance cores. Now to compete they may be forced overnight to make high-end GPUs if not leading at least on a comparable level to ATI/nVidia. Why not just snap up nVidia then? Simply because as big as Intel is they cannot afford nVidia. That is why they are frantically trying to build a graphics team from ground-up and they will use any means necessary to hire the best engineers.

3. NVida the current king of the non-integrated graphics market and can possibly eat some of the integrated market share with their recent IGP releases. They are the undisputed leader at the moment but what they lack is a general CPU product. We have already seen the in the graphics market the crwon can pass from one company to another in the blink of an eye. What are their options? Cullde up in bed with Intel or one of the other CPU makers such as SUN's Nigara or even buy a platform like Sony's cell chip to create that dream of SOC with CPU+GPU on the same die. However this is all unlikely as the mountain is the hardest to climb to convince software developers to support yet another x86 architecture. Soem people may say nVidia is in the worst position and could be relegated to second or thrid-tier just supplying chips to Intel. Yet somehow I doubt Nvidia's CEO good ole Jen Sun will allow that. Nvidia only simply needs to continue what they do best run a tight ship, produce the best prodcts and at a lower price.

RE: Sub-titans...
By NT78stonewobble on 10/11/2007 3:18:49 PM , Rating: 2
Lessons from the past demonstrated how ATI dominated with the 9700pro and subsequently lost it for the enxt 4years with the intro of the nVidia 6800 series ->7800 series and 8800 series. 3 generations they have lost the battle to nVidia and though nVida holds the crown now the balance can shift on a dime.

Well actually the x800's and x850's were superior compaired to the 6800's.

ATI was down again when it was the x1800 versus 7800's and were ahead again with the x1900-x1950's.

RE: Sub-titans...
By 3kliksphilip on 10/13/2007 5:50:18 PM , Rating: 2
The X800's are now redundant due to their SM2.0 support. I think that the X1950 Pro is ATI / AMD's lifeline. It's a fantastic mid range product which kicks the Geforce 8600 / ATI X2600... in the face.

RE: Sub-titans...
By Hlafordlaes on 10/12/2007 5:48:08 PM , Rating: 2
Agreed. Had the odd thought that a VIA-NVidia linkup might be mutually beneficial. VIA's got the old x86 CPU from Cyrix, and has made inroads in the mini-itx market. However, they are about to get their lunch eaten as Intel is launching a SFF initiative. VIA's boards are OK for general purpose computing, but it's the poor integrated graphics that are really hurting the ability to drive VIA's mini-itx into the set-top and htpc market. A deal with NVidia would seem wise. As far as NVidia taking advantage of the VIA's cpus, I'll let more knowledgeable posters punt on that. I'd love to think that such a match-up could become a third option to Intel/AMD, even if in niche form factors.

RE: Sub-titans...
By sxr7171 on 10/9/07, Rating: -1
RE: Sub-titans...
By retrospooty on 10/10/2007 9:50:20 AM , Rating: 2
Intel has a viable competitor with AMD. the A64 are not far behind, and the Phenom may well be a bit ahead when they fix the yield issues. Intel milked the hell out of us before the Athlon, and would d it again if AMD wasnt producing competitive chips. Yes the A64 is behind, but not by far, especially when you look at price/performance ratio.

RE: Sub-titans...
By GTaudiophile on 10/10/07, Rating: 0
RE: Sub-titans...
By JWalk on 10/28/2007 9:17:27 PM , Rating: 2
Right. And I am sure that if ATI had "won the war" then they would have continued to innovate at a fast pace, right? Nvidia is doing what any company would do. They have the opportunity to slow things down a bit, and take their time developing their next products. Am I personally happy that there aren't new GPU's coming as fast and cheap as before? No. But, I don't blame Nvidia for doing a good job.

RE: Sub-titans...
By ultimatebob on 10/11/2007 10:38:07 PM , Rating: 2
Unfortunately. I really wish that S3 or Matrox would get back into the gaming graphics card market, since we really need some competition on the high end.

RE: Sub-titans...
By Farfignewton on 10/13/2007 6:13:57 PM , Rating: 2
I really wish that S3 or Matrox would get back into the gaming graphics card market, since we really need some competition on the high end.

I've been watching the 3d accelerator market since the 3dfx voodoo, but I guess I must have blinked or something and missed the entire high end S3 and/or Matrox period. Matrox was never more than mid-level that I can recall, and if S3 was still in the game they'd probably be pushing something worse than integrated graphics, like video acceleration through USB 1. ;)

RE: Sub-titans...
By Martimus on 10/15/2007 3:03:50 PM , Rating: 2
Matrox was a major player before the 3D accelerator market. They were often considered the premier 2D graphics card producer. Even early this decade, some considered them the best dual monitor videocard on the market.

RE: Sub-titans...
By jabber on 10/16/2007 12:58:36 PM , Rating: 2
Indeed, around 1998 the setup de jour of the time was a Matrox Millenium 2 partnered with a 3dfx Voodoo2. It didnt get much better than that!

Though I was running with a Matrox Mystique 330(?) and a Matrox M3D at the time. The game support wasnt so good but the image quality from the PowerVR setup was better.

Happy days.

RE: Sub-titans...
By noirsoft on 10/9/2007 5:17:23 PM , Rating: 2
I would say that the technical demands of DX10 + Vista and the new hardware/driver architecture required can just as easily explain the slight dip. This time next year will say for sure if there's an actual slowdown in improvements.

RE: Sub-titans...
By masher2 on 10/9/2007 6:49:59 PM , Rating: 3
I'll generate some heat for saying so, but I believe we're seeing NVidia and AMD both forced into a strange semblance of Intel's "tick-tock" business model. I don't think we'll ever go back to seeing major chip revisions every 12 months, or to their debuting top-end GPUs on the smallest process node.

RE: Sub-titans...
By Orbs on 10/9/2007 7:09:55 PM , Rating: 2
I don't know if I agree with that. The process changes/die shrinks can still happen "off schedule" per the new architecture, and long before Intel coined "tic-tock", nVidia and ATI were introducing new artchitectures in the fall followed by a "spring refresh". If the refresh coincided with a new process or die shrink, we could still get a major revision every 12 months.

RE: Sub-titans...
By Captain Orgazmo on 10/9/2007 8:43:58 PM , Rating: 5
I actually hope it slows down. Being male, I am compelled to buy all new gadgets, and as a result my wallet looks like it has spent the last year in a gulag in Siberia. Also, most games I play are good enough to have been around for a couple of years (or more), and the newest graphics card was unnecessary, but, like I said, I'm male. So I thank God or the Martians that my vid card will be acceptable for the next few months anyways.

RE: Sub-titans...
By helios220 on 10/10/2007 9:40:46 AM , Rating: 2

Look at what happens in the console world; a set of baseline hardware is released with little to no performance changes for several years... yet the quality of visuals in the games released for that console increase significantly over time.

When hardware isn't dramatically changing all of the time, it makes developers find new ways to optimize their design and techniques to provide better quality software rather than just assuming the consumer will take it up the @$$ to buy $600 graphics cards every year.

While I am always thrilled by the advent of new technology, in the end I get sick of always knowing that no matter how much money I throw down on a card it'll be replaced by something better within a matter of months or at best a year.

RE: Sub-titans...
By mars777 on 10/10/2007 2:50:40 PM , Rating: 2
I hardly doubt you will see many significantly visually improved games on the X360 for one.

Halo3 and Crysis are pushing its hardware on the maximum already.

If visual improvements do come, it will not be related to the hardware specs, but developer tricks and magic, like the granturismo line on the PS2.

The last gran turismo, on the PS2 was a product of artists/programmers that we should all bow to. Placement of objects on the scene related to typical player paths, using textures and even sprites on segments of the scene where it doesnt degrade quality, etc.

This is programming, not the EA sports crap :)

RE: Sub-titans...
By Amiga500 on 10/9/2007 5:29:58 PM , Rating: 2
A good warning for the CPU market then eh?

Take heed those that love to see AMD neck deep in brown stuff.

I believe the next gen of GPUs are due out 2 quarter next year... maybe.

RE: Sub-titans...
By 3kliksphilip on 10/9/2007 6:23:54 PM , Rating: 2
I see no need for ultra high end products. A Geforce 8800 or ATI X2900 will play all games out now at playable frame rates. When DX10 becomes more common place, a more powerful GPU will be needed. I'm just happy that the mid range is gradually being filled in. The 8600's were pathetic. As resolutions increase, NVIDIA and ATI seem to think that mid range products should only run at acceptable levels at 1024 x 768 or the like. Still, they got their £240 from me because of it.

RE: Sub-titans...
By MrBungle123 on 10/9/2007 7:24:22 PM , Rating: 4
you may see no need but I like to run games at my monitors max resolution of 1600x1200 with everything on High... that requires a high end video card with newer games.

RE: Sub-titans...
By Lakku on 10/9/2007 8:31:42 PM , Rating: 1
I don't know what the rest of your PC is like, but I can play any new game on a 8800gtx at 1680x1050 with settings maxed with more than playable framerates. That includes playing games in DX10 mode if available. I can't always enable 4x or higher FSAA, but 2x is usually playable, or I can do with none at all. Crysis will play on a single 8800gtx without FSAA at these resolutions, no problem. So, I'm not sure why you'd need even more power at the 1600x1200 resolution unless you're being really picky. :)

RE: Sub-titans...
By sxr7171 on 10/9/2007 11:36:20 PM , Rating: 2
Why not have cards that can all games at 1920x1200 with 4x AA minimum? They introduced up to 16x AA, and why not have a card that can run games with it enabled? I know it doesn't affect gameplay, but it looks very nice. Should we all stagnate and have 1680x1050 with no AA forever?

If anything we need to be moving to higher resolutions and higher PPI with our OSes supporting scaling. Now that's what would separate your Xbox 360 from a high end gaming PC.

Manufacturers should reach higher and not rest on their laurels.

RE: Sub-titans...
By hrah20 on 10/9/2007 11:58:49 PM , Rating: 3
(Why not have cards that can all games at 1920x1200 with 4x AA minimum?)

I was thinking the same.

RE: Sub-titans...
By Lakku on 10/10/2007 12:09:43 AM , Rating: 2
I didn't say rest forever, but considering the cards now can play any game at almost any resolution, most with AA, that video card companies have no incentive to release something new, especially when one company isn't offering much competition on the high end. Eventually, you may be able to get your wish, and that will come with time, but no company is in any rush to push there when the current situation is just fine for the most part.

RE: Sub-titans...
By murphyslabrat on 10/10/2007 1:03:12 PM , Rating: 2
The answer? because that will cost more. More power=higher quality design/silicon, and that equals more cost. The design itself takes longer to complete, with a higher chance of bugs; and costs more, as it takes up more die-space. With the advent of efficient 3-D layouts, this might change; or they just maintain the current level of performance, and you pay half the price for it. Then, of course, if you require higher clock-rates, you deal with higher-quality transistors; or being more selective about your chips, which ends up with more of them being tossed/budget-binned. Both of these solutions result in either higher cost-of-manufacture or lower yields, both of which also drive the price like a highway chase scene.

So, the reason why? Because most people prefer a physics/light/AI intensive game (aka, realistic) to beautifully rendered/smooth-edged/2560x1920 resolution-ed/etc. Quake 2 models. If resolution must be sacrificed to give me relatively cheap life-like scenes, I am all for it.

RE: Sub-titans...
By johnadams on 10/10/2007 8:09:29 AM , Rating: 2
The Emulation scene should write emulators for modern consoles that make full use of the GPU to do as much as the processing as possible. Then we'd have a lot of use for powerful GPUs more powerful than an overclocked 8800GTX. Imagine being able to load and play PS3 games in Vista at full frames or at even higher resolutions with additional filtering and anti-aliasing.

RE: Sub-titans...
By sheh on 10/10/2007 9:03:52 PM , Rating: 2
First they'll have to finish with the previous gen emulators... (Maybe in a few months or a year the PS2 would be mostly conquered.)

RE: Sub-titans...
By nrb on 10/10/2007 8:50:35 AM , Rating: 2
So the real next-gen cards are due out...when?
According to rumours, in January there will be new cards released that feature two G92 or RV670 chips on one card. Supposedly the RV670 version will actually be two chips on one PCB, the Nvidia equivalent will be two distinct PCBs fitting into one expnasion slot, (like 7950GX2).

In Q2/2008 ATI aims to launch its next chip, R700. This is rumoured to be designed for multi-chip configurations, so there will probably be both low- and high-end variants. We don't know anything about what Nvidia might be preparing to counter it, but it is reasonable to assume there will be something.

So, depending on your definitions, the answer to your question is either "January" or "Q2".

RE: Sub-titans...
By AstroCreep on 10/10/2007 3:02:09 PM , Rating: 2
Is nvidia content to just milk their position at the top, for however long they happen to have no real competition from AMD/ATI?

Wouldn't you?

While I don't necessarily think that nVidia is "Milking" their current position, I think it would be a very arrogant move to reveal too much of their current hand before it's time for the flop. They very likely may, but if they want to keep their current advantage it would be wise to play their cards close to their chest...for now.

terrible news
By LumbergTech on 10/9/2007 5:28:09 PM , Rating: 1
this is horrible news for people who like having high frame rates...the 8800 series is terrible in direct x 10 performance and now the only solution will be to get SLI dx10 if you care to actually get 60 fps in a wide variety of games.

RE: terrible news
By maroon1 on 10/9/2007 5:53:57 PM , Rating: 2
the 8800 series is terrible in direct x 10 performance

At least it is better than HD2900 series in DX10

Here is the latest DX10 review

RE: terrible news
By LumbergTech on 10/9/2007 7:19:19 PM , Rating: 2
WOW..yea i didnt mean to pick on nvidia specifically..i was just pointing out that the dx10 performance isnt too good most of the time...that ati card does horrifically in world in conflict

RE: terrible news
By Lakku on 10/9/2007 8:51:50 PM , Rating: 2
The semi sad part is is that the nVidia drivers used aren't the newest beta's released specifically for crysis and bioshock etc. Those drivers have fixed a lot of lingering problems with the 8x00 series and improve DX10/9 performance of Bioshock by a bit.

RE: terrible news
By 1078feba on 10/10/2007 12:32:47 PM , Rating: 2
I agree. Even though that series has been out for 11 months, I think there is still a lot of room for improvement in the drivers, particularly in DX 10.

Interesting to note that Yerli specifically mentions that Crysis runs better in Vista x64 than x32.

RE: terrible news
By zander55 on 10/10/2007 6:52:35 PM , Rating: 2
those benchmarks are already obsolete. catalyst 7.10 is already a reality and promises huge improvements, especially in DX10 mode. these benchmarks show decent improvement, but they're far from comprehensive.,1697,2193650...

RE: terrible news
By jeffery on 10/9/2007 6:56:18 PM , Rating: 2
The inability of current gen hardware to run DX10 games at an acceptable performance level is not completely due to hardware deficiency (although the mid-level parts are severely castrated with their 128-bit interface). I think much of the dramatic frame-chugging observed even with top tier hardware is due to a combination of driver infancy, and the fact that most games are not really taking advantage of the efficiency of the new API (ie, DX10 'features' are simply tacked on).

I was hoping for more
By Orbs on 10/9/2007 5:14:38 PM , Rating: 2
It seems like it's been FOREVER since nVidia's 8800 line first debuted. I guess that's what happens when there's little competition (since the 8800 launch, up until very recently, AMD/ATI offered none really).

I guess that means spring at the earliest for a top-of-the-line card from either camp. So what do I put in my next PC is the big question...

RE: I was hoping for more
By i4mt3hwin on 10/9/2007 5:20:43 PM , Rating: 5
It's also probably what happens when you spend $450+ million on R&D for the chip. Nvidia is going to milk every last penny's worth out of the 8xxx series and I say good for them. If companies are willing to spend huge sums of money on furthering the research of whatever it may be, they deserve a little profit on the side.

RE: I was hoping for more
By Axbattler on 10/9/2007 7:46:18 PM , Rating: 2
The GeForce 8800GTX reminds me of the Radeon 9700Pro. As far as top end graphic cards go, they are standing the test of time pretty well.

RE: I was hoping for more
By sxr7171 on 10/9/2007 11:39:14 PM , Rating: 1
Yeah that 9700Pro was a legend. It absolutely murdered anything else out there. It was nice those days when ATI was competitive. I'm still waiting for AMD/ATI's second coming but I won't be holding my breath.

RE: I was hoping for more
By CyborgTMT on 10/10/2007 1:26:32 AM , Rating: 1
My 9700pro died a couple of months ago, I still haven't been able to bring myself to throw it away. As much as I overclocked it, I'm surprised it lasted this long. My 8500 still works, if only I had a mobo that it would fit in.

good for nvidia, bad for people who buy from them
By VERTIGGO on 10/9/2007 6:17:08 PM , Rating: 3
so we're pretty much effed when it comes to upgrading for crysis. We either have to spend top dollar for almost sweet performance, or stay with what we have.

By cyriene on 10/9/2007 6:29:48 PM , Rating: 3
No doubt. I was looking to upgrade from my 7900GS OC, but can't imagine spending top dollar on a 8800GTX that has been out for a year. If it were a new iteration of a high-end card it may almost be worth it, but not for a card that old just waiting to be replaced...

Guess I'll have to go with lower settings for a bit.

By MrBungle123 on 10/9/2007 7:17:15 PM , Rating: 2
Pretty much what I'm thinking... I was hoping to replace my SLI'd 6800 GT's (which were fast 3 years ago but seriously lacking now) with a 9800 GTS but looks like thats not coming out any time soon... so now do I drop $400+ on a year old 8800 GTS and get mediocre performance in new games or wait ANOTHER 3-6 months for the new generation of cards?

By Lakku on 10/9/2007 8:37:15 PM , Rating: 2
Crysis will play fine on a single 8800gtx at anything under or at 1920x1200. Everytime they have demo'd the game, it's been on an 8800gtx, and that card runs it just fine, at least without FSAA. You know someone is going to do a round up when Crysis comes out, so just wait and see how all the cards perform, but if you had to buy now, you'd be fine playing Crysis at high or max settings with an 8800gtx, though you are out 500 bucks.

What's with the naming scheme
By andreslin on 10/10/2007 3:55:39 PM , Rating: 2
Isn't G9X supposed to be Geforce 9 Series?
It really makes no sense to me that a Geforce 8 card use G92 as its codename.
They have the freedom to name the product whatever they like, but it won't do them too much good to confuse the consumer.

Also, I was hoping the real Geforce 9 can be out this winter, so I can finally kiss-goodbye my snail machine...sigh........

RE: What's with the naming scheme
By DLeRium on 10/10/2007 7:22:04 PM , Rating: 3
I think it confuses the enthusiast consumer like us who actually remember the core names like RV600 and stuff. We are the same people who care about SLI systems and overclocking. to everyone else, they could care less whether its G100 G50 or G20.

R600 is 80nm GPU
By defter on 10/9/2007 6:19:52 PM , Rating: 2
Not 90nm.

8800GTX is NOT AMD/ATI's target
By rhog on 10/10/2007 1:43:15 PM , Rating: 2
I am always amazed by the people who post comment here thinking that the only way to win in the graphics card market is to have the fastest cards. This is simply not the case, AMD bought ATI to build integrated chipsets just like the number one Graphics Chip seller in the world... Intel. As an owner of both Nvidia(7050) and the AMD (690G) solutions Nvidia is hurting. The NVidia chipset is not has good on many fronts including from a Power standpoint, quailty of image and even speed. While I am sure AMD would love to have the 2900XTX card beat Nvidia's 880GTX not very many of these cards are sold relative to the number of integrated chipsets. This is also why AMD came out with the low power Athlon X2 BE series. They have a very nice and inexpensive soltuion that can be used to display HD content and it uses very little power (which should make the greenies happy), Nvidia's chipset is just a rehash of their previous generation. AMD seems to me to be focused on a different market than Nvidia, where the money is not just the Fame. Intel doesn't need to build the faste video chipset solutions but they do build the most.

RV670 looks good, but too little to late...
By Warren21 on 10/9/07, Rating: -1
By Warren21 on 10/9/2007 5:54:49 PM , Rating: 2
Title should read *too late...

RE: RV670 looks good, but too little to late...
By defter on 10/9/2007 6:20:36 PM , Rating: 2
RV670 has 256bit memory controller. 512bit one would be insane for a mid-range part.

RE: RV670 looks good, but too little to late...
By Warren21 on 10/9/2007 6:36:09 PM , Rating: 2
RV670 isn't intended for only one SKU. Maybe the RV670 'GT' or RV670 'Pro' models [will feature 256-bit memory], but having the rumoured 825MHz Core / 1200MHz (2.4GHz DDR) RV670XT version limited to a 512-bit interface seems counter-productive (HD2900 XT = 512-bit, HD2950 XT = 256-bit?)

Numerous sites claimed outrageous facts months ago that the HD 2900Pro's would, for example, have a reduced number of processor units or a 256-bit interface...

Thus, I think its too early to badge this ASIC as 'midrange' just yet...

By James Holden on 10/9/2007 6:42:38 PM , Rating: 4
Numerous sites claimed outrageous facts

Be careful when you lump sites like Inquirer and "facts" into the same sentence.

By Vanilla Thunder on 10/10/2007 11:43:15 AM , Rating: 3
Diss the Inq all you want. Some of it's rubbish, and some of it turns out to be very true. Either way, it's nowhere near as pretentious as this site's reader's can be, and it's wicked fun to read.

Snog yourselves.

Vanilla Thunder

By Jkm3141 on 10/29/2007 2:07:29 AM , Rating: 2
Well put.

RE: RV670 looks good, but too little to late...
By Anh Huynh on 10/9/2007 7:06:16 PM , Rating: 2
It's midrange because of the pricing structure, not the performance. Neither company have unveiled plans for their next-generation flagship quite yet, except minor code names.

By Frallan on 10/10/2007 3:20:34 AM , Rating: 1
Ohh so true - The Inqwell reports it to the best of their knowledge. Given that their knowledge is based on whet they overheard from a drunk technichian at some conference you can#t call it facts.

However credit where credit is due - they often call the truth b4 e1 else.

RE: RV670 looks good, but too little to late...
By defter on 10/10/2007 3:18:55 AM , Rating: 2
but having the rumoured 825MHz Core / 1200MHz (2.4GHz DDR) RV670XT version limited to a 512-bit interface seems counter-productive

Why? Actually, if such high memory clock will be used, then there will not much need for a wide memory bus. Keep in mind that currently 2900XT has doesn't really benefit from that extra memory bandwidth.

2.4GHz memory clock with 256bit bus would give about 77GB/s of memory bandwidth. 8800 GTS has 64GB/s of bandwidth and is doing fine against 2900XT that has 106GB/s of memory bandwidth.

(HD2900 XT = 512-bit, HD2950 XT = 256-bit?)

They could call RV670 a 2900Pro. There has been many this kind of examples: GeForce 7950GT is slower than 7900GTX, Radeon 1950Pro is slower than 1900XTX and so on.

By defter on 10/10/2007 3:19:26 AM , Rating: 2
Sorry, I meant: RV670 could be a 2950Pro

By Darkskypoet on 10/27/2007 4:14:33 PM , Rating: 3
Completely agree. In fact, you may actually see a GDD4 part worth having with this chip. As well, you could see higher come from not only the die shrink, but also from the lowered complexity / transistor count. This could give more to the new part then the simple die shrink, and lead to a better ocing experience.

Additionally, if one is looking to throw 2 chips on a card, the drop in memory bus complexity is a must to bring cards out at a decent price point. Imagine the traces otherwise to memory, the complexity of the PCB.. Yuck.

R600 was also meant to be able to run well in multi gpu configurations, so I'd expect this to carry thru to the RV670. I'd also expect the RV670 performing better then the r600 in many if not all situations. (if they want it to)

Remember the R600 wasn't just a gaming card. It is a GPGPU, and as such other considerations then just winning the top spot for video cards are important. Nvidia is just bringing Tesla into the field, whereas AMD/ATI have had their GPGPU solutions selling to server vendors (and Folding nuts)since the x1900 (perhaps x1800). Because of this, and limited AMD/ATI resources, the R600 was a compromise from the beginning. Not a shot at the super highend at all.

"What would I do? I'd shut it down and give the money back to the shareholders." -- Michael Dell, after being asked what to do with Apple Computer in 1997
Related Articles
Crysis System Requirements Revealed
October 9, 2007, 3:43 PM
Intel Sets Official "Penryn" Launch Date
September 18, 2007, 1:17 PM
AMD Unveils "Barcelona" Architecture
September 7, 2007, 3:03 PM

Most Popular ArticlesSmartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
UN Meeting to Tackle Antimicrobial Resistance
September 21, 2016, 9:52 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Update: Problem-Free Galaxy Note7s CPSC Approved
September 22, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki