Print 77 comment(s) - last by Jkm3141.. on Oct 29 at 2:07 AM

AMD's RV670 processor runs on a single-slot cooler design, though working samples unveiled last week used a two-slot solution instead.  (Source:
Next-generation GPUs are the fastest things on the planet -- if they were released a year ago

Traditionally the Fall graphics refresh has been the battle of the titans -- ATI and NVIDIA both would debut behemoth video cards in an attempt to snag the headlines from one another.

Much of that changed when AMD acquired ATI last year.  Not only did ATI miss the Radeon HD 2900 launch window by almost six months, but NVIDIA's high-end GeForce 8800 became the undisputed ultra-high-end GPU as well.

This Fall, we will not get an ultra-high-end replacement from AMD or NVIDIA. Instead, November will be a clash of the sub-titans.  NVIDIA's mid-range G92 will go head-to-head with ATI's RV670.

ATI's RV670 has been called many things in the past. It was originally a 65 nanometer die-shrink of the R600 class GPU; then a 55 nanometer shrink. Taiwan Semiconductor Manufacturing Company, Asia's largest core-logic foundry, confirmed AMD would go with a 55nm R600 shrink in a memo forwarded to DailyTech earlier this year.

When TSMC debuted its 55nm process earlier this Spring, the company claimed "significant die cost savings from 65 nm, while offering the same speed and 10 to 20 percent lower power consumption."  Since R600 was manufactured originally on a 80nm node, thermal improvements should be fairly dramatic on RV670.

Last week at the World Cyber Games, Sapphire demonstrated a working RV670 using a dual-slot cooler.  Sapphire and ATI engineers alluded to DailyTech that this dual-slot configuration will likely be replaced with a single-slot solution by time of launch.

NVIDIA's G92 has also carried many names.  Originally slated as the 65nm "fill-in" GPU between GeForce 8600 and GeForce 8800, the company began changing documentation earlier this month as ATI's offerings began to firm up. 

NVIDIA confirmed the specifications of G92 with board partners earlier this week.  The GeForce 8800 GT will feature a 600 MHz core clock, a 900 MHz memory clock and a 256 bit memory interface.

Newest guidance from NVIDIA, released Monday, claims the 8800 GT will feature 112 stream processors and a shader clock locked at 1500 MHz. 

The one thing that didn't change on G92 is the process node.  NVIDIA's foundry partner, TSMC, forwarded a second memo to DailyTech confirming G92 is in mass production at the company's Fab 12 with samples available now on 65nm process node.  NVIDIA's GeForce 8500 and GeForce 8600 are manufactured on TSMC's 80nm node; GeForce 8800 GT will be the company's first 65nm graphics processor.

NVIDIA guidance suggests G92 will be here next month, followed by AMD's marketing blitz for RV670, RD790 and Phenom.  All three AMD offerings are expected to launch on the same day, which AMD distributors have penciled in for late November. Intel is expected to launch its 45nm Penryn processors on November 12, and any NVIDIA launch will likely coincide with that announcement.

Late last week, Maximum PC reported that NVIDIA senior vice president Dan Vivoli commented that NVIDIA would be releasing new hardware to go along with the upcoming title Crysis.  The confirmed launch date by Electronic Arts for Crysis is November 15, 2007

However, G92 fans might get a quick preview of the new GPU on October 29, 2007, when the company officially lifts the embargo on 8800 GT.

Neither AMD nor NVIDIA have released "firm" pricing for the products, though we can reasonably infer several key points regarding the price.  Since RV670 is effectively a smaller R600, performance will be very similar to existing R600-based cards on the market today.  However, since the card only utilizes a single-slot cooler and a considerably smaller die, the cost of these cards should be lower than existing R600s.

G92, which was originally called GeForce 8700 until just last week, has a soft suggested retail price of $250, according to NVIDIA board partners.  Since the GeForce 8800 GT will be launching first, it's fairly likely that AMD will adjust the suggested price of RV670 depending on the outcome of initial GeForce 8800 GT feedback.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Sub-titans...
By noirsoft on 10/9/2007 5:17:23 PM , Rating: 2
I would say that the technical demands of DX10 + Vista and the new hardware/driver architecture required can just as easily explain the slight dip. This time next year will say for sure if there's an actual slowdown in improvements.

RE: Sub-titans...
By masher2 on 10/9/2007 6:49:59 PM , Rating: 3
I'll generate some heat for saying so, but I believe we're seeing NVidia and AMD both forced into a strange semblance of Intel's "tick-tock" business model. I don't think we'll ever go back to seeing major chip revisions every 12 months, or to their debuting top-end GPUs on the smallest process node.

RE: Sub-titans...
By Orbs on 10/9/2007 7:09:55 PM , Rating: 2
I don't know if I agree with that. The process changes/die shrinks can still happen "off schedule" per the new architecture, and long before Intel coined "tic-tock", nVidia and ATI were introducing new artchitectures in the fall followed by a "spring refresh". If the refresh coincided with a new process or die shrink, we could still get a major revision every 12 months.

RE: Sub-titans...
By Captain Orgazmo on 10/9/2007 8:43:58 PM , Rating: 5
I actually hope it slows down. Being male, I am compelled to buy all new gadgets, and as a result my wallet looks like it has spent the last year in a gulag in Siberia. Also, most games I play are good enough to have been around for a couple of years (or more), and the newest graphics card was unnecessary, but, like I said, I'm male. So I thank God or the Martians that my vid card will be acceptable for the next few months anyways.

RE: Sub-titans...
By helios220 on 10/10/2007 9:40:46 AM , Rating: 2

Look at what happens in the console world; a set of baseline hardware is released with little to no performance changes for several years... yet the quality of visuals in the games released for that console increase significantly over time.

When hardware isn't dramatically changing all of the time, it makes developers find new ways to optimize their design and techniques to provide better quality software rather than just assuming the consumer will take it up the @$$ to buy $600 graphics cards every year.

While I am always thrilled by the advent of new technology, in the end I get sick of always knowing that no matter how much money I throw down on a card it'll be replaced by something better within a matter of months or at best a year.

RE: Sub-titans...
By mars777 on 10/10/2007 2:50:40 PM , Rating: 2
I hardly doubt you will see many significantly visually improved games on the X360 for one.

Halo3 and Crysis are pushing its hardware on the maximum already.

If visual improvements do come, it will not be related to the hardware specs, but developer tricks and magic, like the granturismo line on the PS2.

The last gran turismo, on the PS2 was a product of artists/programmers that we should all bow to. Placement of objects on the scene related to typical player paths, using textures and even sprites on segments of the scene where it doesnt degrade quality, etc.

This is programming, not the EA sports crap :)

"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher
Related Articles
Crysis System Requirements Revealed
October 9, 2007, 3:43 PM
Intel Sets Official "Penryn" Launch Date
September 18, 2007, 1:17 PM
AMD Unveils "Barcelona" Architecture
September 7, 2007, 3:03 PM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki