backtop


Print 77 comment(s) - last by Jkm3141.. on Oct 29 at 2:07 AM


AMD's RV670 processor runs on a single-slot cooler design, though working samples unveiled last week used a two-slot solution instead.  (Source: Coolaler.com)
Next-generation GPUs are the fastest things on the planet -- if they were released a year ago

Traditionally the Fall graphics refresh has been the battle of the titans -- ATI and NVIDIA both would debut behemoth video cards in an attempt to snag the headlines from one another.

Much of that changed when AMD acquired ATI last year.  Not only did ATI miss the Radeon HD 2900 launch window by almost six months, but NVIDIA's high-end GeForce 8800 became the undisputed ultra-high-end GPU as well.

This Fall, we will not get an ultra-high-end replacement from AMD or NVIDIA. Instead, November will be a clash of the sub-titans.  NVIDIA's mid-range G92 will go head-to-head with ATI's RV670.

ATI's RV670 has been called many things in the past. It was originally a 65 nanometer die-shrink of the R600 class GPU; then a 55 nanometer shrink. Taiwan Semiconductor Manufacturing Company, Asia's largest core-logic foundry, confirmed AMD would go with a 55nm R600 shrink in a memo forwarded to DailyTech earlier this year.

When TSMC debuted its 55nm process earlier this Spring, the company claimed "significant die cost savings from 65 nm, while offering the same speed and 10 to 20 percent lower power consumption."  Since R600 was manufactured originally on a 80nm node, thermal improvements should be fairly dramatic on RV670.

Last week at the World Cyber Games, Sapphire demonstrated a working RV670 using a dual-slot cooler.  Sapphire and ATI engineers alluded to DailyTech that this dual-slot configuration will likely be replaced with a single-slot solution by time of launch.

NVIDIA's G92 has also carried many names.  Originally slated as the 65nm "fill-in" GPU between GeForce 8600 and GeForce 8800, the company began changing documentation earlier this month as ATI's offerings began to firm up. 

NVIDIA confirmed the specifications of G92 with board partners earlier this week.  The GeForce 8800 GT will feature a 600 MHz core clock, a 900 MHz memory clock and a 256 bit memory interface.

Newest guidance from NVIDIA, released Monday, claims the 8800 GT will feature 112 stream processors and a shader clock locked at 1500 MHz. 

The one thing that didn't change on G92 is the process node.  NVIDIA's foundry partner, TSMC, forwarded a second memo to DailyTech confirming G92 is in mass production at the company's Fab 12 with samples available now on 65nm process node.  NVIDIA's GeForce 8500 and GeForce 8600 are manufactured on TSMC's 80nm node; GeForce 8800 GT will be the company's first 65nm graphics processor.

NVIDIA guidance suggests G92 will be here next month, followed by AMD's marketing blitz for RV670, RD790 and Phenom.  All three AMD offerings are expected to launch on the same day, which AMD distributors have penciled in for late November. Intel is expected to launch its 45nm Penryn processors on November 12, and any NVIDIA launch will likely coincide with that announcement.

Late last week, Maximum PC reported that NVIDIA senior vice president Dan Vivoli commented that NVIDIA would be releasing new hardware to go along with the upcoming title Crysis.  The confirmed launch date by Electronic Arts for Crysis is November 15, 2007

However, G92 fans might get a quick preview of the new GPU on October 29, 2007, when the company officially lifts the embargo on 8800 GT.

Neither AMD nor NVIDIA have released "firm" pricing for the products, though we can reasonably infer several key points regarding the price.  Since RV670 is effectively a smaller R600, performance will be very similar to existing R600-based cards on the market today.  However, since the card only utilizes a single-slot cooler and a considerably smaller die, the cost of these cards should be lower than existing R600s.

G92, which was originally called GeForce 8700 until just last week, has a soft suggested retail price of $250, according to NVIDIA board partners.  Since the GeForce 8800 GT will be launching first, it's fairly likely that AMD will adjust the suggested price of RV670 depending on the outcome of initial GeForce 8800 GT feedback.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Sub-titans...
By James Holden on 10/9/2007 5:08:30 PM , Rating: 5
quote:
Has the pace of graphics innovation slowed down primarily because after being acquired by AMD, ATI offerings have ceased to be competitive with those from nvidia in terms of sheer performance? Is nvidia content to just milk their position at the top, for however long they happen to have no real competition from AMD/ATI?

Yes.


RE: Sub-titans...
By kilkennycat on 10/9/07, Rating: 0
RE: Sub-titans...
By retrospooty on 10/9/2007 9:37:49 PM , Rating: 5
All of what you say may be true, except the "no" part. Nvidia is milking the hell out of the AMD/ATI stumble. They have been milking the 8800GTX for over 1 year, and will continue to do so as long as they can.


RE: Sub-titans...
By masher2 (blog) on 10/9/2007 10:56:47 PM , Rating: 1
> "They have been milking the 8800GTX for over 1 year"

Slightly under a year, actually. The GTX was released Nov 2006.


RE: Sub-titans...
By Dactyl on 10/9/2007 10:58:19 PM , Rating: 3
If by "milking" you mean "making profits while they develop the next generation of cards," then NVidia has been milking the situation for the past 10+ years.

Sometimes they have a good year (8800), sometimes they have a bad year (FX5900). NVidia isn't "milking" anything, because it's not resting on its 8800 line. NVidia did not fire its design team, pocket their salaries, and decide to sell 8800s for the next five years.


RE: Sub-titans...
By mushi799 on 10/10/2007 5:23:41 AM , Rating: 3
it's milking by today's standard. The current video war or lack thereof is dismal. Usually by now we get hints/previews of the next generation, but none so far.


RE: Sub-titans...
By retrospooty on 10/10/2007 9:48:11 AM , Rating: 5
"If by "milking" you mean "making profits while they develop the next generation of cards,""


No, Milking means they are not pushing the next gen out as fast as they could have and would have if AMD had a viable competitor product. thus they are "milking" profit out of the current chip. See Intel prior to the Athlon for reference.


RE: Sub-titans...
By daniyarm on 10/10/2007 11:30:25 AM , Rating: 1
So you would rather see Nvidia release a new GPU and then try to get the drivers right for the next year or so? I'd rather have them offer the whole solution at once. I think they learned their lesson with 8800. You can't rush a product to market when software support is in beta stages.


RE: Sub-titans...
By retrospooty on 10/10/2007 10:30:34 PM , Rating: 3
They need to do both, like they used too its not either or... The driver team is not related to the hardware team. the drivers suck now because they let alot of the talent go, either through layoff, or attrition, they did not replace talent that left the department.


RE: Sub-titans...
By 1078feba on 10/10/2007 12:50:56 PM , Rating: 2
Hopefully they're making profits. $425,000,000 in G80 R&D is quite a bit to recoup.

Although I can see the counterpoint: a lack of direct competition at the highest level most assuredly would lead a company to at least sit back and take a deep breath before rolling up it's sleeves and digging in again.

The flip side of this is that with the 8800 series now so prevalent, with so many sold, it gives NVidia serious impetus to keep working on the drivers to get the absolute max performance out of them...one would think.

Mike Magee, the old geezer who runs the Inquirer, also has a weekly column in CPU magazine, used his space in November's issue (that's right, November, subscribers get it early) to lay it on pretty thick for Intel's upcoming Larrabee.

http://en.wikipedia.org/wiki/Intel_Larrabee

http://arstechnica.com/news.ars/post/20070917-inte...

http://arstechnica.com/articles/paedia/hardware/cl...

Times, they are a-changin'.


RE: Sub-titans...
By thartist on 10/10/2007 4:12:45 PM , Rating: 2
You don't have to take it to such an extreme. nVIDIA is not holding still in development, but definitely slowed down on some fronts.

Do the thinking yourself: do you think we would be under such frozen videocard line-ups if the AMDATI catastrophe hadn't occured?


RE: Sub-titans...
By Ryanman on 10/14/2007 2:57:34 PM , Rating: 1
Yeah, they are milking. And they have every damn right to do so.


RE: Sub-titans...
By kilkennycat on 10/10/2007 5:19:46 AM , Rating: 2
The acquisition of ATi was (imho ) the biggest blunder AMD ever made. Going several $billion into debt for a company with a consistent record of non-performance on meeting promised new product delivery schedules. And ATi has continued to non-perform in that department up to this day. This has severly hurt AMD's bottom line, besides the interest payments on the debt. With the imminent arrival of Penryn and the closely-following low-power mobile products on the 45nm Intel process, AMD is going to be in a world of hurt.


RE: Sub-titans...
By murphyslabrat on 10/10/2007 12:45:25 PM , Rating: 2
AMD is about 75% smaller than Intel. When you are in that kind of position, you cannot muscle your way up the ladder without a vastly superior product, and "playing it safe" will not get you there. You have to take risks, but sometimes they hurt.

AMD's K9 processors are an example of a failed gamble. Designed to be massively parallel in preperation for excellent multitasking, they performed poorly and sucked energy. And this failure was costly, the degree to which is currently unknown (at least to me, if anyone has any articles, please share).

Amd has also taken a risk with the acquisition of ATI. The intent of which is apparent, as anyone who has dredged through their hordes of "pretty pictures detailing market dominance" can tell you. A complete GPGPU solution, integrated into the CPU core. This would be an excellent solution, as it would make PC buying simpler for an uneducated consumer, and make integrated graphics redundant for an AMD box. This is in addition to the immediate benefits, namely: revenue from ATI's customer base, as well as potentially doubling the gross income. This would encourage investment in AMD, as it would appear to be twice the company it was before.

Obviously, as you say, it was an immensely risky move; and one that, as of yet, is a greater cause for gray hairs than is old-age for many of those involved. With the volatile nature of this market, you cannot afford downtime, as a few bad years can ruin the next decade. 3 years ago would have been an excellent time to perform this venture, as AMD was gaining market share and ATI contrasted a horrible year (x000 series) to an immensely successfull period immedietly prior. Furthermore, Intel had it's head up it's butt with Netburst, and is only now recovering from the poor image it had generated in the enthusiast market.

However, they didn't. They only attempted this venture as the horizon started to look particularly bleak. And, whether this is too little, too late remains to be seen; but nonetheless, as evidenced by Via, AMD is going to be around for a while yet.


RE: Sub-titans...
By RedFlyer7 on 10/11/2007 12:06:10 AM , Rating: 3
No doubt everyone and their grandma was expecting the G92 codename to be their new flagship 9800 XXX series product and to be released in November to coinside with the Intel Peryn CPU release. Then the usual subsequent 6-month trickle-down of the new technology into their mid-range and low-end products for this new generation. Obviously all parties are disappointed except for maybe AMD/ATI. This is somewhat refelcted in nVidia's recent share price fluctuations once the 8800GT news was exposed.

Milking the tech is ok thats business but hopefully not sitting on their laurels. Lessons from the past demonstrated how ATI dominated with the 9700pro and subsequently lost it for the enxt 4years with the intro of the nVidia 6800 series ->7800 series and 8800 series. 3 generations they have lost the battle to nVidia and though nVida holds the crown now the balance can shift on a dime. Any of you old geesers remember S3? S3 once sold more video chips thans than Intel. How abour Cirrus Logic? Trident? All once wore the crown and dominated the field yet where are their now? This leads to the next question what of the future?

AMD's acquisition of ATI. This may seem like a bad move at them moment because there are no immediate benefits but in the long-run this is a great move. The dream of system on a chip (SOC) is one step closer. Even with the power of todays Core 2 Duo Peryns the cost performance ratio of adding more than 2-4 cores on a chip die dramatically after going beyonf 4 cores except on a limited range of speciality applications.

"The next evolutionary step is to have multiple CPU cores side by side with multiple GPU cores and possibly other additional custom DSP cores like physics processors."

This is where you will get the most performance in the future. Lets examine the position of each horse in this race:

1. AMD/ATI have embarked on this road already witht heir Fusion project. From the CPU+GPU on the same die perspective they are way ahead since they both have mature and cutting-edge CPU and Graphics products. Chipset-wise they was dead last behiond Intel and nVidia but that should not be difficult to remedy. The great leap forward has already begun for them although in the immediate future they may have to stomach the Intel dominance into he CPU world for another year and nVidia GPU world.

I am an nVidia fan and have 7 cards from thier families but I have great faith in AMD/ATI. Afterall wasnt it AMD who came out with the Athlon in when the Pentium was king, wasn't it AMD who pioneered on-Die memory controller to increase performance, wasn't it AMD who created the type Hyper-transport type model for improved I/O for motherboards instead of just plain FSB+PCI I/O, wasn't it AMD who created EMT64 instruction set to support the like of 64-bit OSes like Vista x64 which later Intel was forced to copy and support evne to this day in their Core 2 Duo line?, and dont' forget it was AMD who first introduced dual-core to the world. The need for survival brings out the best in this Darwinian world and the the case remains for AMD to be the cut-edge leader once again. Time is of the essence because if you look farther into the future, they have the lead although it may not seem like it. Let the price-cutting for survival begin until Fusion is released.

2. Intel. Where to begin. Their history is well known. Their strength is not innovation unless there is competetion(ty AMD). What they do have is second to none manufacturing prowess. They can sweep the competition under a tidal wave of cpus, chipsets, etc. THe have been dominating for the past 2 years with the release of the Core Duos and Core 2 Duos. They release their chips on better lower micron tech faster and in more quantity than anyone else. AMD however can come back strong by using world-class manufacturers liek TSMC or UMC who are no sluches and can give Intel a run for its money in terms of technology and mass production capabilities.

Intel's CPU tech although not innovatiove copies and enchances the orignal in some cases. Remeber the Japanese use to be known for being able to cpoy and make betetr products than the US? They have a mature CPU platform but their weakness is graphics. Mostly they have been content in the last 2 decades to make integrated low-performance cores. Now to compete they may be forced overnight to make high-end GPUs if not leading at least on a comparable level to ATI/nVidia. Why not just snap up nVidia then? Simply because as big as Intel is they cannot afford nVidia. That is why they are frantically trying to build a graphics team from ground-up and they will use any means necessary to hire the best engineers.

3. NVida the current king of the non-integrated graphics market and can possibly eat some of the integrated market share with their recent IGP releases. They are the undisputed leader at the moment but what they lack is a general CPU product. We have already seen the in the graphics market the crwon can pass from one company to another in the blink of an eye. What are their options? Cullde up in bed with Intel or one of the other CPU makers such as SUN's Nigara or even buy a platform like Sony's cell chip to create that dream of SOC with CPU+GPU on the same die. However this is all unlikely as the mountain is the hardest to climb to convince software developers to support yet another x86 architecture. Soem people may say nVidia is in the worst position and could be relegated to second or thrid-tier just supplying chips to Intel. Yet somehow I doubt Nvidia's CEO good ole Jen Sun will allow that. Nvidia only simply needs to continue what they do best run a tight ship, produce the best prodcts and at a lower price.


RE: Sub-titans...
By NT78stonewobble on 10/11/2007 3:18:49 PM , Rating: 2
quote:
Lessons from the past demonstrated how ATI dominated with the 9700pro and subsequently lost it for the enxt 4years with the intro of the nVidia 6800 series ->7800 series and 8800 series. 3 generations they have lost the battle to nVidia and though nVida holds the crown now the balance can shift on a dime.


Well actually the x800's and x850's were superior compaired to the 6800's.

ATI was down again when it was the x1800 versus 7800's and were ahead again with the x1900-x1950's.


RE: Sub-titans...
By 3kliksphilip on 10/13/2007 5:50:18 PM , Rating: 2
The X800's are now redundant due to their SM2.0 support. I think that the X1950 Pro is ATI / AMD's lifeline. It's a fantastic mid range product which kicks the Geforce 8600 / ATI X2600... in the face.


RE: Sub-titans...
By Hlafordlaes on 10/12/2007 5:48:08 PM , Rating: 2
Agreed. Had the odd thought that a VIA-NVidia linkup might be mutually beneficial. VIA's got the old x86 CPU from Cyrix, and has made inroads in the mini-itx market. However, they are about to get their lunch eaten as Intel is launching a SFF initiative. VIA's boards are OK for general purpose computing, but it's the poor integrated graphics that are really hurting the ability to drive VIA's mini-itx into the set-top and htpc market. A deal with NVidia would seem wise. As far as NVidia taking advantage of the VIA's cpus, I'll let more knowledgeable posters punt on that. I'd love to think that such a match-up could become a third option to Intel/AMD, even if in niche form factors.


RE: Sub-titans...
By sxr7171 on 10/9/07, Rating: -1
RE: Sub-titans...
By retrospooty on 10/10/2007 9:50:20 AM , Rating: 2
Intel has a viable competitor with AMD. the A64 are not far behind, and the Phenom may well be a bit ahead when they fix the yield issues. Intel milked the hell out of us before the Athlon, and would d it again if AMD wasnt producing competitive chips. Yes the A64 is behind, but not by far, especially when you look at price/performance ratio.


RE: Sub-titans...
By GTaudiophile on 10/10/07, Rating: 0
RE: Sub-titans...
By JWalk on 10/28/2007 9:17:27 PM , Rating: 2
Right. And I am sure that if ATI had "won the war" then they would have continued to innovate at a fast pace, right? Nvidia is doing what any company would do. They have the opportunity to slow things down a bit, and take their time developing their next products. Am I personally happy that there aren't new GPU's coming as fast and cheap as before? No. But, I don't blame Nvidia for doing a good job.


RE: Sub-titans...
By ultimatebob on 10/11/2007 10:38:07 PM , Rating: 2
Unfortunately. I really wish that S3 or Matrox would get back into the gaming graphics card market, since we really need some competition on the high end.


RE: Sub-titans...
By Farfignewton on 10/13/2007 6:13:57 PM , Rating: 2
quote:
I really wish that S3 or Matrox would get back into the gaming graphics card market, since we really need some competition on the high end.


I've been watching the 3d accelerator market since the 3dfx voodoo, but I guess I must have blinked or something and missed the entire high end S3 and/or Matrox period. Matrox was never more than mid-level that I can recall, and if S3 was still in the game they'd probably be pushing something worse than integrated graphics, like video acceleration through USB 1. ;)


RE: Sub-titans...
By Martimus on 10/15/2007 3:03:50 PM , Rating: 2
Matrox was a major player before the 3D accelerator market. They were often considered the premier 2D graphics card producer. Even early this decade, some considered them the best dual monitor videocard on the market.


RE: Sub-titans...
By jabber on 10/16/2007 12:58:36 PM , Rating: 2
Indeed, around 1998 the setup de jour of the time was a Matrox Millenium 2 partnered with a 3dfx Voodoo2. It didnt get much better than that!

Though I was running with a Matrox Mystique 330(?) and a Matrox M3D at the time. The game support wasnt so good but the image quality from the PowerVR setup was better.

Happy days.


"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles
Crysis System Requirements Revealed
October 9, 2007, 3:43 PM
Intel Sets Official "Penryn" Launch Date
September 18, 2007, 1:17 PM
AMD Unveils "Barcelona" Architecture
September 7, 2007, 3:03 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki