backtop


Print 122 comment(s) - last by VooDooAddict.. on Jan 9 at 3:21 PM

Details of AMD's next generation Radeon hit the web

Newly created site Level 505 has leaked benchmarks and specifications of AMD’s upcoming ATI R600 graphics processor. The upcoming graphics processor is expected to launch in January 2007 with an expected revision arriving in March 2007. These early specifications and launch dates line up with what DailyTech has already published and are present on ATI internal roadmaps as of workweek 49.

Preliminary specifications from Level 505 of the ATI R600 are as follows:
  • 64 4-Way SIMD Unified Shaders, 128 Shader Operations/Cycle
  • 32 TMUs, 16 ROPs
  • 512 bit Memory Controller, full 32 bit per chip connection
  • GDDR3 at 900 MHz clock speed (January)
  • GDDR4 at 1.1 GHz clock speed (March, revised edition)
  • Total bandwidth 115 GB/s on GDDR3
  • Total bandwidth 140 GB/s on GDDR4
  • Consumer memory support 1024 MB
  • DX10 full compatibility with draft DX10.1 vendor-specific cap removal (unified programming)
  • 32FP [sic] internal processing
  • Hardware support for GPU clustering (any x^2 [sic] number, not limited to Dual or Quad-GPU)
  • Hardware DVI-HDCP support (High Definition Copy Protocol)
  • Hardware Quad-DVI output support (Limited to workstation editions)
  • 230W TDP PCI-SIG compliant
This time around it appears AMD is going for a different approach by equipping the ATI R600 with less unified shaders than NVIDIA’s recently launched GeForce 8800 GTX. However, the unified shaders found on the ATI R600 can complete more shader operations per clock cycle.

ATI's interal guidance states the R600 will have 320 stream processors at launch; 64 4-way unified shaders only accounts for 256 of these stream processors.

Level505 claims AMD is expected to equip the ATI R600 with GDDR3 and GDDR4 memory with the GDDR3 endowed model launching in January. Memory clocks have been set at 900 MHz for GDDR3 models and 1.1 GHz for GDDR4 models.  As recent as two weeks ago, ATI roadmaps had said this GDDR3 launch was canceled.  These same roadmaps claim the production date for R600 is February 2007, which would be after a January 22nd launch.

Memory bandwidth of the ATI R600 is significantly higher than NVIDIA’s GeForce 8800-series. Total memory bandwidth varies from 115GB/s on GDDR3 equipped models to 140GB/s on GDDR4 equipped models.

Other notable hardware features include hardware support for quad DVI outputs, but utilizing all four outputs are limited to FireGL workstation edition cards.

There’s also integrated support for multi-GPU clustering technologies such as CrossFire too. The implementation on the ATI R600 allows any amount ofATI R600 GPUs to operate together in powers of two. Expect multi-GPU configurations with greater than two GPUs to only be available for the workstation markets though.

The published results are very promising with AMD’s ATI R600 beating out NVIDIA’s GeForce 8800 GTX in most benchmarks. The performance delta varies from 8% up to 42% depending on the game benchmark.

When DailyTech contacted the site owner to get verification of the benchmarks, the owner replied that the benchmark screenshots could not be published due to origin-specific markers that would trace the card back to its source -- the author mentioned the card is part of the Microsoft Vista driver certification program.

If Level505's comments seem a little too pro-ATI, don't be too surprised.  When asked if the site was affiliated in any way to ATI or AMD, the owner replied to DailyTech with the statement that "two staff members of ours are directly affiliated with AMD's business [development] division."


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

THE GUY IS LYING...
By tognoni on 12/31/2006 10:47:28 AM , Rating: 0
Hi...there are way too many gaps in this guy's story to be deemed reliable. This is the email I just sent him. I am visibly irritated, mostly because tomorrow I was to buy an NVidia 8800GTX and obviously now I will have to at least wait in case the benchmarks have a truthful foundation...

Hi,

I just read your seemingly objective review of the ATI R600 card. I don’t know if the benchmarks are real or not…but it’s pretty obvious to me that you are lying about everything else.

I also don’t know if you made up this article just to gain attention and have a laugh, or if you’re paid by ATI (this article will definitely slow down sales of the 8800 until the ATI card reach the market, which is probably going to be quite a bit later than you say, knowing ATI), or if ATI is just using you as the voice of an enthusiast to give a “reliable” view of its new card, or finally if you just wanted to quickly generate a huge amount of traffic on your site hoping to cash in on an advertising quick buck. All this I don’t know, but let’s look at the facts:

1) you are no hardware review site (or any other type of site at all): you have nothing on the site other than this article.

2) you keep talking about driver improvements, Vista, blah blah which will improve the performance of ATI’s card. At least this shows you’re biased towards ATI (which by the way is a lot worse than NVidia at developing drivers…)

3) You talk of a completely new ATI architecture coming out at the end of the year. No single graphics company has ever introduced 2 graphics generations in one year…hardly unlikely.

4) you get a sample card and have a chance to be the first one in the world to publish results of the most awaited card of the year and yet you DON’T test at higher resolutions than 1600x1200, which is where you could really assess the performance of the card (that would at least suggest that somebody passed you the benchmarks, and you in fact do not have the card at your disposal)

5) MUCH more importantly, you say that you cannot publish images of the card because you can’t remove marks that would give away the source of the card. PLEASE, this is bull, you could have come up with a more inventive excuse! Apart from physical ways to cover the marks…have you ever heard of Photoshop? You could remove the marks digitally in minutes…

I was just about to buy an NVidia 8800GTX card. Given that I must give you the benefit of the doubt, I will now wait for a reliable source to publish benchmarks. Hopefully you will not have wasted my time.





RE: THE GUY IS LYING...
By Slappi on 12/31/06, Rating: 0
By KristopherKubicki (blog) on 12/31/2006 12:29:24 PM , Rating: 2
quote:
Yah I was just about to buy a new 8800gtx now I am going to wait.... if this guy turns out to be lying I will spend at least the next few months bashing his site.

Well, he openly states AMD employees are on his staff. I don't really know how much staff he could have, but that statement should affect how you feel about these benchmarks.


RE: THE GUY IS LYING...
By masher2 (blog) on 12/31/2006 12:40:18 PM , Rating: 1
> "This is the email I just sent him, I am visibly irritated..."

If you're getting emotional over video card reports, you might want to relax and get out a bit more :) So what if you buy the 8800? A faster card will come out and beat it...if not in one month, then in 4-5 for sure.

> "No single graphics company has ever introduced 2 graphics generations in one year..."

Actually, this used to be quite common, back in the early days of NVidia. As technology matured, the cycles slowed down. Now, I'm not saying ATI is likely to do so this year. In fact, I find it rather unlikely. But the fact remains, its been done before.


RE: THE GUY IS LYING...
By arturnowp on 12/31/2006 2:49:58 PM , Rating: 2
New card and new architecture is diferent thing. It possible that there will be G81 or G85 by the end of 2007, but not the whole new GPU like G90 not to mention R700 or something. It happens every 2 years or so.


By KristopherKubicki (blog) on 12/31/2006 2:58:07 PM , Rating: 3
G80 was to Nov 8, 2006
G70 was June 22, 2005
NV40 was April 15, 2004
NV30 was Jan 27, 2003


RE: THE GUY IS LYING...
By masher2 (blog) on 12/31/2006 6:39:37 PM , Rating: 2
> "New card and new architecture is diferent thing"

I know that. However, in the early days of GPUS, new architectures came much faster. The Riva TNT (NV4) came out in late 1998. The TNT 2 (NV5) was early 1999. The GEForce 256 (NV20) was late 1999.

The OP's comment that "no one has ever" released two new architectures in less than a year is just plain wrong.


RE: THE GUY IS LYING...
By coldpower27 on 1/1/2007 2:26:24 PM , Rating: 2
The TNT and TNT2 would be considered 1 full generation.

The Geforce 256 with Geforce 2 GTS and Ultra being the refreshes and all be considered the new architecture in relation to that.

I would lump Geforce 3 and Geforce 4 TI technology together, then Geforce FX on it's own, the Geforce 6 and then the Geforce 7 though the Geforce 7 Series didn't add that much functionality wise compare to the Geforce 6's it was mainly performance improvements and cost benefits.


RE: THE GUY IS LYING...
By StevoLincolnite on 1/6/2007 10:50:32 AM , Rating: 1
The TNT2 core is almost identical to its predecessor the RIVA TNT, however updates included AGP 4X support, up to 32MB of VRAM, and a process shrink from 0.35 µm to 0.25 µm.

So yes I would consider it as a full Generation.

The same would be for the Geforce 256 and Geforce 2.
The Geforce 256 Was severely Memory bandwidth constrained.

And the Same for the Geforce 3 and 4. (Except for the MX cards, they were based on the Geforce 2 core).


The Geforce 2 Ultra could even out perform the early Geforce 3 Models that were released, That is until the Ti500 came about.

Oh and the Geforce 2 had Pixel shaders :) But they were fixed function, thus rarely used outside of tech demo's.

The Geforce 3 however Added Pixel shaders and vertex shaders. (1.1 I believe?) And LMA, And the last change was the Anti-Aliasing which went from supersampling to multi-sampling.

Nvidia never released a low-cost version of the Geforce 3, Geforce 2's were still selling like hotcakes.

The Geforce 4 mainly just brought Higher clock speeds, Improved Vertex and Pixel shaders, and some more re-modeling of the Anti-Aliasing.

The Geforce 5/FX was the first time Nvidia combined the 3dfx team, and there own to design the chip.
The only way Nvidia was able to remain competitive to the Radeon series, was to Optimize the drivers to the extreme, They used these "Optimizations" to reduce tri-linear quality, Shader "Hacks" and making the card not render parts of a world. The drivers looked for the software and would apply the aggressive optimizations, And people noticed that the FX picture quality was considerably inferior to the Radeon.

The FX would have been a stand-alone series, And I think everyone wishes for it to be forgotten as a mistake...

The Geforce 6 and 7 are also similar, Mainly just adjustments to the amount of pixel pipelines, and clock speeds.

The Geforce 8 however... Well cant say much the entire line-up hasn't been released, And I am sure everyone has been watching and reading it like a hawk :)

All in all, I still Like Nvidia's drivers better, they have support right from the TNT to the Geforce 8. Thats 10 Generations of 3D accelerator support right there!
And the drivers don't feel so.... clunky. Mind you they have come leaps and bounds from the Radeon 8500 Days of hell.

All in all, Nvidia have used parts of all they're graphics card line-ups. The Geforce 6 was based on the Geforce FX, but changed alot of stuff internally. When it was first released, It still used the same manufacturing process as the FX. Nvidia had alot of work to Turn the FX into what it was meant to be. The Geforce 6.


RE: THE GUY IS LYING...
By psychobriggsy on 1/1/2007 7:01:10 PM , Rating: 5
Nice subject, a definite statement of truth in capitals ...

When really it is a rant regarding your opinion. The only valid point is the fact the site is brand new and thus untrustable, but you always take things like this is a pinch of salt and specifications change.

ATI's drivers are high quality now - it isn't 2002 anymore - I rate your post down by persisting the ATI driver quality myth. And yes, ATI's drivers should be less optimised than nVidia's G80 drivers, as G80 has been on the market for a few months already - a good thing for nVidia.

Holding off your purchase will save you money or get you getter quality, or you could just put down the money and have a G80 tomorrow and ignore the inevitable improved products coming in the near future.


"This is about the Internet.  Everything on the Internet is encrypted. This is not a BlackBerry-only issue. If they can't deal with the Internet, they should shut it off." -- RIM co-CEO Michael Lazaridis











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki