backtop


Print 58 comment(s) - last by Marvin L.. on Dec 11 at 8:00 PM


ATI reference design for Radeon X1950
Ready or not, here comes GDDR4

This week ATI sent an advisory out to its OEM partners announcing the details of the new Radeon X1950 and X1900 graphic cards.  Both of these new cards are based on the same R580 core, but with some fundamental differences.

R580, the 48 pixel-shader processor version of the R520 (Radeon X1800), was announced this past January. R580 features a robust memory controller capable of utilizing several different types of memory, including GDDR4 which was not even available when the Radeon X1900 was first announced.  Since then Hynix and Samsung have both jumped on the GDDR4 train with revenue shipments beginning several weeks ago.  The new GDDR4 variants of R580-based Radeons are now called Radeon X1950.  Radeon X1950 will retain all of the features of the Radeon X1900, and really only have the added benefit of a new cooler, GDDR4 memory and different frequency clocks.

Radeon X1950 at launch will come in two flavors: a high clock "XTX" version, and a CrossFire version.  Both cards feature 512MB GDDR4, and the only major difference between the two is that the CrossFire X1950 houses the composite engine and input interfaces for CrossFire. Just yesterday, ATI issued an advisory to its partners claiming "Clock frequencies for RADEON X1950 family products are pending and will be provided at a later date."  However, in March of this year ATI released a new policy for AIB partners to overclock X1000 series cores with some discretion. While we can already confirm some partners are planning 650MHz core versions, there is still a distinct possibility that higher clocked cards are also in the works. Memory clock frequencies have not been announced either, though Samsung announced its GDDR4 is already capable of 3.2GHz in 8x512Mbit configurations.

The new Radeon X1900 is a low-cost version of the existing Radeon X1900 that only uses 256MB of GDDR3, enabling the card access to the $300 price point.  The Radeon X1900XT 256MB will use the same clock frequencies as other Radeon X1900XT cards: 625MHz core and 1.45GHz memory.

ATI's advisory documentation claims the Radeon X1950XTX will begin sample availability on August 7, with the CrossFire sampling beginning exactly one week later. Sampling of the Radeon X1900XT 256MB will begin immediately.

Radeon X1900 and X1950 will be replaced by another ASIC core, dubbed R600.  R600 is expected to be 80nm with new design features above and beyond the R520 and R580 series.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Is a question allowed?
By NextGenGamer2005 on 7/21/2006 8:04:10 PM , Rating: 4
Two reasons: 1) GPUs are much more complicated (ATI's R580 is has 384 million transistors, compared to half that for most CPUs). 2) When a GPU is being used, almost all of the transistors are being used. A 16 pipeline card has all 16 pipelines working constantly. This in stark contrast to a CPU, where really only 25-50% is actually "on" and working at any one time.

CPUs are also typcially made using more advanced techniques. For instance, Intel has been using 65-nm since January, where as both NVIDIA and ATI are still using 90-nm, with 80-nm right around the corner. By the time they switch to 65-nm in 1H 2007, Intel will be sampling 45-nm products for a 2H 20007 release.

BTW, if AMD does buy ATI, then ATI will have a huge advantage over NVIDIA in the manufacturing department, since I would assume that ATI would use AMD's fabs (which switch to more advanced processes faster then TSMC or UMC).


RE: Is a question allowed?
By Tyler 86 on 7/21/2006 8:10:47 PM , Rating: 2
*cough*

Like CPUs, GPUs are 10-50% on during idling.
Like CPUs, GPUs are 100% on under load.

Construction, size, complexity, and activity are the limiting factors in clock speed.

Smaller manufacturing processes do nessiciarly not give more clock speed headroom.
Smaller manufacturing processes make the chips cheaper to manufacture.


By slashbinslashbash on 7/23/2006 2:20:47 AM , Rating: 3
I don't have exact numbers, but out of Conroe's 291M transistors at least 2/3rds of them are cache... based on fuzzy memories of transistor counts for previous processors, the actual CPU cores (cache excluded) of Conroe should take up no more than 50M-75M transistors. Compare that to GPU's which are over 300M transistors with relatively little of that being cache. That alone will tel you how much more complicated a GPU is compared to a CPU.


RE: Is a question allowed?
By Araemo on 7/26/2006 11:58:43 AM , Rating: 2
Another reason(related to # of transistors):
A GPU generally does more mathematical work per cycle than a CPU does. A CPU generally has 2-6 ALUs that perform one or two additions, multiplications, etc.. per clock. Maximum of 12 ops per clock, and they're relatively simple ops.

A GPU has between 8 and 16(Do any have 32 yet? I think some nVidia ones have 24?) pipelines, which each has multiple functional units, each unit performing a different operation, every single clock. Timing all that to run reliably takes more physical time, and the clock speed is limited based on the time it takes every functional unit in the pipeline to do one cycle of work. In CPUs, they have less functional units to tune, so it's easier to get the timing down low, especially in a reliable manner.


RE: Is a question allowed?
By Araemo on 7/26/2006 11:59:50 AM , Rating: 2
Oh, and another thing.. Since the CPU is using simpler math, it takes more cycles to do the same complexity of work. That is why it's faster to run a game on your 500 mhz gpu than on your 3000 mhz cpu, the GPU is tuned for one specific kind of work, so it doesn't need to have as high of a clock speed.


"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles
Samsung Shipping Production GDDR4
July 5, 2006, 10:00 AM
Hynix to Refocus on Graphics Memory
May 30, 2006, 2:46 PM
ATI's New Stance On Overclocking
March 30, 2006, 12:15 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki