Print 58 comment(s) - last by Marvin L.. on Dec 11 at 8:00 PM

ATI reference design for Radeon X1950
Ready or not, here comes GDDR4

This week ATI sent an advisory out to its OEM partners announcing the details of the new Radeon X1950 and X1900 graphic cards.  Both of these new cards are based on the same R580 core, but with some fundamental differences.

R580, the 48 pixel-shader processor version of the R520 (Radeon X1800), was announced this past January. R580 features a robust memory controller capable of utilizing several different types of memory, including GDDR4 which was not even available when the Radeon X1900 was first announced.  Since then Hynix and Samsung have both jumped on the GDDR4 train with revenue shipments beginning several weeks ago.  The new GDDR4 variants of R580-based Radeons are now called Radeon X1950.  Radeon X1950 will retain all of the features of the Radeon X1900, and really only have the added benefit of a new cooler, GDDR4 memory and different frequency clocks.

Radeon X1950 at launch will come in two flavors: a high clock "XTX" version, and a CrossFire version.  Both cards feature 512MB GDDR4, and the only major difference between the two is that the CrossFire X1950 houses the composite engine and input interfaces for CrossFire. Just yesterday, ATI issued an advisory to its partners claiming "Clock frequencies for RADEON X1950 family products are pending and will be provided at a later date."  However, in March of this year ATI released a new policy for AIB partners to overclock X1000 series cores with some discretion. While we can already confirm some partners are planning 650MHz core versions, there is still a distinct possibility that higher clocked cards are also in the works. Memory clock frequencies have not been announced either, though Samsung announced its GDDR4 is already capable of 3.2GHz in 8x512Mbit configurations.

The new Radeon X1900 is a low-cost version of the existing Radeon X1900 that only uses 256MB of GDDR3, enabling the card access to the $300 price point.  The Radeon X1900XT 256MB will use the same clock frequencies as other Radeon X1900XT cards: 625MHz core and 1.45GHz memory.

ATI's advisory documentation claims the Radeon X1950XTX will begin sample availability on August 7, with the CrossFire sampling beginning exactly one week later. Sampling of the Radeon X1900XT 256MB will begin immediately.

Radeon X1900 and X1950 will be replaced by another ASIC core, dubbed R600.  R600 is expected to be 80nm with new design features above and beyond the R520 and R580 series.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Let's have a round of applause...
By Engine of End on 7/21/2006 9:47:59 PM , Rating: 2
For robust power requirements! I am getting rather tired the ever-increasing power requirements of these cards. It's only going to get worse when the DirectX 10 cards come to town. Over 300W on load? No thanks.

ATi/nVidia should be following Intel's/AMD's example; focusing on lowering power requirements.

RE: Let's have a round of applause...
By irsyz on 7/21/2006 10:17:03 PM , Rating: 2
they need to have their own prescott scenario before they hit that plato in design change

RE: Let's have a round of applause...
By Goty on 7/26/2006 5:38:03 PM , Rating: 2

And on a side note, both companies have already stated that they've committed themselves to reducing the thermal envelope of their chips after the first generation of DX10 cards.

By shabby on 7/21/2006 10:57:43 PM , Rating: 3
I could of swore nvidia focused on efficiency with the 7900, it has less transistors then the previous gen, runs cooler and uses less power.

RE: Let's have a round of applause...
By syne24 on 7/22/2006 1:38:00 AM , Rating: 3
Totally agreed 100%

ATI/Nvidia needs to start getting more efficient. That's one of the reason I'm NOT going Quad-SLI. There is simple no need for a 1KW power supply for a computer to run daily at all; gaming desktop or not. That is a redicuously amount of power. They even have the nerve to suggest a personal power supply just for the graphic cards. Bottom line is instead of chasing clock speed, they need to add some fine tuning to it. I'd like to see ATI/Nvidia taking power consumption into some serious consideration down the road. Double stacking GPU's and dual SLI is NOT the solution.

RE: Let's have a round of applause...
By Jkm3141 on 7/26/2006 1:28:01 AM , Rating: 2
who started dual stacking GPU's?

Its better to have one smart GPU than 2 stupid ones based on power consumption and in some cases performance. granted single GPU's will of course require lots of power in the future, but if u buy one of those beasts like the R600 you *GASP* might not need SLI to run decent frame rates. Hell even a R580 u dont need dual gpu to do it. My friend with an X1900XT can run anything he wants at any resolution he can with as much AA as he wants (max res is 1600x1200). This directly contrasts a nVidia marketing slide i saw saying you HAVE to have SLI to run a game at 1280x1024 with 4xaa or even some 1024x768 with4xaa. I am glad they didnt say you needed QuadSLI for 1600x1200 with 4xaa. the picture im talking about i uploaded to imageshack here:

By Mojo the Monkey on 8/3/2006 4:40:11 PM , Rating: 2
thats marketing, not reality.

"This week I got an iPhone. This weekend I got four chargers so I can keep it charged everywhere I go and a land line so I can actually make phone calls." -- Facebook CEO Mark Zuckerberg
Related Articles
Samsung Shipping Production GDDR4
July 5, 2006, 10:00 AM
Hynix to Refocus on Graphics Memory
May 30, 2006, 2:46 PM
ATI's New Stance On Overclocking
March 30, 2006, 12:15 PM

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki