backtop


Print 47 comment(s) - last by FITCamaro.. on Jan 16 at 7:11 AM

The latest AMD ATI roadmaps reveal details of more DirectX 10 graphics cards

DailyTech has just finished its briefing on the upcoming AMD ATI Radeon launches.  The company has three major launches over the next few months: codenamed R600, R610 and R630.

The R600, ATI's ultra-high-end Radeon X1950 successor, has a production date scheduled for February 2007.  The card will launch at or around the Cebit 2007 convention in mid-March.  Shipments will follow shortly after.

Our latest roadmaps indicate R600 will support unified shaders and GDDR4 or GDDR3 memory.  GDDR3 versions of the card running revision "A12" silicon appear to be making rounds in the blogsphere, and select press can even take a sneak peak  of the card under embargo here at CES.  The final silicon for R600 will be "A13."

A GDDR4 version of the card will be the flagship launch product. Clock frequencies on any R600 card have not been set officially yet, and will not appear on marketing material until just a few weeks before the launch.

The company has also added the R610 and R630 GPUs to the roadmap.  In the past, ATI has used lower number codenames to denote entry-level products.  We would suspect R610 would be the entry-level R600 cutdown, and R630 would be the mid-range chipset.  The Radeon roadmap puts both of these products on the shelf before June 2007.

All R600-series GPUs from ATI are compatible with DirectX 10.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Does this not seem like a late launch date?
By Einy0 on 1/9/2007 5:01:38 AM , Rating: 2
This seems a bit late. I wonder if this is somehow an effect of doing the Xbox 360 chip. I know Nvidia did the PS3 but that was sorta of a modified 7900GTX chip. The Xbox 360 chip is a lot different than anything we've ever seen on a desktop card...




RE: Does this not seem like a late launch date?
By DingieM on 1/9/2007 5:19:30 AM , Rating: 3
From what I have heard is that the Xenon has 64 unified pipelines and 16 have been disabled due to yield issues. Based on that information the R600 can be described as a mixture between R5xx and the Xenon. In this case the R600 has full 512 bit internal ring bus (further development of R5xx) but no embedded DRAM like the Xenon has. There's a question in how the pipelines in the R600 will be grouped, say 64 parallel pipelines or groups of 16? While the Xenon has 3 groups of 16 pipelines, 64 independant groupes is way more efficient (i.e. groupes of 1).

And I think the R600 is not late at all. Vista has yet to arrive at the end of januari and so have the DX10 games. Even than that has to mature.


RE: Does this not seem like a late launch date?
By Topweasel on 1/9/2007 9:05:52 AM , Rating: 2
I thought the Xenon was the CPU not the video card?

I double checked the CPU was called Xenon. The GPU was called Xenos.

http://en.wikipedia.org/wiki/Xbox_360


By DingieM on 1/9/2007 9:29:45 AM , Rating: 2
Ah thanx, I always mix them up


By Le Québécois on 1/9/2007 5:23:29 AM , Rating: 2
Late, maybe, but if the product is good, I don't think it really matters. Worst case scenario for ATI, Nvidia come up with a new product not too long after the R600 release or they lower their price, either ways, we ( the consumers ) win.

As for your Xbox360 comment: A friend of mine is working at EA , in shaders programming on the 360 to be more precise. At home he happens to have a X1900XT because it's at the core, while being less powerful than the 360 GPU, a very similar GPU that he can use to improve or develop new shadders for the game he's working on. So the for the Xbox360 ATI took a similar approach like Nvidia did on the PS3.


By IntelUser2000 on 1/9/2007 5:39:51 AM , Rating: 2
quote:
Late, maybe, but if the product is good, I don't think it really matters. Worst case scenario for ATI, Nvidia come up with a new product not too long after the R600 release or they lower their price, either ways, we ( the consumers ) win.


Let's see the products that came out to meet the expectations and were late from original date. Hmm... hmm... hmm... I can't think of any.

Prescott... NV30... R520...

Now for software it might be different.


RE: Does this not seem like a late launch date?
By RMSe17 on 1/9/2007 9:12:35 AM , Rating: 2
Too bad Prescott was crap due to it's high heat and inability to reach the clock rate that it's extra long pipeline was designed for.


By IntelUser2000 on 1/9/2007 10:22:39 PM , Rating: 2
quote:
Too bad Prescott was crap due to it's high heat and inability to reach the clock rate that it's extra long pipeline was designed for.


Did you happen to figure out that people like you are annoying?? Did you fail to understand what I was trying to point out??

Can you really not figure out what Prescott/NV30/R520 had in common??

Somehow you have a belief NV30 and R520 was an EXCELLENT product.


RE: Does this not seem like a late launch date?
By Sahrin on 1/9/2007 10:11:05 AM , Rating: 2
The R580 (X1900) is a hybrid of the R300-based and R600 architectures. It does not have programmable pipelines or any of the non-graphics enhancements that the R600 does, but it is not derived from the "Xenos" GPU. The Xenos is, if you want to think of it hierarchically, the "parent" of the R600 architecture. The R300 "architecture" was discarded when the switch was made to Unified Shaders.

Conversely, the "RSX" from the PoS3 is based on the G70, lacking Unified Shaders and the other non-graphics enhancements of the R600/G80 generation. To look at in simplistic terms, the PS3 has an 7950 GPU, and the X360 has an R600 GPU.

This is no fault of nVidia's, Sony had initially planned to have no graphics chip of any kind in the PS3, assuming that Cell's almighty abilities would suffice for such operations - however developers disagreed, and at the last moment Sony approached nVidia (ATI was otherwise engaged) and asked them to hack a graphics chip into the console.


By Le Québécois on 1/9/2007 1:11:50 PM , Rating: 2
quote:
The R580 (X1900) is a hybrid of the R300-based and R600 architectures. It does not have programmable pipelines or any of the non-graphics enhancements that the R600 does, but it is not derived from the "Xenos" GPU. The Xenos is, if you want to think of it hierarchically, the "parent" of the R600 architecture.


Yes you are right but from what I've been told from this friend of mine that was the closest GPU you could find on the market when he bought the card last spring.


RE: Does this not seem like a late launch date?
By Brand0 on 1/10/2007 5:35:31 AM , Rating: 2
I don't think your "simplistic terms" are accurate.

The PS3's GPU is based on the 7800 GTX. The RSX only has 256 MB of GDDR3 VRAM @ 700 MHz. The 7950 GTX is much more powerful since it is essentially two 7900 GTX cards put together (though it takes two PCIe x16 slots). In theory, even a single 7900 GTX would be faster than the PS3's RSX.

Also to say that the X360 has an R600 GPU is quite a stretch. Its a full generation ahead of the X360 and it supports DX10 (besides the fact that it has faster clock speeds, more VRAM, etc.).

At least thats what I've read...


By FITCamaro on 1/16/2007 7:11:28 AM , Rating: 2
The PS3 is based on the 7800GTX but it is nothing like the 7950GX2 (the dual GPU card). Nor is the 7950GX2 two 7900GTX cards put together. It is more akin to two 7900GTs put together but with a slightly faster core and slower memory at stock speeds. And there is no 7950GTX, there is the 7950GT which is a version of the 7900GT with faster core and memory speeds and available with 256MB or 512MB of RAM.

And yes a 7900GTX would be faster than RSX since RSX is crippled to save cost. It only has a 128bit memory bus vs. a 7800/7900s 256bit bus and only 8 ROPS instead of 16. Both are huge bottlenecks.

Read harder.


RE: Does this not seem like a late launch date?
By Grated on 1/9/2007 5:25:58 AM , Rating: 2
NVIDIA had delays after the first Xbox and now ATI has the same problems with R5x0-R6x0 series. ATI Also had the Wii chip (wich doesn't seem to be a lot more than slighty modified slightly faster but still...).

ATI lost quite a lot of manpower they could use to produce the R5xx and R6xx series but since everyone is back for the PC space, I don't think that future chips (late R6xx series of R7xx series) will be late.


By Slaimus on 1/9/2007 7:01:43 PM , Rating: 2
They will lose a lot to Fusion as well in the next couple of generations.

Should have picked up those 3D Labs guys!


By RussianSensation on 1/9/2007 7:55:47 AM , Rating: 2
This seems a bit late.

The fact that ATI is able to release its high-end DX10 product full 4-5 months after G80 just goes to highlight the slow progress of software. I don't want to offend any G80 owners, but in reality, there is no point of G80 even as of now, nevermind during launch. There are no new next generation games, most of which got pushed into 2007 (U3, Crysis), and I am not even mentioning any DX10 games period. Sure you need fast hardware before software takes advantage of it. But what is the point of launching a DX10 graphics card, before there is even a DX10 capable OS with workable drivers? Instead it would have been much better to wait to release high end, mid range and low end G80 cards all at once on Jan 30th, while improving G80 up to this point as far as possible. I wonder how many G80 cards have been sold so far...


By piesquared on 1/9/2007 8:20:18 AM , Rating: 2
Also, i'm sure I remember reading something that said NV originally planned to release G80 in Q1 '06. If, no, IF that was the case, I can't help but wonder how it could be D3D10 compliant in the first place.

disclaimer:

I could very well be wrong about the original release date.

As for R600, I can't imagine ATI not having an advantage here with them being able to leverage the xenos architecture, and some R580 branching capability. Now, obviously i'm just talking out my ass here, but something just doesn't seem to add up!!


By DigitalFreak on 1/9/2007 8:45:00 AM , Rating: 1
LOL. So the fact that a single 8800GTX runs DirectX 9 games faster than the previous gen Crossfire or SLI means nothing, huh? Goomba...


By RussianSensation on 1/9/2007 9:24:55 AM , Rating: 2
Sure, it shows us how wonderful of a card G80 is. Having said that I could care less that my games run at 120+ frames per second (which before required a dual-card setup). I'd rather choose a game that runs at 50-60fps but looks a lot prettier than run Doom 3 at 120fps.

Of course you cannot satisfy the majority and G80 does not aim to do so. It is beneficial to those gunning for 1920x1200 or 2048x1536 resolutions and so forth. However, once again, I'd rather play a game that maxes out my card at 1600x1200 or even 1280x1024 but has better graphics than Far Cry at 100000x10000 resolution.

When Doom 3 came out suddently it looked better at 1024x768 than almost any game at the time looked at 1600x1200. With G80 you have the power and the features to make games that look amazing, but it'll take a year or so before its potential is tapped. That's the way the graphics industry always worked and is why it only makes sense to buy top of the line right away if you have $ to burn.

This is why I could never understand why graphics card companies never aim to launch high-, mid- and low-end segments simultaneously. It makes sense to buy hardware for today , not tomorrow, when G80 will be slow anyway. In which case 8600GT would have been fast enough, but you can't buy it.


By VooDooAddict on 1/9/2007 12:28:13 PM , Rating: 2
quote:
This is why I could never understand why graphics card companies never aim to launch high-, mid- and low-end segments simultaneously.


Launching when the product is ready, can make money, and generate good press sounds like a good idea to me.

They gain back some $$ from early adopters for R&D. While, the mid and low end next gen parts are getting finalized. Makes sense to me.


RE: Does this not seem like a late launch date?
By Ringold on 1/9/2007 5:00:41 PM , Rating: 3
You're exaggerating a bit about what the G80 is capable of versus previous generations. Insane numbers like "Doom 3 at 120fps" or "Far Cry at 100000x100000" doesn't advance the debate any when the fact is those are both old games, and newer ones, such at Oblivion, still barely get performance commonly accepted as a solid frame rate at the increasingly gold-standard resolutions of 1600x1200 and 1920x1200.

So, in more than a couple modern games already on the shelves, the G8800 GTS meets your own personal requirements and only provides needlessly high frame rates in games that are years old.


By Whedonic on 1/14/2007 2:23:08 AM , Rating: 2
quote:
and newer ones, such at Oblivion, still barely get performance commonly accepted as a solid frame rate at the increasingly gold-standard resolutions of 1600x1200 and 1920x1200.


Very true...with all of the settings cranked up, Oblivion is finally playable at 1900x1200 with an 8800gtx. So although its DX10 capabilities are as yet wasted, the sheer graphically power can be put to very good use in current gen games.


By UNCjigga on 1/9/2007 11:32:54 AM , Rating: 3
Dude, you're missing the point. The R600 family *is* in fact very late from a Vista launch window perspective. There are a lot of people holding off on upgrades or buying new PCs because they are waiting for Vista. Typically you see a flurry of activity in PC sales when a new OS debuts. With Vista's emphasis on graphics, we'll also be seeing a lot of video cards sold for folks sticking to existing PCs.

The problem with R600 being late is that while the power users and gamers will have an R600 option in March, the mass market will not have a mid-range/low-end DX10 Radeon part until June. This leaves plenty of room for Nvidia to come in and own this market. Also, Nvidia will likely be first to ship a DX10 notebook part too.

This sort of timing is HUGE for the OEM market--Dell, HP, Lenovo etc. want to be ready with new products when Vista hits, and Nvidia's going to be winning a lot of business if they're the only DX10 game in town--they won't care whether it has unified shaders or not.


By kilkennycat on 1/9/2007 2:17:18 PM , Rating: 2
Umm, the G80 is currently the most powerful GPU for the DX9.0c market. Hence, the logical reason for the 8800 video-card release in November 2006. The release timing has nothing really to do with Dx10. nVidia and its third-party board suppliers were ready to go. The 8800 cards are at least 5 months earlier than anything of equivalent power from AMD (ATi), and also are future-proofed with its Dx10 capability. Which PC high-end gaming-enthusiast in his/her right mind desiring to update their rig right now would NOT buy a 8800-series card? Full Dx9.0c compatibility, with the option to switch to Dx10 whenever the PC owner is ready to make the switch. nVidia has just released the first Vista-capable WHQL driver for the 8800, but have warned that the driver is still evolving and its Dx10 SLI implementation is incomplete. Not that SLI is necessary for any early usage of the 8800 under Vista. The initial adoption rate for Vista (Dx10.x) amongst current PC owners is going to be very small. The adoption for Vista will be driven almost entirely by new PCs with Vista pre-installed. Vista/Dx10 will drive the implementation technology on the next-gen GPUs, but WinXP/Dx9.0c will still drive the volume for at least the next 2 years. One boon for nVidia with the early introduction of the 8800 and its brand-new architecture --- nVidia will have had 5 months of production 8800 silicon to mature the driver designs for both Dx9.0c AND Dx10 ( and find any silicon implementation errors in the current run of 8800 GPUs - for rectification with their upcoming silicon re-spin on 80nm or 65nm ).


By Polynikes on 1/9/2007 2:44:22 PM , Rating: 2
They released it because they knew people would buy, and as of right now they have no competition in the next-gen GPU market, so they get all the sales.

The 8800 is faster in DX9 games than anything else out there, too.


RE: Does this not seem like a late launch date?
By SunAngel on 1/9/07, Rating: -1
RE: Does this not seem like a late launch date?
By OrSin on 1/9/2007 11:42:17 AM , Rating: 3
"superior product to market"? The PS3 is not the superior product. It might be better fit for you. The 360 might be abtter fit for somone else. But superior it aint. Not trying for flame war but until the PS3 actually does something better then the 360 leave it alone. For the record it might be better product in 3-4 years as the amrket changes but for not it is not. And MS has soon they can upgrade the 360 at any time to easily compete.


RE: Does this not seem like a late launch date?
By SunAngel on 1/9/07, Rating: -1
RE: Does this not seem like a late launch date?
By Lakku on 1/10/2007 1:21:05 AM , Rating: 2
Please explain how Sony has supplied an all in one home entertainment system, why Linux would matter in this equation of anything related to entertainment, why installing Vista on the 360 would matter, and how Xbox Live being far superior to what Sony offers doesn't somehow give MS an advantage in bringing everything together right now. Also, how does Sony combat the fact MS is bringing IPTV to the 360, which makes it a DVR, Media Center Extender (face it, almost everyone who owns a PC has Windows, so this puts MS in a good position and makes the current philosophy behind the 360s design just right the way it is), game machine, DVD player (HD-DVD if you want), and an on-demand video service machine? Last I checked, the PS3 doesn't do as well as the 360 in serving as a streaming device from my PC for video, audio, and other media. Yes, I own both, and yes, the 360 has a better interface and is a better media extender. From my own experience, the 360 is just a better machine aside from the HD video aspect (meaning you get Bluray by default for one, HD DVD add on for the other), and I certainly don't need Linux on anything other then my PC should I chose. As a side note, the games aren't that impressive for the PS3, nor is the controller, but I bought it as a 499 dollar Bluray player that happens to play games, so no biggie. PS - Bluray had to many good movies coming to it or out for me not to have the ability to play them, and the 20GB PS3 is a great value for playing them.


Please Spread this news
By iwod on 1/9/2007 6:54:30 AM , Rating: 2
Well, i am no Nvidia supporter.

But hopefully a March launch date will finally put those rumors of Jan Launch date from that FUD blog to rest.

To me it seems more and more like that blog was created by AMD / ATI marketing team to hold people up in buying Geforce 8800 Series.

Just why ATI never manage to have their product out on time?





RE: Please Spread this news
By otispunkmeyer on 1/9/2007 8:58:25 AM , Rating: 3
good things come to those who wait my friend

id rather have a product that works late, than a product of questionable quality on time


RE: Please Spread this news
By Fibbly on 1/9/2007 9:16:15 AM , Rating: 2
If there was high demand for 8800 series, you'd have a point, but there is no high demand. Around here, the shelves are full and they have a hard time selling them. And that has been the case ever since launch.


RE: Please Spread this news
By vortmax on 1/9/2007 9:30:53 AM , Rating: 2
Well, the fact that they released them as early as they did was ultimately good for the consumer by lowering prices of the pre-gen cards. As for the enthusiasts that purchase in the ULTRA-high end product range, it made them happy too. Plus, Nvidia has regained the DX9-based game lead.

I don't have a problem with their early launch from a consumer standpoint.


RE: Please Spread this news
By kilkennycat on 1/9/2007 2:38:07 PM , Rating: 2
Er, exactly which shelves ? "Around here".. where ? A boutique corner store in NYC? Please give specific details. The major US retailers such as Best Buy, CompUSA, for example currently advertise and sell 8800GTX and 8800GTS cards ONLY through their on-line sales-channel ! Check their websites for in-store purchase.. nada...

I personally would not dream of buying any such high-end product at a retail store, anyway. The selection and prices are much better on-line, from many reputable on-line vendors... Newegg, ZipZoomFly etc... I am absolutely sure that this opinion of mine is not only shared but implemented in practice by most of the high-end PC enthusiasts - the actual purchasers of 8800-family cards. Got the Newegg and ZZF sales figures handy ?


Power consumption?
By Rob94hawk on 1/9/2007 6:52:18 AM , Rating: 2
Am I gonna need a dedicated 220V line if I build a new PC in the future?!?




RE: Power consumption?
By Fibbly on 1/9/2007 9:22:49 AM , Rating: 2
I dont know where you come from, but on this continent, I plug my 'puter in a 230V/10A wall outlet. Plenty water will flow down the rivers before a desktop computer draws more than 2kW - if that ever happens (which is highly doubtful).


RE: Power consumption?
By Pythias on 1/9/2007 1:37:37 PM , Rating: 2
quote:
Am I gonna need a dedicated 220V line if I build a new PC in the future?!?


Not really. Gaming systems dont eat as much juice as many people claim they do. Search around anandtech for some of the articles where they benchmark total power usage.


Not Just High end!
By fierydemise on 1/9/2007 9:27:26 AM , Rating: 2
Maybe the reason R6xx is taking longer is because ATI is actually releasing a full line of cards. G80 may be a screamer but outside of the ultra high end there isn't much there. Not to diminish the G80 but it strikes me as something of a sham to take back the high end and declare that nVidia has the performance crown without giving anything to people who don't want to spend $450+ on a video card.




RE: Not Just High end!
By cochy on 1/9/2007 1:40:12 PM , Rating: 2
quote:
The Radeon roadmap puts both of these products on the shelf before June 2007.


Not according to this article. R600 some time around March, mid and low end some time around June.


RE: Not Just High end!
By ElJefe69 on 1/9/2007 7:48:45 PM , Rating: 2
Is it me or do the mid and low level versions of new pricey cards suck ass?

they get like 1/2 or less the frames. which equals a last generation card, which... is often the same price. The x1600 should be like 20% slower vs 1900 not like half. So basically, buy the $$$ one and be on top, or settle and still spend like 200 bux.


numbers
By melgross on 1/9/2007 8:54:09 AM , Rating: 2
Not to be TOO picky, but aren't 610, 620, and 630 higher numbers than 600, rather than lower?




RE: numbers
By tkSteveFOX on 1/9/07, Rating: -1
RE: numbers
By threepac3 on 1/9/2007 9:58:30 AM , Rating: 2
I think Nvidia's early entry into the next-gen (DX10) add-in card arena will ultimately be better for them. ATi will indeed come up with a more powerful card-- Though there is no doubt in my mind that NV is waiting with a new product to super seed ATi card.


RE: numbers
By SoBizarre on 1/9/2007 5:13:03 PM , Rating: 2
You're right. On both being too picky and the number thing.


no new news
By zam786 on 1/9/2007 5:16:34 AM , Rating: 3
is that all we waited 6 months for 3 lines of news




I'm glad it's delayed...
By A5 on 1/9/2007 10:18:51 AM , Rating: 3
Glad I got a X1950XT ($220 AR @ newegg) instead of waiting around for stuff I can't afford anyway. :P




"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki