backtop


Print 47 comment(s) - last by FITCamaro.. on Jan 16 at 7:11 AM

The latest AMD ATI roadmaps reveal details of more DirectX 10 graphics cards

DailyTech has just finished its briefing on the upcoming AMD ATI Radeon launches.  The company has three major launches over the next few months: codenamed R600, R610 and R630.

The R600, ATI's ultra-high-end Radeon X1950 successor, has a production date scheduled for February 2007.  The card will launch at or around the Cebit 2007 convention in mid-March.  Shipments will follow shortly after.

Our latest roadmaps indicate R600 will support unified shaders and GDDR4 or GDDR3 memory.  GDDR3 versions of the card running revision "A12" silicon appear to be making rounds in the blogsphere, and select press can even take a sneak peak  of the card under embargo here at CES.  The final silicon for R600 will be "A13."

A GDDR4 version of the card will be the flagship launch product. Clock frequencies on any R600 card have not been set officially yet, and will not appear on marketing material until just a few weeks before the launch.

The company has also added the R610 and R630 GPUs to the roadmap.  In the past, ATI has used lower number codenames to denote entry-level products.  We would suspect R610 would be the entry-level R600 cutdown, and R630 would be the mid-range chipset.  The Radeon roadmap puts both of these products on the shelf before June 2007.

All R600-series GPUs from ATI are compatible with DirectX 10.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By RussianSensation on 1/9/2007 7:55:47 AM , Rating: 2
This seems a bit late.

The fact that ATI is able to release its high-end DX10 product full 4-5 months after G80 just goes to highlight the slow progress of software. I don't want to offend any G80 owners, but in reality, there is no point of G80 even as of now, nevermind during launch. There are no new next generation games, most of which got pushed into 2007 (U3, Crysis), and I am not even mentioning any DX10 games period. Sure you need fast hardware before software takes advantage of it. But what is the point of launching a DX10 graphics card, before there is even a DX10 capable OS with workable drivers? Instead it would have been much better to wait to release high end, mid range and low end G80 cards all at once on Jan 30th, while improving G80 up to this point as far as possible. I wonder how many G80 cards have been sold so far...


By piesquared on 1/9/2007 8:20:18 AM , Rating: 2
Also, i'm sure I remember reading something that said NV originally planned to release G80 in Q1 '06. If, no, IF that was the case, I can't help but wonder how it could be D3D10 compliant in the first place.

disclaimer:

I could very well be wrong about the original release date.

As for R600, I can't imagine ATI not having an advantage here with them being able to leverage the xenos architecture, and some R580 branching capability. Now, obviously i'm just talking out my ass here, but something just doesn't seem to add up!!


By DigitalFreak on 1/9/2007 8:45:00 AM , Rating: 1
LOL. So the fact that a single 8800GTX runs DirectX 9 games faster than the previous gen Crossfire or SLI means nothing, huh? Goomba...


By RussianSensation on 1/9/2007 9:24:55 AM , Rating: 2
Sure, it shows us how wonderful of a card G80 is. Having said that I could care less that my games run at 120+ frames per second (which before required a dual-card setup). I'd rather choose a game that runs at 50-60fps but looks a lot prettier than run Doom 3 at 120fps.

Of course you cannot satisfy the majority and G80 does not aim to do so. It is beneficial to those gunning for 1920x1200 or 2048x1536 resolutions and so forth. However, once again, I'd rather play a game that maxes out my card at 1600x1200 or even 1280x1024 but has better graphics than Far Cry at 100000x10000 resolution.

When Doom 3 came out suddently it looked better at 1024x768 than almost any game at the time looked at 1600x1200. With G80 you have the power and the features to make games that look amazing, but it'll take a year or so before its potential is tapped. That's the way the graphics industry always worked and is why it only makes sense to buy top of the line right away if you have $ to burn.

This is why I could never understand why graphics card companies never aim to launch high-, mid- and low-end segments simultaneously. It makes sense to buy hardware for today , not tomorrow, when G80 will be slow anyway. In which case 8600GT would have been fast enough, but you can't buy it.


By VooDooAddict on 1/9/2007 12:28:13 PM , Rating: 2
quote:
This is why I could never understand why graphics card companies never aim to launch high-, mid- and low-end segments simultaneously.


Launching when the product is ready, can make money, and generate good press sounds like a good idea to me.

They gain back some $$ from early adopters for R&D. While, the mid and low end next gen parts are getting finalized. Makes sense to me.


RE: Does this not seem like a late launch date?
By Ringold on 1/9/2007 5:00:41 PM , Rating: 3
You're exaggerating a bit about what the G80 is capable of versus previous generations. Insane numbers like "Doom 3 at 120fps" or "Far Cry at 100000x100000" doesn't advance the debate any when the fact is those are both old games, and newer ones, such at Oblivion, still barely get performance commonly accepted as a solid frame rate at the increasingly gold-standard resolutions of 1600x1200 and 1920x1200.

So, in more than a couple modern games already on the shelves, the G8800 GTS meets your own personal requirements and only provides needlessly high frame rates in games that are years old.


By Whedonic on 1/14/2007 2:23:08 AM , Rating: 2
quote:
and newer ones, such at Oblivion, still barely get performance commonly accepted as a solid frame rate at the increasingly gold-standard resolutions of 1600x1200 and 1920x1200.


Very true...with all of the settings cranked up, Oblivion is finally playable at 1900x1200 with an 8800gtx. So although its DX10 capabilities are as yet wasted, the sheer graphically power can be put to very good use in current gen games.


By UNCjigga on 1/9/2007 11:32:54 AM , Rating: 3
Dude, you're missing the point. The R600 family *is* in fact very late from a Vista launch window perspective. There are a lot of people holding off on upgrades or buying new PCs because they are waiting for Vista. Typically you see a flurry of activity in PC sales when a new OS debuts. With Vista's emphasis on graphics, we'll also be seeing a lot of video cards sold for folks sticking to existing PCs.

The problem with R600 being late is that while the power users and gamers will have an R600 option in March, the mass market will not have a mid-range/low-end DX10 Radeon part until June. This leaves plenty of room for Nvidia to come in and own this market. Also, Nvidia will likely be first to ship a DX10 notebook part too.

This sort of timing is HUGE for the OEM market--Dell, HP, Lenovo etc. want to be ready with new products when Vista hits, and Nvidia's going to be winning a lot of business if they're the only DX10 game in town--they won't care whether it has unified shaders or not.


By kilkennycat on 1/9/2007 2:17:18 PM , Rating: 2
Umm, the G80 is currently the most powerful GPU for the DX9.0c market. Hence, the logical reason for the 8800 video-card release in November 2006. The release timing has nothing really to do with Dx10. nVidia and its third-party board suppliers were ready to go. The 8800 cards are at least 5 months earlier than anything of equivalent power from AMD (ATi), and also are future-proofed with its Dx10 capability. Which PC high-end gaming-enthusiast in his/her right mind desiring to update their rig right now would NOT buy a 8800-series card? Full Dx9.0c compatibility, with the option to switch to Dx10 whenever the PC owner is ready to make the switch. nVidia has just released the first Vista-capable WHQL driver for the 8800, but have warned that the driver is still evolving and its Dx10 SLI implementation is incomplete. Not that SLI is necessary for any early usage of the 8800 under Vista. The initial adoption rate for Vista (Dx10.x) amongst current PC owners is going to be very small. The adoption for Vista will be driven almost entirely by new PCs with Vista pre-installed. Vista/Dx10 will drive the implementation technology on the next-gen GPUs, but WinXP/Dx9.0c will still drive the volume for at least the next 2 years. One boon for nVidia with the early introduction of the 8800 and its brand-new architecture --- nVidia will have had 5 months of production 8800 silicon to mature the driver designs for both Dx9.0c AND Dx10 ( and find any silicon implementation errors in the current run of 8800 GPUs - for rectification with their upcoming silicon re-spin on 80nm or 65nm ).


By Polynikes on 1/9/2007 2:44:22 PM , Rating: 2
They released it because they knew people would buy, and as of right now they have no competition in the next-gen GPU market, so they get all the sales.

The 8800 is faster in DX9 games than anything else out there, too.


"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki