backtop


Print 78 comment(s) - last by z3R0C00L.. on Sep 13 at 11:07 AM


ATI RV570 Details

ATI Desktop Discrete PCIe transitions
New performance and mainstream offerings

During the recent Games Convention 2007 in Germany, we received a new roadmap that outlines ATI’s pre-R600 plans for its complete graphics card lineup. At the top of chain of ATI graphics cards is the previously released Radeon X1950XTX and CrossFire graphics cards. Slotted right below the Radeon X1950XTX and CrossFire cards will be the Radeon X1900XT 512MB which will be a carryover product. The Radeon X1900XTX is discontinued and being phased out. This completes ATI’s enthusiast offerings for the time being.

On the performance side of things is the previously released Radeon X1900XT 256MB. Slotted right below the X1900XT 256MB will be the unreleased Radeon X1950 Pro. The Radeon X1950 Pro replaces the current X1900GT. Radeon X1950 Pro will be based on the RV570 core, which is one of ATI’s first 80nm products. Specifications of the RV570 core include 12 pipelines and 36 pixel shaders with a 600 MHz core clock. Memory will be clocked at 1.4 GHz and have a 256-bit interface. Radeon X1950 Pro cards will be equipped with 256 MB of graphics memory and sport a single slot cooler. This will also be ATI’s first card with internal CrossFire compatibility for dongle-less CrossFire. Availability of the Radeon X1950 Pro is expected in October. ATI claims performance of the Radeon X1950 Pro will be faster than the 7900GT.

On the mainstream side of things is the Radeon X1650 Pro. This is based on the RV530 core and replaces the previous Radeon X1600XT. Joining the mainstream lineup later in September will be the Radeon X1650XT. The Radeon X1650XT will be based on ATI’s upcoming RV560 core that like the RV570 is an 80nm part. It will also have 8 pipelines with 24 pixel shaders and go up against NVIDIA’s GeForce 7900GS. Radeon X1650XT cards will have 256MB of memory on a 128-bit interface. Core and memory clock is unknown. Availability of the Radeon X1650XT is expected around the same time as the Radeon X1950 Pro.

ATI’s value lineup will consist of the Radeon X1300XT, X1300 Pro, X1300, X550HM and X300SE. The Radeon X1300XT is essentially identical to ATI’s previous mainstream offering the Radeon X1600 Pro while the other four products are simply carryovers.

Also mentioned in the roadmaps is ATI’s high definition video compatible mainstream part, the RV550 with ATI’s Universal Video Decoder. The UVD equipped RV550 is expected to start sampling in September and availability starting in December. It will be based on ATI’s R515 core, which is the equivalent of a Radeon X1300 series.

ATI plans to move most of its product lineup over to an 80nm fabrication process too. While the Radeon X1950 Pro and X1650XT will launch as 80nm products, the Radeon X1650 Pro, X1300XT and X1300 series are still based on a 90nm fabrication process. ATI will switch the RV530XT based Radeon X1650 Pro over to 80nm with the RV535XT while the RV530 Pro based X1300XT will switch over to the 80nm RV535 Pro core. ATI’s RV515 and RV516 value based products will switch over to the 80nm R505 variants as well.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

If it's not DX10, keep it in your warehouse ATI
By FXi on 8/28/2006 8:54:35 PM , Rating: 4
Sorry but by the holidays there needs to be DX10. 1 month before Vista, I'm not buying last years news refreshed.

They will be sorely dissappointed if this is their only holiday lineup...




By splines on 8/28/2006 10:59:12 PM , Rating: 2
To be honest, I'm not sure how mind-blowing these DX10 cards are going to be. I'm sure they'll run DX9 games fine (with the usual linear increase in processing power and so forth) - but DX10 performance is going to be the selling point, and I can't see the first generation being all that fantastic. Give it six months to a year, see how Crysis performs (personally, I think it's going to murder just about anything that wants to really turn up the pretty, much like Oblivion has done to DX9 cards) and then consider the purchase.

I'm certainly not trading up from the GF7 series until the gains are worth the expense.


By biohazard420420 on 8/29/2006 12:18:56 AM , Rating: 2
Ok here is something I have never understood. What is the point of moving to DX10 and Vista right when it comes out. First off in regards to any MS product (I am a windows only user no experience with Mac or Linux) I see NO need to move to a new OS with a year or several from its release. I don't care how "secure" Vista is supposed to be but I can probally safely say that XP will be more secure and stable than Vista for at least 6 months or longer.

There all always problems, hole, bugs whatever you want to call them with a new MS OS. Heck I didn't even upgrade to XP from Win 98 until 2003 or 2004 and then only because I had to get a new hard drive. New and forthcoming games will still work on XP for the foreseeable future (at least 2 to 4 years maybe with out all the super cool flashy stuff but they will still run with most if not all current flasy stuff)

And as far as DX 10 in concerened, again it is the same thing as before, the new flashy bells and whistles that come with a new API are not really that big of a deal IMO. Would I like all the new visual effects and "bling" if you will, but I also am not going to pay out the nose for it either. With the exception of a few games Halo 2 and Crysis the only 2 examples I can think of off hand alot of new games wont be 100% DX 10 optimized. And even so they will run in DX 9 fine without the new DX10 features and even if you get a new DX10 card and continue to run XP they will still run the new games and make them look good and maybe even a bit faster than on DX10 since your not computing the new "bling" in DX10

I have been running the same rig for about 5 or 6 years with the exception of a new vid card I got from a friend for cheap and the aformentioned HDD and it will run most current games just fine for me until I upgrade my pc with a total rebuild in the next 4 to 6 months.



By poohbear on 8/29/2006 12:06:20 PM , Rating: 2
the only thing i care about is a vid card that runs the unreal 3 engine well since there are 10+ games using that engine.


By coldpower27 on 9/2/2006 10:33:58 PM , Rating: 2
Its is very likely graphics cards from the R300 generation and up can run the Unreal Engine 3.0.

They jsut won't run the game at maximum levels. Sorta like how the source engine has pathways for DX7.0, DX 8.0, DX 8.1 and DX 9.0

Same thing here, Shader Model 2.0, 3.0 and 4.0 pathways are likely going to be available.


"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone

Related Articles
Off to Games Convention
August 23, 2006, 4:07 PM
ATI Readies X1950, X1900XT 256MB
August 22, 2006, 1:13 AM
More ATI RV550 Details
July 31, 2006, 1:31 PM
ATI's Single-PCB Dual-GPU Plans
June 25, 2006, 12:31 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki