Print 54 comment(s) - last by StevoLincolnit.. on May 13 at 10:35 PM

AMD's ATI Radeon HD 2600 features 120 stream processors

AMD’s next-generation graphics products are right around the corner. The next generation ATI Radeon HD 2000-series includes a top to bottom DirectX 10 lineup consisting of the HD2900, HD2600 and HD2400-series. Although DailyTech unveiled benchmarks of AMD’s upcoming ATI Radeon HD 2900 XT last week, details of the mainstream and value HD 2600 and HD2400-series have remained scarce.

AMD’s next-generation ATI Radeon HD 2600, based on the RV630 core, features 120 stream processing units with three SIMDs and two texture units. There will be two ATI Radeon HD 2600 GPUs: a Pro and XT model with different clock speeds and memory requirements.

AMD and add-in board partners will offer a variety of ATI Radeon HD 2600-based graphics cards. The different models include variations with GDDR4, GDDR3 and DDR2 memory in 512MB or 256MB configurations with or without video input and output capabilities.

Catering towards value users is the ATI Radeon HD 2400 with its 40 stream processors. Two SIMDs and one texture unit join the 40 stream processors. AMD plans to offer the ATI Radeon HD 2400 in Pro and LE models with varying configurations. ATI Radeon HD 2400-based graphics cards will have 256MB or 128MB of video memory. Add-in board manufacturers are free to set the output configuration, including VGA, dual-link DVI, video input/output, HDMI and DMS-59 outputs on ATI Radeon HD 2400-based graphics cards.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Price will make the difference
By cheetah2k on 5/7/2007 1:54:01 AM , Rating: 5
With Nvidia ruling the roost in terms of speed with their high end video cards, AMD/Ati will have to make sure these babies are cost efficient against the competition - 8400 to 8600. From speculators out there, Nvidia's low end options arent much better than their DX9 equivalents (bar the H.264 coding & DX10 advantages) and this is where AMD need to perform.

Value for $ AMD, value for $.....

RE: Price will make the difference
By Metroid on 5/7/2007 2:17:31 AM , Rating: 2
Yes it depends of facts put on board. I do not want to buy something that is just good on synthetic benchmarks. I like Nvidia G8 but for now I prefer ATI 2X HD models, but all will depend of Price/Performance for me.

If ATI is launching something inferior than its competitor, I just want the price will be also inferior. They were talking about the 2900XT that may cost around at $350 to $450. I just think $400 is just higher than I was expecting around $300.

By James Holden on 5/7/2007 2:20:41 AM , Rating: 2
I'm not sure if inferior is the right word there

RE: Price will make the difference
By thartist on 5/7/2007 3:28:18 PM , Rating: 2
Knowing that nvidias 8600 and 8500 are a total deception and LET MANY GAMERS DOWN, AMD/ATI MUST take the chance and win that segment of the v-card market. I mean it's not very often that nVIDIA screws it up and exposes his neck.

If AMD/ATI doesn't make it here, i won't have my mainstream DX10 card. I may have to wait for the next gen, and most possibly it's not only me.

RE: Price will make the difference
By 3kliksphilip on 5/7/2007 3:48:23 PM , Rating: 1
...What's wrong with the 8600? From what I've heard it's roughly the same as the best from the previous generation, only with added dx10 support. Doesn't seem too bad, especially with lower power requirements etc.

RE: Price will make the difference
By Angelus897 on 5/7/2007 4:24:45 PM , Rating: 3
Uhh.. the 7900/7950 cards outperform the 8600 by a large margin. The x1950 products even more so. Also, the 8600 is more expensive than those.

RE: Price will make the difference
By Ard on 5/7/2007 4:35:17 PM , Rating: 2
A misperception that people feel is necessary to spread everywhere they go:

Keep reading.

RE: Price will make the difference
By shabby on 5/7/2007 6:03:53 PM , Rating: 2
Ah yes Kyle Bennet and his canned benchmarks...

RE: Price will make the difference
By jarman on 5/7/2007 10:47:41 PM , Rating: 2
Ah yes Kyle Bennet and his canned benchmarks...

You must be one of those kids who were really upset when you didn't see your pretty little video card perform as well as you knew it would...

RE: Price will make the difference
By shabby on 5/8/2007 6:59:30 AM , Rating: 2
Nope, my x1900xtx performed pretty well.

RE: Price will make the difference
By Ard on 5/7/07, Rating: 0
RE: Price will make the difference
By phusg on 5/8/2007 4:33:00 AM , Rating: 2
If you actually read some of the article you would see that he isn't basing his opinion solely on benchmarks,
Many people look at the specifications alone and make a pre-conceived notion about performance before even reading what the gameplay experience actually is. We have it in our heads that a 128-bit memory bus is slow and 32 streaming processors doesn’t sound like a lot when we are used to seeing 96 mentioned for the 8800 GTS and 128 for the 8800 GTX.

I'm still not convinced though, as I too was under the impression that the X1950 was still pretty competitive. Anyone have another comparison review that supports Kyle's conclusions?

By Mojo the Monkey on 5/10/2007 1:32:21 PM , Rating: 2
I dont think you get it. Look at the retail price of the 8600GTS, and the current retail price of the X1950XT (not pro). I saw some for ~170 at newegg and they are a leap ahead of the pro. Why dont take a look at how those 2 stack up.

Comparisons should be about cards of the same price.

By shady28 on 5/8/2007 9:55:55 PM , Rating: 2
The post here that notes the low performance of the 8500/8600 is dead on.

The high end of Nvidia's new series, the 8600GTS, is slower in the majority of benchmarks than the older x19xx series ATI cards as well as Nvidia's own 79xx series.

See here :

Note the 8600GTS, top of the line for Nvidia's new midrange models is soundly beaten in FEAR by the X1950Gt, 7900GS, X1950 Pro, 7950GT, X19000XT. The reason I use FEAR as an example is that it is one of the games it does best in.

There are some places where the 8600GTS does well, but keep in mind the price of the 8600GTS (note the S) - the cheapest one at newegg at this moment is $175 for a 256MB variant.

For that, you can buy a 7950GT 512MB that will - in most cases - smash the 8600GTS in current games.

Top end not in my budget.
By Mitch101 on 5/7/2007 9:07:29 AM , Rating: 5
I dont know about the rest of you but I generally buy in the sub $200.00 range. 100FPS means nothing to me because my LCD monitor is 60fps anyhow and if it does the games I want to play at 60FPS at the resolution I prefer to play in then getting a more expensive card wont do me any good than to waste more money.

I would like a card that does h.264 natively but I would love to see video encoding drivers that could possibly utilize the GPU more than the CPU when re-encoding video.

I love the idea of physics built in because to me its all going to come down to eye candy. I know all GPU's can do physics however there are many ways to improve on this.

Im not upgrading my monitor any time soon so eye candy at my monitors native resolution means more than some ungodly resolution or freakish 300FPS. Video cards come and go but monitors generally last me 4 or more years.

I like the idea of the audio being decoded on the card as well because of the possibility of watching some HD content on my pc. Generally I use my HDTV for that but I have a projector hooked to my PC so I dont live in fear of having a blank screen when H.264 content is played back through the PC because of DRM infections.

With that being said I could care less about the top end because no game company in thier right mind releases a game that will only play on the top end hardware. So it not like a paticular game will come out that I wont be able to play.

Heck my X850XT is still banging out decent frame rates but Its time for me to be DX10 even though most games for the next year will still be DX9 maybe even longer since the adoption rate of Vista is slow and DX10 is a Vista Exclusive.

Microsoft - Halo 2 is not a reason to upgrade to DX10. It was a lousy game. Even the developer admitted they didnt like it. Gears of War might be worthy of the DX10 move.

RE: Top end not in my budget.
By FITCamaro on 5/7/2007 9:26:18 AM , Rating: 1
1) Vista adoption rate has been relatively high.

2) Halo 2 isn't DX10 anyway. And Gears of War isn't even announced for PC yet.

RE: Top end not in my budget.
By Mitch101 on 5/7/2007 10:13:44 AM , Rating: 2
1) Vista's adoption rate has actually been very slow and problematic. Most places (Dell, HP, IBM) have forced the hands at Microsoft and have been re-selling XP because of the lack of driver support. It wont be until service pack 1 in august/sept that Vista starts moving forward.

2) Microsoft has said Halo 2 will be a DX10 title but it was such a lousy game for the X-Box its not like DX10 owners are getting a treat with it.

Gears of war would be a much more worthy title but we all know this isnt going to happen.

RE: Top end not in my budget.
By Angelus897 on 5/7/2007 4:28:57 PM , Rating: 2
It's a DX9 title, just Vista mandatory.

RE: Top end not in my budget.
By kalak on 5/7/2007 10:59:17 AM , Rating: 4
Im not upgrading my monitor any time soon so eye candy at my monitors native resolution means more than some ungodly resolution or freakish 300FPS. Video cards come and go but monitors generally last me 4 or more years.

I agree. If a Video Card can give me 60 FPS at 1280x1024 with all the eye candies on (4X AA, Shadows, etc...), it worth my money !

no game company in thier right mind releases a game that will only play on the top end hardware

that's not true...

these games was (and some still are!) bare playable with cards in the sub $200.00 range.
I hear someone say that "- I play FEAR at 15 FPS and it's NOT a problem to me!" but is really WEIRD ! The fact is that a FPS like FEAR or QUAKE running at 15 FPS is a pain in the ass! ;-)

RE: Top end not in my budget.
By Angelus897 on 5/7/2007 4:34:56 PM , Rating: 2
My Athlon XP 2600+ & 7600GS runs Far Cry maxed out at 1680x1050, Quake 4 almost maxed out (not a 512mb card, so no ultra setting) at 1280x1024. Fear runs at 1024x768 with med-high settings.

RE: Top end not in my budget.
By Spyvie on 5/10/2007 11:04:26 AM , Rating: 2
No Way.

By StevoLincolnite on 5/13/2007 10:35:07 PM , Rating: 2
Yes way, I have oblivion on my Radeon 9700pro @ 1280x800 @ medium quality settings and I score about 35-40 fps outdoors.

RE: Top end not in my budget.
By darkpaw on 5/8/2007 8:48:40 AM , Rating: 2
Oblivion ran fine with 30+fps on my older system as well as my high end system. Sure I couldn't turn on HDR or max out the effects but it was playable and still looked better then a lot of other games.

128-bit Memory Interface?
By chrispyski on 5/7/2007 2:24:11 AM , Rating: 1
I'm sorry, but AMD/ATI really had the chance to overtake the mainstream graphics from Nvidia if they opted for a larger memory interface...I mean 128-bit is just going to choke this card out like it did for Nvidia's 8600 series.

But I suppose the benchmarks will tell all...but I'm not holding my breath.

RE: 128-bit Memory Interface?
By defter on 5/7/2007 5:40:47 AM , Rating: 1
Why do you think that 128bit bus chokes 8600 series? For example 8600 GTS has roughly the same bandwidth/ROP ratio as mid range 7900 cards. Increasing bandwidth alone would not significantly increase performance.

RE: 128-bit Memory Interface?
By brites on 5/7/2007 6:31:38 AM , Rating: 2
Really... I belive by the numbers that a 128 Bus does constrain MTR... the 7900 series have a MTR of 42.2Gb/s to 51.2 Gb/S as 8600 series have a MTR of 22.4 to 32 Gb/s... this is 1/2 ot MTR and that is why the 7900 is a better card than 8600 (not looking at the lack of H. and DX10)...

RE: 128-bit Memory Interface?
By defter on 5/7/2007 7:19:30 AM , Rating: 2
For example:

8600 GTS:
8 ROPs @ 675MHz => 5400
32GB/s of bandwidth => bandwidth/ROP ratio is about: 5.93

7900 GT:
16 ROPs @ 450MHz => 7200
42.2GB/s of bandwidth => bandwidth/ROP ratio is about: 5.86

Thus, relatively speaking, 8600GTS actually has slightly more bandwidth compared to fillrate...

7900 series is faster especially at higher resolutions with FSAA/AF because it has more of both: fillrate AND bandwidth. Bandwidth isn't the only thing that matters.

RE: 128-bit Memory Interface?
By Samus on 5/7/2007 7:51:17 AM , Rating: 1
I agree. Having used a 7900gs and a 8600gts, you'd think the 8600 would be competitive, but it isn't. It feels lag-ier and definitely has a lower AVERAGE frame rate through a Battlefield 2142 map.

It can only be contributed to the memory interface.

RE: 128-bit Memory Interface?
By brites on 5/8/2007 5:56:16 AM , Rating: 2
But in the end maths doesn't mather because the 8600gts doesn't win against the 7900gs, not even OC!!! so what really matters is the beenchmarks on games not the 3Dmarks that sometimes doesn't show the real power of the "beast"...

RE: 128-bit Memory Interface?
By brites on 5/8/2007 6:03:49 AM , Rating: 2
Just to end this topic (on my part) I would like to see a 256 bit 8600gts vs a 128 bit 8600gts just to check the diference... and that would be a diference...

RE: 128-bit Memory Interface?
By FITCamaro on 5/7/07, Rating: -1
RE: 128-bit Memory Interface?
By swtethan on 5/7/2007 10:30:55 AM , Rating: 3
... says it right in the chart... 128 bit

One Texture Unit
By ira176 on 5/7/2007 2:30:53 AM , Rating: 2
Will one texture unit be enough, for ATI's new mainstream parts, considering my X1950Pro has 12 texture units, 12 pixel piplines, 36 pixel shaders and 8 vertex shaders? Will the 2 SIMD units perform a greater function for the future of ATI video cards, maybe replacing the need for higher number vertex shaders?

RE: One Texture Unit
By Griswold on 5/7/2007 4:11:42 AM , Rating: 5
The value line has one texture unit. The mainstream cards have two. At any rate, I dont think you can simply compare the numbers here.

RE: One Texture Unit
By defter on 5/7/2007 5:36:14 AM , Rating: 5
One texture "unit" means a cluster of 4 units.

Article doesn't mention ROPs counts. Both HD 2400 and HD 2600 have 4 ROPs (one cluster).

release date?
By Decoy26517 on 5/7/2007 11:06:48 AM , Rating: 2
Has there been a release date on these GPUs yet? My next project (aaah new computer after 5 years...) will have a budget for the top end and I'm kinda looking forward to getting one of these new cards.

RE: release date?
By Decoy26517 on 5/7/2007 11:10:20 AM , Rating: 2
Guess I should of been more precise as to which card, is there a release date for the X2xxx series yet?

RE: release date?
By jkresh on 5/7/2007 2:59:02 PM , Rating: 2
Supposedly the nda is up on the 2900's at least on May 14th and retail availability should be around then, I don't know about the 2600's/2400's...

RE: release date?
By jazkat on 5/7/2007 5:46:08 PM , Rating: 2
i wouldnt bother listening to any information put on this site they clearly constantly lie like the supposed 2900xtx card which amd hasnt even handed out it was a 2900xt oem card, the xtx comes later and it will have 1gb of gddr4,
it will also be 65nm and have lower power usuage let me just say there is only an xt and thats because ati have something alot quicker in the pipeline.

RE: release date?
By Lakku on 5/7/2007 8:31:53 PM , Rating: 2
Quit assuming rumors without evidence will be true. Also, unless you work for DT, you don't have ANY clue what card they had. DT may be right or wrong, it's a RUMOR. Just like all the information you just presented is RUMOR. However, there are far more websites with 'insider' information 'confirming' all the negative news out of DT, and many saying the entire R600 series is too little too late. We will just have to wait and see.

For the momemt
By crystal clear on 5/7/2007 6:00:40 AM , Rating: 1
Intel maintained lead in 1Q07 desktop IGP market, says Jon Peddie Research

Further figures from Jon Peddie Research's (JPR) recent report highlight that Intel's desktop integrated graphics processor (IGP) market share increased by 1.6% in first quarter of 2007 compared a quarter earlier, maintaining the company's lead in the desktop market. AMD's market share remained flat during the period, while Nvidia's fell 2.3%.

SiS saw its market share increased 0.1% in first quarter of 2007 while rival VIA saw its share increase 0.7% during the period.

RE: For the momemt
By crystal clear on 5/7/2007 6:19:00 AM , Rating: 1
Nvidia surpassed Intel to take first place in terms of desktop graphics market share for the first quarter of 2007, according to estimates by Jon Peddie Research (JPR).

Intel led the mobile graphics market with a 55% share (up from 50%), AMD lost 0.2% to a 23.2% share, and Nvidia slipped to a 20% (from 22.9%) market share for the quarter. SiS and VIA both saw declines in market share.

Jon Peddie Research Reports First-Quarter PC Graphics Shipments: Nvidia Leaps to First Place in Desktop Graphics Chips, Displacing Intel read press release

------------------------------------------------- -------------------------------

Going to WinHEC L.A.? Jon Peddie will lead a panel of experts from AMD, Intel, and Nvidia in a debate over the future directions of CPUs and GPUs, multicore architectures, and application models. Wednesday, May 16 at 3:30. about WinHEC . . .

RE: For the momemt
By Dianoda on 5/7/2007 9:51:05 AM , Rating: 5
You must work for Jon Peddie Research. You plug them way too many times to just be referencing them as a source. And what you reference just barely relates to the article.

RE: For the momemt
By crystal clear on 5/7/2007 6:07:01 PM , Rating: 2
No I dont work for them nor anybody else-I work for myself.

Neither do I promote anybody.

Its just that I forgot to add in the additional information so I made a second comment.

Anyway I write this just for clarification purposes only.

By kdog03 on 5/7/2007 3:16:09 AM , Rating: 2
Are we there yet?

RE: ATI...
By Griswold on 5/7/2007 4:12:12 AM , Rating: 2
No, son.

RE: ATI...
By spartan014 on 5/7/2007 5:54:42 AM , Rating: 2
Are we done yet?


RE: ATI...
By JimFear on 5/7/2007 7:22:43 AM , Rating: 2
Don't make me turn this car around!

RE: ATI...
By GlassHouse69 on 5/7/07, Rating: 0
Most likely disappointing...
By RussianSensation on 5/7/2007 7:32:08 AM , Rating: 2
3x less shader streaming processors on 128-bit memory bus? So to be fair this should cost roughly 2-3x less than HD 2900XT. The performance difference between mid-range and high end is become so drastic, it practically doesn't make any sense to buy mid-range cards this gen and go for top high end cards of last gen instead or step up to 8800GTS 320 while it is in supply for $260.

Looks like this round NV and ATI have something in common with Monica Lewinski - they both blew the big one!

RE: Most likely disappointing...
By Lightning III on 5/7/2007 10:13:27 AM , Rating: 2
I'll wait for the review on the new universal hardware video decoder. Supposedly its built in on the new hd 2000 cards since its looking like the HD 2900XT will be the best game with video quality combo since all my cards move from the gaming rig eventually to my HTPC anyway.

ask anybody in the HTPC world pure video is pure crud in comparison the addition of h.264 only decoding on their midrange cards is the first time I've even seen them aknowledge the market.

RE: Most likely disappointing...
By Lakku on 5/7/2007 2:02:51 PM , Rating: 2
Quit spreading FUD. Anyone who knows, knows that Purevideo was updated just a few months back, around the time the 8800 was introduced. Quit laying down outright lies, as in fact the updated Purevideo HD scores as good or better then ATi in the HQV benchmark (both at 123, depending on ATi card used).

RE: Most likely disappointing...
By mars777 on 5/7/2007 10:58:31 PM , Rating: 2
I think he was referencing video quality not ingame video quality.

But we still have to see UVD in action to make decidions on what is the best.

HD2950 XTX
By Ard on 5/8/2007 3:34:32 PM , Rating: 2
Somewhat off topic, but certainly related, is AMD's intentions on releasing the 2950 in Q3, right around the same time the bastard that is the X2900 XTX is released. And the fanboys cried foul on DT's benchmarks. Hah!

"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki