Print 58 comment(s) - last by AntDX316.. on May 18 at 4:10 AM

2.0 GHz memory frequencies? No problem.

Anh beat me to R600 benchmarks by a few mere hours -- when you snooze, you lose at DailyTech. Needless to say, I feel somewhat compelled to add my benchmarks to the mix as well. 

The system I'm using is an Intel Core 2 Extreme QX6800, ASUS P5N32SLI-E with 2x2GB DDR2-800.  My tests this morning used the Catalyst 8.361 RC4 driver.  The card used was a Radeon HD 2900 XT 512MB.

Core Clock
Memory Clock
745 MHz
800 MHz
800 MHz
900 MHz
845 MHz
950 MHz
845 MHz
995 MHz

Like Anh, I was able to get pretty close to a 2.0GHz memory clock while still keeping the system stable.  For reference, my GeForce 8800 GTX (core clock at 650 MHz, memory at 2.0 GHz) scores 14128 (1280x1024) and 11867 (1600x1200) on the same system with ForceWare 158.19. 

I'm currently benchmarking the Radeon HD 2900 XTX, though I'll revist the XT if anyone has any particular requests.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Grrrrr
By caboosemoose on 4/25/2007 12:56:05 PM , Rating: 0

The point still holds - these numbers are pretty worthless. I have no idea why 3DMark still holds so much currency, it's such a crappy benchmark and really doesn;t reflect how most games are coded. And how many people are going to game with an R600 at 12 x 10 with no AA.

Some numbers comparing the XTX to a GTX in various cutting edge games at stock and overclocked speeds at 19 x 12 would be FAR more relevant.

But fine, if you want to know how well an R600 cycles through a synthetic benchmark then all power to you.

I wouldnt object to 3DMark numbers if they came after some real tests. But they'd be the last frigging test i would do.

RE: Grrrrr
By GlassHouse69 on 4/25/2007 10:35:33 PM , Rating: 1
yeah synth benches are meaningless in a world of actual games.

no point in them. Not knocking people at anandtech at all! I just think they are a waste of time. people do synthetic tests to determine how good something is that they cannot test. Games exist, no need for synthetics. Plus, games are the endgoal and have uniform settings.

That being said, I have one request:

Show this card on real systems. I dont think people will be buying the chip you used. I know it stops the cpu from binding up the scores, but really, most people on here have a 2.2-2.4ghz athlon64 in a dual core or not even dual core yet configuration. Give us a try on the regular level 939 or even am2 I guess. That chip could give 10-15 frames at those resolutions to any vid card.

It is curious as to why people testing here havent gone the full route and tested all the games at various settings... kinda fishy. They could have the best spot on the web for tech geeks. Someone from ati asked not to do high res's in normal games until a new driver is out? hm

RE: Grrrrr
By Meaker10 on 4/25/2007 11:43:30 PM , Rating: 2
Are you two blind? While the numbers themselves are not so important the change IS, if a card scores 30% less when you up the resolution then its likely the card gets 30% less FPS in games when you up the resolution, it really is not hard is it?

"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer
Related Articles

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki