backtop


Print 58 comment(s) - last by AntDX316.. on May 18 at 4:10 AM

2.0 GHz memory frequencies? No problem.

Anh beat me to R600 benchmarks by a few mere hours -- when you snooze, you lose at DailyTech. Needless to say, I feel somewhat compelled to add my benchmarks to the mix as well. 

The system I'm using is an Intel Core 2 Extreme QX6800, ASUS P5N32SLI-E with 2x2GB DDR2-800.  My tests this morning used the Catalyst 8.361 RC4 driver.  The card used was a Radeon HD 2900 XT 512MB.

Core Clock
Memory Clock
3DMark06
1280x1024
3DMark06
1600x1200
745 MHz
800 MHz
13547
10978
800 MHz
900 MHz
13855
11164
845 MHz
950 MHz
13908
11219
845 MHz
995 MHz
14005
11266

Like Anh, I was able to get pretty close to a 2.0GHz memory clock while still keeping the system stable.  For reference, my GeForce 8800 GTX (core clock at 650 MHz, memory at 2.0 GHz) scores 14128 (1280x1024) and 11867 (1600x1200) on the same system with ForceWare 158.19. 

I'm currently benchmarking the Radeon HD 2900 XTX, though I'll revist the XT if anyone has any particular requests.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

so...
By yacoub on 4/24/2007 11:38:27 PM , Rating: 1
So if a 8800GTX with its much lower clockspeed can outperform even that most highly-clocked overclock of the 2900 XT, I'd much rather take the quieter, cooler running 8800. Especially after the R600's competition helps drive the prices down a bit.




RE: so...
By CyborgTMT on 4/25/2007 12:12:39 AM , Rating: 2
So if an overclocked 8800GTX can out perform a reference design OC'd 2900xt, that will cost $100-150 less, by a mere 123 points and 401 points respectively on beta drivers; will anyone act like a fanboy.... guess so.

BTW, please inform us of how you know that the 2900's are louder and hotter?

The only thing I agree with is the hope this drops prices so I can upgrade my SLI setup.


RE: so...
By yacoub on 4/25/2007 7:24:50 AM , Rating: 1
I'm sorry your fanboyism got in the way of you understanding what I wrote. I'll try again:

Quote:
"overclocked 2900XT:
845 MHz
995 MHz
14005
11266

overclocked GeForce 8800 GTX (core clock at 650 MHz, memory at 2.0 GHz) scores 14128 (1280x1024) and 11867 (1600x1200) on the same system with ForceWare 158.19. "

And I said in response to those numbers: So a GTX that is clocked lower (650MHz is a lower core clock than 845MHz right?) will run cooler yet perform better. I'd rather have the card that is clocked lower and will last longer and not be as likely to artifact and also uses a quieter cooler.

I'm not sure how any of that is being a fanboy on my part, it's simply a logical statement, but I like that in your fanboy rage you immediately try to label the other person a fanboy for simply posting a logical position. Grow up.


RE: so...
By CrystalBay on 4/25/2007 7:43:11 AM , Rating: 2
why don't you stay in nvidia Yankoff...


RE: so...
By DingieM on 4/25/2007 8:04:40 AM , Rating: 2
Wake up.

The shaders run at almost double speed of main clock in G80 generation so approximately 1.35 GHz.
This is what the G80 gives its speed.

All components inside R6xx runs on main clock.

Conclusion: one cannot compare those two generations of cards on the sole main clock frequency.

Wake up.


RE: so...
By Sunday Ironfoot on 4/25/2007 9:04:36 AM , Rating: 1
quote:
And I said in response to those numbers: So a GTX that is clocked lower (650MHz is a lower core clock than 845MHz right?) will run cooler yet perform better. I'd rather have the card that is clocked lower and will last longer and not be as likely to artifact and also uses a quieter cooler.


The 8800 is manufactured at .90nm whereas the new 2900XT/XTX cards are 'rumoured to be manufactured at .65nm, thus they will be much smaller, Smaller = less power consumption = less heat = less noise.

Assuming the rumours are true the 2900 series will run cooler than the 8800 series.

Also the 2900XT is suppose to be compared to the 8800GTS not the GTX, as thats the price point the 2900XT is aiming for. Wait for XTX benchmarks and see how they compare with 8800 GTX.


RE: so...
By Lakku on 4/25/2007 11:44:16 AM , Rating: 1
Exactly, everything you just said is based on rumor . It's a fact the 8800gtx draws about 175 to 185 watts at load. With all the rampant speculation on almost every website talking about the R600 and its power requirements, how can every ATi fanboy ASSUME everything will change next month? This could be the reason for the delay but no one has disputed, from ATi or otherwise, the documentation FROM ATi stating the R600 will use 240 to 270 watts (XTX version as far as I know, because if the XT draws this much, you are kidding yourself if you think it will do well in the OEM sector). I don't care how you spin it, that means it WILL run hotter then an 8800 should it actually use that much power. But since we are talking about rumor here, places are also saying the XTX won't come out until Q3. It doesn't matter if that source is reliable or not, since you people seem to believe everything good you hear but require proof when you hear bad. So if that turns out to be true, that puts ATi about 9 months or more behind nVidia, and by then, DX10 game(s) WILL be out. If it is delayed, you can bet the farm it was delayed due to power issues, or them waiting to get it working on 65nm because it was a power hog. I think everyone should just wait and see, because as it stands, these numbers are NOT impressive, especially if the card is going to be drawing that much power, and if it can't muster more out of a frikkin 512-bit bus with "320" stream shaders. The only saving grace is if the rumors turn out to be true that the XT version will be under 400 MSRP, because then it may just be worth it.


RE: so...
By Goty on 4/25/2007 1:28:13 PM , Rating: 2
First of all, consumed power is not directly related to heat output. You have to take efficiency into consideration, first. A chip that consumed 1000W but was 100% efficient would have zero heat output while a chip that consumed 1000W but was only 50% efficient would put out 50W of heat. I agree with your conclusion, but your logic is flawed.

Also, regarding heat output, who cares how hot the card runs? The heat is being exhausted outside the case for the most part, so the temperature of the card has little-to-no bearing on anything.

Regarding the comment about the XTX being delayed by three months, Sven has stated that he has an XTX in-hand, so that's just utter nonsense.


RE: so...
By Schmide on 4/25/2007 2:01:41 PM , Rating: 2
Conversation of energy!!!

Consumed power is directly related to heat output, The chip is basically a very complex variable resistor and there is nowhere else for the energy to go. If it uses it, it must go somewhere. It goes to HEAT!!! Efficiency is directly related to power usage which is directly related to heat output.


RE: so...
By Goty on 4/25/2007 6:04:44 PM , Rating: 1
Exactly, if a chip is 100% efficient, then it will use all the power it is supplied with, leaving none left over to be wasted as heat. Semiconductors are inherently resistive, as you stated, and so some is wasted as heat. The type of semiconductor used, as well as its dopants, all affect it's current characteristics. Some combinations are more efficient than others due to their relative band gaps so they will waste less current as heat than other combinations.

Different manufacturing processes also play a part in efficiency because of the fact that the dielectric used in the transistors gets smaller as the size of the individual transistor goes down. The thinner the dielectric, the more current that is allowed to leak out as heat.


RE: so...
By dunno99 on 4/25/2007 8:30:39 PM , Rating: 3
Don't want to start a flame war or anything, but despite you being able to use a bunch of terms from an EE lecture, you're missing the point about where that energy goes. Remember back in physics, a circuit has to use up all the voltage difference between the source and sink (usually ground)?

Since we know that these graphics cards need some finite amount of current (I) at a certain voltage (V), we know that the power it uses is P = IV. And according to the laws of thermodynamics, energy (and power) is conserved.

Therefore, If P = IV amount of watts is used by the graphics card, where does the energy go? Most likely, the card doesn't have a huge battery or capacitor storing up the energy just to mess with you. So, everything turns into heat in the end (we're not talking about heat death here, just the usage of the video card). Therefore, if a card consumes 1000W of power, it will output 1000W of heat, no matter what. Efficiency just means how much of that power is used to do something "useful."

As a final note, I'll just use an example. Assume you have an adder circuit that absolutely requires 1V at 1A. The power it uses is 1W. Its efficiency we would say, is 100%. However, if you put a huge resistor in front of the adder (in series) that is rated at 1 Ohm, then you would need to raise the voltage input to 2V in order for the adder to do its job. Now, we know that of the 2W you put in, only 1W is doing the actual work, and the rest are wasted. In this case, the efficiency is 50% for the entire circuit.


RE: so...
By Schmide on 4/25/2007 8:59:32 PM , Rating: 2
I really suck at this efficiency crap, but wouldn't any semiconductor be considered 0% efficient, because it basically does no work. When you say a light bulb is say 90% efficient, this means it converts 90% of its energy to light and 10% to heat. A semiconductor produces no light or kinetic movement so it is basically 100% inefficient, and contributes to a circuit exactly the same as a resistor.


RE: so...
By sxr7171 on 4/26/2007 1:40:24 AM , Rating: 2
Yes. Everything becomes heat. There is no "work" in the physical sense.


RE: so...
By dunno99 on 4/26/2007 9:57:37 AM , Rating: 2
Just like sxr7171 said, work is really just how you define it. If you wanted a heater, then a video card is pretty much 100% efficient (discounting generated electromagnetic radiation and whatnot).

In the case of a graphics card, we would consider the ideal amount of power to perform computations and information exchange at the specs of the graphics card, and compare that figure to what it's actually using.


RE: so...
By KCvale on 4/26/2007 10:55:07 AM , Rating: 2
An incandesent light bulb is only about 5% efficent (the light), the other 95% of the energy is wasted on heat.
Thats why the push for the world to change to the new coiled floresent bulbs.
They are over twice as efficent putting about 12% of the power to light using less power (a 20W floresent puts out as much light as a 75W bulb) and last 5 times longer.

A semiconductor is an "active" device, the "work" they perform is the millions of logic gates inside a chip being switched on and off millions of times a second and held in a given state (on or off) the entire time.

Resistors are a "static" device that simply restricts the flow of DC current by disapating it as heat (in series), or redirects the current flow (and voltage) when in parallel.

Nothing involving electricity is 100% efficient, not even wire.
Buy using better materials and using lower voltages semiconductors are becoming more efficient, but they always produce some waste in the form of heat.


RE: so...
By Howard on 4/25/2007 9:24:53 PM , Rating: 2
Jesus Christ. All the power used by a chip is turned into heat.


RE: so...
By johnsonx on 4/26/2007 5:23:37 PM , Rating: 1
Actually I think the poster's name was 'Goty'. I haven't seen Jesus Christ here in awhile.


RE: so...
By AntDX316 on 5/18/2007 4:10:33 AM , Rating: 2
the ultra is probably secretly 80 or 65nm


RE: so...
By 91TTZ on 4/25/2007 1:07:42 PM , Rating: 1
You can't compare 2 different chips like that. How do you know it will run cooler? One chip at 1 ghz may run cooler than another chip at 700 mhz. You just can't tell without seeing the results.


RE: so...
By AnotherGuy on 4/25/2007 12:50:37 AM , Rating: 2
Dude what r u talking about the gtx at stock is 1800 mhz... not 2000... theres some companies who did 2gig versions of the gtx but thats not stock speed... other companies will do the same with the XT and get it maybe close to 2000....

I think u got it wrong right now yacoub... ur thinkin XT FOR XTX maybe... coz at $450 gettin the performance of the GTX at $550.... I think you should be bashing at nVidia and their drivers instead of AMD... I know its late and all that but... It doesnt matter... If u waited for something better than GTX... than get the damn XTX for whatever cost it is... and keep quiet... damn whinners :P


RE: so...
By eman 7613 on 4/29/2007 7:29:07 PM , Rating: 2
cooler? my 8800gtx runs at 73c during gameplay underclocked, the damn thing makes an annoying whistle while it does anything that drives me nuts.


“So far we have not seen a single Android device that does not infringe on our patents." -- Microsoft General Counsel Brad Smith

Related Articles
















botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki