backtop


Print 88 comment(s) - last by testerguy.. on Mar 15 at 10:46 AM

Benchmarks or GTFO!

Yesterday when Apple unveiled the new iPad, the crew from Cupertino took some time to brag about its new A5X processor in comparison to NVIDIA’s Tegra 3. Apple certainly isn't widely known for offering up benchmarks on its own, so we'll likely have to wait until iPads land in the hands of reviewers and geeks around the web.
 
Apple used the iPad unveiling to boast that the A5X chip inside the new iPad is two times faster than A5, and four times more powerful in graphics performance than the Tegra 3.
 
 
NVIDIA isn't buying those claims without proof. The graphics company wants to know how Apple came by that number. Ken Brown, a spokesman for NVIDIA, stated, "[It was] certainly flattering " for Apple to compare its newest chip to their part.
 
Brown continued, “We don’t have the benchmark information. We have to understand what the application was that was used. Was it one or a variety of applications? What drivers were used? There are so many issues to get into with benchmark.”
 
Anyone that follows tech knows benchmarks are often handpicked to favor one particular brand over another when it comes to claims such as these. So it should be interesting to see if the new iPad’s performance lives up to the claims.

Source: ZDNet



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Apple Picked the Wrong @!**@fight
By theapparition on 3/9/2012 1:28:29 PM , Rating: 2
quote:
You don't understand how GPU's are benchmarked, then.

I understand completely. You don't understand that people don't care about benchmarks on equal footing, only how a device performs in it's configured state. Only a complete moron would believe a 4core part of the same series would perform worse than it's two core counterpart on equal settings. I've never said anything to the contrary. The only thing that matters is if the new part works better at the higher resolution.

What part of that don't you get?

I can't possibly believe you're arguing this. It's pointless, give it up.

Your comment below about testing an engine on equal ground shows your complete lack of knowledge on the real world. The whole package matters, not trying to interpret the sum of it's components.

I can see you in a product design meeting right now.

You: "Yes Mr. Product Manager, the overall system is slower, it's quite laggy. But, but.....the GPU is faster. It's twice as fast as the old product tested on the same settings."

PM: "But we don't have the same settings, do we? Didn't you design to the human factors requirements definition?"

You: "But I can product a benchmark that shows it's twice as fast at the old settings"

PM: "And that leaves us with an unusable product"

You: "I don't care, it's twice as fast"

Boss: "Clear out your desk. Security!"

And for the record, I don't trust either Apple's or Nvidia's claims, which have been wildly inflated in the past. No amount of your doublespeak will justify those claims.


By testerguy on 3/14/2012 5:07:32 PM , Rating: 2
quote:
I understand completely.


No, you don't. You seem to think that GPU benchmarks are a 'human factors' metric. Guess what, they aren't. They are what they are described as - a way of comparing the relative capabilities of GPU's. No more, no less.

quote:
You don't understand that people don't care about benchmarks on equal footing


My point doesn't depend on what people 'care' about. It depends on the reality of how benchmarks work. Again, you're making a fundamental mistake in not realising what GPU benchmarks are. Your point is essentially that you don't care much about GPU benchmarks. That doesn't mean that the way benchmarks are carried out changes. Native resolutions is another, different, and equally flawed metric, in other ways. It is not the same thing to test in native resolutions as it is to benchmark a GPU when keeping all other variables constant.

quote:
The only thing that matters is if the new part works better at the higher resolution.


Again, you refer to what 'matters' - instead of what constitutes a GPU benchmark. You are, again, confusing your flawed belief that GPU benchmarks don't matter with meaning that the fundamental method of calculating benchmarks is different.

quote:
Your comment below about testing an engine on equal ground shows your complete lack of knowledge on the real world. The whole package matters, not trying to interpret the sum of it's components.


Actually, your ignorance to what I'm saying shows a complete lack of ability to read. When you benchmark GPU's, you want to do precisely that, benchmark a GPU. Benchmarking a DEVICE , as opposed to a GPU , at it's native resolution, is a VERY DIFFERENT METRIC. It is also just as flawed. As in the other examples I've given, is 100 FPS native resolution of 10 x 10 better than 60 fps at 2000 x 1000? Of course not. Clearly, then, native FPS isn't the only relevant metric. More relevantly, in this case, Apples claim specifically refers to the GPU, and not to 'native performance'. So you're wrong on multiple levels.

Here's a further reason why you're wrong. Just like the iPhone 4 could, the new iPad can actually render games at the resolution of the iPad 2, and scale it up. So it's 'native' resolution can actually change, depending on what the game designer builds. This gives the developers the choice of either 4x the resolution, or 4x the performance, or some mix of the two. What the GPU benchmarks tell us, is that at any given graphics quality, the new iPad GPU will be up to 4x faster. This is clearly backed up by the benchmarks which show the iPad 2 as significantly faster than Tegra 3 already, and the logical deduction that the new GPU is 2x faster. Again, something we learn through GPU BENCHMARKS. If we took your flawed method we would conclude that the GPU in the iPhone 4 is worse than the iPhone 3GS. Clearly a nonsense.

It's beyond belief that you are in complete denial about the established and accepted method of benchmarking GPU's, you're absolutely delusional. Your whole argument about the experience of the device as a whole is nothing to do with GPU benchmarks, and that you confuse the two is the whole foundation of your failure. Your failed 'conversation' also fails to factor in the fact that you can't simply compare speed, even when looking at the device experience, because the higher resolution can make it worth sacrificing FPS in certain situations.


"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki