Print 90 comment(s) - last by EasyC.. on Jun 6 at 11:39 AM

Intel has big ambitions for its low power Oak Trail Atom-based platform, which it says will trash ARM processors in Android performance.  (Source: Intel)

Sadly for Intel quite the opposite proved true in early benchmarks. ARM badly beat an Oak Trail prototype in app performance and heat.  (Source:

ASUSTek's Eee Transformer Pad, powered by NVIDIA's dual-core Tegra 2 ARM CPU proved the most powerful tablet in most benchmarks.  (Source: Android In)
The only benchmark Intel's new platform performed admirably in was Javascript performance

Intel Corp.'s (INTC) was quick to brag on dramatic process improvements that would propel its Atom chips to new levels of performance during its keynote at Computex 2011 in Taiwan.  The company says it will leverage its die shrink lead on Core brand CPUs to push yearly die shrinks for Atom over the next couple years, hitting the 14 nm node a couple years before ARM manufacturers.  And it says it will deploy its new tri-gate transistors at the Atom's 22 nm node in 2013.

I. Intel Oak Trail Gets Tested

By the looks of early testing, Intel desperately needs all the help it can get.  A dual core Z6xx series atom chip running on the company's new Oak Trail chipset was shown off in a prototype design by Taiwan's Compal Electronics.

The prototype packed two CPU cores, running at 1.5 GHz.  It also packed an Intel GMA600 GPU, which is essentially a rebranded PowerVR SGX535.

The new tablet was running Google Inc.'s (GOOG) popular Android 3.0 "Honeycomb" operating system, the second most used tablet OS in the world behind Apple, Inc.'s (AAPL) iOS (found on the iPad and iPad 2).

In a limited set of tests,, a Dutch hardware site benchmarked [translated] the new platform and compared it to rivals currently on the market with similarly clocked dual-core CPUs.  The picture wasn't pretty for Intel.

II. Slow

In the Caffeine 3 benchmark, the Oak Trail prototype scored a dismal 1562 points, well behind the Asus Eee Transformer Pad (Tegra 2 based; 6246 points) and the Samsung 10.1v Galaxy Tab (Hummingbird Gen. 2; 7194 points).  This was significant as Caffeine measures Java performance -- the language most Android apps are written in.  As such, the benchmark provides a key indicator of how fast apps will run on the tablet -- in Intel's case "very slow".

That result was confirmed by the Linpack benchmark, which gave a result at 9.4 MFLOPs, versus 36 MFLOPS for the Tegra 2.  Similarly the Quadrant benchmark gave a score of 1978, at the very bottom of the 2,000 to 2,500 that Android tablets regularly score.  Some Android Phones even score 2,000+. 

While these numbers aren't necessarily a bad thing for all apps (some of which are less demanding), it may mean that on Intel-based Android tablets you'll have to forgo highly demanding apps like the early crop of 3D shooter titles.

The Oak Trail tablet did show some promise, posting the best score (1500 ms) in the Sunspider benchmark, a full 376 ms faster than the fastest ARM-based Android, the Asus Eee Transformer Pad.  In other words, while Intel's platform may come up short in apps, it looks like it will handle the internet pretty well.

III. Hot

Unfortunately, two critical performance measures -- Flash performance and battery life -- were not tested.

The site did evaluate Oak Trail's temperature performance, writing [translated]:

The settings menu of the x86 port also showed how hot the Intel CPU in the tablet. In this model ranged between 60 and 65 degrees [Celsius], and that was quite obvious. The tablet on the outside felt warm, much warmer than previous Honeycomb Tablets we owned had.

Unfortunately the site did not produce any quantitative numbers to back its claims.  However, if the CPU is truly reaching 140-149 °F, that's a major issue as, at that temperature, heat conduction could make holding the case very uncomfortable (particularly given the tight casing in modern ultra-slender tablets).

IV. Hope for Intel?

There's hope on both the performance and temperature front for Intel.  It's thought that a major part of the gap in app performance may be due to optimizations in Android for the ARM architecture.  If Intel pushes hard enough, it may be able to get similar optimizations for x86 worked in.

The temperature is intimately tied to usage and clock speed, so there's no way of necessarily escaping that during times of heavy use.  However, Intel could always solve this problem by putting a small fan in its tablets.  While that might produce a "fatter" less seemly tablet, it would at least spare the user from discomfort.

And in the long term, the die shrink in Q4 2011 to 32 nm should reduce chip temperatures.

The early numbers do indicate, though, that Oak Trail and Atom-powered Android is a work in progress -- a picture that stands in sharp contrast to Intel's promise that Oak Trail would trash ARM designs in performance.  Once we get numbers on battery life we should be able to see exactly how far behind the platform is.

The Tegra 2 is a dual-core Android by American-based NVIDIA Corp. (NVDA).  The processors are overclocked to around 1.5 GHz, in typical builds.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

CPU != Tablet temperature
By MrTeal on 6/3/11, Rating: 0
RE: CPU != Tablet temperature
By JasonMick on 6/3/2011 2:52:34 PM , Rating: 2
Just because the CPU reached 65 degrees, doesn't mean the tablet will. There's going to be a pretty significant difference in temperature between the outside of the tablet and the CPU.


But if the temp is that high, the plastic is still going to get pretty hot, as heat conducts outward through the casing... hence the "become very uncomfortable" part. Of course that's only a qualitative statement, outdoors in the winter it might feel nice at least...

And if they add a fan that could significantly cut the temperature of the case.

RE: CPU != Tablet temperature
By hyvonen on 6/3/2011 3:01:27 PM , Rating: 1
This depends on too many things (like the size of the chip, or even the size of the local hot spot on the chip from which the measurement was taken).

Making a conclusion that because the CPU measurement showed 60C the tablet itself must be hot is just bad physics.

RE: CPU != Tablet temperature
By Samus on 6/3/2011 3:53:52 PM , Rating: 2
I know this is prototype hardware and early silicon... but OUCH.

Better yields will allow for higher clockspeed and lower voltages, slightly increasing performance and decreasing heat output, but realistically, x86 can not compete on efficiency.

RE: CPU != Tablet temperature
By encia on 6/3/2011 7:15:12 PM , Rating: 3
X86 can compete on efficiency i.e. AMD Z-01 APU.

RE: CPU != Tablet temperature
By Samus on 6/3/2011 7:56:28 PM , Rating: 2
I don't completely disagree. I love my HP DM1z (aside from the design flaws) but RISC is inherently superior to x86 in virtually every way, simply because it is modern by allowing software to completely reprogram how hardware compiles data. x86 is a fixed instruction set with various programmable extensions that try to make it modern.

At the end of the day, x86 is three decades old, and RISC will never show its age as it is allowed to evolve around hardware, not revolve around hardware.

ARM is the future. x86 is the past. The only reason we still rely on x86 so much is because of Intel and Microsoft, and to some extent, AMD. But Microsoft is changing the game with Windows 8. They tried to break their x86 roots with Windows NT nearly TWO decades ago, but the time wasn't right. Now it is.

RE: CPU != Tablet temperature
By phantom505 on 6/3/2011 8:44:47 PM , Rating: 2
The return of the Itanic?

RE: CPU != Tablet temperature
By SPOOFE on 6/4/2011 1:24:21 AM , Rating: 2
Not all RISC instructions are created equal. RISC refers more to one class or category of architecture, whereas x86 is a specific instruction set of a CISC architecture (and a lot of Intel's x86 extensions are themselves like RISC sub-processors).

RE: CPU != Tablet temperature
By FauxNews on 6/3/2011 9:23:14 PM , Rating: 2
RISC is inherently superior to x86 in virtually every way, simply because it is modern by allowing software to completely reprogram how hardware compiles data. x86 is a fixed instruction set with various programmable extensions that try to make it modern.

You have absolutely no idea what you're talking about.

x86 has turned it's disadvantages into massive advantages, which has allowed it to prevail over most other RISC architectures despite their "superiority".

For example, while people were bragging about PowerPC's 32 registers and how x86 was inferior with it's 8 registers, x86 turned around and came out with CPU's with hundreds of internal registers.
Suddenly it took a major disadvantage and turned it into a major advantage.

Everyone that has claimed x86 was "dead" and "inferior" have inevitibly ended up eating crow as it quickly eclipsed all of its competitors.

RE: CPU != Tablet temperature
By Samus on 6/4/2011 2:58:28 AM , Rating: 3
Interesting...ranting about integer registers makes me think you have no idea what you're talking about.

Unless you're an electrical engineer like myself, I'm wasting my time going into detail because you'd have no idea what I'm talking about, which is why I was as blunt as possible with my explaination of RISC and x86 instruction sets. You pretty much agreed with my comment indirectly when you stated x86 has evolved around extensions that are RISC in nature.

If you can't beat'em, join'em? Maybe thats why x86 processors have had 11 extensions, from additional SIMD registers to seperate floating point instructions (MMX forward) to 64-bit memory addressing. It's also worth mentioning x86 cpu's didn't even have a dedicated math co-processor until the 386. Motorola and Texas Instruments had integrated math coprocessors in their chips years before Intel, even in their consumer products.

Intel has superior manufacturing processes and that is the ONLY thing that kept them in the game for so long. If ARM had the support and manufacturing ability, we'd all be running CELL-style architecture in our desktops and Intel would be in the tenths of percentage in performance. Now that Microsoft isn't going to carry Intel's deadbeat red headed stepchild into the future, the superior architecture has a chance.

Keeping x86 alive is like refusing to replace your 1979 can upgrade it all you want, but in the end, all you've done is molested a car that is still 1979 technology at its foundation. It can not be made better than a modern car using modern technology, but there will always be the old dogs that hold onto their old crap because they refuse to change.

RE: CPU != Tablet temperature
By k20boy on 6/4/2011 1:23:24 PM , Rating: 2
Glad you have to qualify your title or educational background. Being an EE doesn't mean you are a specialist in computer architecture, nor does it mean that just because you took a computer architecture or microprocessor design course n years ago, you work in the industry and understand how the game works in practice. So, unless you are a CPU design engineer, I am skeptical of what you have to say. Of course, we all learn the textbook beauty of RISC design. Intel has proven that x86 has extreme legs, however. This extends beyond its extreme manufacturing prowess and I would argue that extensions to the x86 instruction set and some of the particular implementations on latest Intel microprocessor designs have shown that Intel can make incredible design innovations despite their inherent CISC architecture. I have heard numbers in low single digit percentages about the hit that Intel takes on x86 decode logic. With this information, it seems with our billion-plus transistor CPUs, the CISC vs. RISC debate would be over. The particular implementation is much more important than the particular instruction set. If you are arguing the scalability of the x86 instruction set, you are dead-wrong: just look at the server market (where power is a concern) and how most of the high-end RISC machines cannot compete from a pure performance perspective or even performance/watt. Just in case you needed qualification of my credentials: recent EE/Phys graduate and pursuing MSEE. Have a good day.

RE: CPU != Tablet temperature
By Samus on 6/4/2011 1:59:57 PM , Rating: 2
Yet, the worlds top 10 super computers all use RISC...

Yea, x86 is just a killer server chip.

Listen, the only reason people use x86 is because they are forced too. If you had a version of Windows compiled and optimized for RISC, much like the current version of Windows compiled and optimized for x86, I can guarantee at every performance/watt level, the RISC version would be superior in EVERYTHING but encoding\decoding as Intel's branch prediction units are far superior to everyone elses, even AMD's. This has nothing to do with x86, it has to do with Intel's engineering and R&D budget.

I can't believe you are actually disagreeing RISC is superior to CISC. It boggles my mind.

RE: CPU != Tablet temperature
By k20boy on 6/4/2011 3:08:59 PM , Rating: 2
You are exactly right. Intel's R&D has made the RISC vs. CISC debate extinct. Their design decisions, extensions to the x86 instruction set and superior process node have more than made up for any inherent deficiencies in the CISC model.

You said:
Yet, the worlds top 10 super computers all use RISC...

This may be true of single monolithic systems but that is not the way supercomputers are built today. Most use some sort of clustering. Also, I said server, not super computer, there is a large difference. Just look at any of the articles on Anandtech looking at server performance and you will see that x86 is king. Also, if I was talking about clusters or supercomputers I would point you to the Top 500 list of supercomputers running the High Performance Linpak: Notice how most of the systems use x86 CPUs and usually use GPUs as well.

Yes, THEORETICALLY, RISC is superior to CISC. Intel, however, has made this theoretical argument unimportant in practical implementations. Obviously, if one could design from the ground up and not worry about legacy software support, RISC would be the way to go (actually probably something like EPIC would be even better) and Intel would still be able to make further inroads than they have today. This is just not the way the world works and Intel has designed itself out of its problem.

RE: CPU != Tablet temperature
By Targon on 6/4/2011 4:48:05 PM , Rating: 2
It isn't just Intel, the real key is in the overall system architecture, not just CPU design. As system complexity increases, the value of CISC increases as well, while code at a very low level will favor RISC. Think about that for a moment. Yes, there is an increased need for code optimizations in the compilers with CISC, but when a single instruction will do EVERYTHING you need and behind the scenes is broken down into very neat RISC-like micro-ops, that eliminates the much of the debate about what is better.

While RISC does have the POTENTIAL to be faster, the increased code design effort generally will mean you never realize that potential.

RE: CPU != Tablet temperature
By harshbarj on 6/4/2011 2:44:26 PM , Rating: 3
It's also worth mentioning x86 cpu's didn't even have a dedicated math co-processor until the 386.

Not true at all. I consider myself an expert on vintage Intel CPU history and that statement is flat out incorrect. Intel has had dedicated math co-processors from the very first x86 CPU. Even the IBM PC 5150(introduced in 1981) had both an 8088 CPU and 8087 math co-processor slot.

Now if you were talking about an 'integrated' co-processor your still incorrect. The first x86 CPU from Intel to integrate the math co-processor was the 486DX line (initially just called the 486, the DX was added with the introduction of the 486sx to differentiate between the two products). Intel later produced the 486sx that lacked a math co-processor but was otherwise identical to the DX chip. ALL 386 processors had a separate co-processor. the 386sx was a 32-bit internal and 16-bit external chip (limiting addressing to 16mb) while the 386dx, 486sx, and 486dx were all fully 32-bit.

Lastly I would NOT want to run an arm processor on a desktop. While okay for cellphones and tablets, they are just too slow for a desktop. Just try to encode a lengthy video on an arm CPU or render a complex 3d animation. It can be done, if you have some time to kill.

RE: CPU != Tablet temperature
By SPOOFE on 6/4/2011 3:13:13 PM , Rating: 2
It's also worth mentioning x86 cpu's didn't even have a dedicated math co-processor until the 386

Only if you argument is "at one time, RISC had a superiority over CISC", but that's not your argument. Your argument is present tense. 386 is nowhere near "present", and in CPU terms is millions of years old. You might as well claim humans are inferior to fish because at one time humans didn't exist.

Intel has superior manufacturing processes and that is the ONLY thing that kept them in the game for so long

That's why AMD disappeared in the 90s, right? Right? Oh wait...

Go back to electrical engineering.

RE: CPU != Tablet temperature
By dotpoz on 6/6/2011 4:14:23 AM , Rating: 2
I agree. We are forced to keep an obsolte architecture with VARIABLE INSTUCTIONS LENGHT just for compatibility issue.

RE: CPU != Tablet temperature
By encia on 6/3/2011 10:35:37 PM , Rating: 2
My ACER Iconia W500 tablet scores 1010.6 ms in SunSpider 0.9.1 i.e. beating both Oak Trail tablet(1500 ms) and ASUS Eee Transformer Pad (1876 ms).

RE: CPU != Tablet temperature
By Alexvrb on 6/4/2011 11:30:18 PM , Rating: 2
The Bobcat cores are quick, and the GPU clinches the deal. The upcoming Z series looks to improve it even further.

RE: CPU != Tablet temperature
By encia on 6/3/2011 11:21:09 PM , Rating: 2
RE: CPU != Tablet temperature
By encia on 6/4/2011 1:14:01 AM , Rating: 4

Notice how small is the AMD's X86 decoders i.e. 1 to 2 percent of the die size.

Modern X86 CPUs has assimilated two key RISC principles i.e. translating variable length instruction into fix length and single cycle instruction throughput.

RE: CPU != Tablet temperature
By Wolfpup on 6/5/2011 4:19:01 AM , Rating: 1
"RISC" and "CISC" don't mean much anymore, and haven't for ages. So called "RISC" chips often have larger more complex instruction sets than older CISC chips. And most x86 chips haven't actually been CISC since the Pentium Pro.

simply because it is modern by allowing software to completely reprogram how hardware compiles data. x86 is a fixed instruction set with various programmable extensions that try to make it modern.

Doesn't even mean anything. They both have "fixed instruction sets".

The only thing I'm interested in here is why is an equivalently clocked first gen Atom being outperformed by a Cortex A9? I know the first gen Atom is in order, but still, wasn't it supposed to outperform Cortex A9 severely, even aside from clock speed advantages?

When power doesn't matter, everyone's picking first gen Atom over ARM...which would back up that idea.

Hence I'm wondering if there's something else going on here-like an early software build that's not optimized well for Atom, or an extra layer of emulation, or something.

RE: CPU != Tablet temperature
By bitterman0 on 6/3/2011 3:03:49 PM , Rating: 4
Of course that's only a qualitative statement, outdoors in the winter it might feel nice at least...

Or you can dunk the contraption into a liquid nitrogen tank every few minutes and not only it will stay nice and frosty, but it also will achieve super-conductivity(*) and finally reach the performance figures Intel was boasting about(*).


RE: CPU != Tablet temperature
By MrTeal on 6/3/2011 3:16:37 PM , Rating: 3
Just wanted to make sure you were aware. This sentence
However, if the tablet is truly reach temperatures of 140-149 °F, that's a major issue as, at that temperature, holding it could become very uncomfortable.

seems to imply that the tablet itself is that hot.

RE: CPU != Tablet temperature
By mcnabney on 6/3/2011 4:55:30 PM , Rating: 2
That means the tablet itself can cook pork to a safe internal temperature (in several hours).

Also, you can actually cook an egg with a surface temperature of 158.

RE: CPU != Tablet temperature
By JasonMick on 6/3/2011 6:06:42 PM , Rating: 2
seems to imply that the tablet itself is that hot.

Ah, I see what you meant... I changed the text slightly to...
However, if the CPU is truly reaching 140-149 °F, that's a major issue as, at that temperature, heat conduction could make holding the case very uncomfortable (particularly given the tight casing in modern ultra-slender tablets).

My point was that tablets tend to be very tightly packaged, so the CPU is like butting against some sort of foil separator, which in turn probably touches the back case. That means wherever the CPU rests on the motherboard, likely will become locally quite uncomfortable to the touch.

Didn't mean the whole tablet would heat up to 140 degrees, just meant that the portion of the case over the CPU could be quite hot...

RE: CPU != Tablet temperature
By Colin1497 on 6/3/2011 8:01:57 PM , Rating: 2
It might also mean that the heat sinking is poor and the chip isn't cooling well in the prototype, or that they located their temperature sensing diode in the hottest part of a small die. I'm not a huge INTC fan, but temperature doesn't equal power.

Also, the low java performance seen may just be poorly optimized code in the x86 honeycomb since it hasn't been the priority. The chip might suck, too, but I'd guess not as much as the article suggests, especially considering the web browser was as actually faster.

RE: CPU != Tablet temperature
By Targon on 6/4/2011 4:55:38 PM , Rating: 2
The Intel Atom processors suck, and that is where the real problem comes from. If you have kept up, you will notice that the new chips from AMD are doing quite a bit better in the netbook market than the Atom, and AMD JUST got into that market. Tablets are another story, but if the performance isn't great, and people are seeing high temperatures, that is just bad news for whoever manufactures these Atom based tablets.

RE: CPU != Tablet temperature
By messele on 6/3/2011 2:59:58 PM , Rating: 1
Exactly. A 40 degree delta when measured on a 64mm^2 die becomes significantly diluted when spread over a 30,000mm^2 aluminium plate as found on the rear of a tablet (both figures are roughly approximated). It becomes approx. 0.008 of a degree, something like that? Assuming a 100% even spread of heat of course...

What's really important is battery life, the issues of heat management are relatively easily solved.

RE: CPU != Tablet temperature
By nafhan on 6/3/2011 3:00:54 PM , Rating: 3
For most electronics, high temp = high power usage = poor battery life.

You're correct in that it's not necessarily the CPU (it may be some other component or combination of components). Still... not good.

What else is new?
By bitterman0 on 6/3/2011 2:45:56 PM , Rating: 2
Yet another Atom flop. Come on, out with the die shrink already!

RE: What else is new?
By tviceman on 6/3/2011 3:36:29 PM , Rating: 2
By the time the die shrink hits Nvidia will be hitting back with Tegra 3 & 4. Intel is going to be playing catch up, and Nvidia will always have the upper hand with better graphical capabilities.

RE: What else is new?
By MonkeyPaw on 6/3/2011 5:53:32 PM , Rating: 2
The funny part is, tegra2s gpu is relatively slow. Future versions will only embarrass Atom. I'm sorry, but I think Atom is not going to make it, as Intel is NEEDING process shrinks to even catch up. ARM is just designed for low power, x86 isn't. Intel will die trying, but I think it will be another itanium moment. Billions in effort for decades, insufficient adoption.

RE: What else is new?
By SPOOFE on 6/4/2011 1:56:26 AM , Rating: 2
Intel will die trying

Nah, these are itty-bitty chips that Intel is probably funding with change found in their sofa cushions.

RE: What else is new?
By Belard on 6/4/2011 12:37:27 AM , Rating: 2
If someone wants to build an x86 tablet, using an Atom or Tegra2 won't cut it compared to the current AMD Fusion which is (A) faster (B) cheaper (C) less heat (D) less power.

RE: What else is new?
By Khato on 6/4/2011 1:01:57 AM , Rating: 1
Much as I can understand why so many enjoy entertaining such delusions... While it's quite true that the AMD Fusion offerings are faster than the currently available Atoms, they by no means consume less power/generate less heat. Sorry, but it just isn't true. Laptops based on the C-50 idle anywhere from 5 to 10 watts, with full load consumption typically being around 20 watts. The typical E-350 implementation meanwhile ends up at around 10 watts idle and just shy of 30 watts at load. N550 based laptops are in the 4 to 8 watt idle range and 15 watt load... and who knows exactly what Oak Trail based netbooks will be like for power consumption/performance?

RE: What else is new?
By encia on 6/4/2011 1:24:41 AM , Rating: 2
AMD Z-01 APU says Hi.

My AMD C-50 (with Z-01's GPU 276Mhz clockspeed) is nowhere near 20 watts at full load. I use Belkin power meter.

RE: What else is new?
By Khato on 6/4/2011 1:37:02 AM , Rating: 1
Let's see, do I want to believe you, or what I've found to be the best laptop review site available? Not saying that the C-50 is bad or power hungry, just that it is higher power consumption than atom. (C-50 still bests it slightly in single-threaded performance, and easily in graphics/media.)

C-50 based Acer Iconia Tab W500 -

Atom N550 based Acer Aspire One D255 -

Oh, and keep in mind that the 'Z' series Atoms are far lower power consumption than the N550. More than easily keeping pace with the amusing Z-01 (which, by the way, is still crippled with the power-sucking M1 FCH last I heard.)

RE: What else is new?
By encia on 6/4/2011 1:57:13 AM , Rating: 2
Atom N550 based Acer Aspire One D255 uses a screen with 1024 x 600 pixels, while Acer Iconia W500 uses 1200 x 800 pixels.

Acer Iconia W500 was benchmarked with docking Keyboard+USB Hub+Pointer.

RE: What else is new?
By Khato on 6/4/2011 2:14:16 AM , Rating: 1
Your point being? Yes, when not under full load (aka, not everything connected and turned on) the power consumption goes down. Yes, there are other differences between the two machines other than the processor used. Yes, you can find both C-50 based and N550 based products that use more power...

But in terms of actual power consumption, the N550+NM10 combination is less than the C-50+M1.

RE: What else is new?
By encia on 6/4/2011 2:49:26 AM , Rating: 2
ACER W500 includes a touch screen while Acer Aspire One D255 does not.

RE: What else is new?
By encia on 6/4/2011 3:07:00 AM , Rating: 2
Against Notebookcheck's "It is no longer really enough for games" claims

Crysis 2 PC on AMD C-50 APU

RE: What else is new?
By Shadowmaster625 on 6/6/2011 9:39:29 AM , Rating: 2
18 watts under a furmark and prime95 is nearly meaningless. That has to be above any typical usage scenario, including the vast majority of gaming scenarios. When you have 80 SPs, there is no way around using a lot of power if you really want to.

RE: What else is new?
By dagamer34 on 6/6/2011 9:03:02 AM , Rating: 2
The ratings seen on CPUs are not for power draw, but heat dissipation. Do remember that.

RE: What else is new?
By encia on 6/4/2011 1:34:22 AM , Rating: 1
AMD Z-01 APU (renamed C-50 with 276Mhz GPU) has a max TDP of 5.9 watts. Normal AMD C-50 has 280Mhz GPU clockspeed.

AMD G-T40E APU has a max TDP of 6.4 watts.

Both AMD Z-01 and AMD G-T40E APUs includes dual core at 1Ghz and Radeon HD 6250M i.e. same as a normal AMD C-50.

Intel Atom Processor N550 has a max TDP of 8.4 watts.

RE: What else is new?
By Khato on 6/4/2011 1:52:46 AM , Rating: 1
And without a controller hub, the only I/O of either of those chips is memory, display, and one PCI-E x4 connection. As well, I've not read anything stating that the processor can actually operate without a controller hub... In which case you must add the 2.7-4.7W TDP of the Hudson M1 FCH to those figures. (TDP of the FCH is according to this chart - )

By comparison, that N550 (or the N570) has a TDP of 8.5W and gets coupled to the 2.1W TDP NM10. Or you could go with the Z670 with its 3W TDP, adding another 0.75 watts to the TDP for its SM35 chipset.

3.75W for the lowest power atom configuration compared to 8.6W for the Z-01.

RE: What else is new?
By encia on 6/4/2011 2:29:58 AM , Rating: 2
ACER Tablet W500's FCH doesn't match Anandtech's Brazos Hudson FCH feature set.

For example, my FCH has 1 active SATA port instead of 6 active SATA port.


PS; Doesn't include tablet specfic FCH.

RE: What else is new?
By encia on 6/4/2011 2:40:53 AM , Rating: 2
Intel Atom Processor Z670 has 1 CPU core, it's missing X64 ISA and supports DDR2-800.

RE: What else is new?
By Targon on 6/4/2011 4:59:37 PM , Rating: 2
You forget one critical thing, and that is what the overall system performance is like. Atom based machines feel very slow and feel like cheap crap. While I have not personally tested the new AMD Fusion based machines, what I have read indicates that they have a much better level of performance, and may even be worth using.

If temperatures on the Atom based machine are high, with poor performance, and the Fusion based machines are lower with better performance, which is better for the consumer?

CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/3/2011 7:13:22 PM , Rating: 2
My ACER Iconia W500 tablet (AMD C-50 APU low power edition**@ 1.0Ghz) scores 9206 points(CaffeineMark 3.0).

**Same 276Mhz GPU speed as AMD Z-01 APU. Normal AMD C-50 has 280Mhz GPU speed.

RE: CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/3/2011 10:33:25 PM , Rating: 2
My ACER Iconia W500 tablet scores 1010.6 ms in SunSpider 0.9.1 i.e. beating both Oak Trail tablet(1500 ms) and ASUS Eee Transformer Pad (1876 ms).

RE: CaffeineMark 3.0 with ACER Iconia W500
By Khato on 6/3/2011 11:22:39 PM , Rating: 2
And what OS is your Acer Iconia W500 running? Unless it's Android instead of the stock windows, then those results are not at all comparable to those of the Oak Trail Atom tablet that was tested.

RE: CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/3/2011 11:46:50 PM , Rating: 2 was using Java VM CaffeineMark 3.0 and JScript SunSpider benchmarks. Both middleware are not tied to an OS.

As for Android, I'm waiting for Bluestack i.e. Android runtime for MS Windows.

RE: CaffeineMark 3.0 with ACER Iconia W500
By Khato on 6/4/2011 12:32:27 AM , Rating: 1
No question that the benchmarks themselves aren't tied to the OS. But what do you think translates that non-OS specific code to something that a specific platform can run? That's the component that's causing the Oak Trail Atom tablet to perform abysmally, and as such, unless your tablet is subject to the same translation penalty then its scores are every bit as meaningless as the 75787(146105 embedded) caffeinemark 3.0 score of this sandy bridge computer.

RE: CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/4/2011 12:58:00 AM , Rating: 2
Unlike ASUS Transformer and ACER Iconia A500 tablets, Intel Sandybridge devices are not running the same form factor i.e. show me 9-to-10 inch size tablets.

Why would one artificially cripple a benchmark?

RE: CaffeineMark 3.0 with ACER Iconia W500
By Khato on 6/4/2011 1:09:35 AM , Rating: 2
It's not an artificial crippling. Benchmarks are dependent upon the software they're run on. If you want to believe that OS implementation doesn't affect benchmark results, then please explain the performance gain going from Android 2.1 to 2.2 -

That JIT compiler that resulted in the huge performance gains doesn't exist in the x86 version of Android, because, ya know, compiling for ARM is different than compiling for x86. And no, they can't just pick up the great JIT compiler used in windows and put it into Android.

RE: CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/4/2011 1:15:35 AM , Rating: 2 also benchmarked a non-Android IOS 4.3 powered tablet i.e. Apple iPad 2.

RE: CaffeineMark 3.0 with ACER Iconia W500
By Khato on 6/4/2011 1:26:31 AM , Rating: 2
Which is another product with actual production quality software, so what?

Let's put it this way, what do you think that very same Atom tablet would score if running windows?

RE: CaffeineMark 3.0 with ACER Iconia W500
By encia on 6/4/2011 1:45:55 AM , Rating: 2
It's not my reasonability to post Intel Atom Windows 7 based tablet.

I do have my old ASUS Eee PC T101MT tablet with 1.8Ghz turbo/overclock.

RE: CaffeineMark 3.0 with ACER Iconia W500
By Khato on 6/4/2011 2:04:49 AM , Rating: 2
Nope, it's not. But you're the one that was comparing the performance of a windows tablet to those in the article and claiming a victory for the C-50. As well as the one saying that the benchmarks are not tied to an OS and asking why they'd be artificially crippled.

I'm simply stating the fact that the poor performance of the Oak Trail tablet on Android was purely due to the immature nature of the x86 port. And that both current windows performance and future Android performance once the x86 port has been properly turned would be roughly equal to the C-50, as is the case in most all benchmarks to date.

By encia on 6/4/2011 2:13:38 AM , Rating: 2
Go ahead and post Intel Atom Dual Core + Windows 7 scores.

If I double the CaffeineMark 3 score on my old Intel Atom tablet, it doesn't not reach the same 9000+ score.

Both benchmarks have poor multi-threaded CPU support i.e. 55 percent CPU usage(Task Manager) on my ACER W500.

By encia on 6/4/2011 3:13:30 AM , Rating: 2
It's PR damage for Intel, which damages X86 as a whole.

By encia on 6/4/2011 3:09:45 AM , Rating: 2
Ops, edit reasonability to responsibility.

By encia on 6/4/2011 1:03:38 AM , Rating: 2 also benchmarked a non-Android IOS 4.3 powered tablet i.e. Apple iPad 2.

By encia on 6/4/2011 12:16:00 AM , Rating: 2

Running Java bytecode on OpenCL (AMD Radeon HD GPU)

Who is surprised?
By icanhascpu on 6/3/2011 8:36:31 PM , Rating: 2
And better yet, who cares? ATOM is underpowered as hell. WE KNOW THIS.

Most of the people that come to this website know this. We are waiting for ATOM successor that uses the new Intel transistor design. I want to see what those are going to score. I would not be surprised if it is double the highest marks those tests are seeing now for their lowest rated CPU.

RE: Who is surprised?
By fteoath64 on 6/4/2011 5:15:32 AM , Rating: 2
We are surprised at the stupidity of this team in Intel. It is clear they are NOT doing any work on the architecture but just rely on die-shrink to get its power savings.
Little do they realize that the competition is way over them on 2 fronts. It is just a waste valuable wafer which can be used for better things. One can hope but does not mean, we will get a new design.

RE: Who is surprised?
By A11 on 6/4/2011 9:37:41 AM , Rating: 2
It's not stupidity, they sure as hell know what's going on.

It's called making business decisions, some of which may be wrong in the long run but looking at Intel today you can hardly say they've failed much in that regard over the last 30 years.

RE: Who is surprised?
By Targon on 6/4/2011 5:09:49 PM , Rating: 2
Intel has made some HORRIBLE mistakes over the years, and even though it has made the best of them, you can't say that Intel is perfect.

The Pentium Pro....most people don't even remember that failure of a chip, but the design DID end up as the basis for a lot of the Intel processors over the past two decades. The Itanium....we will have to see if Intel will be able to salvage ANYTHING from that flop of a chip.

Atom....poor performance from every machine that has used it. Honestly, if AMD had anything competitive in the netbook space before now, many people wouldn't even know what an Atom processor was, because no one would have bothered using it.

RE: Who is surprised?
By SPOOFE on 6/5/2011 5:50:21 PM , Rating: 1
, if AMD had anything competitive in the netbook space before now

... Then Intel may have felt some pressure to update their product sooner rather than later. As it is, they had to wait until they felt a threat from ARM, not AMD, to actually get off their butts!

I think some AMD fans overestimate how much marketing traction you can get by emphasizing the superior graphics of a low-power, low-performance, low-price, low-margin product for cheapie devices.

By dcollins on 6/3/2011 4:54:12 PM , Rating: 5
I would not read into these results too much, although they aren't encouraging. As of Android 2.2, Dalvik relies on a JIT compiler to achieve good performance when executing "java" code. The JIT in Android 3.0 almost certainly does not support x86, so this benchmark is comparing JIT compiled code versus interpreted code, which makes huge difference, especially in synthetic benchmarks like Linpack. It's not a matter of optimization; if the JIT cannot compile to x86, the code stuck in the much slower interpreter loop.

Normally a good JIT offers 2-5X the performance versus a similar quality interpreter, but the difference can be even higher in arithmetic code. Using python as an example (because I most familiar with it) pypy's JIT can perform simple math benchmarks up to 100x faster than cpython's interpreter.

The fact that Atom outperforms Tegra in Javascript performance drives this point home. V8 obviously has excellent JIT support for x86 since it was originally developed for PC's so this is the only test that compares apples to apples. For general performance, we'll just have to wait and see.

RE: Dalvik
By B3an on 6/3/2011 7:10:07 PM , Rating: 1
This is what i was thinking too. I'm positive Android 3.0's JIT does not support x86, so these benchmarks are pretty much useless.

Remember when Android 2.2 came out and in some benches it was literally 5x faster than Android 2.1? This is the kind of difference JIT makes, so in all of these benches the Atom is going to be roughly 3 - 5x slower because of no JIT support. Once it gets this support these benchmark numbers will dramatically increase.

RE: Dalvik
By psonice on 6/6/2011 6:14:33 AM , Rating: 2
Yep, what I was thinking too. The atom is significantly faster in javascript, it's likely to be significantly faster running java too with a decent dalvik port. Anyone assuming that these results mean the atom tablets are going to be slow just doesn't understand what they're looking at!

Also, consider the floating point benchmarks. Tegra 2 famously lacks Arm's NEON engine. This is the equivalent of SSE on the atom, and accelerates a lot of FPU operations massively. There's no way a tegra 2 would beat an atom at floating point unless the software was seriously crippled or unfinished!

I think intel has a few real or potential problems though:

- heat. A hot tablet isn't going to be fun, and it suggests that the CPU is eating way too much power, and battery life will suffer. Maybe they can improve it a lot before it's released (and finished software will help plenty too), but it's not a promising start.

- The OS/dalvik software. They need the software optimising for it. If they don't put the effort into this, it's not going to be pretty, but this is totally solvable.

- the GPU. An SGX535, seriously? The iphone 3gs had this GPU, 2 years ago. The ipad GPU is 9x faster (well, yeah, that's a marketing figure, but I develop for it and it is 'many times' faster at least in pretty much all cases).

- Apps. Ok, so no problem with java apps (assuming the dalvik port is good). How about native apps, or apps that contain native code? They're not going to run (or they'll run badly under emulation). There's going to be quite a big compatibility problem - especially for games that need the speed of native code.

Atom = Pentium 4
By EricMartello on 6/3/2011 6:28:22 PM , Rating: 2
Isn't that what the Atom is? Recycled Pentium 4's no surprise that it runs so hot. Remember the massive heatsinks P4s used to use even when they were not overclocked?

RE: Atom = Pentium 4
By Goty on 6/3/2011 7:45:16 PM , Rating: 2
Atom's architecture is very different from Netburst. It's actually more similar to the original Pentium than anything else.

RE: Atom = Pentium 4
By Colin1497 on 6/3/2011 8:03:23 PM , Rating: 2
Actually nothing like Pentium 4, it's based on the older x86 chips.

By zephyrprime on 6/3/2011 3:25:01 PM , Rating: 2
It's strange that the performance is so overwhelmingly bad. Was the java bytecode interpreted rather than compiled?

RE: sucktacular
By Khato on 6/3/2011 4:52:44 PM , Rating: 2
Yeah, there's no question that the CaffeineMark 3.0 score for the Atom is worthless. Maybe they're making it emulate an ARM processor or something... Since searching around a bit for old x86 scores on CaffeineMark 3.0 turns up such gems as a Pentium 233 scoring 4400 with Java 1.1.18 back in 1999... Or you could always go ahead and run it on your computer - this i5 540m gets a bit over 48k.

Compal = Taiwanese
By rocman on 6/3/2011 4:18:20 PM , Rating: 2
Quick correction - Compal is actually a Taiwanese firm.

I know the Atom was pretty sucky, but I thought it was at least on par in performance with the current iterations of ARM offerings. But intel's got some resources to throw at this situation. So I am not *too* worried for them, yet. The whole thing does remind me a little of Intel's position back with the Netburst architecture. Hopefully they will find the banias-esque breakthrough regarding Atom... more competition is always better!

RE: Compal = Taiwanese
By Targon on 6/4/2011 5:14:51 PM , Rating: 2
Atom still is sucky, not was. When graphics come into the picture, Atom based machines end up worse due to being paired with Intel graphics. By the time Intel gets its act together with Atom, AMD will be ready with a new generation that may very well crush Intel in the netbook space, and maybe tablets...but ARM based machines may still hold many advantages.

Intel fails
By cschuele on 6/3/2011 7:02:45 PM , Rating: 3
I have'nt seen this mentioned anywhere but its a huge huge mistake by intel to rush all their desktop PC's to smaller nodes. Its been Since early Jan 2010 that Intel has had CPUs produced on 32nm. However, they wont have 32nm Atom till the end of this year. Thats a full 18-24months after intel had the technology.

People with desktop PCs were overclocking quadcores to 4ghz+ at 45nm with no problems. The real issue is intel could have changed all their roadmaps to start process upgrades with tablet and SOC designs first rather then the PC chips since those have GIANT ACTIVE CPU COOLERS which can disapte the heat and run off the grid, not a tiny underpowered battery!

While 32nm is nice in the PC it would help in the mobile market far more. A huge miscalculation that has already cost intel marketshare (ARM in every apple iphone and ipad to date & 90% of all smartphones and tablets) and now they will have to work doubletime to try to get contracts and convince smartphone and tablet makers that their chip is competitve. The other major problem is the atom costs about $60 in volume and an ARM CPU costs $25. Oh, and the ARM chip is simpler, faster, uses atleat 50% less power at the same node.

Congrats big giant intel, you spent billions of dollars to make process changes and you now finally discovered what should be ugpraded first. Good luck in 2014 when you will finally have a chip that might use less power, not due to a good design, but due to a lead in smaller gate size, to bad they could have had the lead last year if they had the foresight to realize that low powered chips are much better in things that run off batterys.

RE: Intel fails
By A11 on 6/4/2011 9:44:51 AM , Rating: 1
No you fail, you forget where Intel's bread and butter comes from and you also completely ignore there's a market called servers which has ridiculous margins.

By arne000 on 6/3/2011 7:00:25 PM , Rating: 2
Feature: Cook eggs on the screen... Like come on ARM is the leader in embedded CPUs, Intel can not just jump in with cut back PC CPU can call it 'embedded'.

By Justin Time on 6/3/2011 8:20:48 PM , Rating: 2
I thought they were talking about Medfield and Cedar Trail ?

I also thought that Oak Trail was just a re-packaging of the existing 2nd generation (45nm) Lincroft CPU and Whitney Point chipset, as SoC with some firmware tweaks, so it's hardly surprising it's no real advance on the existing Atom status, as it retains all of the existing underlaying architecture.

Cedar Trail is the 3rd generation (32nm) Atom technology, and is due to ship later this year.

By fteoath64 on 6/4/2011 2:54:12 AM , Rating: 2
Clearly, this chip is a stillborn. I cannot understand why they even build a prototype tablet with it. In the prototype bread board, a few tests will confirm that this chip is going to be dead-in-the-water. Not only is Tegra2 a leader at just 1Ghz, the Qualcomm Scorpion is rather lethal too.

So Intel, you know better than just using process-improvements to get the numbers, go back to the drawing board and do a proper ATOM that is power-efficient and competitive with ARM. It might take you 2 years to do this but ATOM has not moved in 3 years,so time was already wasted. Get ON with it and keep Meego out of this!. Its Me-GONE!.

By ajcarroll on 6/4/2011 6:00:08 AM , Rating: 2
Given the highly competitive JavaScripts scores, the most plausiible explaination to me is simply that the JIT (Just in time) compiler has not yet been implemented on Dalvic (the Android Java implementaton)

A significant amount of effort has gone in to optimizing JavaScript on ARM, so the fact that the intel chips are competetive with JavaScript, indicates that the Intel CPU does not suck in terms of raw non-floating point CPU performance, as it can keep up with a heavily optimized ARM implementation. The huge discrepancy between JavaScript (which IS just in time conpiled on ARM) vs Java performance suggests to me that the Java implentation is heavily constrained in some manner, and the most simple explaination is that JIT is not enabled.

Given the positive JavaScript scores, my guess is once Atom based tablets are released raw CPU performance will be very competitive... whether battery life is competitive or not remains to be seen

x86 vs ARM
By mosu on 6/4/2011 6:19:48 AM , Rating: 2
quote: Now that Microsoft isn't going to carry Intel's deadbeat red headed stepchild into the future, the superior architecture has a chance.
Best phrase of the week!

Asus Ep121
By damianrobertjones on 6/4/2011 8:26:33 AM , Rating: 2
How do they all stack up against the Asus EP121 (Other than battery life)?

Answer = Nowhere near as fast.

By blppt on 6/4/2011 2:00:37 PM , Rating: 2
I find it amusing that everybody is bashing intel for keeping the "red-headed stepchild" alive when they wanted everybody to be using some form of Itanium by now. Remember, it was feisty little AMD that patched its way to x86-64, not Intel, thus extending x86's life.

Why no AMD?
By EasyC on 6/6/2011 11:39:26 AM , Rating: 2
I have an Acer W500 with AMD APU. It's a more than capable tablet (awesome tablet btw). Wouldn't they benchmark an Intel on a windows tablet...rather than android??

"There is a single light of science, and to brighten it anywhere is to brighten it everywhere." -- Isaac Asimov

Most Popular ArticlesTop 5 Smart Watches
July 21, 2016, 11:48 PM
Free Windows 10 offer ends July 29th, 2016: 10 Reasons to Upgrade Immediately
July 22, 2016, 9:19 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki