Print 69 comment(s) - last by calKing.. on Jan 5 at 9:33 PM

This might be a good chip for tablets, but not so much for smartphones

Numbers have reportedly leaked via VR-Zone on the performance of CPU kingpin Intel Corp.'s (INTCMedfield, the company's tardy upcoming ultra-mobile CPU.  Now it's important to exercise a bit of caution as the credibility of these figures is questionable and even if they're the real deal, Medfield is still reportedly a half-year or more away from launch. 

With that said, let's dig into them.

I. The Platform

First, let's look at the leaked specs for the tablet platform:                                                                                 
  • 32 nm process
  • 1.6GHz CPU
  • 1GB of DDR2 RAM
  • WiFi
  • Bluetooth
  • FM radio
  • GPU (no details given)
Noticeably absent from the leaked materials was any reference to a baked-in 4G LTE (or 3G GSM/CDMA) modem.  Also absent was the very important CPU core count figure (based on the performance, this appears to be a dual-core chip).

The leak appears to consist of a benchmarked Red Ridge tablet.  Red Ridge is the name of the Android 3.2 Honeycomb tablet reference design, which Intel previewed in September.  Given past information, it appears likely that Red Ridge does have a 3G modem onboard, though whether it's on-die remains to be seen.

Red Ridge tablets
Intel's Red Ridge platform will be the first target for Medfield, after Intel scrapped plans for a smartphone platform. [Image Source: The Verge (left); VR-Zone (right)]

II. A Powerful Little Piece of Silicon

Now the good news -- Medfield appears to be pretty fast.  To give a point of comparison, let's look at top ARM chipmakers' current bread-and-butter smartphone chips, NVIDIA Corp.'s (NVDA) Tegra 2, Qualcomm Inc.'s (QCOM) MSM8260 third-generation Snapdragon, and Samsung Electronics Comp., Ltd.'s (KS:005930) Exynos were benchmarked (VR-Zone's report made it unclear whether these benchmarks were performed by the blog or by Intel) and gave:
Medfield v. the rest

So Medfield is a fast little bugger, capable of beating up on the current generation ARM smartphone chips.  But the numbers are a bit deceptive as Medfield is more of a tablet chip (more on that in a bit), so it should have gone up against Tegra 3, but for some reason the testers instead put it up against Tegra 2.  As they did not give the Samsung platform tested, it's very possible they pulled a similar shenanigan with Samsung's chip, testing the lower clocked smartphone variant, versus the higher clocked tablet variant.

That said, the numbers do indicate unquestionably that Medfield is going to be in the ballpark of ARM in terms of processing power, possibly even beating the ARM chips.

III. Medfield: Battery-Guzzler Edition

Now the bad news: the power budget is quite high.  The platform reportedly has a 2.6W TDP at idle and a maximum power consumption of 3.6W when playing 720P Flash video.  By launch the maximum power is intended to drop to 2.6W, while the idle is also likely to drop a fair bit.

Still, these numbers are pretty horrible if Intel hopes to squeeze Medfield on a smartphone.  Some quick "napkin math":
  • An average smartphone battery is around 1600 mAh
  • The output voltage is typically 3.7 V
  • The total battery power is thus 5.92 Wh
  • Thus the platform would last a bit over two hours at idle in a smartphone before dying
Low battery, Android
Intel's new chip could only muster about two hours of battery life in a smartphone.
[Image Source: Namran blog]

In other words there's no way Intel can hope to launch this chip in a smartphone.

It's disappointing to see Intel is still trailing so badly in power.  For example, a loaded Tegra 2 reportedly draws around 1 W, meaning that it could sip the aforementioned battery for around 6 hours before kicking the bucket.  Intel's chip is fast, but it appears to be a "battery-guzzler".

More troubling is the fact that these results come from a 32 nm part, where as NVIDIA and Qualcomm have 40 nm parts (Samsung is also at the 32 nm node).  In other words, that process advantage Intel is always talking about appears to be nonexistent here.

Intel's best hope power-wise is its 3D FinFET technology, which wil be introduced to Medfield sometime in the 2013-2014 window.  That will likely be the true test of Intel's fading hopes in the mobile space.  If Intel's 22 nm finFET transistor chip can't meet or beat ARM in power budget, it's game over.

IV. Launching Soon in a Tablet Near You 

Lastly let's examine what else is known about Medfield.

Intel reportedly hopes to launch the chip in "early 2012".  As laid out here, it seems obvious that this is a tablet-only launch.

The launch is being spearheaded by Intel's new "Mobile and Communications" business unit.  Intel has merged four separate units -- Mobile Communications, Mobile Wireless, Netbook & Tablet PC, and Ultra-Mobility -- to form the new super-unit.

The unit is headed by Mike Bell and Hermann Eul.  Mr. Bell has a particularly interesting history.  He was at Apple, Inc. (AAPL) and helped design the first iPhone.  From there he jumped ship to Palm.  And when Palm was in its final throes pre-acquisition, he jumped ship in 2010 to Intel.  So it's fair to say he has a bit of mobile experience.

Medfield was originally intended to be a smartphone platform.  Instead -- likely due to poor power performance -- it has morphed into a third leg in Intel's tablet push.  Intel already has released Oak Trail -- a beefier platform with PCI support, designed for Windows 7 tablets -- and Moorestown -- a lighter platform ideal for Android tablets.  Presumably Medfield will take the role of a leaner Moorestown, or perhaps step in as a Moorestown replacement.

It has a tough road ahead as Intel has thus far had almost no traction in the ARM-dominated tablet market.  The problems in the tablet department are familiar -- Intel's tablets tend to be powerful, but have poor battery life and run hot.

Source: VR-Zone

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By milli on 12/29/2011 11:43:14 AM , Rating: 5
Jason please update this article or remove it. I understand that every other website on the web copied this VR-Zone article wrongly but people got used to Anand posting correct information.

You mentioning that the platform power usage is 2.6W and so on, but then compare it to Tegra 2's SOC power usage. Even if those numbers from VR-Zone don't include the screen, they still include everything else on the PCB. Even Menlow had a idle power usage of 100mW. How can a more advanced Atom have a much higher idle power usage? Just because of this I think it's pretty easy to assume that those numbers are not for the SOC alone.

Just because VR-Zone doesn't know the GPU, you shouldn't copy this blindly. I think it's easy to assume that it'll be based on a PowerVR design. They've used the SGX535 before and are using the SGX545 in Cedar Trail. There were some rumors on the web that Intel had some issues with Cedar Trail's drivers. My hunch is that Medfield is also using the SGX545. The SGX545 should support DX10.1 but the drivers are not up to scratch yet. But there's no way to know this ATM.

Basically Intel is offering A15 performance with competitive power usage and x86 compatibility on top (considering how much faster the Atom cpu is compared to A9: What's there not to like? You should welcome competition in the mobile space. I for one am looking forward to a big fight in the mobile space and not only from ARM and Intel, but also MIPS.

RE: Huh?
By french toast on 12/29/2011 2:15:28 PM , Rating: 2
Thanks for the informative link, ive been searching for something like that for ages.
However, whilst interesting it is a bit misleading, for example, im not up to date with the part numbers, but i got the impresion that the atom parts were 1.6ghz v arm 1.2ghz?
If so that would even things up a bit more, plus im not sure if they are multi threaded are they? would that matter?

Also from that caffeine mark score, it looks about the same performance clock for clock as the cortex a9 parts, again is that multithreaded?
Also, bear in mind that is the older tegra 2 at 1ghz and at 40nm..
Tegra 3 again on 40nm is vastly more powerfull, in every area and actually consumes LESS power than tegra 2.
Tegra 2 phones can play 720p video for hours on a smartphone battery.

We still dont know what type of gpu, any 3g/4g modems, memory controller etc, which would make things more interesting.

1 other thing, it is not going to be released for 6 months, by then we are talking cortex a-15/krait from 2-4 cores which will be at least 50% more powerfull than atom clock for clock, with 2-4 times the cores, ALSO running at speeds of up to 2-2.5ghz on 32-28nm processes.
And,ON DIE 4g modems at 28nm.......

They will be shipping in smartphones at the time that is being released to power draining tablets.
Intel has got no chance of competing in the next 12 months on any level. and it will take a revolutionary next gen multi core OoO atom on 22nmfin fet tech to even have a hope.

The other thing to bear in mind, as process nodes decrease, the power advantages decrease the further you go down, 22nm fin fet is about 50% more power effecient than 32nm, which if released now would be competitive to tegra 2 designs on 40nm last year. chance.

RE: Huh?
By french toast on 12/29/2011 2:30:51 PM , Rating: 2
Sorry one other thing, quote me if im wrong about this, (im no expert) but i read somewhere that the die area of atom is massive compared to a cortex a9, which would also put them at a decreased profit wise,and also leave less available space for the gpu part, which in turn would make it less cost effective to keep up on graphics?? what do you think about that?

Also if i am right, previous smartphone atom designs used single channel memory controller..which if carried over leaves less memory bandwith for cpu/graphics, and would increase die area/powerconsumption again if matched to the already shipping arm soc's? am i correct?
..As well as 1gb ram for mid next year when top smartphones will already have 2gb??

RE: Huh?
By Khato on 12/29/2011 4:30:07 PM , Rating: 2
Those Phoronix benchmarks are actually mildly surprising - I would have expected the OMAP4660 to be closer than that. I'm also quite impressed with their selection of processors for comparison (N270 is original single-core atom at 1.6 GHz, Z530 is a more recent single-core atom also at 1.6 GHz, the Pentium M 1.86 GHz is a single-core dothan, and the T2400 is a 1.86 GHz dual core yonah.) The inclusion of the dothan and yonah not only provide comparison against the original core micro architecture, but also show what kind of gains a specific benchmark realizes from multi-threading.

Now looking through the various benchmarks yields a few interesting points. First, for single-threaded performance the OMAP4660 is, at best, as fast as atom once the frequency difference is taken into account. Meanwhile the worst case has it lagging by quite a bit even with the result scaled for frequency. Second, the multi-threaded performance of the OMAP4660 is simply abysmal - there's only one multi-threaded benchmark where the dual core A9 beats a single core atom with hyperthreading.

RE: Huh?
By french toast on 12/29/2011 5:13:16 PM , Rating: 1
Yea but the atom is higher clocked and with hyper threading, which intel puts up against arm duel core offerings...put it this way atom will not match arm in core count..the power consumpion would turn your tablet into a blow heater!.

Also they are at much higher tdps, and i think at much bigger die sizes even taking into account the die shrink to 32nm.

IMHO the duel core cortex a9 is comparable to an atom with HT clock for clock...but the a9 is at a fraction of the can scale to much higher frequencies, and also much smaller die size.

28nm krait will completley smoke any atom,in any benchmark comparison and it will be launched in smartphones, with intergrated LTE on die with in 2 months.

"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki