Print 69 comment(s) - last by calKing.. on Jan 5 at 9:33 PM

This might be a good chip for tablets, but not so much for smartphones

Numbers have reportedly leaked via VR-Zone on the performance of CPU kingpin Intel Corp.'s (INTCMedfield, the company's tardy upcoming ultra-mobile CPU.  Now it's important to exercise a bit of caution as the credibility of these figures is questionable and even if they're the real deal, Medfield is still reportedly a half-year or more away from launch. 

With that said, let's dig into them.

I. The Platform

First, let's look at the leaked specs for the tablet platform:                                                                                 
  • 32 nm process
  • 1.6GHz CPU
  • 1GB of DDR2 RAM
  • WiFi
  • Bluetooth
  • FM radio
  • GPU (no details given)
Noticeably absent from the leaked materials was any reference to a baked-in 4G LTE (or 3G GSM/CDMA) modem.  Also absent was the very important CPU core count figure (based on the performance, this appears to be a dual-core chip).

The leak appears to consist of a benchmarked Red Ridge tablet.  Red Ridge is the name of the Android 3.2 Honeycomb tablet reference design, which Intel previewed in September.  Given past information, it appears likely that Red Ridge does have a 3G modem onboard, though whether it's on-die remains to be seen.

Red Ridge tablets
Intel's Red Ridge platform will be the first target for Medfield, after Intel scrapped plans for a smartphone platform. [Image Source: The Verge (left); VR-Zone (right)]

II. A Powerful Little Piece of Silicon

Now the good news -- Medfield appears to be pretty fast.  To give a point of comparison, let's look at top ARM chipmakers' current bread-and-butter smartphone chips, NVIDIA Corp.'s (NVDA) Tegra 2, Qualcomm Inc.'s (QCOM) MSM8260 third-generation Snapdragon, and Samsung Electronics Comp., Ltd.'s (KS:005930) Exynos were benchmarked (VR-Zone's report made it unclear whether these benchmarks were performed by the blog or by Intel) and gave:
Medfield v. the rest

So Medfield is a fast little bugger, capable of beating up on the current generation ARM smartphone chips.  But the numbers are a bit deceptive as Medfield is more of a tablet chip (more on that in a bit), so it should have gone up against Tegra 3, but for some reason the testers instead put it up against Tegra 2.  As they did not give the Samsung platform tested, it's very possible they pulled a similar shenanigan with Samsung's chip, testing the lower clocked smartphone variant, versus the higher clocked tablet variant.

That said, the numbers do indicate unquestionably that Medfield is going to be in the ballpark of ARM in terms of processing power, possibly even beating the ARM chips.

III. Medfield: Battery-Guzzler Edition

Now the bad news: the power budget is quite high.  The platform reportedly has a 2.6W TDP at idle and a maximum power consumption of 3.6W when playing 720P Flash video.  By launch the maximum power is intended to drop to 2.6W, while the idle is also likely to drop a fair bit.

Still, these numbers are pretty horrible if Intel hopes to squeeze Medfield on a smartphone.  Some quick "napkin math":
  • An average smartphone battery is around 1600 mAh
  • The output voltage is typically 3.7 V
  • The total battery power is thus 5.92 Wh
  • Thus the platform would last a bit over two hours at idle in a smartphone before dying
Low battery, Android
Intel's new chip could only muster about two hours of battery life in a smartphone.
[Image Source: Namran blog]

In other words there's no way Intel can hope to launch this chip in a smartphone.

It's disappointing to see Intel is still trailing so badly in power.  For example, a loaded Tegra 2 reportedly draws around 1 W, meaning that it could sip the aforementioned battery for around 6 hours before kicking the bucket.  Intel's chip is fast, but it appears to be a "battery-guzzler".

More troubling is the fact that these results come from a 32 nm part, where as NVIDIA and Qualcomm have 40 nm parts (Samsung is also at the 32 nm node).  In other words, that process advantage Intel is always talking about appears to be nonexistent here.

Intel's best hope power-wise is its 3D FinFET technology, which wil be introduced to Medfield sometime in the 2013-2014 window.  That will likely be the true test of Intel's fading hopes in the mobile space.  If Intel's 22 nm finFET transistor chip can't meet or beat ARM in power budget, it's game over.

IV. Launching Soon in a Tablet Near You 

Lastly let's examine what else is known about Medfield.

Intel reportedly hopes to launch the chip in "early 2012".  As laid out here, it seems obvious that this is a tablet-only launch.

The launch is being spearheaded by Intel's new "Mobile and Communications" business unit.  Intel has merged four separate units -- Mobile Communications, Mobile Wireless, Netbook & Tablet PC, and Ultra-Mobility -- to form the new super-unit.

The unit is headed by Mike Bell and Hermann Eul.  Mr. Bell has a particularly interesting history.  He was at Apple, Inc. (AAPL) and helped design the first iPhone.  From there he jumped ship to Palm.  And when Palm was in its final throes pre-acquisition, he jumped ship in 2010 to Intel.  So it's fair to say he has a bit of mobile experience.

Medfield was originally intended to be a smartphone platform.  Instead -- likely due to poor power performance -- it has morphed into a third leg in Intel's tablet push.  Intel already has released Oak Trail -- a beefier platform with PCI support, designed for Windows 7 tablets -- and Moorestown -- a lighter platform ideal for Android tablets.  Presumably Medfield will take the role of a leaner Moorestown, or perhaps step in as a Moorestown replacement.

It has a tough road ahead as Intel has thus far had almost no traction in the ARM-dominated tablet market.  The problems in the tablet department are familiar -- Intel's tablets tend to be powerful, but have poor battery life and run hot.

Source: VR-Zone

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Misleading Power Analysis
By Khato on 12/29/2011 2:42:19 AM , Rating: 5
While the platform power consumption numbers aren't great (as the source article states, Intel is still a fair bit off their goals) they're hardly as bad as the analysis makes them out to be.

First, with respect to the smartphone 'napkin math'... Yes, that's what would happen if you ran a tablet off a smartphone battery. Because that 2.6W idle number isn't just Medfield, it's Medfield + all supporting chips + display.

Second, the comparison to Tegra 2's 1W power consumption is horribly invalid for much the same reason - a 10 inch tablet using a Tegra 2 processor uses a fair amount more than 1W under load. In fact, a Galaxy Tab 10.1 looks to draw a bit over 4W when streaming 720p flash video (25.9Wh battery lasts ~5.5 hours). Oh, and I doubt that Intel's design goal of 2.6W under load is a coincidence - the iPad 2 apparently draws around that much when it's streaming 720p flash video.

RE: Misleading Power Analysis
By IntelUser2000 on 12/29/2011 5:43:46 AM , Rating: 2

There was an article about how people were losing trust in online information because inaccuracies like these are rampant. This article is a prime example of that.

To put that in perspective again, 2.6W with Flash means with a 25WHr battery like in the iPad 2, it would last ~10 hours, which is similar to what the iPad achieves in battery life.

You don't see anyone making fun of A5's power usage do you?

RE: Misleading Power Analysis
By JasonMick (blog) on 12/29/2011 9:16:14 AM , Rating: 1
First, with respect to the smartphone 'napkin math'... Yes, that's what would happen if you ran a tablet off a smartphone battery. Because that 2.6W idle number isn't just Medfield, it's Medfield + all supporting chips + display.

You could be right, but Theo at VR-Zone (see the source link) who somehow got the reference design, appeared to be claiming that the CPU consumption was that high. It's entirely possible that he took the tablet apart and measured the current going to the screen, etc.

If you turned off the 3G/Wi-Fi, your only major current draw would be the screen... so you could pretty easily make a relatively accurate estimate of pure power consumption.

Regardless, I was never trying to make the point that the final power consumption target was inappropriate for a tablet, merely that it was not compatible with a smartphone, which I think we could agree on.

While I respect your comparison with the A5, I think it's clear that Medfield's power performance isn't scaling well in underclocked parts.

Remember Medfield was SUPPOSED to be a smartphone chip. If power performance didn't suck in the underclocked parts, why would Intel be scrapping its smartphone bid and making yet another tablet chip, which it already has two families of?

As I said right up front, Theo didn't do the best job writing up precise details about where he got his numbers, etc. I tried to be precise in detailing where the lack of clarity lies... you raise an excellent point w.r.t. whether the LCD was factored out of the power consumption numbers.

That said, I think it's clear that this release is Intel trying to buy time for Medfield and put it out in some form onto the market, albeit turning a smartphone chip into a tablet one...

As I said, Intel's big opportunity will come in 2013 with Medfield's 3D FinFET equipped successor. That chip might finally scale well enough to see a real smartphone deployment, which Intel has long promised, but thus far failed to do.

RE: Misleading Power Analysis
By IntelUser2000 on 12/29/2011 9:42:22 AM , Rating: 2
Big misunderstanding IMO.

According to the same guy, Intel's aim is for 2W idle and 2.6W flash video playback. Since the same company claims the smartphone Medfield will be competitive with competitors in power consumption, 2/2.6 number has to be for the platform.

"Few weeks ahead of the official launch, we now have first performance numbers of "Medfield Tablet Platform."

And tell me how do you suppose they'll isolate CPU power on such an integrated device like a prototype Tablet? You are making absolutely no sense here. The Menlow chip from 2008 has idle power usage of 100mW, which is 1/20th of 2W.

RE: Misleading Power Analysis
By JasonMick (blog) on 12/29/2011 10:57:09 AM , Rating: 2
And tell me how do you suppose they'll isolate CPU power on such an integrated device like a prototype Tablet? You are making absolutely no sense here. The Menlow chip from 2008 has idle power usage of 100mW, which is 1/20th of 2W.

Simple -- disassemble the frame, then start the benchmark running and unplug the screen. That be the quickest and dirtiest way to isolate the CPU, in my mind.

Again, I don't know if they did that, though...

But again if you think the power performance is so great, why do you think they aren't trying to put the thing on a smartphone, as they originally intended? It just doesn't make sense.

If the power consumption wasn't struggling, they'd be putting Medfield in a smartphone test platform in a heartbeat...

RE: Misleading Power Analysis
By Khato on 12/29/2011 2:01:27 PM , Rating: 2
Uhmmmm, look at page 14 -

Considering that's basically a presentation to investors, I'd expect the information to be relatively accurate... with the expected amount of bias of course.

RE: Misleading Power Analysis
By french toast on 12/30/2011 7:30:59 AM , Rating: 1
I agree, its going to be funny though as LG is rumoured to be releaseing medfield PHONES at CES....that is going to be really interesting...
15mm thick slabs weighing close to 200g, with 3g, QVGA screens getting 5 hours battery life STANDBY...
Intel its nearly 2012..not 2008.

RE: Misleading Power Analysis
By name99 on 12/29/2011 3:55:47 PM , Rating: 3
the iPad 2 apparently draws around that much when it's streaming 720p flash video.

It's a little hard to take seriously ANYTHING in a blog comment that talks about iPads playing flash video. You do understand this, right?

RE: Misleading Power Analysis
By Khato on 12/29/2011 4:15:44 PM , Rating: 2
Haha, true enough. Sorry if such confused you - I tend to use the term 'flash video' interchangeably with 'streaming video'. Since what we're interested in here is actual worst-case power consumption, and video playback where all data is being streamed through wi-fi typically beats out any other usage.

By psychobriggsy on 1/1/2012 4:42:40 PM , Rating: 2
From what I've read, it actually is just the SoC only. Not the display. Not the I/O hub. Just the CPU, Graphics, Sound, etc, in the SoC.

A typical 10" IPS display uses around 2W of power. I think you can see why Apple's chips aren't running at the same speeds as the competition, but get better battery lives.

Most SoCs use well under 1W when streaming video because it is a hardware function with little CPU interaction. 2.6W ... that's 0.6W SoC and 2W display. Medfield will be 3W SoC and 2W display.

By tviceman on 12/29/2011 1:22:37 AM , Rating: 3
So by the time it comes to market, it will be 40-50% faster than Tegra 2, a chip that will have been out for 2 years and will no longer be in production by then. They didn't compare it to Tegra 3, because the report is supposed to make Medfield look good. In all likelihood Tegra 3 is faster in most situations, and still uses less power.

Just about every 28nm dual core ARM cpu should be faster and the power draw for 28nm ARM cpu's will be a third of what Medfield is supposed to be. In order for Mefield to compete with time between charges, there will have to be a heavier battery onboard which is totally undesirable for a tablet. Also notice how no information on GPU performance was given. Not a good sign at all.

This chip will power very few products and will have an extremely short life. Intel will still be playing catch up and in the meantime companies like Nvidia and Qualcomm are building on the ARM design and releasing innovative new mobile CPU's. Maybe when Intel can migrate their ultra mobile CPU's to 22nm they can finally compete, but the cycle of playing catch up is going to be hard to equalize.

RE: Unimpressive
By B3an on 12/29/2011 2:09:28 AM , Rating: 2
Completely agree. And the Samsung Exynos in my GSII will likely match Medfield right now if it also ran at the same clockspped (1.6GHz). I can actually overclock it to that speed, and even at that speed and 100% load it lasts far longer than 2 hours, which Medfield will likely last at idle with a typical smartphone battery.

At this rate theres no way Intels 22nm will save them when Medfield finally moves to that. Look at Ivy Bridge CPU's based on 22nm, they are not a massive improvement over current 32nm Sandy Bridge for power comsumption, and Intel really does need a MASSIVE improvement here, atleast 4x better. IF Intel can ever even come close to ARM it wont likely be for atleast 3+ years.

RE: Unimpressive
By french toast on 12/30/2011 7:03:48 AM , Rating: 2
Yea i also agree, have you got any info on the tdp comparison between ivy/sandybridge?? i wonder what the power difference is as i heard they are only optmising for effeciency not performance now amd is out of the game....

Certainly it wont be 4x or anywhere near...and they need that just to compete with current a9 40nm designs...

RE: Unimpressive
By Gondor on 12/29/2011 7:54:59 AM , Rating: 2
Would it be possible to use "scrap area" of wafers used in Ivy Bridge production (area around the edge of the wafer) to squeeze many tiny 22 nm Atom chips inthere instead of wasting theose parts with chopped-off Ivy Bridge dies which have to be canned anyway ?

"Free chips", sort of.

RE: Unimpressive
By steven975 on 12/29/2011 9:33:21 AM , Rating: 2
I'm not sure, but I'm positive any chipmaker has thought of that.

I think the challenges would be the lithography and separation/testing of 2 different products from one wafer, and they may not be set up for that.

RE: Unimpressive
By tviceman on 12/29/2011 2:39:53 PM , Rating: 2
I have no idea if it's possible to fabricate two completely different chips on the same wafer. Anyone?

RE: Unimpressive
By JKflipflop98 on 12/29/2011 9:42:31 PM , Rating: 2
Yes, yes it is.

RE: Unimpressive
By tviceman on 12/30/2011 11:01:31 AM , Rating: 2
Wow that surprises me. I wonder if this is an often used technique? Especially with wafers meant to primarily make big chips, like Nvidia's. Nvidia would be doing themselves a big favor to put their low end GPU's on the outer parts of the wafer where it would otherwise cut off their big chips.

Process node
By nafhan on 12/29/2011 9:24:21 AM , Rating: 2
One of Intel's biggest advantages historically has been their excellent manufacturing tech, and they aren't taking advantage of it for the mobile space - Atom's are all built on the "old" process node with the high end getting the new process node. Imagine if Medfield was about to be released on Ivy Bridge's 22nm process: this story would read very differently.

RE: Process node
By french toast on 12/29/2011 2:22:54 PM , Rating: 2
No it wouldnt, i read somewhere that the manufaturing shrink get less beneficial the smaller you get now(anandtech)and that there super high tech fin fet 22nm is about 50% more power efficient than there 32nm.

if you compare that to the medfield chip, that would only be about 4 hours 720p on a smartphone battery..
Compared to tegra 2 which already does at least that on 40nm!
what about tegra 3 on 40nm thats more power efficient??
Qualcomm krait at 28nm??? not a hope in hell.

RE: Process node
By nafhan on 12/29/2011 4:54:17 PM , Rating: 2
Did you read what you just wrote? Medfield performance doesn't appear to be an issue, it's the power envelope that's a problem. Drop power usage by 50% while keeping the performance where it is and you'd have a pretty good chip - certainly competitive with those you listed (although not necessarily faster).

Tegra 3 for instance is only more power efficient than Tegra 2 when more compute power is needed or less compute power is needed. With your example of video playback, for instance, Tegra 2 and 3 have about the same power usage.

On a related note, it was probably AMD PR that your remember recently saying the thing about power usage and manufacturing shrinks, and as a general statement it may be true. However, Intel is getting huge benefits from the move to 22nm, specifically.

RE: Process node
By french toast on 12/29/2011 5:28:57 PM , Rating: 2
yes i did thanks, the info i got was from a recent article from anandtech, which is highly respectable. i cant be bothered to dig it out! lol.

My point exactly, tegra 3 is significantly faster than tegra 2, and yet more power effecient still on 40nm.
That would blow away the medfield solution on performance, although we dont know anyother details about medfields gpu/modem/bandwith etc.

So if you take medfield and soup it up with trigate 22nm HMK it would be about as power effecient as tegra 3 on 40nm g...yet be SLOWER.

Also take into account that tegra 3 is the crappyiest of the arm SOC's to come out in the next 12 months, with the qualcomm krait launching several months before medfield, which will blow the atom away on performance, and power effeiciency as it is on 28nm including the 4g modem.

The sad fact is though, atom even on the best tech is still slower and more power hungry than the current arm chips when it would launch...

RE: Process node
By french toast on 12/29/2011 5:32:04 PM , Rating: 2
And atom will not have that tech for at least 18 months..maybe longer, by which time arm will be shipping some serious hardware.

RE: Process node
By nafhan on 12/30/2011 9:17:08 AM , Rating: 2
And this comment basically brings things back around to my initial comment: Intel needs to put mobile chips on their most advanced process if they want to be competitive, and they are not doing that.

RE: Process node
By french toast on 12/31/2011 11:43:50 AM , Rating: 2
Lol, yea i get what your saying, and i agree that they should release there best process nodes on the mobile market, instead of the desktop, which is sewn up.

BUT what i am saying is that it will still not matter, and they are not comparable in performance either,that was against last years 'unspecified' chips, medfield is currently a PDF slide, that wont be in phones till mid next year at best..if goes to plan.

By that time, tegra 3+, krait, omap5, exynos 5250/4212, st errickson nova thor will be the competitors, so YES performance will defo be an issue.

RE: Process node
By calKing on 1/5/2012 9:33:24 PM , Rating: 2
Guys the maths is not that simple.
The ARM has not new architecture bigSmall core where a small core takes over when the load is light, just like Tegra 5 core. Intel does not have any architetcure like that. Also at same perf and at same node you have to see who has more area Intel or Arm. Intel has lots of laguage
Also look at the VPs of mobile at Intel one guy is from apple/palm what does he know about cutting edge Asic design. I dont know about other.

By milli on 12/29/2011 11:43:14 AM , Rating: 5
Jason please update this article or remove it. I understand that every other website on the web copied this VR-Zone article wrongly but people got used to Anand posting correct information.

You mentioning that the platform power usage is 2.6W and so on, but then compare it to Tegra 2's SOC power usage. Even if those numbers from VR-Zone don't include the screen, they still include everything else on the PCB. Even Menlow had a idle power usage of 100mW. How can a more advanced Atom have a much higher idle power usage? Just because of this I think it's pretty easy to assume that those numbers are not for the SOC alone.

Just because VR-Zone doesn't know the GPU, you shouldn't copy this blindly. I think it's easy to assume that it'll be based on a PowerVR design. They've used the SGX535 before and are using the SGX545 in Cedar Trail. There were some rumors on the web that Intel had some issues with Cedar Trail's drivers. My hunch is that Medfield is also using the SGX545. The SGX545 should support DX10.1 but the drivers are not up to scratch yet. But there's no way to know this ATM.

Basically Intel is offering A15 performance with competitive power usage and x86 compatibility on top (considering how much faster the Atom cpu is compared to A9: What's there not to like? You should welcome competition in the mobile space. I for one am looking forward to a big fight in the mobile space and not only from ARM and Intel, but also MIPS.

RE: Huh?
By french toast on 12/29/2011 2:15:28 PM , Rating: 2
Thanks for the informative link, ive been searching for something like that for ages.
However, whilst interesting it is a bit misleading, for example, im not up to date with the part numbers, but i got the impresion that the atom parts were 1.6ghz v arm 1.2ghz?
If so that would even things up a bit more, plus im not sure if they are multi threaded are they? would that matter?

Also from that caffeine mark score, it looks about the same performance clock for clock as the cortex a9 parts, again is that multithreaded?
Also, bear in mind that is the older tegra 2 at 1ghz and at 40nm..
Tegra 3 again on 40nm is vastly more powerfull, in every area and actually consumes LESS power than tegra 2.
Tegra 2 phones can play 720p video for hours on a smartphone battery.

We still dont know what type of gpu, any 3g/4g modems, memory controller etc, which would make things more interesting.

1 other thing, it is not going to be released for 6 months, by then we are talking cortex a-15/krait from 2-4 cores which will be at least 50% more powerfull than atom clock for clock, with 2-4 times the cores, ALSO running at speeds of up to 2-2.5ghz on 32-28nm processes.
And,ON DIE 4g modems at 28nm.......

They will be shipping in smartphones at the time that is being released to power draining tablets.
Intel has got no chance of competing in the next 12 months on any level. and it will take a revolutionary next gen multi core OoO atom on 22nmfin fet tech to even have a hope.

The other thing to bear in mind, as process nodes decrease, the power advantages decrease the further you go down, 22nm fin fet is about 50% more power effecient than 32nm, which if released now would be competitive to tegra 2 designs on 40nm last year. chance.

RE: Huh?
By french toast on 12/29/2011 2:30:51 PM , Rating: 2
Sorry one other thing, quote me if im wrong about this, (im no expert) but i read somewhere that the die area of atom is massive compared to a cortex a9, which would also put them at a decreased profit wise,and also leave less available space for the gpu part, which in turn would make it less cost effective to keep up on graphics?? what do you think about that?

Also if i am right, previous smartphone atom designs used single channel memory controller..which if carried over leaves less memory bandwith for cpu/graphics, and would increase die area/powerconsumption again if matched to the already shipping arm soc's? am i correct?
..As well as 1gb ram for mid next year when top smartphones will already have 2gb??

RE: Huh?
By Khato on 12/29/2011 4:30:07 PM , Rating: 2
Those Phoronix benchmarks are actually mildly surprising - I would have expected the OMAP4660 to be closer than that. I'm also quite impressed with their selection of processors for comparison (N270 is original single-core atom at 1.6 GHz, Z530 is a more recent single-core atom also at 1.6 GHz, the Pentium M 1.86 GHz is a single-core dothan, and the T2400 is a 1.86 GHz dual core yonah.) The inclusion of the dothan and yonah not only provide comparison against the original core micro architecture, but also show what kind of gains a specific benchmark realizes from multi-threading.

Now looking through the various benchmarks yields a few interesting points. First, for single-threaded performance the OMAP4660 is, at best, as fast as atom once the frequency difference is taken into account. Meanwhile the worst case has it lagging by quite a bit even with the result scaled for frequency. Second, the multi-threaded performance of the OMAP4660 is simply abysmal - there's only one multi-threaded benchmark where the dual core A9 beats a single core atom with hyperthreading.

RE: Huh?
By french toast on 12/29/2011 5:13:16 PM , Rating: 1
Yea but the atom is higher clocked and with hyper threading, which intel puts up against arm duel core offerings...put it this way atom will not match arm in core count..the power consumpion would turn your tablet into a blow heater!.

Also they are at much higher tdps, and i think at much bigger die sizes even taking into account the die shrink to 32nm.

IMHO the duel core cortex a9 is comparable to an atom with HT clock for clock...but the a9 is at a fraction of the can scale to much higher frequencies, and also much smaller die size.

28nm krait will completley smoke any atom,in any benchmark comparison and it will be launched in smartphones, with intergrated LTE on die with in 2 months.

Mick, i hate your processor articles
By Iketh on 12/29/2011 10:46:26 PM , Rating: 2
Your logic just doesn't feet with these type of articles.

In other words, that process advantage Intel is always talking about appears to be nonexistent here.

You can never state something like this. You can't say the advantage's of 32nm over 40nm don't apply. That's just silly.

What's actually happening is Intel's process allows Intel to get to this point of competing in the tablet market with its current architecture. What Intel has to improve on is the architecture.

RE: Mick, i hate your processor articles
By french toast on 12/30/2011 5:05:34 AM , Rating: 1
Hes right, when he says intels process advantage is not helping much hes right, it doesnt, despite the better process advantage, it is no where near competetive.

Intel x86 uses alot more transisters to get the same performance, and typically uses more power to do the same job compared to RISC architecture. will ALWAYS use more transisters(bigger chips) and its chips use more power than arm using the same process would have to change to a RISC architecture to do other wise.

RE: Mick, i hate your processor articles
By french toast on 12/30/2011 5:13:19 AM , Rating: 2
..And when he says that if 22nm fin FET doesnt do the job, then they are probably finished in the short term in smartphones,hes right.
- because windows will have moved onto arm, and alot of the x86 apps will have been converted to arm, lessoning any advantage of legacy intel is banking on.
Intel has 18 months to come up with a competitive product to arm, and not only that get it running in smartphones or its finished in the mobile space.

(against likely 28/22nm HMK quad core cortex a-15's @ 3ghz in smartphones lasting all day.)

RE: Mick, i hate your processor articles
By Iketh on 12/30/2011 5:33:31 AM , Rating: 2
You don't get it either. 32nm has advantages over 40nm. This is indisputable. The advantage is just HIDDEN because of the ARCHITECTURE differences. That's all I'm saying. Nothing to argue here. Jason makes himself sound silly because of his wording.

By french toast on 12/30/2011 6:31:10 AM , Rating: 2
Yea i do get what your saying, just the node where 32 is better than 40nm. fair enough.
I was looking at the whole package as was the writer.

Well this is completely UNsurprising
By name99 on 12/29/2011 3:17:02 PM , Rating: 2
I've said it before, I'll say it again. Intel doomed themselves to irrelevance when some marketing genius insisted that their ultramobile chips incorporate every random piece of of x86 crap that's been added to the platform since 1980.

And just to spell it out for you Intel supporters, the issue is NOT only that those (unwanted by any sane OS vendor or HW vendor) additional features take up die and use power; it is mainly that verifying all; this junk takes forever. Look at these numbers --- the chip isn't available yet, it's going to take a long time to become available, and during all that time ARM is prepping A15s which will be ready to go by the time this thing ships. So Intel can proudly debut something that is (barely) performance competitive with A9 just as no-one any longer cares about A9.

It didn't have to be this way. Intel COULD have started from scratch --- use what makes sense from the x86-64 instruction set and ditch EVERYTHING else -- make it fairly easy for compilers and assembly to be rewritten, but not automatic; and no nonsense about being able to run x86 code from 1983. But that's what happens when your tech company roadmap becomes dominated by people who know nothing about technology. At least we can be grateful that (so far) these idiots have only infected Atom and iTanic, and have't yet got a firm foothold in the main product line. (Though they are certainly trying --- witness the ongoing idiocy regarding what features do and don't get enabled on different Sandy Bridge CPUs, or the ignominious failure of Larrabee.)

RE: Well this is completely UNsurprising
By name99 on 12/29/2011 3:52:01 PM , Rating: 2
To add to my point, note the following

Fundamental to ARMv8 has to be the new instruction set, known as A64; the encoding of instructions to enable an application to utilize a 64-bit machine. ARM took the decision to introduce 64-bit through a new instruction set rather than extension of an existing instruction set for many good reasons. Most notably, and probably as no surprise, because we could develop a new independent instruction set to execute code in a lower power manner than by adding instructions to the existing instruction set. Of course, for compatibility reasons, we still support the entire ARMv7 machine in the new ARMv8 architecture, but when running 64-bit software, this part of the machine is not being used, and the area of complex legacy it had built up does not need to be active when running in the 64-bit ISA, unlike other architectures where 64-bit extension was simply added to the historical complexity and legacy of their 32-bit mode.


I think we all know who the "other architectures where 64-bit extension was simply added to the historical complexity and legacy of their 32-bit mode" are.

By Khato on 12/29/2011 4:46:53 PM , Rating: 2
I think we all know who the "other architectures where 64-bit extension was simply added to the historical complexity and legacy of their 32-bit mode" are.

Yup, they're talking about AMD64. Let's all remember which company saddled x86 with a comparatively ill-conceived 64 bit implementation.

As for the rant against x86... meh, everyone's entitled to their opinions.

Questionable premises.
By SlyNine on 12/29/2011 7:52:39 AM , Rating: 2
You never mentioned how much work can be done per watt, and what idle consumption is.

ADD that to the fact this chip is a concept and not a finished product! Quickly you will see that the real world can be very different then your portrayal of it.

RE: Questionable premises.
By SlyNine on 12/29/2011 7:55:21 AM , Rating: 2
The platform reportedly has a 2.6W TDP at idle and a maximum power consumption of 3.6W when playing 720P Flash video.

Missed it. Still in a tablet its about getting the work done and getting back to idle, Just as it is for most devices.

Does it matter when the bottleneck is elsewhere?
By Roffles on 12/29/2011 3:39:08 PM , Rating: 2
What about the saying, "Your computer is only as fast as its slowest component"? Does this not apply to phones?

As a Motorola Bionic owner, it's clear to me the integrated peripherals are the power and performance bottleneck. Lousy read, write, random read and random write speeds of both the internal and external Micro SDHC make archiving large files, networking large files, and playing large HD video files frustrating and pointless. The standalone LTE modem is guilty of killing my battery.

I would like to be reading more articles discussing the performance advancements of Micro SDHC and SOC integrated LTE modems. I mean, are we really going to get an entire generation of quad-core SOC's lacking LTE integration? Yes, a phone is more capable with more processing power, but it's getting to a point where a phone's utility as a portable computer is being spoiled by the things that don't seem to be advancing.

I just hope Google keeps to its word and allows Motorola to continue developing phones as a separate entity. They have something special with's runs like a dog in the current generation of phones. But future hardware and a continued effort to polish the Webtop software will yield a distinct advantage against the competition.

By french toast on 12/29/2011 4:07:41 PM , Rating: 2
I see your point..
A few things to note, qualcomm is moving its 4g/lte modem into its krait design in a couple of months...that means lte ON DIE at 28nm...

Same thing is happening with sg errickson end of next year.
Samsung has just teamed up with docomoto and others for the same thing. is taking this up a notch..12.8gb/s in there next chip along with 2gb ram.
They are also adding dx11 graphics (malit-604)this will be announced in 1 month.

Humm... 9183 Score Galaxy Nexus
By texel on 12/30/2011 1:48:24 AM , Rating: 2
I got a 9183 score on the FlexyCore Caffeine 3 benchmark test..with a Galaxy Nexus ICS.... No mods activated tonight...


By french toast on 12/30/2011 7:40:15 AM , Rating: 2
Cool, is that the same benchmark as the one in the article? are they comparable? is it multithreaded?

If so that means that cortex a9's are more than a match for atom, clock for clock, with much much less power on old 40nm process.

By danjw1 on 12/29/2011 5:13:56 PM , Rating: 2
They continue to under perform. They are wasting time and effort on something they don't seem to be able to deliver on. Intel is at a point where no one will seriously consider them for a mobile product. They either need to put something out there that blows everyone away, and do it soon, or risk being totally written off as not competent in this space.

Long time for this to play out
By tayb on 12/29/2011 7:28:35 PM , Rating: 2
This market is brand spanking new. Intel has not missed anything and if they launch a mobile CPU in 2015 that is the top performer the only thing they'll have missed out on is early revenue. Does anyone seriously think Intel is just going to give up and exit the market if Medfield, Medfields successor, or the successor to that fails? No. They aren't.

Intel still sells around 17% of all chips in the world and has an enormous revenue stream.

A motivated Intel is a scary Intel.

Dead before it comes out...
By dsx724 on 12/28/11, Rating: -1
RE: Dead before it comes out...
By spread on 12/28/2011 11:45:56 PM , Rating: 4
It's half a year away from launch, it's probably not even production silicon which is always buggy and inefficient. The engineering samples of all the previous Intel CPUs were always crap but they're there to test the design and tweak before final mass production.

It might be able to compete very well. You will know in 6 months.

RE: Dead before it comes out...
By Samus on 12/29/11, Rating: -1
RE: Dead before it comes out...
By BansheeX on 12/29/2011 12:20:10 AM , Rating: 2
I agree... unless we start making some serious strides in battery technology.

RE: Dead before it comes out...
By StevoLincolnite on 12/29/2011 1:25:59 AM , Rating: 5
I don't completely agree.

As you decrease in the manufacturing nodes... The smaller the percentage that retaining x86 and x64 has on the die size.
I remember reading an interview that Anand had with an AMD engineer talking about it a long time ago.

Eventually it will reach a point that the die-area cost of x86 and in turn x64 will be insignificant, yet the backwards compatibility with over 15+ years of software and games is simply huge.

For example, Take the game known as Master of Orion 2 which is almost 16 years old , it functions brilliantly on my Atom powered tablet even with the touch screen. - Which funnily enough the game pre-dates such technology that is now common place.
With Resolutions on phones now approaching resolutions we had 15 years ago on the PC... I would love to be able to play such games on the go.

I consider the mid/late 90's the golden age of gaming where developers weren't afraid of trying new things where you had classics like Master of Orion, Sacrifice, Battlezone, Mech Warrior and the likes, would be a shame to loose them in the passage of time never to be played again.
Better than the sequels that get thrown out year after year with very little changed.

RE: Dead before it comes out...
By Lerianis on 12/29/11, Rating: 0
RE: Dead before it comes out...
By StevoLincolnite on 12/29/2011 12:26:40 PM , Rating: 3
That is more because of a good 'emulation' software or a rewrite to the code that makes it compatible there.

Nope. Used my original copy from 1996 on CD in it's original packaging.
I had to image the disc and chuck it on a flash drive first, but Master of Orion 2 was just an install and play.

The fact is that at some point in the future, even smartphones are going to be 64-bit or the equivalent ARM architecture so that they can address more than 4GB's of memory for the OS, thereby making things faster.

Not exactly, ARM doesn't have to go 64bit to address a memory pool larger than 4gb.
They can (They might have already?) implement LPAE also known as large physical address extension.
Similar thing exists in the x86 world known as PAE.
It would enable the chips to address up to 1 Terabyte of memory.

RE: Dead before it comes out...
By name99 on 12/29/2011 3:45:54 PM , Rating: 1
You do know that all these issues have already been resolved, right?

A15 brings an extended memory model called LPAE, which behaves somewhat like PAE though, as I understand it, the reason for it is very different.
PAE was about allowing many apps (each using less than 4GB) to utilize the memory of a system with more than 4GB. This is a model that makes sense for servers, but not for phones where few apps (especially really large apps) run at once.
So LPAE is more about providing a clean way for co-processors (like GPU) to share memory with the CPU. (ARM don't say this, but I would not be surprised if another reason for LPAE is to allow flash storage to be brought "directly" into the memory hierarchy, the way IBM AS/400 works. I could see Apple working towards this.)

Meanwhile A64, the Arm64 bit extensions have also been defined, including the new registers and memory model.

RE: Dead before it comes out...
By Samus on 12/30/2011 1:50:22 AM , Rating: 1
I love all this talk about x86 is so awesome because of 15+ years of backwards compatibility.

Anything designed to run on x86 thats more than even a FEW YEARS OLD will run at full speed through emulation. Apple bet on that when they moved from PPC to x86 and it worked perfectly. And who can forget about DOSBOX?

The reality is any modern operating system (for tablet or phone) isn't going to run x86 programs just because it has an x86 processor, because the operating system (unless its Windows 8) isn't going to be compatible with the executables (Android on x86 obviously can't run Windows programs.)

Intel is literally pushing a bloated tank uphill by sticking with x86 on mobile platforms. x86 command set has been stacked on like a Sega console: you can add a Genesis, Sega CD, and 32x; in the end you still have the same old-school core system using more and more electricity.

RE: Dead before it comes out...
By french toast on 12/30/2011 7:18:04 AM , Rating: 1
Im sorry i dont get all this talk about 15 year backwards compatibility with ancient x86 games and software...
Who cares??
The smartphone and tablet market..which lets face it is what we are talking about here, the MAJORITY of people running arm version windows or google equivalent arnt going to worry about that,

Besides very quickly the most important x86 apps that would benefit being on a tablet/smartphone will be re-writen for arm..problem solved.
Of course there will be a small niche of people that will benefit from all the old that case they will have to do with a full laptop/pc variant.

I get the feeling however that not a whole lot of them apps will be optimised for a screen/touch input method and so wont be used, instead developers will just release a more up to date touch friendly version on arm for the masses.

Qualcomm krait/cortex a-15 have extensions enabling 40 bit memory addressing i think that what you were saying above...

RE: Dead before it comes out...
By sprockkets on 12/29/2011 9:28:27 AM , Rating: 2
Yeah, but this isn't the desktop. You'd have a hell of a time convincing the smartphone world to go to x86 instead of just the desktop world staying on x86.

And I'm not going to have x86 and Intel's empire last any longer.

RE: Dead before it comes out...
By NellyFromMA on 12/29/2011 10:21:45 AM , Rating: 2
What makes you think you're not supporting the making of a new ARM empire instead.............

Intel's empire has supported probably virtually all of the daily benefits in your life for the past 20 years provided you are that old....

ARM just provided you with a fun handheld for a little bit in recent time, indirectly....

RE: Dead before it comes out...
By sprockkets on 12/29/2011 11:34:56 AM , Rating: 2
ARM allows for each company to make different implementations, Intel, well, there's that whole AMD saga, and their anti-trust BS.

Intel should have just kept Xscale and kept on working on it.

RE: Dead before it comes out...
By name99 on 12/29/2011 3:23:42 PM , Rating: 1
As you decrease in the manufacturing nodes... The smaller the percentage that retaining x86 and x64 has on the die size.
I remember reading an interview that Anand had with an AMD engineer talking about it a long time ago.

Eventually it will reach a point that the die-area cost of x86 and in turn x64 will be insignificant, yet the backwards compatibility with over 15+ years of software and games is simply huge.

You just don't get it.
The point is NOT the die costs of x86 compatibility; it is the COMPLEXITY costs. To get FULL x86 compatibility (including ALL that crap --- segments, 286 mode, virtual x86 mode, SMM, PAE, MMX sharing registers with x87 FP --- which by itself is useless because any sane programmer uses SSE for FP --- on and on it goes), all that requires phenomenal complexity --- which means Intel cannot change these chips and improve them as fast as ARM.

We've already seen this. Intel have been at this for a few years now. They know it matters. They have vastly more resources than ARM. Yet EVERYTHING they ship is too little too late. That's the cost of complexity.

RE: Dead before it comes out...
By dsx724 on 12/30/2011 5:52:17 AM , Rating: 2
So right!
ARM is a light scalable architecture with low complexity due to lack of legacy compatibility. x86 is a minefield of archaic design decisions, instructions, registers, and mmu that will hinder VLSI to production-masks from a costs, testing, time, performance, power perspective. You can pump out 10 ARM designs in the time it takes to design 1 x86 processor so its going to be really hard for Intel to compete in the highly integrated space.

RE: Dead before it comes out...
By french toast on 12/30/2011 7:54:58 AM , Rating: 2
That may or may not be true, however i also read somewhere on anandtech that the performance improvements the lower you go down past 32/22nm get a lot smaller.
Someone else mentioned on this thread about the small difference between ivy bridge and sandy bridge as proof.

This whole nonsense about backwards compatibilty with 15 year old apps and games is pointless, all the old apps are not optimised for small smartphone touchscreens (see windows 7 tablets) and so it will greatly benefit the developers to re release any apps that would be beneficial to be newly optimised that way for the millions of ARM smartphones that will be ALREADY SHIPPING next year.

And only a reletive small number or techies (or geeks) would think about them anyway, most average people will just trot to the available app store and find eveythink they need right there.

Regardless of whether what you said is true or not, intel is ALREADY ahead on process nodes and is knowhere near competetive.
22NM doesnt arive within the next 18months or so and thatS according to optimistic pdf slides that would MAYBE make them competitive now...18 months is WAY WAY too late.

RE: Dead before it comes out...
By aegisofrime on 12/29/2011 1:03:32 AM , Rating: 2
Agreed. Medfield is laughable. Let's not forget that Cortex A15 parts have taped out and are on the verge of release. If Medfield is only slightly beating Cortex A9, I don't see what chance it has against A15.

The Exynos 5xxx series should be a beast.

RE: Dead before it comes out...
By JasonMick (blog) on 12/29/2011 9:04:44 AM , Rating: 2
It's half a year away from launch, it's probably not even production silicon which is always buggy and inefficient. The engineering samples of all the previous Intel CPUs were always crap but they're there to test the design and tweak before final mass production.

It might be able to compete very well. You will know in 6 months.

You could well be right in that it may make a decent tablet chip once in production form... as I said in the piece, at least it's powerful.

That said, I think its troubling that Intel has scrapped its smartphone effort and turned Medfield into yet another tablet platform. To me that is clearly a desperation move due to the power performance being so bad (even the supposed final target is way high for a smartphone).

And it is purely nonsensical. Why do you need THREE different platforms (Oak Trail, Moorestown, Medfield) in a market where you're likely selling less than one million units. It seems like gross overkill...

Medfield was SUPPOSED to be a smartphone chip. If it turns out to be a decent tablet chip, great, but what Intel really needed was a competitive smartphone chip.

It should just swallow its pride and license ARM or scoop an ARM licensee....

RE: Dead before it comes out...
By NellyFromMA on 12/29/2011 10:24:48 AM , Rating: 2
Wouldn't picking up an ARM license be nearly equivalent to Microsoft saying it now needs to host some of its cloud services on iCloud in order to remain competitive?

I don't think the [Intel] shareholders would take that as a symbol of strength.

RE: Dead before it comes out...
By carniver on 12/29/2011 12:46:35 PM , Rating: 3
Intel can combine ARM architecture and their fabrication prowess to kill all competition first, and then reshape the market as they please

RE: Dead before it comes out...
By zzss on 12/30/2011 11:17:02 AM , Rating: 3
Intel has scrapped its smartphone effort and turned Medfield into yet another tablet platform.
Source please ??
If my information is correct, there will be 2 Medfield SOC SKU, Smartphone and Tablet. VR-Zone had mixed up the Smartphone Performance number with Tablet Power spec. Check AnandTech posting

I hope you can do some research before jumping into any conclusion....

By haukionkannel on 12/29/2011 5:16:24 AM , Rating: 2
Well Intel GPUs are not state of the art and still Intel is the biggest GPU manufacturer in the world...
So this may not be much, but Intel has money and if they can get to 22nm and 18nm fast, they can catch the competitors.

I think that Intel at this moment just try out the consept, and if everything seems to be "ok" they will put more effort in the next iterations.
It may be enough that this will sell just because it is Intel and it is x86 and x64 compatible...

"What would I do? I'd shut it down and give the money back to the shareholders." -- Michael Dell, after being asked what to do with Apple Computer in 1997

Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki